Having an artificial neural network that solves Maxwell’s equations in a general setting is an intellectual challenge and a great utility. Recently, there have been multiple successful attempts to use artificial neural networks to predict electromagnetic fields, given a specific source and interacting material distribution. However, many of these attempts are limited in domain size and restricted to object shapes similar to the learned ones. Here, we overcome these restrictions by using graph neural networks (GNNs) that adapt the propagation scheme of the finite-difference time-domain (FDTD) method to solve Maxwell’s equations for a distinct time step. GNNs yield a significant advantage, i.e., size invariance, over conventional neural network architectures, such as convolutional or linear neural networks. Once trained, a GNN can work on graphs of arbitrary size and connectivity. This allows us to train them on the propagation procedure of electromagnetic fields on small domain sizes and, finally, expand the domain to an arbitrary scale. Moreover, GNNs can adapt to any material shape and work not only on structured grids, such as FDTD, but also on arbitrary meshes. This work may be seen as the first benchmark for field predictions with graph networks and could be expanded to more complex mesh-based optical simulations, e.g., those based on finite elements.