What surprising patterns do fluid dynamics and neurobiological networks have in common? In this issue's first Research Spotlights article, “Bump Attractors and Waves in Networks of Leaky Integrate-and-Fire Neurons,” a connection is made between the waves at the onset of pipe turbulence and the localized traveling waves in spiking neural networks. Authors Daniele Avitabile, Joshua L. Davis, and Kyle Wedgwood consider a so-called “discrete integrate-and-fire model” to describe the dynamics of a connected neuron firing via a system of piecewise linear ordinary differential equations. Constructing a solution to this system effectively amounts to determining firing times of the neurons. One can then witness “bump” patterns, in this case characterized by a core of neurons that fire frequently, while neurons away from the core do not fire. Such a model's deterministic chaotic behavior can then be studied around a firing set and used to construct waves and analyze their stability. Simulations and analysis demonstrate oscillatory and nested branching of these waves, and such behavior can accumulate in a bump attractor and wandering wave. The authors show that such complex spatiotemporal patterns, and their conditions for stability, are not unlike those witnessed in the transition to turbulence in a pipe. The paper concludes with a challenge to future researchers: can further dynamical systems characterizations of such event-driven models uncover links between localized waves and bumps in complex spatial networks? The second Research Spotlights article addresses the problem of identifying a system from noisy observations of dynamical data. In “Learning Dynamical Systems with Side Information” authors Amir Ali Ahmadi and Bachir El Khadir present a way to incorporate system knowledge to improve the learning process beyond what is possible using data alone. Such knowledge, or “side information,” arises when one has contextual information about the unknown dynamics without knowing the dynamics precisely. Examples of side information covered by the authors' framework include known trajectories (e.g., when one has knowledge of some equilibrium points), group symmetry, and coordinatewise properties (e.g., nonnegativity or monotonicity). Restricting attention to a particular class of functions to be learned and particular side information allows the learning process to be posed as a semidefinite optimization formulation amenable to efficient solution. This approach is demonstrated numerically on systems modeling the diffusion of a contagion, behavior of a simple pendulum, and time evolution of a cancerous tumor, as well as on the classical Lorenz system. In each case, imposing side information via semidefinite programming allows the authors to learn the unknown dynamics from far fewer observations than otherwise possible and provides a potential road map for addressing even broader classes of systems and side information.