Abstract The James Webb Space Telescope (JWST) observations have revolutionized extragalactic research, particularly with the discovery of little red dots (LRDs), which have been discovered as a population of dust-reddened broad-line active galactic nuclei (AGNs). Their unique V-shaped spectral feature, characterized by a red optical continuum and a UV excess in the rest frame, challenges us to discern the relative contributions of the galaxy and AGN. We study a spectral energy distribution (SED) model for LRDs from rest-frame UV to infrared bands. We hypothesize that the incident radiation from an AGN, characterized by a typical SED, is embedded in an extended dusty medium with an extinction law similar to those seen in dense regions such as Orion Nebula or certain AGN environments. The UV−optical spectrum is described by dust-attenuated AGN emission, featuring a red optical continuum at λ > 4000 Å and a flat UV spectral shape established through a gray extinction curve at λ < 3000 Å, due to the absence of small-size grains. There is no need for additional stellar emission or AGN scattered light. In the infrared, the SED is shaped by an extended dust and gas distribution (γ < 1; ρ ∝ r −γ ) with characteristic gas densities of ≃10–103 cm−3, which allows relatively cool dust temperatures to dominate the radiation. As a result, these dust structures shift the emission energy peak from near-infrared to mid-infrared bands in the rest frame; for sources at z ~ 4–7, the corresponding wavelengths shift from the JWST/MIRI to Herschel range. This model, unlike the typical AGN hot torus models, can produce an infrared SED flattening that is consistent with LRD observations through JWST MIRI. Such a density structure can arise from the coexistence of inflows and outflows during the early assembly of galactic nuclei. This might be the reason why LRDs emerge preferentially in the high-redshift Universe younger than 1 billion years.
Read full abstract