Abstract

We demonstrate a situation where the wavelength dependence of the intrinsic linear polarization of stellar radiation matches that of the interstellar linear polarization described by the Serkowski law. Such a situation can arise when the radiation from a star with a dipole magnetic field is scattered in a circumstellar plasma shell with a uniform electron density distribution. As a result, we have estimated the magnetic field strength at the photospheric phase of Supernova 1999gi. We show that the existence of intrinsic polarization in Galactic stars disguised as interstellar polarization is possible in principle.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call