We examine the case where a circumstellar medium around a supernova is sufficiently opaque that a radiation dominated shock propagates in the circumstellar region. The initial propagation of the shock front into the circumstellar region can be approximated by a self-similar solution that determines the radiative energy in a shocked shell; the eventual escape of this energy gives the maximum luminosity of the supernova. If the circumstellar density is described by \rho=Dr^{-2} out to a radius R_w, where D is a constant, the properties of the shock breakout radiation depend on R_w and R_d\equiv\kappa Dv_{sh}/c, where \kappa is the opacity and v_{sh} is the shock velocity. If R_w>R_d, the rise to maximum light begins at ~ R_d/v_{sh}; the duration of the rise is also ~ R_d/v_{sh}; the outer parts of the opaque medium are extended and at low velocity at the time of peak luminosity; and a dense shell forms whose continued interaction with the dense mass loss gives a characteristic flatter portion of the declining light curve. If R_w<R_d, the rise to maximum light begins at R_w/v_{sh}; the duration of the rise is R_w^2/v_{sh}R_d; the outer parts of the opaque medium are not extended and are accelerated to high velocity by radiation pressure at the time of maximum luminosity; and a dense shell forms but does not affect the light curve near maximum. We argue that SN 2006gy is an example of the first kind of event, while SN 2010gx and related supernovae are examples of the second.