We study the problem of reconstructing a high-dimensional signal xRn from a low-dimensional noisy linear measurement y=Mx+eRℓ, assuming x admits a certain structure. We model the measurement matrix as M=BA, with arbitrary BRℓ×m and sub-gaussian ARm×n; therefore allowing for a family of random measurement matrices which may have heavy tails, dependent rows and columns, and a large dynamic range for the singular values. The structure is either given as a non-convex cone TRn, or is induced via minimizing a given convex function f(∙), hence our study is sparsity-free. We prove, in both cases, that an approximate empirical risk minimizer robustly recovers the signal if the effective number of measurements is sufficient, even in the presence of a model mismatch, i.e. the signal not exactly admitting the model’s structure. While in classical compressed sensing the number of independent (sub)-gaussian measurements regulates the possibility of a robust reconstruction, in our setting the effective number of measurements depends on the properties of B. We show that, in this model, the stable rank of B indicates the effective number of measurements, and an accurate recovery is guaranteed whenever it exceeds, to within a constant factor, the effective dimension of the structure set. We apply our results to the special case of generative priors, i.e. when x is close to the range of a Generative Neural Network (GNN) with ReLU activation functions. Also, if the GNN has random weights in the last layer, our theory allows a partial Fourier measurement matrix, thus taking the first step towards a theoretical analysis of compressed sensing MRI with GNN. Our work relies on a recent result in random matrix theory by Jeong, Li, Plan, and Yılmaz [1].