Abstract
In this paper, we study a nonlinear elliptic equation arising in astrophysics:(0.1)−Δu+V(x)u=|u|2⁎−2u|x′|+Q(x)|u|q−2u,x:=(x′,x″)∈Rm×Rn−m, where n≥3, 2≤m<n, 2⁎=2(n−1)n−2, 2<q<2nn−2 and V(x),Q(x)∈C(Rn). The partly singular term |u|2⁎−2u|x′| in (0.1) forces us to establish a refined Sobolev inequality involving partly weighted Morrey norm, which generalizes similar inequality obtained by G. Palatucci and A. Pisante (2014) [25]. Benefiting from the new inequality, we obtain a global compactness result and some existence result for (0.1), which extend the results obtained by Y. B. Deng et al. (2012) [12]. Our strategy turns out to be more concise because we avoid the use of Levy concentration functions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.