Abstract

We study existence and regularity of distributional solutions for possibly degenerate quasi-linear parabolic problems having a first order term which grows quadratically in the gradient. The model problem we refer to is the following(1){ut−div(α(u)∇u)=β(u)|∇u|2+f(x,t),in Ω×]0,T[;u(x,t)=0,on ∂Ω×]0,T[;u(x,0)=u0(x),in Ω. Here Ω is a bounded open set in RN, T>0. The unknown function u=u(x,t) depends on x∈Ω and t∈]0,T[. The symbol ∇u denotes the gradient of u with respect to x. The real functions α, β are continuous; moreover α is positive, bounded and may vanish at ±∞. As far as the data are concerned, we require the following assumptions:∫ΩΦ(u0(x))dx<∞ where Φ is a convenient function which is superlinear at ±∞ andf(x,t)∈Lr(0,T;Lq(Ω))with 1r+N2q⩽1. We give sufficient conditions on α and β in order to have distributional solutions. We point out that the assumptions on the data do not guarantee in general the boundedness of the solutions; this means that the coercivity of the principal part of the operator can really degenerate. Moreover, a boundedness result is proved when the assumptions on the data are strengthened.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call