Abstract

We study nonlocal diffusion models of the form $$(\gamma(u))_t (t, x) = \int_{\Omega} J(x-y)(u(t, y) - u(t, x))\, dy.$$ Here Ω is a bounded smooth domain andγ is a maximal monotone graph in \({\mathbb{R}}^2\). This is a nonlocal diffusion problem analogous with the usual Laplacian with Neumann boundary conditions. We prove existence and uniqueness of solutions with initial conditions in L1 (Ω). Moreover, when γ is a continuous function we find the asymptotic behaviour of the solutions, they converge as t → ∞ to the mean value of the initial condition.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call