Abstract

In this paper, we deal with the randomized generalized diffusion equation with delay: ut(t, x) = a2uxx(t, x) + b2uxx(t − τ, x), t > τ, 0 ≤ x ≤ l; , t ≥ 0; , 0 ≤ t ≤ τ, 0 ≤ x ≤ l. Here, τ > 0 and l > 0 are constant. The coefficients a2 and b2 are nonnegative random variables, and the initial condition φ(t, x) and the solution u(t, x) are random fields. The separation of variables method develops a formal series solution. We prove that the series satisfies the delay diffusion problem in the random Lebesgue sense rigorously. By truncating the series, the expectation and the variance of the random‐field solution can be approximated.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call