Abstract

It is well-known that wave-type equations with memory, under appropriate assumptions on the memory kernel, are uniformly exponentially stable. On the other hand, time delay effects may destroy this behavior. Here, we consider the stabilization problem for second-order evolution equations with memory and intermittent delay feedback. We show that, under suitable assumptions involving the delay feedback coefficient and the memory kernel, asymptotic or exponential stability are still preserved. In particular, asymptotic stability is guaranteed if the delay feedback coefficient belongs to $$L^1(0, +{\infty })$$ and the time intervals where the delay feedback is off are sufficiently large.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call