Abstract

Data de-duplication divides backup stream into chunks and eliminates duplicate chunks across the entire system, thus remarkably reduces the storage and bandwidth requirement for backups. However, this technique also introduced many new problems among which the performance problem has been resolved by many of the existing solutions, while the logical data deletion problem is not well studied till now. This paper studied the logical data deletion mechanism in de-duplication backup systems, analyzed the memory overhead of the Bloom filter, which supports both high performance de-duplication and logical data deletion, and proposed a lazy deletion method to minimize the influence of logical data deletion on de-duplication performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call