Characterizing datasets for data deduplication in backup applications
- 1 December 2010
- conference paper
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE International Symposium on Workload Characterization (IISWC'10)
Abstract
The compression and throughput performance of data deduplication system is directly affected by the input dataset. We propose two sets of evaluation metrics, and the means to extract those metrics, for deduplication systems. The First set of metrics represents how the composition of segments changes within the deduplication system over five full backups. This in turn allows more insights into how the compression ratio will change as data accumulate. The second set of metrics represents index table fragmentation caused by duplicate elimination and the arrival rate at the underlying storage system. We show that, while shorter sequences of unique data may be bad for index caching, they provide a more uniform arrival rate which improves the overall throughput. Finally, we compute the metrics derived from the datasets under evaluation and show how the datasets perform with different metrics. Our evaluation shows that backup datasets typically exhibit patterns in how they change over time and that these patterns are quantifiable in terms of how they affect the deduplication process. This quantification allows us to: 1) decide whether deduplication is applicable, 2) provision resources, 3) tune the data deduplication parameters and 4) potentially decide which portion of the dataset is best suited for deduplication.Keywords
This publication has 14 references indexed in Scilit:
- I/O DeduplicationACM Transactions on Storage, 2010
- Generating realistic impressions for file-system benchmarkingACM Transactions on Storage, 2009
- Extreme Binning: Scalable, parallel deduplication for chunk-based file backupPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2009
- Multi-level comparison of data deduplication in a backup scenarioPublished by Association for Computing Machinery (ACM) ,2009
- Demystifying data deduplicationPublished by Association for Computing Machinery (ACM) ,2008
- Improving duplicate elimination in storage systemsACM Transactions on Storage, 2006
- Deep Store: An Archival Storage System ArchitecturePublished by Institute of Electrical and Electronics Engineers (IEEE) ,2005
- A low-bandwidth network file systemACM SIGOPS Operating Systems Review, 2001
- A large-scale study of file-system contentsACM SIGMETRICS Performance Evaluation Review, 1999
- Space/time trade-offs in hash coding with allowable errorsCommunications of the ACM, 1970