by Jim McGann
Data centers increase storage capacity year after year, relentlessly trying to keep up with prolific users. This strategy has worked since the cost of storage has plummeted over the years. Adding capacity has been solved through a simple call to the vendor of your choice. Problem solved?
Recently we have seen a surge in data classification projects. These projects are attacking a number of very different use cases. Here are some recent examples:
- Classification of 2.8PBs of user data on a shared network server. Found that 36% of the content was aged system files that had no business value. 1PB of storage capacity reclaimed. Happy client. Use Case: Cost Savings
- Security assessment on 25 network servers to determine is sensitive email/PSTs exist on the network. Found 2,376 PSTs, 57% had not been accessed in years, 28% were abandoned by ex-employees, and 32% were active and accessed in the last 6 months. Use Case: Privacy
- 676 TBs of high profile network storage classified and data classified for ROT (redundant, obsolete, trivial) analysis. 12% of data was purged. 19% was archived in the cloud. 275,000 files were found that contained high value intellectual property that were secured. Use Case: Cost Savings and Privacy
These projects started with the classification of unstructured data, providing the knowledge to make decisions and develop a disposition strategy. Many were initially driven by costs, however, the end game resulted in support for privacy.
Is cost the driving factor for the resurgence of data classification?
With the looming EU General Data Protection Regulation (GDPR) will privacy take a front seat?
Download our newest eBook, Harnessing Metadata for Streamlined Data Management and Governance to find out how data classification can drive down costs and help privacy efforts.