Analyzing data: Why a “bigger is better” mentality may be at odds with intelligent information governance 5V’s of big data – Volume, Velocity, Variety, Value and Veracity. The Five Vs keep in-house counsel from turning “big data” into “bad data” http://bit.ly/Q4l56V @dean_gonsowski @InsideCounsel
Watch Ed Moke, VP of Operations at Index Engines explains the process of legacy backup tape remediation. Purging legacy disaster recovery tapes saves offsite storage costs (Iron Mountain, Recall and others) and eliminates risk and liability of the unmanaged user files and email. Index Engines supports direct indexing, search and extraction of files and email from tapes created using Symantec NetBackup and Backup Exec, IBM TSM, EMC Legato Networker, CA ArcServe, HP DataProtector, and others. Simply load DLT, LTO, AIT and other tapes in library and automate the process of indexing and extraction of the content based on legal and compliance policy. No need for expensive tape restoration or the original backup software.
Learn more about Tape Remediation
On average, each gigabyte of data represents approximately $18,000.00 in e-discovery costs.
On a per-gigabyte basis, costs ranged from $125 to $6,700 for the collection of data, from $600 to $6,000 for processing electronic data, and from $1,800 to $210,000 for the legal review.
It is unrealistic to expect much room for improvement in the rates of human review. While it only costs about $.20 a day to buy one gigabyte of storage, it costs much more to have the stored data reviewed in the event of litigation. Thus, taking steps to establish a process and procedure for retaining only necessary data and deleting unnecessary data will reduce costs by reducing each gigabyte of data that needs to be reviewed. Full Article>
Efficiency starts with good communication between outside counsel, in-house counsel and the IT department
Identification: it is helpful if an organization has a data map or server topology (a visual representation of a company’s network systems).
Preservation: The first step in preserving ESI is to implement a litigation hold. Outside counsel and in-house counsel should determine at the outset whether any issues might prevent the proper preservation of relevant ESI. Counsel should be mindful of recently implemented policies, network upgrades and system changes that affect older data. Retired computer systems (also called legacy systems) may cause problems because data upgrades or migrations can affect data integrity. Moreover, preservation issues often arise with network backup systems because they are not designed for retrieval in connection with litigation. Learn more on how to get to legacy data>
Data may only need to be preserved for certain key custodians. Counsel may want to limit preservation to a specific date range, target only specific types of data, or focus on specific locations or subject matters. Ideally, the goal is to preserve only the ESI that a company may need for a particular matter. Full Article>
IT is taking on greater responsibility in e-discovery – the process of exchanging documents in electronic format during the evidence-gathering period prior to a trial.
The courts gradually have been getting tougher about e-discovery requirements, and some heavy penalties for non-compliance have been paid. Karnick (Glaser Weil’s CIO) said that he is seeing less tolerance among judges for e-discovery failures that result from ignorance.
Back-up tapes “might be the most expensive type of forensic function there is,” says Karnick. “You have to figure out what’s on the tape and then replicate the system that was originally used so you can pull the data off of them.” – Read Complete Article>
Index Engines has streamlined access to legacy data on backup tapes. No more replicating or restore. Learn how>
Be the first to see Index Engines latest version of the Octane Intelligent Discovery platform. Exciting new features that extend the workflow and increase overall functionality for eDiscovery will be announced at a webinar on Sept 27th.
Learn about new load file support for Relativity, support for OCR scanned images and pdf’s, and high speed Forensic Image processing – no need for mounting. Attend this webinar
Strategic advantages to a centralized corporate platform include early, direct access to ESI in the wild, single instance collection storage, shared indexes, cross matter designations and universal chain of custody.
So how do you minimize the distortion of the telephone game and gain confidence in your process? That is exactly what is driving technology providers to add centralized workflow and collaboration features to their offerings.
Read full article from eDiscovery Journal’s expert Greg Buckles:
With company data growth constantly on the rise, an on-going key concern for information governance and eDiscovery professionals is the management of this data and ability to quickly react to and comply with legal hold and preservation requests, according to a recent survey to legal practitioners to uncover the key drivers behind investments in information governance and eDiscovery solutions.
Specifically, the survey found that compliance was the number one challenge for 78% of respondents. Additionally, respondents also expressed concern over enforcing company-wide policies and are struggling to implement strategies to manage their data, shedding light on the importance of implementing information governance policies.
This study underscores the need for organizations to consider an all-encompassing information governance platform that provides data indexing, mapping, extraction and archiving. Read entire article
New White Paper:
Data mapping provides an inventory and profile of user content and is the key foundation to any corporate data policy definition. You must first understand what you have before you can develop a sound policy that will protect the organization from harm and long term risk. With a data map deployed you can easily manage data more effectively moving what is required by compliance onto legal hold, securing sensitive content in the corporate archive, and purging data that no longer has business value. The challenge with data mapping is that the infrastructure is complex and vast.
Obtaining a view into the content has been a manual and time consuming process, and one that quickly becomes obsolete as the data evolves. With today’s technology a data map can be automated so that a comprehensive and insightful view of current data is possible. Read this White Paper and Learn more about how to Use a Data Map to understand and manage your content.