Document Review Management Best Practices: Daily Reports
An MBA professor of mine used to be fond of saying “data drives decisions.” His point was that the more information you could get, the more informed the decision you could make. In the context of document review, daily reports can be a timely and efficient method of communicating that critical information. The first few days, and even weeks, of any document review project are full of questions and uncertainty. Does the mere mention of a term make the documents responsive? How substantive does the document have to be to be considered “hot”? Am I tagging too many documents as “not responsive”?
In many cases, the document review protocol that is used to set the scope and terms of the review is drafted prior to the review of any documents. Further, the review sets may have been created using search terms that could be overly broad or too narrow. Both the protocol and review sets may need to be updated before the review team gets too far down the road. Indeed, the first days and weeks can often feel as if the review is spinning its wheels in the mud trying to gain traction.
Daily Reports Provide Timely Insight
One way to gauge whether a document review is on the right track is by having the people reviewing the documents submit daily reports describing what they are seeing and how they are coding those documents. As part of our standard review protocol, at the end of each day, we require all reviewers provide a report that generally describes the types of documents they are seeing, attaches 4-5 sample documents and includes a brief description of each. The review manager then prepares the reports for the case team or client’s review.
These reports serve multiple purposes:
- Provide a sense of the type of documents in the review set;
- Provide (and attach) examples of documents the reviewers consider to be “hot” or important;
- Provide insight into how well the review team understands the issues in the case and goals of the review;
- Give the review manager some sense of the pace of the review;
- Provide insight into whether the correct custodians and data sources were identified for collection; and
- Provide an indication as to the usefulness of the search terms that were used to create the review batches.
This insight can apply to the review of both client documents and opposing or third party productions. Early in the review, daily reports allow a case team an opportunity to provide feedback that can help the review run more smoothly as the review progresses. For example, if a specific type of document is getting tagged as not responsive by multiple reviewers but it is actually hot, it could mean that the protocol is unclear or the review training provided conflicting instructions. Or, if that type of document actually is not responsive, you may be able to mass-tag it to reduce the universe of documents needing review. Addressing and resolving confusion or uncertainty early in the review increases the effectiveness of the review and decreases the chance of inconsistent coding. With so many reviews now being conducted remotely, mandating and reviewing daily reports is even more critical. No longer do we have the opportunity to get all reviewers together to brainstorm or discuss common issues they are encountering. Pre-COVID, reviewers sitting in the same room could easily bounce ideas off each other or talk through different interpretations of the documents. While we encourage our review teams to communicate via an email group and we hold regular group meetings by video where the team can ask questions, the flow of information post-COVID is inevitably more formal and less frequent.
Creating daily reports also allows case teams to adjust strategy by, for example, revising search terms or updating the protocol and tagging. For example, if all reviewers are reporting a high percentage of documents that are correctly tagged as not responsive according to the protocol, the case team may also want to investigate. Do the search terms need to be adjusted to exclude certain types of documents? Does the protocol need to be revisited to include these types of documents? Does the case team need to reach out to opposing counsel and raise the issue of the high percentage of documents that are not responsive to any request? Again, no matter what the reason may be, the earlier the issue can be addressed and resolved, the more accurate and complete the review will be.
Daily Reports Help Gauge Pace of the Review
The daily reports also give the review manager another tool to gauge the pace of the review and help determine whether meeting deadlines could become an issue. Applications such as Case Metrics in Relativity, which provides review metrics such as number of documents viewed, number of documents coded per hour and time spent in the database, provide only a partial picture of review progress. For example, if a reviewer is stuck on a batch of lengthy board meeting presentations or complicated financial reports, their progress may be slower. While Case Metrics is certainly helpful, without more information regarding the substance of the documents, it is incomplete -- daily reports can help round out the picture and enable the review manager to address potential issues early and often.
The LitSmart® Daily Report
Previously, our daily reports consisted of an email from the reviewer listing and describing hot documents and attaching branded copies of the documents. The review manager would then compile the daily reports into one submission that would go to the case team. For most smaller reviews (less than five reviewers on a one week review), that process worked very well. However, on larger reviews, reviewing and compiling daily reports became arduous. For example, if we had 25 reviewers going through four batches a day and finding on average a half dozen or so hot documents, I, as the review manager, ended up spending hours reviewing, updating and combining the reports, and the submission I prepared for the already extremely busy case team could have 100+ attachments.
To streamline this process, our innovative team developed an application that integrates with Relativity and allows the reviewers to identify report-worthy documents within the database, insert a description along with the document and provide a general overview of each batch reviewed. These automated reports, which include links to the selected documents within Relativity, are exported each night for the review manager’s easy access and analysis. The report also includes the date, number of reviewers in the workspace and number of batches completed. It lists by reviewer each document tagged as an example in the batch along with the document custodian, description and any other relevant information as decided by the review manager.
The LitSmart® Daily Report streamlines the reporting process for the reviewers as well. The reviewer no longer spends time maintaining a list of “hot” documents to use as examples, imaging documents to attach to the email report and typing the coding information and document description into the email. Instead, the reviewer is able to spend an additional 30 minutes a day actually reviewing and analyzing documents.
The lesson here is that the benefits of substantive daily reports throughout the course of a document review are many. Mandating those reports for our review teams has proven to be well worth the effort. And if you can automate the reporting function altogether, like our team of innovative technical experts has done, that’s even better!
DISCLAIMER: The information contained in this blog is not intended as legal advice or as an opinion on specific facts. For more information about these issues, please contact the author(s) of this blog or your existing LitSmart contact. The invitation to contact the author is not to be construed as a solicitation for legal work. Any new attorney/client relationship will be confirmed in writing.