Digital White Papers

July 2013: Knowledge Management

publication of the International Legal Technology Association

Issue link: https://epubs.iltanet.org/i/143561

Contents of this Issue

Navigation

Page 40 of 61

KMSTANDARDS IN PRACTICE Use 2: Content Management We also use KMS when providing counseling on a group of prior-executed documents for a client. In this case, we are looking to answer questions such as "what did we do before, and how much of it did we do?" If we have worked with a client to execute a number of agreements in the past, and a dispute arises, we want to react quickly. Knowing how that agreement relates to others is helpful. Getting to the section in question is a bit easier now. For example, among a volume of contracts, we can determine quickly the choice of law, standards of care, geographic coverage, arbitration approaches and so forth. Further, if the client needs a new document of the same type, we want to know what was done, either by our firm or by another firm, and what was used in the past. demonstrations. They are happy to hear we are trying to tackle their problems in new ways. It is comforting to them that we are handling their materials with care and through a structured method. In this instance, we are going to tailor the project a bit differently. The structure might be driven by that of the underlying documents, but often it is driven more by convenience. We will usually design it as a means of responding to frequent questions, issues or problems. If a section is always under the microscope, that will be pushed to the top of the structure, even if it's one of the last referenced in an actual agreement. •Whether the document compared has each section found in the corpus or not. We already have had open discussions with a few clients about this use and have shown them Use 3: Benchmarking The third major use centers around KMS's benchmarking functionality. After a project has been created, a new document — either drafted internally, by the client or from a vendor paper — can be compared to the whole. This is somewhat like redlining on steroids. From a setup perspective, the document structure does not matter. You can copy and paste text and be on your way. The tool tells you several things: •If there is a match, it shows how far the compared document deviates from the baseline produced by the index. •The tool also highlights missing sections in an easy way. The exclusion of a clause may be intentional, but it is convenient and helpful to have that information front and center before completing a document. This process almost appears magical, but we've learned not to oversell when demonstrating this. The process is another way to mitigate risk, but it is not a replacement for attorney review. Our hope is to make the review a bit quicker and catch things humans might not. But, it is not a pushbutton solution. We have started building out general and clientspecific projects for this purpose. Where we have a lot of throughput, from a drafting or review perspective, a project for this purpose can yield substantial value. GETTING FROM A TO Z For all three uses, we will collect over 40 representative documents. The entire tool is driven by various formulae, and sample size is important for any trends to be relevant. In fact, 40 is fewer than we would like, and we may run into times when a larger set is needed. At present, 40 is our starting point. Methods of collection are important. Where possible, we lean on our attorneys heavily. That may sound inefficient, especially given the fact we have had enterprise search for many years, through which we can find samples fairly easily. But, for attorneys to rely on the output, they need to have faith in the input. Garbage in, garbage out, as

Articles in this issue

Archives of this issue

view archives of Digital White Papers - July 2013: Knowledge Management