Monitor/
Laboratory Automation
and Information Management 31 (1995) 69-75
69
Monitor m
Book Reviews
Good Computer Validation Practices: Common Sense Implementation, by Teri Stokes, Ronald C. Branning, Kenneth G. Chapman, Heinrich Hambloch and Anthony J. Trill Press Inc., Buffalo Grove, IL, USA, $239.00, ISBN O935184-55-4
Interpharm
This is a book written by five contributors with a wide experience in computer validation: from a inspector regulatory agency through industry experts to consultants. The book consists of 13 chapters with six appendices. This is a good book, recommended to people who have validated computer systems as well as to newcomers to the field, and the reviewer can only agree with the subtitle of the book: it covers common sense implementation; this makes it useful for people who have validated systems before. The book covers a review of the worldwide regulations for regulated agencies, the need for management involvement in computer validation, documentation, validation concepts, retrospective validation of existing systems, good practices for central data centres,
vendor audits and the organisation implications of validation. The book starts with a chapter, written by Trill, looking at computer system disasters and discusses how to build quality into a system via the system development life cycle (SDLC). Stokes reviews the world wide regulations for Good laboratory Practices (GLP), Good Manufacturing Practices (GMP) and Good Clinical Practices (GCP). There are many similarities between the various approaches and Stokes concludes that the European Union GMP Annex 11 can be used as a template for system validation across all three regulatory environments. This is followed by a discussion of management and its role in computer validation discusses the need to develop a corporate validation policy with the suggested sections to be covered and how to implement the policy. Chapman discusses the role of documentation in the validation process. In the opening of this chapter two questions are posed: what documentation is needed for validation? (only that with common sense utility); how are validation documents best prepared and maintained? (the less paperwork the better). The comment that concise reports and not “bales of paper” is
gratifying to all scientists involved in computer validation. Taking the work of Tetzlaff one stage further, Chapman divides the required documentation for computer systems into four sections: instructions, events, reviews and information. The key concepts and terminology are discussed thoroughly in the next chapter. A chapter by Hambloch discusses retrospective evaluation of existing systems. Here, one of the problems with a multi author book becomes apparent: Chapman on page 87 refers to retrospective validation and Hambloch retrospective evaluation. To help the reader, a more consistent use of terminology should be used. The chapter describes the process of collecting the data for a retrospective validation well. The chapter refers often to the “Red Apple Document”, although correctly referenced in the back of the chapter, this can be misleading where not explained to a newcomer to the field and does everyone know why this publication is so-called? An area that can cause major headaches when considering a validation effort is a central data centre. A chapter by Hambloch discusses the ways to control and document the operations of the
Monitor/Laboratory
70
data centre via SOPS is presented well. This is followed by a very useful chapter, by the same author, describing how to audit a software vendor, an activity that is increasing to ensure that the software that one is purchasing is a quality product. A discussion of GCP, GLP and GMP specifics for computer systems are outlined in the next three chapters. These are useful as the specific requirements for each area are discussed. The training and organisation for validation is covered by Branning in Chapter 12. Finally, Trill looks at computer systems from a regulatory perspective and covers the concerns with the wide range of computer systems likely to be encountered. A valuable table is Table 13.1 which lists the audit documentation associated with a computer system project. Depending on how the system is designed and operated the list may be modified appropriately. In summary, the book is a very useful addition to the literature and should be on the book shelf of any organisation or department that has to validate computer systems. The multiauthor approach has the advantages of bringing the expertise of together. individuals several However, there are problems with consistent terminology and there are at least two figures of the same diagram presented in slightly different ways. Notwithstanding these comments, a very good book. R.D. McDOWALL of Chemistry, University of Surrey, Guildford, GlJ2 5XH, UK
Department
-
Automation and Information Management 31(1995)
LIMS: Implementation and Management, by Allen S. Nakagawa Royal Society of Chemistry, Cambridge, UK. 180 pages, x37.50, ZSBN 0 85186 824 X This is the fourth book on the subject of Laboratory Information Management Systems. Written by a consultant who specialises in the subject, it is a welcome addition to the literature and well worth a place in your bookcase. Why do you need this book? As pointed out in the preface to the book, the money spent on the implementation of a LIMS usually exceeds the original purchase price of the system. Furthermore, users are often overlooked by the laboratory, information systems personnel and project managers, which can lead to failure. This books emphasis on implementation and management of a LIMS means that it fills a niche in the market that is not fully covered by the existing three LIMS books Hegerty, Mahaffey and by McDowall. What do you get for your money? The book is divided into five sections: Introduction to LIMS; Understanding laboratory information flow; Developing an automation strategy; LIMS implementation strategy; Implementation pragmatics. There are 15 chapters, which are usually short and concise. The text is frequently bulleted for emphasis and there is good use of white space, making the page layouts easy on the eye. After a brief introduction of a historical perspective, differences
69-75
in organisations and a rather thin description of what is a LIMS, the real meat of the book starts with Chapter 3. Scientific data are probably the most complex that a database has to handle, therefore it is important to understand the complexities and interrelationships. In Chapter 3, the techniques for mapping laboratory operations such as context diagrams and understanding the relationships of the laboratory with its customers, event diagrams, temporal diagrams are described. These are unique and are not covered in any book on LIMS. Roles and interactions introduces how different laboratories work and recognises that different laboratories require different approaches to LIMS. The book then moves to customer-supplier relationships and interactions with other interested parties such as the regulatory agencies. The impact of a LIMS on the laboratory and the organisation is examined together with the nature of change on the ways of working both within and outside of the laboratory. This is a very important facet of implementation that is usually ignored and can be the main reason for an implementation failure. I found the quotation on page 79 to be very pertinent “a LIMS is often viewed as a panacea for all laboratory information management problems and bottlenecks. Its ability to solve existing problems is not inherent in the technology itself; it is a function of how the LIMS is implemented and deployed.” The main problem is that we are not thoughtful when implementing a LIMS (it will solve all our problems) and not very innovative in automating