TECHNO BYTES
Digital casts in orthodontics: A comparison of 4 software systems € ,c Andrea Bresin,d Spyros Psonis,e and Olof Torgerssonf Anna Westerlund,a Weronika Tancredi,b Maria Ransjo Gothenburg, Sweden
Introduction: The introduction of digital cast models is inevitable in the otherwise digitized everyday life of orthodontics. The introduction of this new technology, however, is not straightforward, and selecting an appropriate system can be difficult. The aim of the study was to compare 4 orthodontic digital software systems regarding service, features, and usability. Methods: Information regarding service offered by the companies was obtained from questionnaires and Web sites. The features of each software system were collected by exploring the user manuals and the software programs. Replicas of pretreatment casts were sent to Cadent (OrthoCAD; Cadent, Carlstadt, NJ), OthoLab (O3DM; OrthoLab, Poznan, Poland), OrthoProof (DigiModel; OrthoProof, Nieuwegein, The Netherlands), and 3Shape (OrthoAnalyzer; 3Shape, Copenhagen, Denmark). The usability of the programs was assessed by experts in interaction design and usability using the “enhanced cognitive walkthrough” method: 4 tasks were defined and performed by a group of domain experts while they were observed by usability experts. Results: The services provided by the companies were similar. Regarding the features, all 4 systems were able to perform basic measurements; however, not all provided the peer assessment rating index or the American Board of Orthodontics analysis, simulation of the treatment with braces, or digital articulation of the casts. All systems demonstrated weaknesses in usability. However, OrthoCAD and 03DM were considered to be easier to learn for first-time users. Conclusions: In general, the usability of these programs was poor and needs to be further developed. Hands-on training supervised by the program experts is recommended for beginners. (Am J Orthod Dentofacial Orthop 2015;147:509-16)
I
n many areas of health care, there is a shift toward digitization of patient information and data. Orthodontics is no exception. Medical records, x-rays, and photographs are just a few examples. Study models are central to orthodontic diagnosis, treatment planning, and evaluation. The introduction and use of digital models is inevitable in the otherwise digitized everyday life of orthodontics. Easy and effective storage, access,
a Associate professor, Department of Orthodontics, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden. b Interaction designer, Essiq Consulting Company, Gothenburg, Sweden. c Professor, Department of Orthodontics, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden. d Head of specialist training, Orthodontist, Public Dental Service, V€astra G€ otaland Region, Gothenburg, Sweden. e Orthodontist, Public Dental Healthcare Service, V€astra G€ otaland Region, Gothenburg, Sweden. f Associate professor, Department of Applied Information Technology, University of Gothenburg, Gothenburg, Sweden. All authors have completed and submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest, and none were reported. Address correspondence to: Anna Westerlund, Department of Orthodontics, Sahlgrenska Academy, University of Gothenburg, Box 450, SE 405 30 Gothenburg, Sweden; e-mail,
[email protected]. Submitted, September 2014; revised and accepted, November 2014. 0889-5406/$36.00 Copyright Ó 2015 by the American Association of Orthodontists. http://dx.doi.org/10.1016/j.ajodo.2014.11.020
durability, transferability, and diagnostic versatility have been presented as advantages. Moreover, it is possible to communicate with colleagues and patients with virtual images that can be printed and e-mailed without effort. Clinicians could use the systems for marketing their clinics by showing prospective patients that the office is at the forefront of technology, with no upfront investment required. The systems make it possible to superimpose with other digital casts or to fuse with digital x-rays and digital photos. The first system for digital evaluation was introduced to the market in 2001. Since then, several systems have been marketed. Commercially available digital cast models can be produced by either direct or indirect techniques. Direct methods use interior scanners, and indirect methods use either laser scanning or computed tomography imaging of the impressions or plaster models. Subsequently, the scans are converted into digital images that are stored on the manufacturer's servers. The models are then available for downloading by the account holder, and the manufacturer provides software for routine measurement. There is doubt, however, that these 2-dimensional computer screen systems can provide as much information as the hands-on 3-dimensional plaster models in terms of diagnosis, treatment planning, and evaluation. 509
Westerlund et al
510
So far, the following systems have been evaluated almost exclusively regarding reliability and validity: Cecile,1 e-models,2,3 Orametrix,4 OrthoCAD,5-13 DigiModel,14 O3DM,15 and OrthoAnalyzer.16 Studies have demonstrated that even though the reliability of the systems is not ideal for research, it is adequate for clinical use.17 However, the introduction of this new technology is not straightforward, and selecting an adequate system may be difficult. Research has shown that to facilitate the shift in technology, manufacturers need to provide appropriate features and services, and their products should be easy to use. However, a more thorough comparison of various digital software systems used in orthodontics has not been done. The aim of this study was to compare 4 digital software systems regarding (1) service and support provided by the manufacturer, (2) features of the program, and (3) usability of the program. MATERIAL AND METHODS
The 4 orthodontic digital software systems evaluated were OrthoCAD (version 3.5.0; Cadent, Carlstadt, NJ), O3DM (version 3.2; OrthoLab, Poznan, Poland), Digimodel (version 2.2.1; OrthoProof, Nieuwegein, The Netherlands), and OrthoAnalyzer (version 1.5; 3Shape, Copenhagen, Denmark). Duplicates of pretreatment casts from patients at the Department of Orthodontics at the University of Gothenburg in Sweden were sent to these 4 companies for scanning and to subsequently be available in a digital format on a computer at the clinic. Information regarding services provided from the companies—time until delivery, cost, technical requirements, and so on—was obtained from their Web sites. If additional information was needed, the manufacturer was contacted. The features offered by each software system were collected from user manuals and by browsing the software programs. If additional information was needed, the manufacturer was contacted. The “enhanced cognitive walkthrough” method was used to evaluate usability.18 Theoretically, the enhanced cognitive walkthrough focuses on ease of learning by exploration. This means that a user tries to complete a task using a trial-and-error technique. While the user performs the sequence of actions to accomplish the task, the method simulates the user's cognitive processes. This determines whether the user's background knowledge, together with hints from the interface, will lead to a correct sequence of goals and actions. Practically, enhanced cognitive walkthrough is a method in which predefined tasks are performed by domain experts (in this study, A.W., M.R., S.P., A.B.), supervised by evaluators (in this case, experts in humancomputer interactions [W.T., O.T.]).
April 2015 Vol 147 Issue 4
In our study the 4 domain experts (A.W., M.R., S.P., A.B.) had much experience in orthodontic diagnosis and evaluation with an otherwise basic knowledge of computers. They had only limited experience in orthodontic measurements using digital analysis. The selection of tasks was based on the intended use: ie, measurements significant for orthodontic diagnosis and treatment planning such as opening up a patient file, viewing the casts from different angles, measuring overjet and overbite, and analyzing spaces. To evaluate usability, a 2-level question process is used. Level 1 is for function: ie, a sequence of actions comprising the tasks (Fig 1). Level 2 is for the operations: ie, each step in a function (Fig 2). The program's or interface's capability to capture the user is studied in level 1. In level 2, the user's ability to perform the function correctly is studied. The analysis starts with the evaluator asking questions on level 1 for the whole function. Then the analysis continues with the operations. The underlying operations of a given node are analyzed in whole before the analysis continues in the next operation (Fig 3). The questions are answered and graded (grades 1-5) as well as justified for each grade. The grading represents the different levels of success. The justifications are termed failure or success stories, and they describe the assumptions underlying the choice of grades: eg, whether the user can understand a text message or a symbol. The grade ranks are supposed to represent different problems in the interface: ie, the seriousness of the problem. This type of grading is used to determine what is most important to modify when redoing an interface. In the analysis process, the question is answered, assuming that the preceding questions are answered “yes” (grade 5). The usability problem is the factor that prevents the user from accomplishing the action accurately. If the seriousness of the problem has been graded 1 to 4, it implies that there is a usability problem. This problem then needs to be described in a failure or a success story. The problems found are subsequently categorized by type with a description of the problem and the failure stories. The problems vary with the user interface and the user's task. Different problem types are described in Figures 1 and 2.18-20 In addition, the legibility and relevance of the logo to the representation of the systems were compared. RESULTS
The results demonstrated that all companies require alginate or silicone impressions, good-quality bite registrations of wax or silicone, and disposable trays. If needed, a plaster model can be made from the impression at some additional cost. Not all companies offer
American Journal of Orthodontics and Dentofacial Orthopedics
Westerlund et al
511
Fig 1. Schematic illustration of the analysis of a function.
laboratory integration. Time until delivery of the digital casts and costs are similar, and costs are also similar to the cost of a regular plaster model (Table I). The services provided by the instruction materials from the companies are shown in Table II. All 4 systems can perform basic measurements such as overbite, overjet, tooth size, space analysis, Bolton analysis, and arch length. The systems can show views of the digital cast in different planes; map occlusal contacts; make free measurements from point to point, point to plane, or plane to plane; and measure tooth movements in different planes. However, not all systems can provide the peer assessment rating index or the
American Board of Orthodontics analysis, simulation of the treatment with braces, or digital articulation of the casts (Table III). Task 1, opening up a patient file, was done through selecting an option in the main menu bar. All systems except OrthoAnalyzer have the menu option “file.” OrthoCAD, DigiModel, and 03DM refer to the patient as a “file.” File is a technical term showing the underlying software structure. The user's mental model of a task is to access information about a specific patient's model. The number of issues for this task was the least for OrthoCAD and the greatest for OrthoAnalyzer. The issues could mainly be called “text and icon” and “hidden.” It
American Journal of Orthodontics and Dentofacial Orthopedics
April 2015 Vol 147 Issue 4
Westerlund et al
512
Fig 2. Schematic illustration of the analysis of an operation.
could be concluded that for task 1, OrthoCAD, 03DM, and DigiModel were more usable than OrthoAnalyzer (Table IV). Task 2 was viewing casts from different angles. In all 4 systems, it was possible to manipulate the cast in different ways. The user interfaces of the systems provide several manipulation strategies: 1. 2. 3. 4.
Selecting icons in a toolbar. The user selects 1 to 6 predefined viewing perspectives (views). Direct manipulation of the cast. The user can rotate and pan the cast. The main menu bar contains an option for selection of views and orientations of the cast. Custom solution in the form of a designated view control.
In this study, the manipulation of the model through the icons in the toolbar was analyzed. The number of issues for this task was the least for OrthoCAD and OD3M and the greatest for OrthoAnalyzer and DigiModel. The issues could mainly be referred to as “text and icon” and “hidden.” It could be concluded that
April 2015 Vol 147 Issue 4
for task 2, OrthoCAD and 03DM were more usable than were OrthoAnalyzer and DigiModel (Table V). Task 3 was measuring overbite and overjet. In the analyzed systems, it was possible to adjust the position of the longitudinal section of a cast in 1 of 3 ways: by direct manipulation of the cutting plane in the cast (OrthoCAD), by selecting a tooth where the cross section should be positioned (OrthoAnalyzer and 03DM), or by manipulating a slider control (DigiModel). Positioning the slider directly below the cast and making it longer would make the interaction more effective and satisfying; the system could also allow direct manipulation of the cross section. The number of issues for this task was the least for OrthoAnalyzer and the greatest for OrthoCAD and DigiModel. The issues could mainly be called “text and icon” and “hidden.” It could be concluded that for task 3, OrthoAnalyzer was more usable than 03DM, DigiModel, and OrthoCAD (Table VI). Task 4 was space analysis. To calculate that, 2 measurements are required—the lengths of an ideal arch and a dental arch (actual arch). It is natural for users
American Journal of Orthodontics and Dentofacial Orthopedics
Westerlund et al
513
Fig 3. Diagram with nodes, operations, and functions.
Table I. Service provided by the manufacturers Cost Time until delivery Requirements
Impression material Bite registration Trays Plaster/plastic model fabrication Scanning technology Digital model size Software cost Software size File of digital model Laboratory integration Compatibility with patient's information system
OrthoAnalyzer 30-35V ($36-$43 USD) Impression Plaster model Intraoral scan PVS Alginate PVS Wax Disposable Yes LED-laser 3 MB Free 650 MB Always available Yes Yes
DigiModel 20V ($25 USD) 24 hours Impression Plaster model PVS Alginate PVS Wax Disposable Yes CT scanner 4-5 MB Free 74 MB Yes Yes
OrthoCAD 24V ($30 USD) 48 hours Impression Plaster model Intraoral scan PVS
O3DM 29V ($35 USD) 48 hours Impression Plaster model
PVS
PVS Alginate Wax
Disposable Yes Laser 3 MB Free 8 MB Up to 14 years No Yes
Disposable Yes (18.2-25.47 euro) 3D scanner using laser beam 3-4 MB Free 38.07-54.10 MB Never delete file Yes Yes
PVS, Polyvinylsiloxane.
that these 2 operations comprise the task of space analysis. No analyzed system offers space analysis as a distinctive function. All systems, however, allow the user to do space analysis indirectly. Users must do the space analysis from 2 available functions: the function for measuring tooth width and the function for specifying the ideal arch shape. In this case, the users' mental model—ie, how they picture that the task should be
performed—does not match the sequence of actions that the system imposes. The number of issues for this task was the least for OrthoCAD and the greatest for DigiModel. The issues could mainly be referred to as “hidden” and “text and icon” but also to “sequence,” “feedback,” and “user.” It could be concluded that for task 4, none of the 4 systems was more usable than were the others (Table VII).
American Journal of Orthodontics and Dentofacial Orthopedics
April 2015 Vol 147 Issue 4
Westerlund et al
514
Table II. Instruction materials provided by the companies Size (number of pages) 34
Content OrthoCAD The manual is not about the interface of the system but about integrating the system into the work practice and PVS impression. DigiModel Primarily online help divided into chapters Illustration of available functionality with that can be browsed (it was possible to images and step-by-step instructions print the online user guide). about the interface. OrthoAnalyzer 114 Illustration of available functionality with images and step by step instructions about the interface. O3DM 167 Illustration of available functionality with images and step-by-step instructions about the interface.
Step-by-step instruction for a specific task Yes, but only 4 pages describing the process of approval of a cast.
No, mostly description of menu bars, panels, and icons. The descriptions are not organized into tasks. Yes.
Yes, in the manual, important objects of user interface are marked in images with red arrows and circles.
PVS, Polyvinylsiloxane.
Table III. Features provided by the manufacturers OrthoAnalyzer Virtual setup Yes Articulation Yes Superimposition Yes PAR index No Indirect bonding Yes Windows or Windows Macintosh
DigiModel Yes Yes Yes Yes Yes Both
OrthoCAD Yes No No No Yes Windows
Table V. Numbers and types of usability issues identi-
O3DM Yes Yes Yes Yes Yes Both
fied for each system for task 2, viewing casts from different angles (sagittal, frontal, and occlusal) Text System User and icon Hidden Feedback Sequence Total OrthoCAD 0 1 0 0 0 1 DigiModel 0 8 7 1 0 16 OrthoAnalyzer 0 5 1 1 0 7 O3DM 0 0 1 0 0 1
PAR, Peer assessment rating index.
Table VI. Numbers and types of usability issues idenTable IV. Numbers and types of usability issues iden-
tified for each system for task 1, opening up a patient file Text System User and icon Hidden Feedback Sequence Total OrthoCAD 0 0 0 0 0 0 DigiModel 0 1 1 0 0 2 OrthoAnalyzer 0 1 1 0 3 5 O3DM 0 1 1 0 0 2
The logo of OrthoAnalyzer has the poorest legibility. The contrast between the light blue background and the white figure is too low to recognize what it represents. The thick dark frame around the logo causes the elements of the logo to run together and overlap one another. Increasing the space between the elements, a stronger contrast between the color of the elements and background, and removing the frame would increase the legibility of the logo. The logo of DigiModel is hard to read as well. It contains too many colors, and the shape is difficult to recognize. Simplifying the symbol would make it more readable. The 03DM uses a 3-dimensional model as a logo, which seems relevant since it represents the dental system, and it reads relatively well despite its small size.
April 2015 Vol 147 Issue 4
tified for each system for task 3, measuring overjet and overbite Text System User and icon Hidden Feedback Sequence Total OrthoCAD 0 9 7 2 0 18 DigiModel 0 10 4 1 2 17 OrthoAnalyzer 0 0 0 0 0 0 O3DM 1 6 2 1 0 10
Table VII. Numbers and types of usability issues identified for each system for task 4, space analysis Text System User and icon Hidden Feedback Sequence Total OrthoCAD 0 3 6 4 2 15 DigiModel 3 4 13 2 2 24 OrthoAnalyzer 0 10 0 2 7 19 O3DM 3 9 3 0 3 18
The color of the gums makes the logo easy to interpret. The logo of OrthoCAD has the best legibility: high contrast between colors and spacing between the elements. It is relevant for the system it represents and is distinctive, and its appearance differs from the other logos.
American Journal of Orthodontics and Dentofacial Orthopedics
Westerlund et al
DISCUSSION
Digital software systems have been carefully evaluated for accuracy and reproducibility. In this study, we focused on the services the manufacturers provided, the features of the systems, and especially the usability of the systems. These 4 systems were chosen because they are commonly used and are also represented in different geographic areas. These systems provide similar services and features, but they differ regarding usability. There are 2 approaches to evaluate usability: empirical and analytical methods.21 The empirical evaluation includes users who perform different tasks, whereas analytical evaluation is performed without users. The method we used in this study was the enhanced cognitive walkthrough. This method is supposed to find potential usability problems in medical equipment and has been used to successfully assess user interface designs for other medical purposes. This method is a further development of the cognitive walkthrough, which had problems such as poor high-level perspectives, insufficient categorization of detected usability problems, and difficulties in overviewing the analytical results.18-20,22 The enhanced cognitive walkthrough analysis method, however, has some limitations because it mainly studies the user's ability to learn through exploration and guessability, which are only limited parts of usability.22 The other aspects of usability, as defined by Nielsen,23 comprise memorability, efficiency, error prevention, and satisfaction; these also need to be assessed. Moreover, the analysis is limited because the method is mainly an inspection method and not an empirical method. This may contribute to finding more problems than those relevant to a real user. The analysis of the logos was not a part of the enhanced cognitive walkthrough method; however, the results corresponded well to the results of the analysis of usability issues made with the enhanced cognitive walkthrough; OrthoCAD and 03DM had higher usability than did OrthoAnalyzer and DigiModel. Of all 4 programs assessed, OrthoCAD and 03DM were considered to be easier to learn for first-time users because they supported the exploration of the system to a greater extent. We found weaknesses of usability in all systems, meaning that these programs need to be further developed by their manufacturers. (For the detailed evaluation of each program and task, see Supplementary Appendix A.) For now, an expert-guided introduction and supervised hands-on training are recommended for first-time users of all systems. Changing today's way of working with regular study models—the tactile sense of having them in your hand, and the ability to judge size and distance in natural size
515
and 3 dimensions—is a challenge to be met. This will probably be easier in the future because today's youth will have the training of daily exposure to 3dimensional digital environments through games and other devices. Compatibility of the software with a small clinic's management software is usually not a problem, but it could be a problem with larger organizations' digital systems. To free the clinic from impressions and casts in the future, a digital management system that is compatible with an efficient intraoral scanner should be selected. Intraoral scanning and digital evaluation are the future in dentistry. Of the 4 systems evaluated in this study, 3Shape (TRIOS scanner) and OrthoCAD (Cadent's iTero iOC intraoral scanner) offer this kind of integrated system to their customers. Manufacturers that provide orthodontists with functional intraoral scanners, as well as an integrated software system that is accurate and reliable and has high usability, will have an advantage over manufacturers providing only the software system. CONCLUSIONS
1. 2. 3.
All 4 systems were similar regarding services and features. In general, the usability of these programs was poor and needs to be further developed. Hands-on training supervised by experts of the programs is recommended for beginners.
ACKNOWLEDGMENTS
The authors acknowledge The V€astra G€otaland Region Public Dental Healthcare Service for supporting this study. SUPPLEMENTARY DATA
Supplementary data related to this article can be found in the online version at http://dx.doi.org/10. 1016/j.ajodo.2014.11.020. REFERENCES 1. Watanabe-Kanno GA, Abr~ao J, Miasiro Junior H, Sanchez-Ayala A, Lagravere MO. Reproducibility, reliability and validity of measurements obtained from Cecile3 digital models. Braz Oral Res 2009; 23:288-95. 2. Horton HM, Miller JR, Gaillard PR, Larson BE. Technique comparison for efficient orthodontic tooth measurements using digital models. Angle Orthod 2010;80:254-61. 3. Stevens DR, Flores-Mir C, Nebbe B, Raboud DW, Heo G, Major PW. Validity, reliability, and reproducibility of plaster vs digital study models: comparison of peer assessment rating and Bolton analysis and their constituent measurements. Am J Orthod Dentofacial Orthop 2006;129:794-803.
American Journal of Orthodontics and Dentofacial Orthopedics
April 2015 Vol 147 Issue 4
Westerlund et al
516
4. Torassian G, Kau CH, English JD, Powers J, Bussa HI, Marie Salas-Lopez A, et al. Digital models vs plaster models using alginate and alginate substitute materials. Angle Orthod 2010;80: 474-81. 5. Leifert MF, Leifert MM, Efstratiadis SS, Cangialosi TJ. Comparison of space analysis evaluations with digital models and plaster dental casts. Am J Orthod Dentofacial Orthop 2009;136:16.e1-4:discussion, 16. 6. Mayers M, Firestone AR, Rashid R, Vig KW. Comparison of peer assessment rating (PAR) index scores of plaster and computer-based digital models. Am J Orthod Dentofacial Orthop 2005;128:431-4. 7. Okunami TR, Kusnoto B, BeGole E, Evans CA, Sadowsky C, Fadavi S. Assessing the American Board of Orthodontics objective grading system: digital vs plaster dental casts. Am J Orthod Dentofacial Orthop 2007;131:51-6. 8. Quimby ML, Vig KW, Rashid RG, Firestone AR. The accuracy and reliability of measurements made on computer-based digital models. Angle Orthod 2004;74:298-303. 9. Santoro M, Galkin S, Teredesai M, Nicolay OF, Cangialosi TJ. Comparison of measurements made on digital and plaster models. Am J Orthod Dentofacial Orthop 2003;124:101-5. 10. Tomassetti JJ, Taloumis LJ, Denny JM, Fischer JR Jr. A comparison of 3 computerized Bolton tooth-size analyses with a commonly used method. Angle Orthod 2001;71:351-7. 11. Zilberman O, Huggare JA, Parikakis KA. Evaluation of the validity of tooth size and arch width measurements using conventional and three-dimensional virtual orthodontic models. Angle Orthod 2003;73:301-6. 12. Costalos PA, Sarraf K, Cangialosi TJ, Efstratiadis S. Evaluation of the accuracy of digital model analysis for the American Board of Orthodontics objective grading system for dental casts. Am J Orthod Dentofacial Orthop 2005;128:624-9. 13. Goonewardene RW, Goonewardene MS, Razza JM, Murray K. Accuracy and validity of space analysis and irregularity index measurements using digital models. Aust Orthod J 2008;24:83-90.
April 2015 Vol 147 Issue 4
14. Veenema AC, Katsaros C, Boxum SC, Bronkhorst EM, KuijpersJagtman AM. Index of complexity, outcome and need scored on plaster and digital models. Eur J Orthod 2009;31:281-6. 15. Sjogren AP, Lindgren JE, Huggare JA. Orthodontic study cast analysis—reproducibility of recordings and agreement between conventional and 3D virtual measurements. J Digit Imaging 2010; 23:482-92. 16. Sousa MV, Vasconcelos EC, Janson G, Garib D, Pinzan A. Accuracy and reproducibility of 3-dimensional digital model measurements. Am J Orthod Dentofacial Orthop 2012;142:269-73. 17. Fleming PS, Marinho V, Johal A. Orthodontic measurements on digital study models compared with plaster models: a systematic review. Orthod Craniofac Res 2011;14:1-16. 18. Blig ard LO, Osvalder AL. Enhanced cognitive walkthrough: development of the cognitive walkthrough method to better predict, identify, and present usability problems. Adv Hum Comput Interact 2013;931698. 19. Blig ard LO, Osvalder AL. An analytical approach for predicting and identifying use error and usability problem. In: Holzinger A, editor. HCI and Usability for Medicine and Health Care Lecture Notes in Computer Science. Third Symposium of the Workgroup HumanComputer Interaction and Usability Engineering of the Austrian Computer Society, USAB 2007 Graz, Austria, November, 22, 2007. Berlin Heidelberg: Springer-Verlag; 2007, Volume 4799, p. 427-40. 20. Blig ard LO. Prediction of medical device usability problems and use errors - an improved analytical methodical approach [dissertation]. Chalmers Publication Library. Gothenburg (Sweden): Chalmers University of Technology; 2007. 21. Cooper A, Reinman R, Cronin D, Noessel C. About face: the essentials of interaction design. Indianapolis: Wiley Publishing; 2014. 22. Mahatody T, Sagar M, Kolski C. State of the art on the cognitive walkthrough method, its variants and evolutions. Int J Hum Comput Interact 2010;26:741-85. 23. Nielsen J. Usability engineering. San Francisco: Morgan Kaufmann Publishers; 1994.
American Journal of Orthodontics and Dentofacial Orthopedics