Shaping computer-based support for curriculum developers

Shaping computer-based support for curriculum developers

Computers & Education 50 (2008) 248–261 www.elsevier.com/locate/compedu Shaping computer-based support for curriculum developers Susan McKenney * D...

2MB Sizes 1 Downloads 73 Views

Computers & Education 50 (2008) 248–261 www.elsevier.com/locate/compedu

Shaping computer-based support for curriculum developers Susan McKenney

*

Department of Curriculum, Faculty of Behavioral Sciences, University of Twente, P.O. Box 217, 7500 AE Enschede, The Netherlands Received 23 March 2006; received in revised form 26 May 2006; accepted 29 May 2006

Abstract CASCADE-SEA stands for computer supported curriculum analysis, design and evaluation for science education in Africa. It is the name of a computer program designed to help secondary level science teachers in southern Africa create exemplary paper-based lesson materials. Research conducted alongside the design and development of the CASCADESEA system explored how to shape performance support for teachers who are engaged in the complex task of developing exemplary lesson materials. After brief examination of the study’s inception, the theoretical underpinnings are offered followed by a description of the research design. Thereafter, the software is briefly described before the findings are addressed. Design principles for constructing performance support of this nature are offered in the conclusion. Ó 2006 Elsevier Ltd. All rights reserved. Keywords: Performance support; Curriculum development; Design principles

1. Introduction Ever since the 1980s when personal computer use started to become ubiquitous in professional settings, exploration has been underway into how the computer might be able to support the performance of various tasks. In the field of education, many tasks are complex and numerous individuals and organizations stand to benefit from carefully constructed performance support systems. One complex task that remains integral to most educational initiatives, in one form or another, is that of curriculum development. Over the last 15 years, many tools have been developed for the purpose of supporting the complex process of curriculum development (Grabinger, Jonassen, & Wilson, 1992; Gustafson & Reeves, 1990; Nieveen, 1997; Nieveen & Gustafson, 1999; Rosendaal & Schrijvers, 1990; Wilson & Jonassen, 1991; Zhongmin & Merril, 1991). In an effort to learn more about the potential of computers to aid in this process, the Department of Curriculum within the Faculty of Educational Science and Technology at the University of Twente conducted a study in collaboration with the Dutch National Institute for Curriculum Development. From this study, Nieveen (1997) concluded that the use of a tailor-made performance support system for the formative evaluation of lesson materials could: improve consistency of evaluation plans; motivate developers; save time; and support decision-making. These *

Tel.: +31 53 489 2890; fax: +31 53 489 3759. E-mail address: [email protected]

0360-1315/$ - see front matter Ó 2006 Elsevier Ltd. All rights reserved. doi:10.1016/j.compedu.2006.05.004

S. McKenney / Computers & Education 50 (2008) 248–261

249

promising findings prompted a small-scale exploratory study that demonstrated how a similar system could be useful in settings outside of the context for which this program was originally developed. At the same time, collaboration between staff from the University of Twente and various educational improvement initiatives in southern Africa was on the rise. Many of the joint projects involved the development of high-quality materials that offer concrete guidance for implementing new or challenging curricula. Such materials, hereafter referred to as exemplary materials, were sometimes created by project staff alone; but in many cases, local teachers also contributed to the materials development process. Dialogue with international colleagues – from curriculum development programs in which teachers helped create exemplary materials – led to the decision to explore how to shape a performance support system that could address some of the challenges faced by exemplary materials developers in southern Africa. This exploration took place through the design, development and formative evaluation of the CASCADESEA program: software to help teachers build good quality teacher guides (to be used by other teachers); and to facilitate the professional development of its users by providing support to better understand and visualize the curriculum development process. Throughout the 4 years of design and development of the CASACDESEA system, various aspects of the software were studied (e.g., program content, technical and interface issues, performance support; for information on other aspects studied, please see McKenney & van den Akker, 2005); this article reports on the research related to the performance support embedded in CASCADE-SEA.

2. Theoretical foundations The CASCADE-SEA program was designed to support both teacher professional development and curriculum development at a very practical cross-roads: the creation of exemplary lesson materials. Reflective practice has had long held a place in the domain of professional development, and is considered by many to be a key ingredient in inservice education (Fullan, 1991; Loucks-Horsley, Hewson, Love, & Stiles, 1998). By reflecting on one’s own ideas regarding good teaching practice and making those thoughts explicit for others (for example, in the form of exemplary teacher guides), teacher development takes place. The resulting materials represent a form of curriculum development, and when teachers use exemplary lesson materials to inspire, support or advance their daily practice, curriculum implementation is often improved while further professional development may be stimulated. Particularly in developing nations, lesson materials play a crucial role in curriculum development (de Feiter, Vonk, & van den Akker, 1995; McKenney & van den Akker, 1998). While some countries are engaged in replacing out-dated or irrelevant materials, others strive to fill a profound void of teaching resources (Caillods, Go¨ttelmann-Duret, & Lewin, 1996). The notion of engaging teachers in materials development as an effective form of inservice is widely advocated (Ball & Cohen, 1996; Ben-Peretz, 1990; de Feiter et al., 1995); and it is this practice that the CASCADESEA program was designed to support. Research activities embedded in the design and development of CASCADE-SEA were structured to contribute to a growing body of knowledge in the domain of performance support. Earlier work in the field of performance support was dominated by an orientation toward ‘proof of concept’ thinking, as evidenced in the literature that populated journals at that time (for an overview of EPSS – related literature from 1989 to 1995, please refer to Hudzina, Rowley, & Wagner, 1996). Here, emphasis was given to defining the field (cf. Gery, 1989; cf. Gery, 1991) and discussing ways of exploring it (Pirolli & Russel, 1990; Stevens & Stevens, 1995). As the field grew, a trend rapidly emerged in which user performance became central, with the supporting systems on the periphery (Rosenberg, 1995; Winslow & Bramer, 1994); hence the field of performance-centered design (PCD) was born. This gave rise to articulation of fundamental forms of support (Gery, 1995), attributes and behaviors of performance-centered systems (Gery, 1997) and even methodologies for conducting PCD (Raybould, 2000). Advocates of performance support presume several advantages, including improved task performance (Collis, 1994; Gustafson & Reeves, 1990; Nieveen, 1997), organizational learning (Flechsig, 1989; Stevens & Stevens, 1995)and individual growth (McKenney, Nieveen, & van den Akker, 2002). Using contemporary insights from this field, the CASCADE-SEA system was designed to help improve task performance by aiding users in creating high quality materials, and to foster organizational and

250

S. McKenney / Computers & Education 50 (2008) 248–261

Fig. 1. Core elements of task support defined in the CASCADE-SEA study.

individual learning by emphasizing and maximizing those professional development opportunities that have a natural association with the materials development process. Performance-centered systems reach the aforementioned goals by aiding users at the moment of need (cf. Collis, 1994; cf. Gery, 1991), through various means. Different experts have distinguished support types, including: advisory systems; information bases; learning experiences; job aids and communication aids. In the CASCADE-SEA study, the notion of performance support was characterized by four main components, based on a synthesis of those advocated by experts (Bastiaens, 1997; Collis & Verwijs, 1995; Gery, 1995, 1997; Nieveen, 1997; Raybould, 1990; Stevens & Stevens, 1995): advice, tools, learning opportunities and communication aids, shown in Fig. 1. Advice refers to either generic tips for executing a particular task or tailor-made guidelines, based on what the system knows about a user’s specific needs and context. Tools may be located outside the software (external programs) or inside the program, e.g., in the form of templates (pre-structured forms that the user need only fill in to use) or checklists (lists of things to do or consider). Learning opportunities stimulate users to extend their existing procedural or conceptual knowledge, and can be offered explicitly (for example, in the form of a tutorial or a help file) or implicitly throughout the system (for example, by structuring activities in certain ways). Finally, the communication aids facilitate or stimulate dialogue in written or verbal forms, in real-time or asynchronously. Performance support research was conducted during the design and development of CASCADE-SEA with two main purposes. First, formative evaluation of individual prototypes facilitated improvements in the CASCADE-SEA system itself. Second, comparing and contrasting data across prototypes and with other tools and research findings helped to generate broader design principles for creating performance support tools. The following section describes how this dual orientation was integrated into the research design. 3. Design of the study As mentioned, previous exploration into the computer’s potential for offering added value to curriculum development has produced promising results (Nieveen, 1997) particularly with regard to the creation of classroom materials (Nieveen & van den Akker, 1999). The research conducted throughout the design and development of CASCADE-SEA builds on existing knowledge in this domain. The main research question guiding the performance support study was, ‘‘What are the characteristics of a valid and practical performance support tool that can help teachers grow professionally and advance curriculum implementation within the domain of secondary level science and mathematics materials development in southern Africa?’’

S. McKenney / Computers & Education 50 (2008) 248–261

251

3.1. Research approach As mentioned in the previous section, the performance support study intended to provide revision guidelines for the CASCADE-SEA program during evolutionary prototyping as well as more general design principles for shaping performance support. Given these goals, the CASCADE-SEA study was structured as design research, in accordance with the definition of Barab and Squire (2004, p. 2) ‘‘a series of approaches, with the intent of producing new theories, artifacts, and practices that account for and potentially impact learning and teaching in naturalistic settings’’. This approach was selected to because of the opportunities it affords to study progressive approximations of ideal interventions in their target settings, while also emphasizing articulation of principles that underpin their design (cf. Collins, Joseph, & Bielaczyc, 2004; van den Akker, 1999; van den Akker, Gravemeijer, McKenney, & Nieveen, 2006). During the cyclic process of design, formative evaluation and revision, CASCADE-SEA prototypes were used as a vehicle to learn about shaping support within a valid and practical performance support system for use by southern African science and mathematics teachers engaged creating low-cost, reprintable materials. The criteria of validity and practicality were defined as follows. A scientifically valid performance support system would (a) use state of the art knowledge; and (b) evidence internal consistency. A practical performance support system would (a) provide clear procedural specifications for performing the task at hand; (b) be congruent with the needs and wishes of the users in the target setting; and (c) offer support at an acceptable cost (i.e., the ratio of time, effort and financial resources that must be invested in order to gain returns in time, effort, satisfaction, learning and recognition). 3.2. Data collection The performance support study took place alongside the 4 years of CASCADE-SEA program design and development. The research was divided into three main phases: needs/context analysis; design/formative evaluation of prototypes; and a semi-summative evaluation. The analysis phase contained 2 cycles, one featuring literature review and another featuring site visits to gather design specifications. The design and development phase contained four cycles, each focusing on the formative evaluation of a (revised) prototype. Because the CASCADE-SEA study focused on program design and development, but not implementation, the final evaluation was relatively limited and contained both formative and summative aspects, the former being most relevant to shaping performance support. Each cycle in the study consisted of several data collection circuits, each circuit used one of the following four strategies: developer screening, expert appraisal, micro-evaluation or full-scale tryout. In total, 33 circuits of data collection took place, gathering responses from approximately 500 respondents (users and experts) through interviews/walkthroughs, questionnaires, focus group discussions, observations, logbooks and document analysis. Fig. 2 illustrates the performance support data collected per phase and cycle. Cell shading indicates how many data circuits addressed performance: white = none; light gray = half of less; dark gray: more than half.

Fig. 2. Performance support data per phase and cycle. Abbreviations: SAK, state of the art knowledge; INC, internal consistency; PS, procedural specifications; CON, congruence; COS, cost.

252

S. McKenney / Computers & Education 50 (2008) 248–261

The shading shows a gradual shift in emphasis from validity aspects earlier in the study toward practicality aspects later on. This is a logical consequence of the fact that the analysis phase relied more on literature review whereas the design phase featured formative evaluation of working prototypes. For specific information on the cycles and circuits, including instruments used per circuit, please refer to McKenney (2001). 4. About the software As previously mentioned, the performance support study was integrated into the trajectory of the CASCADE-SEA program’s design and development. To facilitate understanding of the findings, a limited description of the final version of the software is given here. The CASCADE-SEA program consists of two elements: a CD-ROM (or 16 diskettes) and a website. Although the number of CASCADE-SEA users with Internet access is rapidly increasing, many still work with the system in an off-line setting. For this reason, the website was created as a supplement to the main program. The CASCADE-SEA main program walks users through the complex process of curriculum materials development. The program asks users to consider what they would like to achieve (why they are making materials, and what kinds of materials would be useful for that particular setting). If the developer already has a basic rationale in mind, then the computer helps to make this explicit and generates a ‘Rationale Profile’ that may then be used in discussion with co-developers. Should users have difficulty determining key issues related to the materials they are about to develop, then CASCADE-SEA will recommend that the analysis section be visited. In the analysis portion of the program, support is offered in conducting a materials needs and context analysis, which (when completed) will then aid in elaborating the rationale. Once the user has generated sufficient specifications regarding the kind(s) of materials to be developed, the design phase supports the creation of these materials. It helps the user to map out a lesson series, build individual lessons and to think about the layout, form and style to be applied. For users who have completed some of the development, ranging from rationale formation to a complete lesson series, support is also available for conducting a formative evaluation of that which has been designed so far. The evaluation component is heavily based on the original CASCADE program (Nieveen, 1997), although it has been translated in terms of both language and context. Fig. 3 shows

Fig. 3. ‘Create Plan’ page from the ‘Make Materials’ section in CASCADE-SEA’s design area.

S. McKenney / Computers & Education 50 (2008) 248–261

253

the ‘Create Plan’ page from the design area of the CASCADE-SEA program. For additional information on the program content, please see McKenney (2001, 2005). 5. Findings This study set out to investigate the characteristics of a valid and practical performance support tool that can help teachers grow professionally and advance curriculum implementation within the domain of secondary level science and mathematics materials development in southern Africa. The CASCADE-SEA system was designed to maximize two potential benefits of performance support: (1) improving task performance (by helping teachers to create good quality materials); and (2) contributing to professional learning (particularly by stimulating reflection during the materials creation process). Data collected during the design and development of the CASCADE-SEA system helped inform evolution of prototypes and contributed to the generation of more generic design principles for tools of this nature. In particular, the research offered insights on how to shape and improve the validity and practicality of system support. Because the volume of findings that influenced CASCADE-SEA’s design would be too numerous to mention, this section contains illustrations of how data from the first two phases (analysis and design) helped shape the performance support within the final version of program. Thereafter, specific performance support characteristics are addressed. 5.1. Validity As mentioned earlier, a valid performance support system would contain state of the art knowledge. In the case of CASCADE-SEA, such knowledge pertains to advice on materials design as well as guidance on embedding materials in professional development activities. Participants evaluating both the first and the second prototypes offered suggestions on how to help users improve the pedagogical quality of their lessons. Such ideas shaped both the overt support offered (e.g., choice menus) in the lesson design area as well as the implicit support given through the structure of the lesson template (which contained these headings: summary, preparation, lesson body, conclusion, teacher notes). In addition to integrating state of the art knowledge, tips, guidelines, templates, advice and help functions were to be offered in an internally consistent fashion. Participants recommended that each main phase in the program produce some kind of tangible output. They recommended that such an output would record user decisions, represent them in another format and offer opportunity for changes, updates or tailoring. Each main component in CASCADE-SEA now features such documents (rationale profile, analysis plan, lesson plans and evaluation plan) as well as guidelines on how and why to customize them. 5.2. Practicality One aspect of practicality pertains to the way the system provides clear procedural specifications for performing the task at hand. In CASCADE-SEA, this means describing program content and use clearly and concisely. Since English is not first language of most users, some participants found the level of English used in the program to be challenging. One response to this finding was the creation of an interactive agent (Kasey, the bird) who, among other functions, offers clarification of difficult words in each area of the program. Another element of practicality relates to congruence: the relevance and usability of the support offered. During the analysis phase, participants identified the most promising setting for the use of the proposed program: teacher resource centers (TRCs). The system was, therefore, designed not for individual use, but for shared use by design teams whose work is guided by a TRC facilitator. Throughout design and development, attention was given to maximizing the potential of a shared resource (e.g., by asynchronous sharing through the database and targeting small teams of designers, not individuals). Finally, the support within CASCADE-SEA was to be extensive enough to that the threshold of investment cost to the user would be attractive (or at least acceptable). Participants emphasized the importance of subjectspecific support. In order to provide tailor-made support without the associated risks of rigidity, the CASCADE-SEA program offers generic guidelines, illustrated through subject-specific examples (e.g., sample

254

S. McKenney / Computers & Education 50 (2008) 248–261

lesson series goals, concept map templates). Participants found this balance to be a useful start, and requested more of the same type of support. As a result, ready-made subject-specific documents were incorporated into each main area of the program (along with recommendations on how to customize them). 5.3. Performance support During the final evaluation phase, participants found CASCADE-SEA to be satisfactorily valid and practical. Additionally, they indicated that the system possessed the potential to help users create better quality materials than they would without the aid of the computer, and to learn from the process. Detailed information on the final evaluation of the CASCADE-SEA program as a whole is available elsewhere (McKenney, 2001, 2005). Because final evaluation was less focused toward fine-tuning the program, few findings from this phase contributed to improved understanding of performance support in general, nor in the CASCADE-SEA program in particular. Having concluded that CASCADE-SEA may be considered valid and practical, and illustrated several ways in which these criteria were met within the system, the following section examines characteristics of CASCADE-SEA’s performance support. In accordance with the guidelines for task support defined previously (Fig. 1), this section is arranged by support element: advice, tools, learning opportunities and communication aids. As with the validity and practicality findings, the following set is not intended to offer a comprehensive listing of system support, but to clarify how the performance support can take shape in a functioning program. 5.3.1. Advice As previously stated, advice may be tailor-made or generic. Examples of tailor-made advice given within the CASCADE-SEA program include consequence warnings, helping set priorities and offering pro- or reactive feedback. In the CASCADE-SEA program, tailor-made advice is also given through reminders. For example, as illustrated in Fig. 4, arrows remind users what instruments they will need to conduct their needs/context analysis. Additionally, consistency checks are in place throughout the program whereby illogical steps/options are visible, but disabled (e.g., users can see evaluation approaches but only edit them once

Fig. 4. ‘Conducting Observations’ page in analysis area.

S. McKenney / Computers & Education 50 (2008) 248–261

255

selected for use). CASCADE-SEA further provides heuristics, e.g., through design recommendations in the rationale section that are based on (combinations of) specific user input. Generic advice is also offered through tips, recommended practices, predictions and decision-making support. For example, CASCADE-SEA’s help files contain literature listings of the sources used to build each component (this pertains primarily to the content area of curriculum materials development). Further, examples are offered throughout the system, such as the ready-made concept maps demonstrating lesson series content that are offered in the design area. Additionally, printable suggestions are offered for how to carry out analysis or evaluation data collection (see also Fig. 4). Finally, sample choices are presented to the user, as in the design area where default (editable) goals are pre-loaded into the user’s text entry field, based on the subject and topic selected in the rationale.

5.3.2. Tools Internal tools help users prepare and/or automate the execution of (sub)tasks. They also regularly summarize user actions. In CASCADE-SEA, templates are offered, such as research instruments for use as is, or altered; the program also includes suggestions for customization. Checklists are offered to the user after drafting analysis, evaluation and lesson plans, with hints on where to target polishing efforts. Decision and discussion charts are offered at the end of the analysis and evaluation components to help users visualize their process and better understand the task at hand. In terms of automated support, CASCADE-SEA warns users who are about to replace existing documents, saves automatically every 10 min to the archival structure defined at the start of a project and copies all accessed documents (instruments, guidelines, profiles, etc.) into the project folder (archives) for easy access. CASCADE-SEA generates (draft) products in each main area of the program, as illustrated in Fig. 5. The system also offers personal reminders, e.g., message box appears with personal suggestions (based on specific user input) when a rationale profile is generated. Further, histories (navigation, use, time, etc.) are readily available through the recent pages pull-down menu. The database of existing lesson plans, activities and media may be accessed from within the design area of the system; or

Fig. 5. Draft product from ‘Toolbox File’ being edited in word processor.

256

S. McKenney / Computers & Education 50 (2008) 248–261

directly from CASCADE-SEA’s toolbox. User selections are collected and placed into a ‘Toolbox File’ that may be exported and edited (see Fig. 5). External tools help extend (sub)tasks and are also offered through CASCADE-SEA’s toolbox, which includes external programs for drawing, editing text and creating concept maps (see also Fig. 5). In addition, the CASCADE-SEA support website contains information as well as links to other useful sites relating to curriculum development and subject matter content. The website also connects users to an additional database with lesson activities, completed plans and media.

5.3.3. Learning opportunities Implicit learning opportunities include advanced organization and setting up paths that match the desired flow of work. In CASCADE-SEA, the visual appearance suggests a method for doing (sub)tasks, as the main menu and sub-screens offer (but do not strictly enforce) a performance sequence; see left navigation bar in Figs. 3, 4 and 6. Further, previews (e.g., of drafted documents) help users consider whether or not to export them to a word processor for editing. The interactive agent (bird metaphor) named Kasey signifies the voice of CASCADE-SEA, and offers context sensitive help. Finally, user performance is lightly monitored, through the recommendations for analysis and evaluation approaches that are based on user input. Explicit learning opportunities are also offered within CASCADE-SEA. For example, answers to (procedural or conceptual) questions (who, what, when, where, how, why) are provided in most areas of the program. Demonstrations are available (e.g., navigation tutorial shows how to move around the program) as are explanations (extra information located behind buttons on most pages). Further, explicit learning takes place through the instructions and information given at the top of each page, and users can visualize their location in the curriculum materials development process through the ‘‘Where am I?’’ button located on nearly every page. Users receive evaluative feedback when they attempt to provide answers that are inconsistent with other information they have supplied to the computer. And instructions for certain tasks (e.g., planning and hosting a workshop to introduce materials – see Fig. 6) are provided along with templates and samples.

Fig. 6. Workshop planning guidelines.

S. McKenney / Computers & Education 50 (2008) 248–261

257

Fig. 7. ‘Teaching Methods’ page from values area of rationale.

5.3.4. Communication aids Aids to stimulate communication within the CASCADE-SEA program are intended for real-time as well as asynchronous use. Generally speaking, they aim to facilitate communication within teacher design teams, though there are a few that provide links to external expertise. The shared knowledge/database on the website is one example, as it provides lesson plans, activities, and media, and the resource grows as users contribute to it. The CASCADE-SEA website also hosts a bulletin-board style discussion group, and is linked to pages with relevant list serves and chat rooms. Verbal communication is stimulated because it can help foster reflection. Within a team it can spark discussion and debate, and when a mentor is present, it can help to provoke coaching. CASCADE-SEA offers checklists that directly stimulate communication (e.g., analysis section guidelines area includes checklist for working in teams) as well as provocative elements that indirectly do so (e.g., position statements in the values area of the rationale section stimulate group discussion), see Fig. 7. 6. Discussion and conclusions Throughout the analysis and design phases, participants offered suggestions for improvements. These suggestions were used to learn about and shape valid and practical support in the forms of advice, tools, learning opportunities and communication aids. Where feasible and desirable, these recommendations were integrated into the final version of the software, although not before ‘it can be done, but should it be done?’ deliberations (cf. Miller, 1997) were addressed by the program developers. As mentioned previously, the final evaluation of CASCADE-SEA was relatively limited and contained both formative and summative aspects. In terms of system refinements, the majority of improvement recommendations concerned the deepening of existing elements. Toward elaboration and improvement, it could be quite useful to seek out compatible components already up and running in the software industry, rather than trying to ‘reinvent the wheel’. Based on those findings, the following list offers four extensions of the support elements that should be considered in future versions of this software:

258

S. McKenney / Computers & Education 50 (2008) 248–261

 Advice: To demonstrate promising practices, the software should contain copious samples of CASCADESEA-created products, such as polished up rationale profiles and customized evaluation plans. While many sample products are available within the system, few model the conceptual processes that take place when customizing computer-generated drafts.  Tools: The addition of assessment-building software to aid in the development of assessment strategies and materials could lend a much-needed boost to the assessment area of the design component.  Learning opportunities: Partnering with other organizations that produce or manage lesson materials (or parts thereof) could allow for significantly more examples to be included along with the program.  Communication aids: Enhanced integration of database connectivity could allow for additional opportunities to share and learn from good illustrations, particularly in the analysis and evaluation (research) phases. For example, CASCADE-SEA already offers sample instruments to support the user in analysis and evaluation activities. But if linked to a database through which the instruments were generated, two additional advantages would arise: (1) by selecting the desired fields, the program could generate tailor-made schemes; and (2) by having the source items in the database, the system could also offer assistance in processing the data once it is collected. During this study, four software companies and one publisher agreed to the open distribution of their materials. Given the goal of supporting educational improvement initiatives in developing countries, additional collaboration seems a likely possibility, for those organizations with a philanthropic history. While this study has shown that the computer does have the potential to support the complex process of curriculum development, the resulting advice, tools, learning opportunities and communication aids can still be greatly improved, particularly by partnering with producers of relevant resources. Some other improvements of the existing program might benefit less from the tangible products of others and more from the theoretical ones. For example, building better, more intelligent support for designing lessons according to particular teaching/learning styles presents quite a challenge. Reigeluth (1999) tackled this challenge; and together with a class full of graduate students, created a ‘Teacher Toolkit’, designed for this exact purpose (Reigeluth, 2000). Perhaps the insights from projects like this one could be integrated into future versions of CASCADE-SEA. The ideas underpinning promising designs such as the ‘Teacher Toolkit’ can contribute to knowledge building and help other designers generate ideas to use in their own settings. When made explicit and shared with a wider audience, such insights are referred to as design principles (Linn, Davis, & Bell, 2004; van den Akker, 1999) domain theories (Edelson, 2006) heuristics (DBRC, 2003) or lessons learned (Vanderbilt, 1997). While the format may vary, design principles generally offer the kinds of heuristic guidelines described by van den Akker (1999, p. 9) ‘‘If you want to design intervention X [for purpose/function Y in context Z]; then you are best advised to give that intervention the characteristics C1, C2,. . ., Cm [substantive emphasis]; and do that via procedures P1, P2,. . ., Pn [procedural emphasis]; because of theoretical arguments T1, T2,. . .,Tp; and empirical arguments E1, E2,. . .Eq’’. Design principles are not intended as recipes for success, but to help others select and apply the most appropriate substantive and procedural knowledge for specific design and development tasks in their own settings (McKenney, van den Akker, & Nieveen, 2006). Based on the findings from literature and field research conducted alongside the CASCADE-SEA program design and development, Fig. 8 and the text below present design principles for performance support tools of this nature. It should be noted that the distinctions shown in Fig. 8 serve the purpose of discussing and understanding (sub)system characteristics. The ultimate aim remained the creation of an integrated whole, in which these features would be blended together. ‘‘In performance-centered design. . .the whole is greater than the sum of its parts. Much like attempts to understand the functionality and behavior of the human body, we can separate out the component systems and elements, but true understanding requires an integrated view since none of these systems or elements operates in isolation’’, (Gery, 1995, p. 54). So, in addition to discussing individual support component guidelines, attention should also be given to system elements that came into existence as a result of component integration. With the overall aim of reducing the workload on the users and thereby allowing them to focus on the task at hand, such systems should:

S. McKenney / Computers & Education 50 (2008) 248–261

Fig. 8. Design principles for performance support.

 Match the natural task flow by: – being layered for multiple levels of use; – presenting (only) relevant information at the time/point of need; – offering growth potential while minimizing need for interpretation of special terms.  Allow the user to rely upon recognition instead of recall by: – providing (system-wide) search capabilities; – listing and linking histories of user progress, navigation etc.  Feel friendly by being: – easy to learn (in coached settings);

259

260

S. McKenney / Computers & Education 50 (2008) 248–261

– inviting (once introduced to it, new computer users should feel motivated to use the system); – forgiving (allowing mistakes and opportunity for correction); – safe (users must not fear that experimentation can ruin something on the computer).

While evaluation data show much room for improvement in the software, the CASCADE-SEA research has resulted in a description of the characteristics of a valid and practical performance support tool that can help teachers grow professionally and advance curriculum implementation within the domain of secondary level science and mathematics materials development in southern Africa. These characteristics were described in specific terms (relating to this particular program) as well as more generic design principles. Particularly when it comes to educational software design, additional examples of design research are needed to extract insights that can help further subsequent design efforts. Sharing design work that strives not only toward creating useful artifacts, but also toward increasing the transparency of a robust design process is needed to help increase the quality of educational design practice. References Ball, D., & Cohen, D. (1996). Reform by the book: what is – or might be – the role of curriculum materials in teacher learning and instructional reform? Educational Researcher, 25(9), 6–8, 14. Barab, S., & Squire, K. (2004). Design-based research: putting a stake in the ground. Journal of the Learning Sciences, 13(1), 1–14. Bastiaens, T. (1997). Leren en werken met electronic performance support systems [learning and working with electronic performance support systems]. Doctoral dissertation, University of Twente, Enschede. Ben-Peretz, M. (1990). The teacher-curriculum encounter. Albany: State University of New York Press. Caillods, F., Go¨ttelmann-Duret, G., & Lewin, K. (1996). Science education and development: planning and policy issues at secondary level. Paris: Pergamon. Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: theoretical and methodological issues. Journal of the Learning Sciences, 13(1), 15–42. Collis, B. (1994). Teacher education; technology in. In T. Huse´n & T. Postlethwaite (Eds.), The international encyclopedia of education (pp. 6004–6008). Oxford: Pergamon Press. Collis, B., & Verwijs, C. (1995). A human approach to electronic performance and learning support systems: hybrid EPSSs. Educational Technology, 35(1), 5–21. Design Based Research Collective (DBRC) (2003). Design-based research: an emerging paradigm for educational inquiry. Educational Researcher, 32(1), 5–8. de Feiter, L., Vonk, H., & van den Akker, J. (1995). Towards more effective science teacher development in southern Africa. Amsterdam: VU University Press. Edelson, D.C. (2006). What we learn when we engage in design: Evaluating design research from a learning perspective. In J. van den Akker, K. Gravemeijer, S. McKenney, & N. Nieveen (Eds.), Educational design research (pp. 100–106). London: Routledge. Flechsig, K. (1989). A knowledge-based system for computer-aided instructional design (cedid). In Education and informatics: Proceedings of a UNESCO conference (pp. 400–403). Paris: UNESCO. Fullan, M. (1991). The new meaning of educational change. New York: Teachers College Press. Gery, G. (1989). Training vs. performance support: inadequate training is now insufficient. Performance Improvement Quarterly, 2(3), 51–71. Gery, G. (1991). Electronic performance support systems: how and why to remake the workplace through the strategic application of technology. Boston: Weingarten. Gery, G. (1995). Attributes and behaviors of performance-centered systems. Performance Improvement Quarterly, 8(1), 47–93. Gery, G. (1997). Granting three wishes through performance-centered design. Communications of the ACM, 40(7), 54–59. Grabinger, R., Jonassen, D., & Wilson, B. (1992). The use of expert systems. In H. Stolvich & E. Keeps (Eds.), Handbook of human performance technology: a comprehensive guide for analyzing and solving performance problems in organizations (pp. 365–380). JosseyBass: spring. Gustafson, K., & Reeves, T. (1990). Idiom: a platform for a course development expert system. Educational Technology, 30(3), 19–25. Hudzina, M., Rowley, K., & Wagner, W. (1996). Electronic performance support technology: defining the domain. Performance Improvement Quarterly, 9(1), 36–48. Linn, M., Davis, E., & Bell, P. (2004). Internet environments for science education. London: spring. Loucks-Horsley, S., Hewson, P., Love, N., & Stiles, K. (1998). Designing professional development for teachers of science and mathematics. Thousand Oaks, CA: Corwin Press. McKenney, S. (2001). Computer based support for science education materials developers in africa: exploring potentials. University of Twente, Enschede. McKenney, S. (2005). Technology for curriculum and teacher development: software to help educators learn while designing teacher guides. Journal of Research on Technology in Education, 28(2), 167–190.

S. McKenney / Computers & Education 50 (2008) 248–261

261

McKenney, S., Nieveen, N., & van den Akker, J. (2002). Computer support for curriculum developers: cascade. Etr&D-Educational Technology Research and Development, 50(4), 25–35. McKenney, S., & van den Akker, J. (1998, April 13–17). Exploring CASCADE-SEA: computer assisted curriculum analysis, design and evaluation for science education in Africa. Paper presented at the AERA annual meeting, San Diego. McKenney, S., & van den Akker, J. (2005). Computer-based support for curriculum designers: a case of developmental research. Educational Technology Research and Development, 53(2), 41–66. McKenney, S., van den Akker, J., & Nieveen, N. (2006). Design research from a curriculum perspective. In J. van den Akker, K. Gravemeijer, S. McKenney, & N. Nieveen (Eds.), Educational design research (pp. 67–90). London: Routledge. Miller, B. (1997). What is in the realm of possibilities may not be possible [online]. Available from http://www.epssinfosite.com/ di_posible.htm. Nieveen, N. (1997). Computer-based support for curriculum developers: a study on the potential of computer support in the domain of formative curriculum evaluation. Doctoral dissertation, University of Twente, Enschede. Nieveen, N., & Gustafson, K. (1999). Characteristics of computer-based tools for education and training development. In J. Van den Akker, R. Branch, K. Gustafson, N. Nieveen, & T. Plomp (Eds.), Design approaches and tools in education and training (pp. 155–174). Dordrecht: Kluwer Academic Publishers. Nieveen, N., & van den Akker, J. (1999). Exploring the potential of a computer tool for instructional developers. Educational Technology Research and Development, 47(3), 77–98. Pirolli, P., & Russel, D. (1990). The instructional design environment: technology to support design problem-solving. Instructional Science, 19(2), 121–144. Raybould, B. (1990). Solving human performance problems with computers. Performance and Instruction, 29(10), 4–14. Raybould, B. (2000). Building performance-centered web-based systems, information systems, and knowledge management systems in the 21st century. Performance Improvement, 39(6), 32–39. Reigeluth, C. (1999). Instructional design theories and models volume ii: a new paradigm of instructional theory. Mahwah, NJ: Lawrence Erlbaum Associates. Reigeluth, C. (2000). Teacher’s toolkit, Annual AERA Meeting Presentation. New Orleans. Rosenberg, M. (1995). Performance technology, performance support and the future of training: A commentary. Performance Improvement Quarterly, 8(1), 94–99. Rosendaal, B., & Schrijvers, J. (1990). Cocos. Opleiding and Ontwikkeling, 11, 8–14. Stevens, G., & Stevens, E. (1995). Designing electronic performance support tools: Improving workplace performance with hypertext, hypermedia and multimedia. Engelwood Cliffs, NJ: Educational Technology Publications. van den Akker, J. (1999). Principles and methods of development research. In J. van den Akker, R. Branch, K. Gustafson, N. Nieveen, & T. Plomp (Eds.), Design approaches and tools in education and training (pp. 1–14). Dordrecht: Kluwer Academic Publishers. van den Akker, J., Gravemeijer, K., McKenney, S., & Nieveen, N. (2006). Introduction to educational design research. In J. van den Akker, K. Gravemeijer, S. McKenney, & N. Nieveen (Eds.), Educational design research (pp. 3–7). London: Routledge. Vanderbilt, cognition and technology group (1997). The Jasper project: Lessons in curriculum, instruction, assessment and professional development. Mahwah, NJ: Lawrence Erlbaum Associates. Wilson, B., & Jonassen, D. (1991). Automated instructional systems design: a review of prototype systems. Journal of Artificial Intelligence in Education, 2(2), 17–30. Winslow, C., & Bramer, W. (1994). Future work: putting knowledge to work in the economy. New York: The Free Press. Zhongmin, L., & Merril, D. (1991). Id expert 2.0: design theory and process. Educational Technology Research and Development, 39(2), 53–69.