Information and Software Technology 1994 36 (5) 251-258
Predicting the development effort of multimedia courseware I M Marshall, W B Samson, P I Dugard Department of Mathematical & Computer Sciences, University of Abertay Dundee, Bell Street, Dundee DD1 1HG, UK
W A Scott Marconi Simulation, The John Sutcliffe Building, Fulmer Way, Donibristle Industrial Park, Nr. Dunfermline, Fife KY11 5JX, UK
A recurring problem faced by multimedia courseware developers is the accurate estimation of development effort. This paper outlines a metrics based model for predicting the development effort of multimedia courseware. A composite model of multimedia courseware development effort is proposed which makes use of a Rayleigh curve and cost drivers. Initial analysis of cost drivers and delivery time are described along with future work to develop a rigorous model for multimedia courseware development. Keywords: multimedia, courseware, metrics, courseware engineering
Educationalists have used computers to deliver instruction using interactive video and other media for over 10 years ~. From these pioneering days the major constraints on widespread use of multimedia was not the hardware but rather the development effort of preparing courseware. Today, the development effort required to produce multimedia courseware remains substantial despite the reduction in media costs and improved functionality of authoring software. A number of authors have reported development to delivery time ratios from 50:1 to 800:1 and beyond 2-s. With development to delivery ratios of this size it is essential to make realistic estimates of the development effort required to produce courseware. There have been repeated calls for systematic research into the development effort required to prepare courseware 9-'2. In this paper, the potential for research into the development of courseware metrics to support the planning and production of cost effective multimedia systems for educational or training purposes is discussed. Early research has indicated that although multimedia development effort estimating systems exist, these fail one or more of Boehm and Wolverion's criteria for evaluating cost estimation models '3. In this paper initial research into multimedia courseware metrics is described. While the proposed model is targeted at education and training, it could be extended to the development of commercial multimedia for information systems.
0950-5849/94/050251-0g © 1994 Butterwonh-HeinemannLtd
The scope of s o f t w a r e m e t r i c s The original motivation for the development of software measures, models and metrics was managerial with the aim of predicting project costs at early phases in the software development life-cycle ~4. There are a number of alternative models for predicting software development effort and cost available to help both software engineers and managers predict the development effort and cost of a project. The study of software metrics attempts to quantify various aspects of software development. Fenton z4 described software metrics as an all-embracing term used to describe a wide range of diverse activities including: • • • • • • • • •
Cost and effort estimation models and measures. Productivity measures and models. Quality control and assurance. Data collection. Quality models and measures. Reliability models. Performance evaluation and models. Algorithmic and computational complexity. Structural and complexity metrics.
A typical example of a cost or effort estimation metric is Boehm's COCOMO or COnstructive COst MOdel 15. This is a composite modeP 6 which was developed using a combination of analytical equations, statistical data fitting and
251
Predicting the development effort o f multimedia courseware: I M Marshall et al.
Figure 1 The waterfall model of software life-cycle
Figure 2 The waterfall model of multimedia courseware development
expert judgement. COCOMO is one of the best known and widely documented composite models. The basic equation for estimating the software development effort is given by: Effort = a~(Size)bi m(X) The Effort is measured in person months while Size represents an estimate of the thousands of delivered source code instructions and m(X) is a composite multiplier which depends on 15 main cost driver attributes. The values of ai and bi depend on the mode of development. While there is some criticism of COCOMO concerning the subjectivity and independence of the cost drivers ~7-z°, it still forms the basis for a number of metric based estimating and planning tools. M u l t i m e d i a courseware engineering m o d e l s Conte, Shen and Dunsmore indicate that the development of effort metrics is only possible within a well-defined phased base software development modeP 6. Figure 1 describes the waterfall model ~5which is one of the commonly accepted models of the software development life-cycle. Each phase is well defined with start and end points. Comparable models have been proposed for multimedia and computer based learning courseware development2~-24. Figure 2 describes a proposed waterfall model for multimedia courseware development. The detailed objectives, target audience and the average training hours of the course are defined in the courseware specification phase. Discussion with developers indicates that this is the minimum information they require to begin the development process. The overall instructional design phase is based on discussions with subject matter experts, courseware designers and the client. Once agreement has been reached with the client, then detailed design is undertaken for the media to be used in the final course. In this phase, additional experts, such as media specialists, graphics artists, design editors and programmers, may be added to the team2s. During the multimedia development phase the graphics, audio, video, sound, simulation and courseware structure is developed. The courseware integration phase brings the various multimedia elements together on one platform and
252
rigorously tests them. In the testing phase the courseware is piloted with learners. Problems identified by the evaluation of the courseware during the testing phase are fed back into the model to improve the quality of the final product. Once the product finishes the testing phase it is then published. Normally in the maintenance phase the courseware is only changed or altered when serious problems are detected or the client requires changes in order to update the content.
The development o f m u l t i m e d i a metrics Using the waterfall model, it then becomes possible to develop metrics to describe the multimedia courseware processes, products or resources. While it can be argued that the software programming aspects of multimedia courseware could use existing metrics, the instructional aspects, complexity of media development and interfacing do not match the characteristics of normal software development. Software engineering metrics were originally developed using data processing or, in some cases, realtime development systems data and development models. The incorporation of media design, development, and integration into the normal software development life-cycle reduces the likelihood that straight transfer of the software metrics will provide useful information on the various aspects of multimedia processes, products or resources. A number of authors have investigated informally the factors which affect the development effort of multimedia courseware 2-8 Marshall, Samson and Dugard 26 indicated that most of these models failed the Boehm and Wolverton's criteria for evaluating cost estimation models ~3. There would appear to be an opportunity to establish multimedia courseware metrics to objectively measure aspects of the development process, products or resources.
Multimedia Effort Estimation Method Boehm's COCOMO j5 was used as the basis for the development of the Multimedia Effort Estimation Method (MEEM). In common with COCOMO the underlying
Information and Software Technology 1994 Volume 36 Number 5
Predicting the development effort of multimedia courseware: I M Marshall et al. Table 1. Multimedia cost drivers
vagcaocm ~
Group
Cost driver
Course Difficulty (CD)
Number of course objectives Level of course objectives Existing course material
Interactivity (IN)
Complexity of interface Level of interactivity Type of question feedback Majority question style Graphics requirements Graphics density Animation requirements Animation density Audio requirements Audio density Video requirements Video density Simulation requirements Simulation density
Development Environment (DE)
Production environment Instructional design, development and delivery methodology Size of proposed development team Development team's subject matter experience Development team's multimedia experience
Subject Expertise (SE)
Subject matter expert's multimedia experience Availability of subject matter expert
J 0Soun0
Figure 3 Rayleigh manpower loading curve assumption is that staff utilization during development can be modelled to a Rayleigh c u r v e 14. Figure 3 shows a Rayleigh curve for the overall duration of the project. This can be made up from a number of individual Rayleigh curves which represent the individual phases of the multimedia project development. The proposed formula for development effort is based on a composite model which incorporates a combination of analytical equations, statistical data fitting and expert judgement ~6. Several researchers have criticized the Rayleigh curve assumption for software development 27-29 but it is a reasonable starting point for this model. Effort = a(Average Training Delivery Hours) b CD(X) Effort is measured in person hours, and Average Training Delivery Hours is an initial estimate of the number of hours of training required. There is some debate as to the validity of using Average Training Delivery Hours as the basis of any metric based model of courseware development effort 29. However, discussion with developers indicate that commissioning briefs will normally state target training delivery hours. The value of constants a and b are used to map data on to the proposed model and to convert average student hours into development staff hours. Cost driver CD(X) is dependent on a number of factors which affect the development of multimedia courseware and are based on expert judgement or analysis of the proposed project.
The proposed cost drivers are grouped together under four headings: • • • •
Course Difficulty (CD). Interactivity (IN). Development Environment (DE). Subject Expertise (SE).
Currently the cost drivers are defined in terms of an ordinal scale which will be validated through the collection of experimental data and statistical analysis. The values of each cost driver range from CD (very low) through to CD (very high). Each set of cost drivers will have its own independent range of values to be determined by data collection and analysis, but for compactness the values are listed under common headings in Tables 2 to 5.
Course Difficulty (CD) Proposed cost drivers in multimedia development A number of authors have identified lists of potential cost drivers which may contribute to the overall development effort of multimedia courseware 2-8. Discussion with developers and analysis of existing factors produced the initial lists of proposed cost drivers in Table 1. It is likely that some of these cost drivers are not independent and they will be eliminated from the final effort esimation model.
Table 2 describes the cost drivers grouped under the Course Difficulty heading. These factors increase or decrease the effort required to develop multimedia courseware. This includes cost drivers which account for the number and level of objectives as well as the existence of prepared course material.
Number of course objectives. The number of objectives to
Table 2. Course Difficulty cost drivers
Number of course objectives Level of course objectives Existing course material
CD (very low)
CD (low)
CD (normal)
CD (high)
CD (very high)
Less than 20
21 to 40
41 to 60
61 to 80
Greater than 80
Concrete concepts
Abstract concepts
Lower order principles
Higher order principles
Problem solving
Rewrite of existing multimedia material
Rewrite of existing CBT material
Rewrite of written material
Rewrite of tutor delivered material
New course
Information and Software Technology 1994 Volume 36 Number 5
253
Predicting the development effort of multimedia courseware: I M Marshall et al.
~ I
I ~
~
Ofdat
explain ideas such as inflation. Lower order rules are used to describe objectives which involve learning simple rules. Higher order rules are used to describe objectives which involve the application of complex rules. The highest value of cost driver is allocated to problem-solving objectives. At this level the student would be applying previously learned behaviours in new or unusual situations. The level and type of course objectives are also used by a number of other multimedia development effort estimation methods 3-5,7.
~
~
Or~¢
Existing course material. Kearsley6 and Schooley7 both indicate that the existence of well prepared course material can significantly reduce the amount of effort required to prepare multimedia courseware. Similarly, rewriting existing multimedia material for another platform will require less effort than the creation of completely new material 3-7.
¢'~.,.,m, . . . . . i.
•
I-1 e-z r,.7~7~
¢,a,-.,zm,~
]
Figure 4 Hierarchy of course objectives be achieved is normally well defined, or can be defined, prior to the development of the courseware. The proposed scale makes an assumption that as the number of objectives increases, the size of the cost driver increases. Level of course objectives. The highest level of objective to be achieved in the multimedia courseware is assumed to contribute to the development effort4. Gagne and Briggs '3° classification of objectives is used as the basis of the cost driver scale. The objectives are classified in terms of the hierarchy shown in Figure 4. The lowest value is given to objectives which involve concrete concepts, which typically describe physical entities. Defined concepts describe objectives which try to
lnteractivity fiN) Table 3 describes the cost drivers under the Interactivity heading. Primarily this includes contributions to development effort due to the complexity of the interface, or media used in the creation of the courseware.
Complexity of interface. Kearsley6 identified the complexity of the user interface as being important in determining the amount of development effort required to create multimedia courseware. A simple text-based interface will typically require less effort to develop than a simple graphical interface or windowing environment. Level of Interactivity. Cohen describes Interactivity in terms
Table 3. Interactivity cost drivers
CD (very low)
CD (low)
CD (normal)
CD (high)
CD (very high)
Simple text based
Complex text based
Simple graphical based
Complex graphical based
Windowing graphical based
Level of activity
Linear
Simple branching
Complex branching
Simple adaptive
Complex adaptive
Type of question feedback
None
Right/Wrong
Right/Wrong with right feedback
Right/Wrong with relevant feedback
Right/Wrong with remediation and feedback on each wrong answer
Majority question style
True-False
Multiple choice
Single words
Limited free text
Other
Graphics requirements
None
Existing artwork
Simple original artwork
Complex original artwork
Extremely complex artwork
Less than 1 per 20 frames
1 per 20 frames
1 per 10 frames
1 per frame
More than 1 per frame
None
Existing animation
Simple animation
Complex animation
Mathematically accurate animation
Less than 1 per 20 frames
1 per 20 frames
1 per 10 frames
! per frame
More than 1 per frame
None
Existing audio
Simple original audio
Complex original audio
Extremely complex audio
Less than 1 per 20 frames
1 per 20 frames
1 per 10 frames
1 per frame
More than 1 per frame
None
Existing video
Simple original linear video clips
Complex original linear video clips
Complex original interactive video clips
Less than 1 per 20 frames
1 per 20 frames
1 per 10 frames
1 per frame
More than 1 per frame
None
Existing simulation
Simple original simulation
Complex original simulation
Realistic simulation
Less than 1 per 20 frames
1 per 20 frames
1 per l0 frames
1 per frame
More than 1 per frame
Complexity of interface
Graphics density Animation requirements Animation density Audio requirements Audio density Video requirements Video density Simulation requirements Simulation density
254
Information and Software Technology 1994 Volume 36 Number 5
Predicting the development effort of multimedia courseware: I M Marshall et al.
of the richness of the 'dialogue' between the learner and instructional program 3~. Golas4 states that the level of Interactivity expected in the final courseware product affected the development effort. A linear sequence with little or no interaction is given the lowest rating. Complex branching is used to describe programs in which feedback loops and a significant number of preprogrammed sequences are included. Adaptive is used to describe programs which include elements of intelligence and, perhaps, models of the student32.
Audio requirement and density. The audio density is used
Level of question feedback. A number of authors have identified level of feedback as an important factor in predicting development effort3'5'7. Questions with no feedback require less development effort than questions which provide relevant feedback and remediation on each wrong answer 3t.
Video requirements and density. Bergman and Moore 33
Majority question style. Gery 3 found that the complexity of the questioning style used in the multimedia courseware development had a significant effect on the development effort. The proposed range of question style varies from simple true or false through limited free text input to other. The other category is used to describe complex interactions such as the definition of touch click areas.
Graphics requirements and density. These cost drivers attempt to describe the development effort required to produce graphical images for the multimedia courseware. The requirement scale describes the number of graphical images required in the proposed multimedia package. The graphical density is used to describe the average requirement for graphical images in the final multimedia product. Common sense indicates that the existence of suitable artwork considerably reduces the development effort required while the production of extremely complex artwork increases the development effort. Animation requirements and density. The animation requirements and density cost drivers describe the development effort required to produce animation sequences for multimedia courseware. A lower requirements cost driver is allocated if suitable animation already exists. A higher value of cost driver would be given to the production of extremely complex photo realistic animation. The graphical density is used to describe the average requirement for animation in the courseware.
to describe the average requirement while the audio requirements cost driver describes the development effort required to produce sound for the courseware. The existence of suitable sound clips considerably reduces the development effort and would result in a low value cost driver. Extremely complex audio clips with multiple voices and background sound effects would require more development time and are allocated a higher cost driver.
indicated that the video requirements and density significantly affected the development effort required to produce multimedia courseware. The requirements scale is used to describe the average complexity of the video required in the proposed multimedia package. The video density is used to describe the average requirement for graphical images in the final multimedia product. Simulation requirement and density. Schooley7 classified the production of simulation as a major contributor to the development effort required to produce multimedia courseware. The simulation requirements cost driver attempts to describe the development effort required to produce simulation for the multimedia courseware. Realistic simulation involving accurate movement, sound and response to student actions is more difficult to produce than a simple animation which simulates the movement of the planets around the sun. The density cost driver is used to describe the average requirements for simulation in the final multimedia product.
Development Environment (DE) The Development Environment heading groups together cost drivers which describe the production environment, instructional design method, and the size and experience of the team developing the courseware. Table 4 describes the cost drivers under this heading. Production environment. A number of authors have indicated that the type of production environment affects the development effort required to prepare multimedia courseware 3-7. The type of production environment used to produce the courseware is assumed to have an effect on the development effort. Unfortunately, the exact effect is unclear.
Table 4. Development Environment cost driven
CD (very low)
CD 00w)
CD (normal)
CD (high)
CD (very high)
Authoring environment
Authoring system
Authoring language
High level language
Low level language
None
Informal
Formal first generation
Formal second generation
Formal third generation
More than 15
10-15
5-9
2-4
1
Development team's subject matter experience
Expert knowledge of the subject
Good knowledge of the subject
Some knowledge of the subject
Knowledge of related subject
No knowledge of subject
Development team's multimedia experience
Extensive multimedia development experience
Some multimedia development experience
Extensive computer aided learning experience
Some computer aided learning experience
None
Production environment Instructional design, development and delivery methodology Size of proposed development team
Information and Software Technology 1994 Volume 36 Number 5
255
Predicting the development effort o f multimedia courseware: I M Marshall et al.
Table 5. SubjectExpertisecost drivers Subject matter expert's multimedia experience Availability of subject matter expert
CD (very low)
CD (low)
CD (normal)
CD (high)
CD (very high)
Extensivemultimedia development experience Unrestricted contact
Some multimedia development experience Daily contact
Extensivecomputer aided learning experience Weekly contact
Somecomputer aided learning experience
None
Monthly contact
Restrictedcontact
Avner 2 found that suitability for the task was more important than the range and complexitiy of facilities available in the authoring system. The current scale ranges from intelligent authoring environment34 which fully supports instructional design, development and delivery (ID3) of multimedia courseware, to the hand crafting of software using a low level language such as assembler. The scale assumes that the authoring system is suitable for the courseware being developed and the team have experience in using the tool.
Instructional design, development and delivery methodology. It is often stated that using a formal methodology improves the effectiveness of courseware production. However, De Diana and van Schaik 9 indicate ' . . . v e r y few studies are available that demonstrate that the use of specific design and development methods can in fact improve efficiency'. The cost driver assumes that the effect of using a formal instructional design, development and delivery methodology is to reduce the effort required to produce multimedia courseware. One of the goals of this research is to attempt to determine accurately the effect of a ID 3 methodology on the overall development. Size of proposed team. Avner 2 found conflicting results on the effect of team size on courseware development effort. In a team which is communicating effectively, then the larger the team the more productive it will be, although the relationship is not linear. However, in teams which are not communicating, the larger the team the less productive it is. Similar results have been found in software development; small teams are more productive than larger teams because less time is spent in communication between team members 35'36. The scale assumes that the team is not communicating effectively which results in a drop in productivity as team size increases.
Development team's subject experience. Golas 4 identified the team's experience and knowledge in the subject area as an important factor in reducing development effort. This may be because the team members do not have to spend time learning the terminology used in the subject area and do not have to refer to an external subject matter expert.
Team multimedia development experience. A number of authors 3-7 have indicated that the more experience a team has in developing multimedia, the more productive the team will be in developing other courseware. The scale ranges from no experience--in which case the team will spend time on the current project learning from their mistakes--through to extensive multimedia experience.
256
Subject Expertise (SE) The Subject Expertise heading groups together cost drivers which describe the effect of the subject matter expert on couseware production. This includes the subject matter expert's multimedia development experience and availability. Both of these factors were identified as important by courseware developers. Table 5 describes cost drivers grouped together under this heading.
Subject expert's multimedia development experience. A subject matter expert who is able to provide the development team with the information in a sequence and in a form which is suitable for multimedia delivery will reduce the development effort. The scale indicates that an author who has past experience of multimedia development will be able to contribute more effectively to the development than a novice author. Avner 2 found some evidence to support this assumption in his longitudinal studies of authors from their first courseware development. Availability of subject matter expertise. This scale is based on discussions with multimedia developers who indicated that the availability of the subject matter expert for consultation significantly affects the overall development effort. I n i t i a l study As indicated at the beginning of the paper, a number of these cost drivers are not independent of each other. For example, the level of Course Objective and the level of Interactivity may be related. Once an adequate data set has been collected, step-wise linear regression or principle component analysis will be used to identify the main cost drivers. In this initial study, data on 14 courseware development projects was available to explore the relationship of the cost drivers to courseware development. This is considered too few for a detailed investigation of the effects of the individual cost drivers, but sufficient to look at the Course Difficulty (CD), Interactivity (IN), Development Experience (DE) and Subject Expertise (SE) headings. In each group a simple score was calculated by summing the ratings on each cost driver in each heading. This assigns equal weight to all cost drivers in the group, and also assumes that the scale, which is a set of ordered categories, may be considered to be approximated by an interval scale. Once more data is available, it will be possible to improve on both of these assumptions. The four cost driver group scores, together with the delivery time and development time, are given for the 14 projects in Table 6. As can be seen from Table 6, 11 of the 14 projects have a delivery time of 1 hour, but the development time for
Information and Software Technology 1994 Volume 36 Number 5
Predicting the development effort of multimedia courseware: I M Marshall et al. Table 7. Actual a n d predicted values of development time
Table 6. Multimedia projects Development Course Inter- Development Subject Delivery time (hours) difficulty activity Environment Expertise time (CD)
(IN)
(DE)
(SE)
(hours)
7 8 9 8 9 9 9 8 10 10 10 7 11 9
22 18 17 25 21 23 18 18 24 26 19 19 37 37
16 19 17 18 18 15 19 19 16 17 19 18 19 16
7 7 7 7 7 6 8 8 6 6 7 6 8 6
0.167 1.000 1.000 1.000 2.000 1.000 1.000 1.000 1.000 1.000 1.000 1.000 1.000 3.000
28 80 100 100 180 180 200 220 250 320 400 435 500 590
these varies by a factor of more than 5. As a first step, these 11 projects were used to see if the variance in the development time could be explained using the four summary cost driver headings. Plots of delivery time against Course Difficulty, Interactivity and Development Environment are shown in Figures 5 to 7. Figure 5 suggests that there is an approximately linear relationship between development time and Course Difficulty with the exception of the project with Course Difficulty = 7 which appears to have much too high a development time. The Interactivity and Development Experience
Actual
Predicted
Percentage error
80 100 100 180 200 220 250 320 400 500
128.190 186.509 97.500 125.129 247.888 128.190 275.517 306.207 367.586 487.284
-60.2371 -86.5086 2.5 30.4837 -23.9440 41.7320 --10.2069 4.3103 8.1034 2.5431
scores for the project are not exceptionally high and it appears that nothing in the data available explains this project. For the other 10, a simple linear regression of development time on Course Difficulty explains nearly three-quarters of the variance. Adding Interactivity to the regression barely improves it: this is partly because the correlation between Course Difficulty and Interactivity is high at 0.65. However, adding Development Environment to the regression explains 85 % of the variance. The actual and predicted values of the development time are given in Table 7. Addition of Subject Expertise to the regression did not improve the results. This is because there is little variation in the Subject Expertise scores for these projects (all lie between 6 and 8). D i s c u s s i o n o f results
see
Using the grouped headings of the cost drivers enables 85 % of the variation in development time to be predicted. However, the limited number of projects in the data set could be unrepresentative of multimedia development projects. Senbetta8 asked 21 experienced developers of courseware to estimate the development effort for courseware based on requirements documentation. Table 8 shows the maximum and minimum and mean estimated development time for projects with delivery times between 1 and 20 hours. In common with our data, the development time estimates vary by a factor of about 5 for each delivery time. Only three projects in the data set have delivery times other than 1 hour. Omitting the outlier with development time 435 and using the other 13 projects, 80% of the variance in development time can be accounted for by a regression on Course Difficulty, Interactivity and Delivery Time. The addition of Development Experience does not improve it.
uo
$ 1SO. sO0.
$
*
It
g
sO, 0 7
Figure 5 Delivery time against Course Difficulty
$
ii
•
"
m
0 16
/ 21
I 26 ~
I 31 (
I 31
C o n c l u s i o n a n d future w o r k
0
Figure 6 Development time against Interactivity
The preliminary results are promising and suggest that,
Ua
Table 8. Expert estimation of courseware development time
4m
|=
Estimated delivery time
$
0 :4
:.
,"
,.'
,;
DIIv.~.7@7,~ t E l w l r o t 1 ~ m (DE)
Figure 7 Development time against Development Environment
Information and Software Technology 1994 Volume 36 Number 5
Delivery time (hours)
n
1 2 5 10 15 20
21 21 21 21 21 21
Minimum
Mean
(hours)
(hours)
Maximum (hours)
90 150 250 500 750 1000
240 380 800 1400 1950 2400
365 700 1625 2550 3750 5000
257
Predicting the development effort of multimedia courseware: I M Marshall et al.
even with the unweighted scoring of cost drivers used here, useful predictions can be achieved. A large and rich data set would allow more detailed investigations which would almost certainly lead to improved results. It is hoped that with a larger data set containing more projects with long delivery times, it may be possible to predict development time using Course Difficulty, Interactivity and Development Experience along with delivery time. Similarly, with a greater range of values for Subject Expertise it may be possible to identify its contribution to the prediction of development effort. Given enough data, it should be possible to produce a software tool version of MEEM. The current model is, of course, simply a framework and cannot yet be used for estimation. It requires to be calibrated to determine the appropriate:
8 9 10
11 12 13 14 15 16 17
• coefficients • cost driver values Boehm 15 used data from 63 projects to calibrate his original COCOMO model, and it is aniticipated that data from a similarly large number of multimedia projects will be needed. Work is currently underway to collect and analyse such data from multimedia projects. In addition, other potential cost drivers are being identified with the assistance of multimedia developers and from the literature 3-7. Using data from a representative sample of projects, statistical analysis will help to show which cost drivers are the most significant. The authors' experience with step-wise linear regression techniques suggest that in any development environment only three or four of the cost drivers have a significant bearing on the effort prediction. Robust statistics 36 have produced promising results in predicting significant cost drivers. Samson, Ellison and Dugard 37 have produced promising results through the use of neural nets to 'learn' how to predict effort from cost driver values. Once these cost drivers have been determined, it may be possible to further simplify data collection.
18 19 20 21
22 23 24
25 26 27 28 29
References 30 1 Stewart, A M and Bryce, C F A 'Multimedia multipurpose: is the quality of the learning experience being well served by the use of educational media?' in Percival, F and Ellington, H (eds) Distance learning and education Kogan Page (1981) 2 Avner, R A 'Is there an ideal size for courseware production teams?'
Proc. 30th Association for the Development of Computer Based Instructional Systems (ADCIS), Philadelphia (1988) 3 Gery, G Making CBT happen: prescriptions for successful implementation of computer-based training in your organisation Weingarten Publications, Boston, MA (1987) 4 Golas, K C 'Estimating time to develop interactive courseware in the 1990s' Proc. lmerservices/Industry Training and Education Conf. (1993) 5 Jay, J, Berustein, K and Gunderson, S 'Estimating computer-based training development times' AR/ Technical Report 765, Research Institute for Bebavioural and Social Sciences, USA (1987) 6 Kearsley, K 'The CBT advisor: an expert system program for making decisions abuot CBT' Performance and Instruction Vol 24 (1985) pp 15-17 7 Schooley, R E 'Computer-based training (CBT), cost estimating
2511
3l 32 33 34 35 36 37
algorithm for courseware (CEAC)' Proc. lnterservice Industry Training Systems Conf. (1988) Sanbetta, G 'CBT time and cost estimation: What do the experts say?' Proc. lOth Annual CBT Conf, and Exposition (1982) De Diana, I and van Scbaik, P 'Courseware engineering outlined: an overview of some research issues' ETH Vo130 No 3 (1993) pp 191-21 I Chen, J W and Chen, M 'Towards the design of an intelligent courseware production system using software engineering and instructional design principles' J. Educational Technology Systems Vol 19 No 1 (1990) pp 41-52 Costello, G 'Developing computer-based instruction: the systems design approach' Training Officer Vol 28 No 2 (1992) pp 48-50 Jones, M K, Li, Z and Merrill, D M 'Rapid prototyping in automated instructional design' Educational, Training Research and Development Vol 40 No 4 (1993) pp 95-100 Boehm, B W and Wolverton, R W 'Software cost modelling: some lesson learned' J. Systems and Software Vol 1 (1980) pp 195-201 Fenton, N Software metrics: a rigorous approach Chapman-Hall (1991) Boehm, B W Software engineering economics Prentice-Hall (1981) Conte, S D, Shen, V Y and Dunsmore, H E Software engineering metrics and models Benjamin/Cummings (1986) Subrahmania, G H and Brelawski, S A 'A case for dimensionality reduction in software development effort estimation' Internal Report TR-89-02 Department of Computer Science, Temple University, Philadelphia, USA (1989) Kitchenham, B A 'Software project development cost estimation' JSS Vol 5 (1985) Kemerer, C F 'An empirical validation of software cost estimation models' CACM Vol 30 No 5 (1987) Cuelenaere, A M E, van Genuchten, M J I and Heemstra, H J 'Calibrating a software estimation model: why and how' J. Inf. and Soft. Technol. Voi 29 No 10 (1987) De Diana, I and Collis, B 'Adaptable educational courseware: an antidote to several portability problems' J. Research on Computing in Education Voi 23 No 2 (1990) pp 225-241 Friedler, Y and Shabo, A 'An approach to cost-effective courseware development' British J. Educational Technology Vol 22 No 2 (1991) pp 129-138 Phillips, W A 'Individual author prototyping desktop development of courseware' Computers in Education Vol 14 No 1 (1990) pp 9-15 Maher, J and Iugrahem, A 'Software engineering and ISD: similarities, complementaries and lessons to share' Proc. Ann. Meeting of Assoc. Educan'onal Communications and Technology Dallas, Texas USA (1989) Cohen, V B 'Interactive features in the design of videodisc materials' Educational Technology Vol 24 No 1 (1984) pp 16-20 Marshall, I M, Samson, W B and Dugard, P I 'Estimating multimedia courseware development effort' Technical Report Dundee Institute of Technology, UK (1994) Parr, F N 'An alternative to the Rayleigh curve model for software development effort' 1EEE Trans. for Soft. Eng. Vol SE-6 No 3 (1980) Jeffery, D R 'Time sensitive models in commercial MIS environment' IEEE Trans. Soft. Eng. Vol SE-13 No 7 (1987) Fairweather, P and O'Neal, A 'The impact of advanced authoring systems on CAI productivity' J. Computer-Based Instruction Vol 11 (1984) pp 90-94 Gagne, R M and Briggs, L J Principles of instructional design (2nd edn) Holt, Rinehart & Winston (1979) Cohen, V B 'A re-examination of feedback in computer based instruction: implications for instructional design' Educational Technology Vol 25 No 1 (1985) pp 33-37 Spohrer, J C 'Integrating multimedia and AI for training--examples and issues' Proc. IEEE Int. Conf. Systems, Man and Cybernetics, Los Angeles, USA (1990) pp 663-664 Bergman, R E and Moore, T V 'Managing interactive video/ multimedia projects' Educational Technology (1990) Tennyson, R D and Elmore, R L (1993). 'Integrated courseware engineering system' Proc. NATA AS11993 - Automating Instructional Design, Development and Delivery Grimstad, Norway (In press) Brooks, F P The mythical man-month Addison-Wesley (1975) Kitchenham, B and Mellor, P 'Data collection and analysis' in Fenton, N E (ed) Software metrics: a rigorous approach Chapman & Hall (1991) lap 89-110 Samson, W B, Ellison, D G and Dugard, P I 'Software cost estimation using an Albus Perceptor (CMAC)' Proc. 8th Int. COCOMO Estimating Meeting, SEI, Pittsburgh, USA (1993)
Information and Software Technology 1994 Volume 36 Number 5