Problem finding skills as components in the creative process

Problem finding skills as components in the creative process

0191.8869/93 $5.00 + 0.00 Copyright 0 1992 Pergamon Press Ltd Person. individ. LX&TVol. 14. No. I, pp. 1555162, 1993 Printed in Great Britain. All ri...

768KB Sizes 0 Downloads 20 Views

0191.8869/93 $5.00 + 0.00 Copyright 0 1992 Pergamon Press Ltd

Person. individ. LX&TVol. 14. No. I, pp. 1555162, 1993 Printed in Great Britain. All rights reserved

PROBLEM

FINDING SKILLS AS COMPONENTS THE CREATIVE PROCESS* IVONNE CHAND

California

State University,

and MARK

Fullerton/EC

IN

A. RUNCOT

105, Fullerton,

CA 92634, U.S.A.

(Received 31 January 1992) Summary-The present investigation compared the impact of explicit and standard instructions on six tests of divergent thinking. Two of these tests assessed real-world divergent thinking; two tests assessed real-world problem generation; and the last two assessed a combination of problem generation and divergent thinking (i.e. examinees chose one of the problems they had themselves identified, and then generated ideas and solutions). Importantly, all tasks focused on problems occurring in the natural environment. In particular, examinees (80 college students) were asked to give solutions for problems concerning both work and school situations. The results revealed significant differences among the various tests and differences between the explicit and standard instructional groups. Importantly, only the scores elicited by explicit instructions were significantly correlated with-and predictive of--creative activities and accomplishments. Implications for future research are discussed.

INTRODUCTION

Most contemporary views of creativity recognize its multifaceted nature (Getzels, 1964; Mumford & Gustafson, 1988; Runco & Okuda, 1988; Runco & Vega, 1989). Runco and Okuda, for example, suggested that creative performance requires problem finding, divergent thinking, and the evaluation of solutions. Similarly, Getzels (1964) proposed that the main elements of creativity are the formulation of a problem, the adoption of a method of solution, and the reaching of a solution. All problem situations are supposedly similar in that these three elements are present, but they differ from one another depending on whether or not the person confronted with the problem has to discover a formulation, a method, and a correct solution, or simply adopt existing formulations, methods, and solutions. Do problems really need to be found? Most individuals face dilemmas daily at home, school, or at work. Dilemmas do not, however, automatically appear as problems capable of resolution or even sensible contemplation. They must often be formulated in fruitful and original ways if they are to be solved. Getzels and Csikszentmihalyi (1967) argued that problem situations can be meaningfully distinguished in terms of how much of the problem is initially given, how much of the method for reaching a solution is readily available, and how much agreement there is as to what constitutes a good solution. They proposed a model in which it is possible to systematically distinguish a number of problem situations varying in what is known and unknown and to whom. In their Type-Case 1, both the problem and the method of solution are known to others and to the problem-solver; however, the solution is known to others but still needs to be determined by the problem-solver. In Type-Case 2, the problem is presented, but no standard method for solving it is known to the problem-solver, although it is known to others. In this case, the problem-solver must proceed by reasoning or trial and error until a solution that matches the one already known to others has been reached. In Type-Case 3, the problem itself is not presented but must be discovered. In this situation, only the generalities are apparent, and the problem-solver needs to discover the possible questions or a workable problem. For discovered problems, it may not be certain that a method of solution or a solution itself is already known. Empirid

research on problem jimding

Csikszentmihalyi and Getzels (1970) examined the importance of problem finding asking 31 art students to arrange a set of objects and then draw them. Comparisons *This project was completed as partial fulfillment tTo whom correspondence should be addressed.

of the first author’s

155

M.A. degree.

in art by indicated

IVONNE CHAND and MARK A. RUNCO

156

that artists who approached the task without a specific problem in mind produced drawings which were more original and higher in overall aesthetic value than those who approached the task with a predetermined method. Wakefield (1985) examined the performance of fifth-grade children on a series of figural “Line Meanings” divergent thinking tasks. Divergent thinking tasks contain open-ended questions which allow the examinee to produce an unlimited number of responses, and are frequently used as an index of creativity (Runco, in press; Wallach & Kogan, 1965). Wakefield attempted to combine and compare two components of creativity: problem finding and solving. For a problem solving task, the children in Wakefield’s project were required to complete a task which had them list all of the things that a given line figure could represent. Their “ideational fluency” score was the number of responses given. A blank card was presented last, and the children were asked to draw a line figure and then provide ideas for it. This was the problem finding task. The results of this study indicated that problem finding scores were significantly correlated with problem solving scores, and more highly correlated with criteria of creativity than the problem solving scores. Runco and Okuda (1988) examined the problem finding and divergent thinking of adolescents. They extended Wakefield’s (1985) study by using three verbal divergent tests: Uses, Instances, and Similarities (from Wallach & Kogan, 1965). Each test consisted of three presented problems and one discovered problem. In the presented problems, students were given tasks such as, “Name all the things you can think of which are square.” The discovered problem asked the Ss to first define a task, and then provide solutions to it. Each of the presented and discovered items was scored for the number of distinct ideas. Contrasts indicated that the adolescents generated significantly more responses to the discovered problems than the presented problems. Additionally, discovered problems were reliable and correlated with five indices of creative performance. Support for the distinctiveness of the problem finding component of creative performance was given by results of a hierarchical regression analysis. Further support for the predictive validity of problem finding tasks was presented by Okuda, Runco and Berger (1991). Problem finding is, therefore, an important and distinct component in the creative process, and can be reliably assessed. Still, it is possible that explicit instructions can be used with problem finding tasks to improve their psychometric properties. Explicit instructions have been used to improve the reliability and validity of scores on standard divergent thinking tasks, and may do the same for problem finding tasks. Explicit

instructions

The rationale for explicit instructions is that they encourage examinees to perform at their best by clearly stating exactly what is expected. The premise is that the most reliable and replicable estimate of an individual’s potential is probably that person’s best or maximal performance. Harrington (1975) conducted an investigation in which one group of college students was given and another group was given explicit divergent thinking tests with standard instructions, instructions to “be creative.” Creative ideas were explicitly defined as unusual and worthwhile. Not surprisingly, Ss given explicit instructions had higher originality scores, with more unusual responses, and lower fluency scores than Ss who received standard (inexplicit) instructions. Runco (1986) extended this line of work by comparing the divergent thinking of gifted and nongifted children in terms of the effects of explicit instructions. His findings revealed that explicit instructions led to increased originality scores for all children. However, the originality scores of the talented and nongifted children increased more than those of the gifted children; and the same instructions inhibited the fluency and flexibility scores of the gifted children more than those of the talented and nongifted children. Runco and Okuda (1991) administered divergent thinking tests with three sets of instructions: (a) conventional (inexplicit) directions; (b) directions designed to maximize ideational originality; and (c) directions designed to maximize ideational flexibility (the thematic variety of ideas). As expected, originality scores increased in the second condition, and flexibility scores increased in the third. Contrary to expectations, flexibility scores were low when originality instructions were given. These findings suggest that explicit instructions do not influence an examinee’s ability to generate ideas, but rather manipulate the choice of specific ideational strategies.

Problem

finding

157

The strategies used when finding and defining problems may also benefit from explicit instructions. With this in mind, one objective of the present investigation was to administer problem finding tasks with explicit instructions to determine whether or not their reliabilities and validities would increase. A second objective was to compare several types of real-world problem finding tasks. Real-world

problem jinding

Little research has been conducted using real-world situations to elicit creative problem solving skills. In her research on television, Meline (1976) presented several real-world problematic situations to children. For example, one real-world test problem asked, “How would you get people to quit smoking?” Getzels and Smilansky (1983) used real school problems involving school regulations, pupil cliques, and homework. Results of their study indicated that students who generated high quality problems similarly produced excellent solutions. Houtz, Jambor, Cifone and Lewis (1989) attempted to show that standardized tasks are not equivalent to real-world tasks. To this end, they administered two divergent thinking tasks to 1 lth grade students. One task was realistic (and might be encountered in the natural environment) and the other was unrealistic. The students received these problems with one of two types of directions. One asked them to generate as many ideas as possible that would be unique (“give ideas that no one would think of”). The second type asked students to generate as many ideas as possible that they felt would be common (“give ideas that everyone else would think of’). The results revealed that the boys given directions calling for common ideas out-performed the boys given directions for unique ideas, with exactly the opposite effect for girls. Houtz et al. (1989) attributed this finding to the possibility that the directions to “think of only ideas that everyone else will also think of’ may be inconsistent with the typical male thinking pattern. They suggested that boys may have a shorter list of such ideas because they tend to think in an individualistic fashion. Girls, on the other hand, may have a larger base of both common and novel ideas as a result of their own cultural stereotypical role development. Okuda et al. (1991) specifically assessed the difference between standard tasks and real-world tasks. In this work with 4th, 5th and 6th grade children, Okuda et al. reported two important results: (a) problem finding measures were more predictive of creative activities than traditional divergent thinking measures; and (b) scores from real-world problem finding tasks were more predictive of creative accomplishment than scores from a problem finding task which did not include real-world problems. The present investigation was designed to extend the research of Okuda et al. (1991) on real-world divergent thinking and problem finding. The results of Okuda et al. indicated that real-world problems are more predictive of creative performance than standard tasks. However, Okuda et al. relied on one type of problem finding task. The present study is unique in its testing real-world divergent thinking with presented problems, problem generation tasks, and discovered problem solving tasks. This study thus allowed a comparison of two different problem finding tasks: problem definition and generation. Finally, this investigation is unique in its testing of the impact of explicit instructions on problem finding tasks.

METHOD

Students at California State University, Fullerton (N = 80) participated in the investigation. The students received course credit for participating in this experiment. Twenty-nine Ss (36.3%) were males and 51 (63.7%) were females, Fifty-eight (72.5%) were freshmen, 9 (11.2%) were sophomores, 7 (8.7%) were juniors, 5 (6.3%) were seniors, and 1 (1.3%) was a graduate student. Six (7.58%) were psychology majors, 2 (2.5%) majored in another social science field, 7 (8.7%) were physical science majors, 24 (30.0%) majored in a humanities field, 5 (6.2%) majored in math/computing, 27 (33.7%) were business majors, and 9 (11.3%) were in the “other” category.

IVONNECHAND and MARK A. RUNCO

158

Measures

and procedure

One experimenter administered all of the measures to the Ss while in groups of 5-18 examinees. The testing sessions were approx. 1 hr; however, no strict time limitation was imposed. The Ss were encouraged to take as much time as necessary to finish the tasks. The Ss in each testing session were randomly divided into two groups. One group received explicit instructions before beginning the tasks; the other group did not. These instructions emphasized original and worthwhile responses (Harrington, 1975). Five tasks were administered to both groups of Ss in the following order: (a) presented problem divergent thinking (Okuda et al., 1991) followed by the collection of (b) demographic information (in part as a distractor); (c) problem generation; (d) discovered problem divergent thinking [a combination of problem finding and divergent thinking (Runco & Okuda, 1988)]; and (e) the Creative Activities Check List (Runco & Okuda, 1988). The order of these tasks was intended to minimize fatigue effects. Additionally, there is an obvious logic in this order (e.g. problem generation must precede discovered problem divergent thinking). The explicit instructions asked the Ss to “be creative” and “give only original responses.” These instructions also informed the Ss that a creative idea was unusual and worthwhile, as in Harrington (1975) and they were informed that an original idea is one which is thought of by no one else. Presented

problem

divergent

thinking

task

The presented real world divergent thinking task, adapted from Okuda et al. (1991), included problems about school and work. Ss were first presented with a problem and asked to give as many solutions as possible. The instructions were: “On the next few pages, we will describe a few problems which may occur at school and work. Your task is to first read about the problem and then try to write down as many solutions as you can for each problem.” Here is an example: “Your favorite television show, L.A. Law, was on last night. You had so much fun watching it that you forgot to do your homework. You are about to go to school this morning when you realize that your homework is due in your first class. Uh-oh . what are you going to do? For this problem, you could answer. ‘Tell the professor that you forgot to do your homework; try to do your homework in the car or bus on the way to school; ask your roommate, boyfriend, girlfriend, or classmate to help you finish your homework; do your homework tonight and turn it in the next time the class meets; or finish your homework first then show up late for class.’ There are many more answers to this problem, and all of them are legitimate Now turn the page, take your time, have fun, and remember to give as many ideas as possible.” One

problem

read as follows:

“Your friend Rick sits next to you in class. Rick really likes to talk to you and often bothers you while you are doing your work. Sometimes he distracts you and you miss an important part of the lecture, and many times you don’t finish your work because he is bothering you. What should you do? How would you solve this problem? Remember to list as many ideas and solutions as you can.”

A second

problem

read

as follows:

“It’s a great day for sailing, and your budddy, Chris, comes to your work and asks you if you want to go sailing. Unfortunately, you have a big project due tomorrow, and it requires a full day to complete. You would rather be sailing. What are you going to do? Think of as

many ideas as you can.” The responses from these tasks were scored for fluency, defined as the total number of ideas generated, and originality, defined as the number of unique ideas produced (Runco & Albert, 1985). Real-world

problem

generation

task

The problem discovery directed Ss to generate a number of problems about school and work. Responses from this task were also scored for fluency and originality. Here are two examples: “List different problems in school that are important to you. You may write down problems about the campus itself, classes, professors, policies, classmates, or whatever. and take your time. Think of as many problems as you can.”

Try to be specific,

Problem finding

159

“On the previous page, you were asked to list problems that you may face at school. Now, I would like you to list problems at work that are important to you. You may write down any problems about your boss, co-workers, clients, policies, or whatever. Be specific, and keep

in mind that the more ideas, the better. Take your time.” Discovered problem divergent thinking task This task required that the Ss refer back to their list of problems about school and work, and choose one problem for each situation which would allow for the largest number of solutions. They were asked to indicate which problem they had chosen, and then generate solutions (the explicit instructions group being reminded to focus on unusual and worthwhile ideas). Only fluency scores were calculated from these responses because each examinee was working on his or her own problem, thereby precluding the comparisons necessary for determining uniqueness and originality. These tasks were presented as follows: “Now go back to page 3 (school problems) the largest number of solutions. Copy that problem

and choose

the one problem which would allow 9,

here:

“Now write as many solutions as you can think of for the problem mentioned again, the more the better.”

The second task problems) . . .“.

was

identical

except

it asked

examinees

to “go

back

above. And

to page

4 (work

Creative Activities Check List The Creative Activities Check List was also administered. The present version was adapted from Runco and Okuda (1988) for the present (college student) sample. It contained 45 items in five different domains: science (10 items), mathematics (7 items), literature (8 items), art (15 items), and music (5 items). An example of the wording is, “How many times have you drawn a picture for aesthetic reasons?” Possible responses were: (a) never; (b) once; (c) 2-3 times; (d) 4-5 times; or (e) 6 or more times. RESULTS

Reliability and descriptive statistics The reliability of the divergent thinking tests was assessed with Cronbach’s alpha. For both groups combined, the alpha coefficient for fluency was 0.83, and for originality was 0.53. For the explicit instructions group, the alpha for fluency was 0.85 and for originality was 0.46. For the standard instructions group, the alpha for fluency was 0.79 and for originality was 0.62. The reliability of the criteria was also evaluated with Cronbach’s alpha. For the Creative Activities Check List, alpha values were as follows: science scale 0.80; mathematics scale 0.66; literature scale 0.71; art scale 0.81; and the music scale 0.70. Descriptive statistics are presented in Tables 1 and 2.

Table I. Descriptive

statistics

for all divergent

thinking

composites

Instructions Explicit (N = 39)

Presented problem divergent thinking Fluency Originality Problem generation Fluency Originality Discovered problem divergent thinking Fluency

Total (N = 78)

Standard (N = 39)

M

SD

M

SD

M

SD

8.10 2.48

3.82

1.88

7.60 1.28

2.89 1.32

7.85 I .88

3.37 1.72

8.41 3.36

4.60 2.74

9.87 3.70

3.96 2.88

9.14 3.66

4.33 2.79

9.15

4.21

8.30

3.81

8.73

4.01

IVONNE CHAND and MARK A. RUNCO

160

Table 2. Descriptive

statistics

for the Creative

Activities

Check List scales

Instructions Explicit M Scales Science Mathematics Literature Art Music

Intertest

d@erences

and instructional

1.55 I .83 2.21 2.25 1.27

(N = 39) SD 0.51 0.63 0.67 0.64 0.56

Standard M

(N = 39) SD

I .50 1.70 2.26 2.19 I.13

0.50 0.56 0.72 0.78 0.28

eflects

Differences among the divergent thinking scores were examined with repeated measures factorial (mixed) analyses of variance using Type of Instruction and Sex as between Ss factors. The results using the three fluency scores indicated a significant difference among tests using the Geiser-Greenhouse corrected test [F(2,148) = 4.01, P < O.OS)]. The Instruction x Test interaction was also significant [F(2,48) = 3.87, P < 0.051. The Instruction and Sex main effects were not significant. Table 1 presents the mean scores, and Fig. 1 graphically portrays the Instruction x Test interaction. The ANOVA using the originality scores indicated that the difference between tests was significant [F(1,76) = 28.30, P < O.OOl].The Sex and Instructions effects were not significant, nor were the interactions. Contrasts within the standard instruction group indicated that the fluency scores from the problem generation task were significantly higher than those from the presented problem test [t(38) = 3.44, P < O.OOl]and significantly higher than scores from the scores from the discovered problem task [t(38) = 2.67, P < 0.011. The scores from the presented problem divergent thinking task were not significantly different from the scores from the discovered problem divergent Fluency score

10.0

9.5

EI

9.0

8.5

8.0

7.5

Divergent Thinking Tests Fig.

1. Instruction x Test interaction with EI = explicit instructions, SI = standard instructions, problem divergent thinking, 2 = problem generation, 3 = discovered problem divergent thinking.

I = presented

Problem Table 3. Correlations

Divergent thinking

161

finding

between the divergent

thinking

variables

I

2

Fluency I PP 2 PG 3 DP

51”’ 43***

54***

Originality 4 PP 5 PG

53”. 34”’

18 61”*

12 30**

3l’f

Totals 6 Orig 7 Flu

50”’ 17111

58’; 86-f

28” 8l***

70’” 32”

3

4

composites 5

90”’ 54***

6

55’”

Decimal points have been omitted. “PP” indicates presented problem; “PG” problem generation; and “DP” discovered problem divergent thinking. *+f < 0.01; ***p < 0.001.

thinking. For the explicit instructions group, no significant differences were found in contrasts of fluency scores. No contrasts were conducted for the originality scores because only two taskspresented problem divergent thinking and problem generation+ould be scored for originality. The ANOVA reported above therefore can be interpreted as a contrast. It indicated that across groups, the originality scores were higher in the problem generation task than the presented problem divergent thinking task. Predictive

validity

The predictive validity of the ideational tests was examined with canonical correlation and regression procedures. The first canonical analysis (N = 80) used the five ideational scores as predictors and the five Creative Activities Check List composite scores as criteria. The results indicated that the canonical coefficient was not significant. Hence no follow-up regression analyses with the entire sample of Ss were justified. Canonical and regression analyses were computed within each of the instructional groups. In the explicit instruction group, the five predictors were significantly correlated with the Creative Activities Check List scores in the multivariate test (R, = 0.70, P < 0.001) and the univariate tests of presented problem fluency (R = 0.55, P < 0.05), problem generation fluency (R = 0.72, P < O.OOl), discovered problem fluency (R = 0.59, P < O.OOl), and problem generation originality (R = 0.60, P < 0.001). For the standard instruction group, the multivariate test and all of the univariate tests using the Creative Activities Check List scores as the criteria were not significant. DISCUSSION One important finding from this study was that the two groups of students had significantly different scores, suggesting that explicit instructions had a significant impact on ideational performance. This was especially apparent in the interaction in the mixed design ANOVA using the three fluency scores. The fluency scores from the presented problem divergent thinking and discovered problem divergent thinking tasks were higher in the explicit instructions group than the standard instructions group, and the fluency scores from the problem generation task were higher in the standard instructions group than the explicit instructions group. Note, however, that the originality scores of the problem generation task were actually higher in the standard instructions group than the explicit instructions group (see Table 1). Hence the explicit instructions do not seem to improve originality in all divergent thinking tasks. Perhaps the metacognitive strategies suggested by the explicit instructions did not apply well to the problem generation tasks. Recall that the rationale for using explicit instructions is twofold: First, their impact suggests that creative ideation is influenced by task perception (Harrington, 1975; Runco, 1986). This is important because it suggests that creative thinking (here operationalized as divergent thinking on presented problems, problem generation, and discovered problem divergent thinking) is in part metacognitive and strategic and not entirely cognitive. Second, the explicit instructions approach suggests a way to enhance creative thinking. Perhaps teachers or organizational specialists coulduse explicit instructions in encouraging students or employees to develop their creative thinking potential.

IVONNECHAND and MARK

162

A.

RUNCO

The intertest differences and interaction from the analysis of variance using fluency scores suggest that there are different ways to assess divergent thinking and ideational creativity. Importantly, the data in Fig. 1 suggest that these tests can only be contrasted when instructions are taken into account. Unfortunately, because examinees were “solving” their own problems (rather than solving one common problem), the discovered problems could not be scored for originality. Still, fluency scores are usually predictive of originality scores (Runco, in press). Thus, it is reasonable to conclude that batteries of ideational creativity tests should contain all three types of tasks for a comprehensive assessment of ideational skill. Group differences were also apparent in the analyses of predictive validity. The scores of the explicit instructions group were significantly correlated with the Creative Activities Check List ratings, supporting their predictive validity, but the scores from the standard instructions group were not. Psychometrically, divergent thinking tests therefore seem to be valid when given with explicit instructions (c$ Okuda et al., 1991; Runco, 1986; Runco & Okuda, 1988). Still, predictive validity is only one type of validity. The discriminant validity of the tasks used herein should be examined in future research. Correlations with the IQ or a similar test could be used to this end. Future research on problem finding should include populations other than college students. Recall also that the Creative Activities Check List used in this investigation only represented five domains. Perhaps other domains of performance or a more balanced representation could be included in future work. Alternatively, different types of criteria might be used in future research. Teachers’ ratings, parents’ ratings, and self-ratings, for example, have each been used in the literature as criteria of creativity. On the other hand, the Creative Activities Check List is a measure of actual performance. It is a self-report, and is therefore open to certain biases (e.g. memory, honesty); however, individuals are generally well-informed about their activities and accomplishments. It appears that explicit instructions are very effective with problem finding tasks, but future research is necessary. Acknowledgemems-The valuable input.

authors

would

like to express

their gratitude

to William

Smith

and Edward

Sterns

for their

REFERENCES Csikszentmihalyi, M. & Getzels, J. W. (1970). Concern for discovery: An attitudinal component of creative production. Journal of Personality, 38, 91-105. Getzels, J. W. (1964). Creative thinking, problem solving and instruction. In Hilgard, E. R. (Ed.), Theories oflearning and instrucrion. Chicago, IL: University of Chicago Press. Getzels, J. W. & Csikszentimihalyi, M. (1967). Scientific creativity. Science Journal, 3, 80-84. Getzels, J. W. & Smilansky, J. (1983). Individual differences in pupil perception of school problems. British Journal of Educational Psychology, 53, 3077316. Harrington, D. M. (1975). Effects of explicit instructions to “be creative” on the psychological meaning of divergent thinking test scores. Journal of Personality, 43, 434-454. Houtz, J. C., Jambor. S. D.. Cifone. A. & Lewis. C. D. (1989). Locus of evaluation control, task directions, and type _. of problem effects on creativity. Creativity Research Jo&nal,‘Z, 1188125. Mehne, C. W. (1976). Does the medium matter? Journal of Communication, 26, 81-89. Mumford, M. D. & Gustafson, S. B. (1988). Creativity syndrome: Integration, application, and innovation. Psychological Bulletin, 103, 27-43. Okuda, S. M., Runco, M. A. & Berger, D. E. (1991). Creativity and the finding and solving of real-world problems. Journal of Psychoeducational Assessment, 9, 45-53. Runco, M. A. (1986). Maximal performance on divergent thinking tests by gifted, talented, and nongifted children. Psychology in the Schools, 23, 3088315. Run&, M. A. (in press). Children’s divergent thinking and creative ideation. Deuelopmenral Review. Runco, M. A. & Albert, S. (1985). The reliabilitv and validity of ideational originality in the divergent thinking of academically gifted and nongifted children. Edicational and Psychological Measurement, 45, 483-501. Runco, M. A. & Okuda, S. M. (1988). Problem-discovery, divergent thinking, and the creative process. Journal of Youth and Adolescence, 17, 21 I-220. Runco, M. A. & Okuda, S. M. (1991). The instructional enhancement of the flexibility and originality scores of divergent thinking tests. Applied Cognitive Psychology, 5, 435-441. Runco, M. A. & Vega, L. (1989). Evaluating the creativity of children’s ideas. Journal of Social Behavior and Personalify, 5, 439-452. Wakefield, J. F. (1985). Towards creativity: Problem finding in a divergent-thinking exercise. Child Study Journal, IS, 2655270. Wallach, M. A. & Kogan, N. (1965). Modes of fhinking in young children. New York: Holt, Rinehart & Winston.