Obtaining active parental consent for evaluation research: a case study

Obtaining active parental consent for evaluation research: a case study

Obtaining Active Parental Consent for Evaluation Research: A Case Study KNOWLTON JOHNSON, DENISE BRYANT, EDWARD ROCKWELL, MARY MOORE, BETTY WATERS STR...

163KB Sizes 0 Downloads 125 Views

Obtaining Active Parental Consent for Evaluation Research: A Case Study KNOWLTON JOHNSON, DENISE BRYANT, EDWARD ROCKWELL, MARY MOORE, BETTY WATERS STRAUB, PATRICIA CUMMINGS, AND CAROLE WILSON

ABSTRACT This study assesses the effectiveness of a strategy for obtaining active written parental consent for the outcome evaluation of an alcohol, tobacco, and other drug (ATOD) abuse prevention program. A local school-based strategy that was implemented in 16 middle schools in ten rural and suburban school districts is presented. Using a multiple case study approach and an adequacy of performance analysis, it was found that seven of the ten districts achieved a minimum consent rate goal set at 70% (ranged from 53% to 85%, average rate of 72%). Only two Knowlton Johnson districts achieved a desired consent rate of 80%. Interviews with a key contact person in each school district provided profile information that distinguished districts that were successful in implementing an active parental consent strategy from those that were not successful. A cost effectiveness analysis showed that this local school-based strategy for obtaining parental consent for program evaluation was more cost effective than in previous studies. However, more than 20% of the data collection costs involved obtaining active written consent. Methodological and practical implications are discussed.

INTRODUCTION Within the research and evaluation field there has been ongoing discussion and study of how to obtain parental consent for minors to participate in cross-sectional and longitudinal studies. There are two types of parental consent: passive and active. The passive consent method requires parents to return a signed form only if they do not want their child to participate. If parents do not return a form, investigators assume parents have given permission for their child to participate in the research. For active written consent, parents must sign Knowlton Johnson ● Community Systems Research Institute, Inc., 1300 South Fourth Street, Suite 300, Louisville, KY 40208; Tel: (502) 634-3694; Fax: (502) 634-5690. American Journal of Evaluation, Vol. 20, No. 2, 1999, pp. 239 –249. All rights of reproduction in any form reserved. ISSN: 1098-2140 Copyright © 1999 by American Evaluation Association.

239

240

AMERICAN JOURNAL OF EVALUATION, 20(2), 1999

and return a form which indicates that their child has permission to participate in a research activity. Parents who fail to do this, as well as those who indicate on the form that they do not want their child to participate, are treated as refusals. Because of high consent rates and low costs, evaluators have preferred the passive consent strategy. Severson and Biglan (1989) reviewed the results of a number of studies that used passive consent methods and found consent rates ranging from 88% to 98%. Cross (1994) reported a nearly 100% consent rate during year three of a longitudinal study that used passive consent. With utilization of active written consent, researchers have historically experienced problems. Studies reported parent consent rates of only 50% to 60% (e.g., Kearney, Hopkins, Mauss, & Weisheit, 1983; Lueptow, Mueller, Hammes, & Master, 1977; Severson & Biglan, 1989; Thompson, 1984). Cross (1994) reported 47% and 43% consent rates for the first and second years of a longitudinal study when active written consent was required. With such low consent rates, active written consent may reduce the sample of students so that the sample may not represent the population. Evidence shows that this method has produced sample bias, such as fewer minorities, low achievers, latch-key children, and youth who have tried drugs (Dent, Galaif, Sussman, & Stacy, 1993; Kearney et al., 1983; Severson & Ary, 1983; Anderman, Cheadle, Curry, & Diehr, 1995). Dent et al. (1993) concluded that children whose parents do not provide active consent are at higher risk for a number of health and social problems. Noll, Zeller, Vannatta, Bukowski, and Davies (1997) found that children who did not receive parental consent for participation in a research project, in comparison with children who received parental consent, were perceived by peers and teachers as less sociable and by peers as being lower on social acceptance, more aggressive, and less academically competent. Furthermore, when active written consent is required, researchers have pondered such ethical dilemmas as how to reach children at high risk because alcoholic parents will not grant permission for inclusion of their children in ATOD studies (Gensheimer, Ayers, & Roosa, 1993). To deal with the problem of reduced sample size and sample bias, some researchers have used extensive follow-up procedures. O’Donnell, Duran, San Doval, Breslin, Juhn, & Stueve (1997) and Thompson (1984) found that incentives for participation and communication with parents and students could increase the response rate substantially. Several other studies have used follow-up telephone calls to parents to significantly increase the response rate (e.g., Dent, Sussman, & Stacy, 1997; Ellickson & Hawes, 1989). However, both of these studies stated that the costs of contacting parents were extremely high. Other studies have employed both written and telephone consent procedures to increase the response rate (Sussman et al., 1990; Moberg & Piper, 1990). Ellickson (1989) reported a multistage parent consent process which involved multiple communication channels that yielded desirable levels of responses for longitudinal studies, but required considerable time. In the mid 1990s, even though research showed active parental consent to be problematic, there was renewed interest in the protection of the rights of minors (Ellickson, 1994; Yeager, 1994). In 1994, Congress passed the Goals 2000: Educate America Act which included the Grassley Amendment regarding student privacy (Grassley Amendment to the Goals 2000: Educate America Act, 20 U.S.C.A. sec. 1232h, 1994). The Grassley Amendment requires all research funded by the U.S. Department of Education to obtain written parental consent before any minor student can participate in a survey, analysis, or evaluation that reveals sensitive information (that is, information about illegal, antisocial, self-incriminatory, and demeaning behaviors). In 1995, a bill was passed in the U.S. House of Representatives to expand the Grassley amendment requirements to all federal government agencies (Family

Obtaining Active Parental Consent

241

Privacy Protection Act of 1995, H.R. 1271, 104th Cong. 1st Sess. [1995]). Although this legislation did not receive any floor action in the U.S. Senate, its passage in the U.S. House of Representatives, and subsequent discussion in the literature, suggest that evaluators using federal funds, regardless of the source, may have to obtain written parental consent to collect data from youth in the future (Renger, Gotkin, Crago, & Shisslak, 1998). With the current trend toward requiring written parental consent for program evaluation involving minors, we present a case study of the use of a local school-based strategy to obtain written parental consent for middle school children to participate in an evaluation. This school-based evaluation was part of a larger outcome evaluation of an ATOD abuse prevention program, Partners in Rural Prevention (Partners), funded by the Center for Substance Abuse Prevention (CSAP), a division of the U.S. Department of Health and Human Services. We addressed three issues in the case study of parental consent: return rate, factors associated with getting an acceptable return rate, and cost-effectiveness. The strategy consisted of a set of procedures that schools implemented with support from the evaluators. The strategy included both incentives and follow-up procedures. The specific research questions were: 1.

2. 3.

Is a local school-based strategy effective in obtaining an acceptable parental written consent rate in comparison with an a priori consent rate of 70% agreed upon by school districts and 80% desired by evaluators? What factors are associated with successful implementation of a local school-based strategy? How cost-effective is a local school-based strategy in comparison with previous studies? METHOD

Setting In 1995, the evaluator and Partners project director met individually with the superintendents and principals of 14 public school districts (range ⫽ 1,360 to 12,462 students) to obtain agreement for their school district to participate in the school-based evaluation component of the Partners in Rural Prevention demonstration project. Ten school districts (six in the region targeted for prevention services and four in the comparison region) agreed to work with the evaluators to obtain active written consent from parents. The school-based evaluation component entailed administration of a 30-minute survey about alcohol, tobacco, and other drug use to eighth-grade youth prior to implementation of the Partners demonstration, and two follow-up surveys to the same students two years apart. Each school designated a school contact person who coordinated efforts for the survey. Superintendents and principals agreed to a minimum parental consent goal of 70% as a reasonable percent of parents who could be persuaded to consent to their eighth-grade youth’s participation in the evaluation. This consent rate was considered acceptable for making generalizations of the baseline results to the targeted population. However, because of anticipated attrition of students from the eighth to twelfth grades in the repeated measures

242

AMERICAN JOURNAL OF EVALUATION, 20(2), 1999

evaluation, evaluators encouraged school personnel to consider the 70% consent rate as a minimum goal and to strive to obtain an 80% consent rate. Parental Consent Strategy A local school-based strategy was implemented to obtain active written consent from parents of the eighth grade youth attending the targeted schools. This strategy was designed to maximize involvement of all personnel in the schools in which the survey was being administered. It differs from most other parental consent strategies reported in the literature in that personnel of the targeted schools are heavily involved in planning and implementing the strategy. The strategy was divided into two phases. In Phase I, evaluators drafted a letter to parents for the school system superintendent’s signature. It introduced the Partners program and its evaluation, which would entail collecting data three times; stressed the importance of their child’s participation in the evaluation and the confidentiality of the responses; and gave examples of questions in the survey. The letter also announced an incentive for the child participating in the study, which would consist of a drawing at each school for a $50 gift certificate redeemable at a local store. At each middle school, the principal was asked either to include the superintendent’s letter and a permission form along with materials in an orientation packet sent to parents at the beginning of the school year or to send the letter and materials home with students or mail them directly to parents. We found that a majority of the school districts (n ⫽ 7) asked youth to take the information home. Two districts mailed the packet directly to parents, and one school district sent the information home as part of a school orientation packet before school began in August. If parents did not respond at the end of two weeks, a follow-up packet was sent home via students or mailed directly to parents (method was determined by school system). This packet contained either a copy of the initial superintendent’s letter or a letter from the chief evaluator and another parental consent form. In phase II, districts with schools that were not close to achieving the 80% minimum return rate were asked to implement additional tactics to increase the return rate. Although the evaluators provided suggestions, each school came up with the tactics that they implemented. Tactics selected by the schools included follow-up phone calls to parents, special visits to classrooms, school contact person attendance at parent meetings, and class contests with prizes such as a pizza party. Evaluators were proactive in providing support to all contact persons, and, in the counties where the prevention program was being implemented, the program staff assisted in obtaining consent forms. Design and Data Requirements The assessment of the parental consent strategy used a multiple case study design (school districts), using a priori shadow controls (school personnel and evaluator judgments and costs of other studies) and adequacy of performance criteria (Rossi & Freeman, 1993; Suchman, 1967). This ex-post facto design, although known to be less desirable than experimental or quasi-experimental approaches, is appropriate when assessing a full-service intervention, the sample size is small (ten school districts), and the gross effects can be assumed to be the same as the net effects.

Obtaining Active Parental Consent

243

Assessment of the adequacy of performance, which involves comparing actual performance to an adequacy criterion, was first introduced by Suchman (1967) to provide evaluative results when implementation of more rigorous designs is not possible. In the present assessment, we determined adequacy of performance in two ways. The first adequacy criterion was the previously mentioned 70% parental consent rate goal agreed upon by school districts and 80% rate desired by the evaluators. The second criterion was cost. The cost effectiveness of the parental consent strategy under study would be compared with results of previous studies. Data sources for this study were records on consent returns and telephone interviews with the contact person in each of the school districts who had assisted in implementing the parental consent strategy. Computerized records were maintained on parental consent returns. A data base was created with the names of all students to be surveyed by school district, school name, and grade level. Evaluation and program staff not involved in implementing the school-based strategy conducted the interviews with school contact personnel. The contact persons were asked closed and open-ended questions related to the following: (a) the school system’s parental consent requirements (e.g., Does the school system require parental consent for youth to participate in a school survey?), (b) their perceptions about factors affecting the level of success in getting active written parental consent (e.g., What time of year do you feel is best to launch a strategy to get parental consent?), and (c) level of support and commitment in implementing the strategy (e.g., What kind of support did the evaluation team provide you and the school system to get parental consent for the survey? How willing were you to be the person who worked with the evaluators to get parental consent for the survey?). Contact persons also were questioned about both personnel and non-personnel costs per task for implementing the strategy. A list of tasks was read to each respondent and if the school completed a task, the respondent was asked to estimate the amount of time spent by each school person involved in the task. The respondent also was asked to estimate the annual salaries of school personnel involved, as well as the costs of non-personnel items. RESULTS Is a local school-based strategy effective? Parental consent rates were calculated for each of the 16 middle schools and compared to the adequacy criteria of 70% and 80%, respectively. Table 1 shows the results by school and school district. Seventy-four percent of the 2,331 eligible parents of grade 8, 10 and 11 students in the ten school districts returned a consent form. Of those, 72% gave permission for their child to participate in the ATOD outcome evaluation. Only 2% refused, a lower rate than has been reported in other studies with which the present authors are familiar. These rates were approximately the same for the school districts in the demonstration region (73%) versus the comparison school districts (70%). An examination of the consent rates for individual school districts (Table 1) shows four of the 16 middle schools failed to meet the 70% minimum return goal. Two of these schools represented one-school districts (5 and 6) and two were schools within larger districts (1-A and 9-C). However, the average consent rate for each of the districts with multiple schools exceeded 70%.

244

AMERICAN JOURNAL OF EVALUATION, 20(2), 1999

TABLE 1. Parental Consent Return Rates by District and School School District/School School District 1 School A School B School C School D School District School District School District School A School B School District School District School District School District School District School A School B School C School District Grand Total

2 3 4

5 6 7 8 9

10

Students Phase I Phase II Total Total (N) Return Rate Return Rate Return Rate Refusal Rate Consent Rate 851 157 228 216 250

70% 35 64 93 76

7% 21 11 0 1

78% 57 77 94 80

1% ⬍1 2 1 2

77% 56 75 94 77

41 183 348 179 169 135 106 55 163 345 137 49 159 104 2331

63 64 74 76 71 42 46 82 52 69 80 88 53 45 65%

22 7 3 4 1 11 8 2 12 1 0 8 0 29 7%

85 77 77 82 72 55 58 84 70 71 80 96 55 75 74%

0 5 <1 1 0 1 5 0 6 <1 0 0 2 1 2%

85 71 76 80 72 53 54 84 64 70 80 96 53 74 72%

Table 1 also shows that a large majority of the signed parental consent forms were returned after receiving the superintendent’s letter (Phase I). Sixty-five percent of the parents responded to the superintendent’s letter, which represents 88% of the total return rate. Another 7% were returned after 13 middle schools implemented a customized, local school strategy to increase the return rate (Phase II). Of the latter, three schools increased their return rate by 21% to 29%, and three others increased theirs by 11% to 12%. Based on these results, we believe the local school-based strategy produced an acceptable consent rate for youth participation in an ATOD outcome evaluation with a multiple cross-sectional design, but not for a repeated measures panel design that spans four years. Twelve of the 16 middle schools exceeded or approached the 70% rate, which is adequate for making generalizations about the cross-sectional results. However, taking attrition into consideration in a long-term repeated measures evaluation, we believe that the local-based strategy did not produce a rate sufficient for schools to make generalizations about the repeated measures results. We only achieved our recommended 80% consent goal in two of the ten school districts and six of the 16 middle schools. What factors influence getting active written parental consent? The interviews with the ten school contact personnel identified several patterns which were helpful in interpreting why the local school-based strategy obtained the minimum acceptable return rate of 70%. First, the contact persons were asked questions about the appropriate time of the year to obtain parental consent and who might influence parents to

Obtaining Active Parental Consent

245

return a written consent form. Most of the respondents (n ⫽ 9) stated that the fall semester is the best time to implement the strategy, and seven of the nine said that the beginning of the school year is the best time to launch it. The time frame in which we implemented the parental consent strategy was almost identical to the contact persons’ preferences: seven districts launched the strategy in early fall and three in late fall. Second, all ten respondents rated principals as having the most influence on the parents’ decision to allow their child to participate in an alcohol and drug use survey. Superintendents were rated as influential by nine of the ten respondents, and the school board was rated as influential by seven of the ten. A majority (7 of 10) felt that the mayor or a county executive would have little or no influence on the parents’ decision. Our parental consent strategy used an introductory letter from the superintendent as a way to influence parents to return their consent form. In retrospect, we should have had principals co-sign the introductory letter to parents. Third, contact persons were asked what methods they considered most effective in getting parents to return the consent form. They were asked to rate the effectiveness of four methods: (1) mailed directly from the evaluator, (2) mailed directly from the school superintendent, (3) sent home with youth, and (4) mailed as part of a school orientation packet in August. Eight out of ten reported that direct mail from the superintendent’s office was the most effective. A smaller majority (6 out of 10) indicated that sending the request form home as part of a school orientation packet at the beginning of the school year was also an effective method. Making the request by direct mail from the evaluator or sending it home with youth were rated as the least effective methods. As reported earlier, we found that a majority of the school districts asked youth to take the information home through students, which is at variance with the contact persons’ preferences. Fourth, other factors that the school contact persons indicated as important to the successful implementation of the parental consent strategy included level of support from schools, degree of persistence and enthusiasm of school contact persons, and type of incentives. A majority of respondents (8 out of 10) rated the level of support and involvement of their school district in implementing ATOD abuse prevention programs and events as having a big influence on the ability to get a 70% or higher consent rate. Teachers were most often named as influential (5), followed by counselors (3) and administrators (2). Contact persons indicated that seven of the ten school districts provided moderate to strong support to the evaluation effort. For example, a teacher in one school kept a list and periodically asked who had returned their forms. In several other schools, teachers and administrators encouraged students to get their parents to return the consent forms. Guidance counselors were also noted as helpers who sent out the forms and encouraged students to get parents to return them. The superintendent and principals played key roles in one of the larger school districts by making it a priority for their staffs to obtain the forms. In three of the four middle schools that did not achieve the 70% minimum consent rate, the contact persons said that principals offered little support. In the fourth school, the principal left the tasks to be completed by a Partners staff, the school contact person, and a school secretary. The school contact person’s degree of persistence and enthusiasm in getting parents to return their consent forms also may have influenced the success in implementing the strategy. In the seven school districts in which all middle schools achieved a response rate of 70% or higher, six of the contact persons were somewhat or very persistent and very enthusiastic; whereas, all of the contact persons in the three school districts whose return rate was less than

246

AMERICAN JOURNAL OF EVALUATION, 20(2), 1999

70% reported being somewhat or not very persistent or enthusiastic. Finally, the contact persons whose schools achieved 70% or higher felt that their schools could be successful in obtaining written consent from parents for future surveys. Incentives were reported as being helpful to get consent forms returned. A large majority of the contact persons (9 out of 10) indicated that the $50 gift certificate was either some “help” or a “big help.” Other incentives included a class contest for a pizza or ice cream party and holding a movie ticket drawing for students who returned the consent forms. Contact persons obtained donations from local companies to do this. Several of the comparison school districts used some of the stipend ($250) they were given to participate in the overall study (also provide record data) to pay for special events in order to get consent forms returned. Fifth, we asked respondents what barriers, if any, prevented their school system from getting a higher return rate. Although most of the school districts achieved the minimum return rate, they did encounter barriers. The most prevalent barrier cited was time constraints which hindered carrying out the added responsibilities. Examples of isolated barriers reported in different school districts included poor timing, tight time schedules, lack of clarity about the purpose of the survey, new staff members, and lack of proximity to students. Is a local school-based strategy cost effective? In determining the cost effectiveness of the local school-based parental consent strategy, we obtained estimates of personnel and non-personnel costs for specific tasks. Personnel costs were for (a) school personnel time (administrators, teachers, staff, and, in some cases, student workers), (b) Partners’ program staff time, and (c) evaluation team time (chief evaluator, co-evaluator, and staff). School and program non-personnel costs included postage, supplies, travel, and long distance phone calls. Evaluation non-personnel costs were for postage, duplication, travel, gift certificates, incentives to the comparison schools, and miscellaneous expenses. The personnel costs were calculated on the basis of tasks and time (rounded to the hour) for each person and approximate annual salary. Salaries were converted to an hourly rate and multiplied by the amount of time a person spent on a particular task. Cost effectiveness figures were calculated by dividing the number of students in each school system whose parents returned a form into the total combined cost (personnel and non-personnel) for the school district, the Partners program, and the evaluation. The results then represent the cost per returned parental consent form. The result of this analysis is presented in Table 2. For each school district, Table 2 shows the number of consent forms completed, school district cost (including costs to the Partners program for its staff helping the school), evaluator cost, total cost, and per cost for each returned form. School district costs varied from a low of $220.00 for one of the comparison schools to a high of $1,280.00 for the second largest district in the study. The cost per student ranged from a low of $2.81 for the largest school district to a cost of $31.50 for the smallest school system, with an average total cost per student of $8.28. One reason for higher costs in the smaller school districts is that the start-up costs (meetings, developing letters, forms, etc.) are constant regardless of the size of the school. In small school districts, start-up costs contributed significantly to the higher cost per returned form. In addition, the four school districts in the demonstration region with cost per student exceeding $17 had a disproportionate amount of assistance from the evaluators and/or the Partners staff. Much of the difference in non-personnel costs appears to

Obtaining Active Parental Consent

247

TABLE 2. Cost Effectiveness of Implementing a Local School-Based Parental Consent Strategy School District 1 2 3 4 5 6 7 8 9 10 Total

Returned Forms (N)

School Cost

Evaluation Cost

Total Cost

Costs per Returned Form

655 35 130 265 72 57 46 105 241 77 1683

$1157 536 833 1280 584 686 627 261 473 220 $6657

$ 685 566 661 769 699 591 641 861 904 892 $7269

$ 1842 1102 1495 2049 1283 1277 1268 1122 1377 1112 $13927

$ 2.81 31.50 11.50 7.73 17.82 22.40 27.57 10.69 5.71 14.44 $ 8.281

1

When only grant costs of $7269 (evaluation and program staff costs) are taken into consideration, the cost per student is reduced from $8.28 to $4.63.

be related to how the school districts were able to identify supplies and activities directly associated with obtaining parental consent. When only grant costs for personnel (evaluator and program staff) and non-personnel items are used to calculate cost effectiveness for the ten school districts, costs are reduced from $8.28 to $4.63 per student (not presented in table). However, approximately 20% of the evaluation budget allocated for collecting data from youth was spent on obtaining parental consent. The cost effectiveness of the school-based strategy of $8.28 per student (or $4.63 if only grant costs are taken into consideration) is low in comparison to the results of other cost effectiveness analyses reported in the literature. In one study of active written parental consent costs, the researchers found that it took four weeks and $25.00 per student to achieve an 85% consent rate (Ellickson & Hawes, 1989). Another study requiring active parental consent in a high school where there was heavy public opposition to the research achieved a 65% consent rate at a cost of about $45.00 per student (Ellickson, 1994). Assuming a 72% consent rate is acceptable for a multiple cross-sectional evaluation, our study demonstrates that a school-based strategy is a cost-effective way to achieve the desired results, although a major consideration would be the size of the school system and the number to be surveyed. The smaller the school system, the higher the cost per student; districts with less than 200 students to be surveyed may find that a local school-based strategy is not cost effective. SUMMARY AND CONCLUSION Because the disclosure of sensitive information constitutes a major risk to youth who participate in ATOD outcome evaluations, evaluators need to ensure protection and to deliver on that promise. Proponents of active written consent argue that this method is the best way to reduce the possibility of exposing youth to risks that their parents have not explicitly accepted.

248

AMERICAN JOURNAL OF EVALUATION, 20(2), 1999

Our study found that an active written consent strategy which is implemented in middle schools, with assistance from the superintendent’s office and grant personnel, can produce an acceptable consent rate in most school districts that are involved in a multiple cross-sectional evaluation. We also found this strategy to be cost effective, even with extensive follow-up, though the cost effectiveness significantly decreases in small school districts. However, we believe that the parental consent rate produced by our local school-based strategy is not sufficiently high to implement a repeated measure evaluation that spans four years. Nevertheless, it still is much more effective than the 40% to 60% rates that are reported in a number of previous studies (Cross, 1994; Kearney et al., 1983; Lueptow et al., 1977; Severson & Biglan, 1989; Thompson, 1984). These results do not refute the claims of passive parental consent proponents who contend that the passive strategy produces higher return rate than an active consent strategy while still protecting the rights of youths (Cross, 1994; Severson & Biglan, 1989). Rather, the results suggest that if active parental consent is the only option available, then it is possible to achieve an acceptable, cost effective consent rate using our local school-based strategy. Key factors that school representatives rated as important in implementing a successful strategy for obtaining written parental consent included (a) launching the strategy at the beginning of the school year, (b) gaining support from personnel at every level of the school district and school (especially the principal), (c) having the superintendent and principal sign the letter to parents, (d) having the superintendent mail the form home instead of sending it home via students, (e) offering incentives to youth, and (f) recruiting school contact persons who are empowered and committed to getting parental permission. We found that there are barriers to overcome in implementing a school-based parental consent strategy, and unless there is adequate school support with empowered and committed school contacts, the strategy will probably not produce an acceptable parental consent return rate.

NOTES

This study was supported by the Center for Substance Abuse Prevention Grant No. a HD45PO6877-01, Community Systems Research Institute, Inc., Seven Counties Services, Inc., and University of Louisville. We wish to thank Ms. Jeanne Anderson, Joyce Borders, and the editors for their helpful comments on earlier drafts.

REFERENCES

Anderman, C., Cheadle, A., Curry, S., & Diehr, P. (1995). Selection bias related to parental consent in school-based survey research. Evaluation Review, 19 (6), 663-674. Cross, H. D. (1994). An adolescent health and lifestyle guidance system. Adolescence, 29 (114), 267-277. Dent, C. W., Galaif, J., Sussman, S., & Stacy, A. (1993). Demographic, psychosocial and behavioral differences in samples of actively and passively consented adolescents. Addictive Behaviors, 18, 51-56. Dent, C. W., Sussman, S. Y., & Stacy, A. W. (1997). The impact of a written parental consent policy on estimates from a school-based drug use survey. Evaluation Review, 21(6), 698-712.

Obtaining Active Parental Consent

249

Ellickson, P. L. (1989). Limiting nonresponse in longitudinal research: Three strategies for schoolbased studies. A Rand note. Santa Monica, CA: The Rand Corporation. Ellickson, P. L. (1994). Getting and keeping schools and kids for evaluation. Journal of Community Psychology CSAP Special Issue, 102-116. Ellickson, P. L., & Hawes, J. A. (1989). An assessment of active versus passive methods for obtaining parental consent. Evaluation Review, 13(1), 45-55. Gensheimer, L. K., Ayers, T. S., & Roosa, M. W. (1993). School-based preventive interventions for at-risk populations: Practical and ethical issues. Evaluation and Program Planning, 16(2), 159167. H. R. 1271, 104th Cong., 1st Sess. (1995). Harrington, K. F., Binkley, D., Reynolds, K. D., Duvall, R. C., Copeland, J. R., Franklin, F. & Racqynski, J. (1997). Recruitment issues in school based research: Lessons learned from the High 5 Alabama Project. Journal of School Health, 67(10), 415-21. Kearney, K. A., Hopkins, R. H., Mauss, A. L., & Weisheit, R. A. (1983). Sample bias resulting from a requirement for written parental consent. Public Opinion Quarterly, 47, 96-102. Lueptow, L., Mueller, S. A., Hammes, R.R., & Master, L.S. (1977). The impact of informed consent regulations on response rate and response bias. Sociological Methods and Research, 6(2), 83-104. Moberg, D. P., & Piper, D. L. (1990). Obtaining active parental consent via telephone in adolescent substance abuse prevention research. Evaluation Review, 14(3), 315-323. Noll, R. B., Zeller, M. H., Vannatta, K., Bukowski, W. M., & Davies, W. H. (1997). Potential bias in classroom research: Comparision of children with permission and those who do not receive permission to participate. Journal of Clinical Child Psychology, 26(1), 36-42. O’Donnell, L. N., Duran, R. H., San Doval, A., Breslin, M. J., Juhn, G. M., & Stueve, A. (1997). Obtaining written parent permission for school-based health surveys of urban young adolescents. Journal of Adolescent Health, 21, 376-383. P. L. No. 103-227, 20 U.S.C.A. § 1232h (1994). Renger, R., Gotkin, V., Crago, M., & Shisslak, C. (1998). Family Privacy Protection Act for research and evaluation involving minors. The American Journal of Evaluation, 19(2), 191-202. Rossi, P., Freeman, H., & Lipsey, M. (1999). Evaluation: A systematic approach. Newbury Park, CA: Sage Publications. Suchman, E. (1967). Evaluative research. New York: Russell Sage. Severson, H., & Ary, D. (1983). Sampling bias due to consent procedures with adolescents.Addictive Behaviors, 8, 433-437. Severson, H., & Biglan, A. (1989). Rationale for the use of passive consent in smoking prevention research: Politics, policy and pragmatics. Preventive Medicine, 18, 267-279. Sussman, S., Dent, C. W., Stacy, A. W., Burciaga, C., Raynor, A., Turner, G. E., Ventura, C., Craig, S., Hansen, W. B., Burton, D., & Flay, B. R. (1990). Peer-group association and adolescent tobacco use. Journal of Abnormal Psychology, 99(4), 349-352. Thompson, T. L, (1984). A comparison of methods of increasing parental consent rates in social research. Public Opinion Quarterly, 48, 779-787. Yeager, J. D. (1994). Confidentiality of student records: A guide for school districts establishing policies and procedures with special emphasis on alcohol and other drug use. Portland, OR: Western Regional Center for Drug-Free Schools and Communities.