Pediatric Residents’ Knowledge of the Community Kimberly D. Northrip, MD, MPH; Heather M. Bush, PhD; Hsin-Fang Li, MA; Jennifer Marsh, JD, PhD; Candice Chen, MD, MPH; Mark F. Guagliardo, PhD From the The Division of General Pediatrics, University of Kentucky College of Medicine (Dr Northrip), Department of Biostatistics, University of Kentucky College of Public Health, Lexington, Ky (Dr Bush, Ms Li); Medical Services, Volunteer Support, Peace Corps Headquarters (Dr Marsh), Department of Health Policy, School of Public and Health and Health Services, The George Washington University (Dr Chen), Washington, DC; and JG Analytics LLC, Burke, VA (Dr Guagliardo) Address correspondence to Kimberly D. Northrip, MD, MPH, 800 Rose Street, MN 118, Lexington, KY 40536 (e-mail:
[email protected]). Received for publication January 31, 2011; accepted March 16, 2012.
ABSTRACT OBJECTIVE: The purpose of this study was to examine pediatric residents’ knowledge of the communities they serve through their continuity clinics. DESIGN/METHODS: The community was identified for each of 6 continuity clinics at an urban children’s hospital by geocoding patient addresses using GIS software (1 hospital-based [n ¼ 36], 1 primary care track site [n ¼ 10], and 4 community clinics [n ¼ 12]). We assessed resident and attending knowledge with a survey examining 7 content areas with basic questions about these communities. The survey answers were compared with publicly available community data. RESULTS: A total of 37 of 57 eligible residents (65%) and 21 of their 23 attendings (91%) completed the survey. The residents achieved an overall mean score of 28.9% correct (SD 9.2) and attendings scored 42.6% (SD 19.7). Scores were significantly greater for community-based attendings overall (P < .002)
and for community-based residents only in the questions of schools (P < 0.001). However, community-based residents had poorer scores in the demographics/economics content area (P < 0.001). Scores were not correlated with year of residency. CONCLUSIONS: Our pediatric professional organizations have recognized the importance of training residents in community pediatrics. This study is the first to describe resident community knowledge and to demonstrate that this knowledge is generally poor, with specific gaps in the content areas of schools, daycares, and health care access. There are differences in areas of knowledge between those working in hospital versus community clinics, suggesting this is an area for further investigation.
KEYWORDS: community pediatrics; medical education ACADEMIC PEDIATRICS 2012;12:350–
WHAT’S NEW
The ACGME added a community and child advocacy experience to the pediatric residency requirements in 1996. These requirements include “community-oriented care with focus on the health needs of all children within a community, particularly underserved populations” and call for training pediatric residents regarding education, childcare, advocacy, and legislation to achieve that goal.3 Professional groups and researchers have further defined what community and advocacy experiences should include (Table 1). The Dyson Foundation and the Academic Pediatric Association have developed guidelines and activities for meeting ACGME competencies in community pediatrics.2,4–6 Wright, et al7 surveyed pediatric advocacy leaders to determine the necessary components of an advocacy-training curriculum and ranked those objectives using the Delphi technique. Several common content areas emerge from these documents. An understanding of the community is critical to the practice and training of pediatricians. Key areas of knowledge include community demographic/economic data, information about schools, daycares, resources, barriers to accessing health care, local governments, and sociocultural factors. Several publications have described or studied specific community pediatrics curricula. These primarily examine changes in resident attitudes, advocacy skills, selfperceived competence, and understanding of living in
We found differences in pediatric residents’ knowledge of the community between those trained in hospitalbased versus community-based settings. Communitybased attendings were more knowledgeable about the community than their hospital-based counterparts, suggesting potential for community pediatrics training in community-based sites.
THE
IMPORTANCE OF community, social and environmental factors in the health and well-being of children is increasingly being recognized. The American Academy of Pediatrics (AAP),1 the Academic Pediatric Association,2 and the Accreditation Council for Graduate Medical Education (ACGME)3 have called attention to the importance of community physicians’ knowledge of the neighborhoods they serve and underscored the importance of knowledge standards, education and assessment of trainees. In 1999, the AAP first published its policy statement on the pediatrician’s role in the community. The AAP defines community pediatrics as integral to the professional duty of pediatricians, recommends medical schools and residency programs include community pediatric competencies, and suggests that pediatricians become involved in student and resident education in the community setting.1 ACADEMIC PEDIATRICS Copyright ª 2012 by Academic Pediatric Association
350
Volume 12, Number 4 July–August 2012
ACADEMIC PEDIATRICS
PEDIATRIC RESIDENT COMMUNITY KNOWLEDGE
351
Table 1. Recommended Topics for Residency Training in Community Pediatrics APA Topics for Training Pediatric Residents in Community Pediatrics2
Dyson Foundation Consensus Statement Resident Competencies1
Delivery of culturally effective care Child advocacy Community and public health Medical home Educational and childcare settings Special populations The pediatrician as consultant Research and scholarship
Child in the cultural, ethnic, and family context Medically underserved children Child in the community Practice management and community pediatrics Child in daycare and school settings Healthcare for children with chronic disease and terminal illness Child abuse, neglect, violence and substance abuse Health care organization and financing
poverty.8–13 However, minimal information exists regarding residents’ and attendings’ knowledge of their patients’ communities. Residents’ familiarity with the communities they serve through their continuity clinics is unclear. This study presents the knowledge of pediatric residents in an urban pediatric setting regarding the community served by their continuity clinics. We hypothesized that residents’ knowledge of these communities would be minimal. This study asked 3 additional questions: 1) Do increased years in training improve residents’ knowledge of their clinic’s community? 2) Do residents in community-based continuity clinics have more community knowledge than residents in hospitalbased clinics? 3) Do supervising attendings in community-based sites have more community knowledge than their hospital-based colleagues?
METHODS We conducted a cross-sectional survey of residents and attendings in a large urban pediatric residency to evaluate their knowledge of the community served by their continuity clinic. We obtained approval for this study from our institutional research board. DEFINING COMMUNITIES We defined each clinic community as the census tracts constituting 60% of the clinic’s patient base over one year. We used ArcGIS 9.2 software to geocode the addresses of all patients visiting all clinics in that year and identified the census tracts with the largest patient counts summing to 60% or more as a given clinic’s community. Publicly available information was then used to characterize each community regarding demographics, economic status of the population, schools, day cares, local government officials, and other resources.
Wright et al7 Top 5 Objectives for Pediatric Residency Advocacy Training
Societal and cultural factors Barriers limiting access Community characteristics How to access local leaders Community agencies and resources
The clinics’ communities were diverse (Table 2). Community-based clinic communities ranged from 18– 45 census tracts in size. The hospital-based and primary care track clinic communities consisted of 85 and 108 tracts respectively. Unemployment ranged from 8.2% to 23.1% and crime from 50 to 78 crimes per 1000 people. SURVEY DEVELOPMENT Because no standard questionnaire was available for this topic, the study team developed the survey based on recommendations from pediatric professional associations. The authors selected questions for each content area based on relevance to the clinic communities. Questions were designed to have objective answers that could be obtained from existing city or geocodable datasets. This limited the number of possible questions in certain content areas, but allowed us to identify correct answers for each clinic. Subjects were first asked to identify the neighborhoods where the majority of their patients lived. The remaining questions examined knowledge of schools, child care centers, resources, transportation, violence, local leaders, health care, and demographics in the defined community (Table 3 and 4). Finally, residents were asked to indicate where they learned about their clinic’s community. We attempted to make the questions of similar difficulty, offering multiple choice ranges for areas such as economic and crime statistics. To identify local resources, subjects were asked to name the resource and/or give its location, allowing the subject to demonstrate partial or more complete knowledge. Although multiple choice format would have allowed subjects to recognize a resource with a name they couldn’t recall, this was not always feasible given the large number of resources across all clinic communities. Because the recommendations of our professional organizations did not value one topic over another, we considered all content areas and their questions to be of equal
Table 2. Clinic Community Demographics Hospital Based
Primary Care Track
Ethnicities 87% AA,* 7.5% H 62% AA, 28% W Common languages English English Unemployment rate 14.1% 8.9% Crime rate per 1000 56 50 Number of census 85 108 tracks in community *AA ¼ African American; H ¼ Hispanic; W ¼ white.
Comm. Based 1
Comm. Based 2 Comm. Based 3
97% AA, 1.5% W 98% AA, 1% W English English 20.9% 23.1% 78 59 18 15
Comm. Based 4
58% AA, 24% H 78% AA, 13.2% H English Spanish English 8.2% 12.2% 59 51 28 45
352
NORTHRIP ET AL
importance. One question had multiple subquestions (name an elementary, middle, and high school). These were treated as separate questions in the analysis. Questions with multiple correct answers (for example, select the two largest ethnic groups) were given a score for each participant and then averaged over all respondents to produce the mean score for that question. Local experts in community pediatrics (n ¼ 4) were asked to review the survey for content validity, clarity, and relevance to the specific communities. We collected the following data for each participant: academic rank (resident or attending), assigned clinic, residents’ year of training, attendings’ number of years at the institution, and the number of days a week they attend continuity clinic. The survey was anonymous to avoid possible stigma associated with deficits in community knowledge. To preserve anonymity, no additional demographic data were collected. STUDY POPULATION We invited 6 resident continuity clinics at one urban training program to participate. These clinics were selected because they represent a variety of training venues, had adequate numbers of residents at each site, and were within the purview of our institutional research board. No clinic with only 1 resident was selected because of anonymity concerns. This left a limited set of eligible residents and attendings (57 and 23, respectively). Of the 6 continuity clinics participating in the study, 4 were community-based (n ¼ 12). The other 2 clinics were on hospital campuses. One of these was a standard hospital-based continuity clinic (hospital-based; n ¼ 36) and the other was the continuity clinic site for the primary care track (PCT; n ¼ 10). These clinics primarily see underserved patients. At the hospital-based and communitybased continuity clinics, categorical pediatric residents see patients half a day a week. The PCT focuses on outpatient experience; PCT residents have a 6-month block per year in continuity clinic as well as half a day per week the rest of the year. The PCT residents therefore have a much greater exposure to their continuity clinic population. At the time of this survey, pediatric residents did not participate in a formal community pediatrics curriculum but two community-based attendings were developing a curriculum with one of the residents involved in the study. In the early spring of 2007 we distributed anonymous surveys to all residents and supervising attendings in the participating clinics. Clinic directors were asked to remind subjects to complete the surveys and leave them in an envelope to be collected by the Principal Investigator after 1 month. To preserve anonymity, we were unable to specifically target nonresponders to increase our response rate. Instead we sent generalized reminders to the clinics twice during the study period. STATISTICAL METHODS Correct answers developed for each clinic using GIS (ie, geographic information systems) mapping and publicly available datasets were compared with subject responses.
ACADEMIC PEDIATRICS
Results were expressed for each question, subscores for each content area, and a single overall numerical score, which provides an easily discussed but nonspecific assessment of differences in knowledge. Differences in grouped content area scores could largely be the result of a single question. Therefore, content areas were scored and then each question within the content area was examined to determine areas of strength or weakness. When considering aggregate overall or content area scores, we calculated the percent correct for a set of questions for each physician. This value was then treated as a continuous variable and summarized with descriptive statistics (n, mean, SD). Physician type, year of experience, and clinic type were treated as independent variables in separate analyses. When summarizing responses for a single question, we used counts and percentages to describe the number with a correct response. To compare the 3 groups (community, hospital, and PCT), we performed an analysis of variance. Two-group comparisons were made with 2-sample t-tests. A 2-sided significance level of .05 was used for all statistical tests. SAS v9.1 was used for all analyses. The sample sizes in this study are quite small and so comparisons between groups do not have sufficient power to detect differences that are moderate or small.
RESULTS Between the 6 clinic sites, there were no significant differences in resident or attending staffing. Each clinic had similar distribution of residents during the 3 years of training, and attending years of experience were similar. Residents spent more days per week working in the PCT site, as expected given the primary care focus of the track. Because PCT is a different model of training on a hospital campus, results for PCT residents will only be mentioned if they differ substantially from those of the hospital-based residents (The scores listed for hospital-based residents and attendings are for the standard model of hospitalbased continuity clinic and not a composite number that includes the PCT results.). The survey was completed by 36 of 57 eligible residents (63%) and 21 of 23 attendings (91%) for an overall physician response rate of 71.3%. Specific questions and the corresponding scores within each content area, classified by community-based, hospital-based and PCT clinics for residents and attendings, are presented in Tables 3 and 4, respectively. DEMOGRAPHICS/ECONOMICS Regardless of clinic location, less than 10% of residents correctly identified the three neighborhoods containing the majority of clinic patients. Attendings also scored low for this question (44% community, 15% hospital-based). Most hospital-based attendings and residents correctly selected the two largest ethnic groups (86% and 92%) and the unemployment rate (64% and 76%) in their community. Community-based residents scored lower for these questions (42% ethnicity and 67% unemployment).
ACADEMIC PEDIATRICS
PEDIATRIC RESIDENT COMMUNITY KNOWLEDGE
353
Table 3. Resident Physicians Community Knowledge Scores by Clinic Location, Sorted by Theme Residents Overall mean score (with SD) Demographics/economics mean score** 1. Circle the 3 neighborhoods in your community that contain the most patients from your continuity clinic. 2. Select the 2 largest ethnic groups living in the community served by your continuity clinic. Please select the make-up of the community, not the patient population in your office or clinic (though these may be the same). 4. Which of the following do you think was the appropriate unemployment rate for DC in 2004? 5. Do you think the unemployment rate for your clinic’s community was ______ (lower, about the same, greater) Schools mean score** 8a. Name one of the elementary schools in the clinic’s community. 8b. Name one of the middle/Jr. high schools in the clinic’s community. 8c. Name one of the high schools in the clinic’s community. Daycares mean score 14. Name one daycare or childcare center that is located in this community. Resources mean score 9a. Name a supermarket or large grocery store in this community. 10. Name a pharmacy in this community. 12. Name a public library in this community. 13. Name a recreation center in this community. 15. Name a food pantry, soup kitchen, or homeless shelter located in this community. 16. Name and/or describe the location of the WIC (ie, Women, Infants, & Children) office nearest to your clinic. 17. Name the metro line that will get members of this community closest to the clinic. 18. Name one bus route number that will get patients from the community to the clinic. 24. Name or describe the location of a park, green-space, or playground in this community. Barriers/Access to Healthcare Mean Score 19. Name another primary health care office that sees children in this community (other than a CNMC facility) 20. Name the nearest hospital with an ER in or near the community (other than CNMC). 21. Name a mental health facility (other than CNMC) in this community. 22. Name a dentist (other than CNMC) in this community. Local government Mean Score 6. Name the Mayor of DC. 7. What is the name of the city council member for the city ward where your clinic is community is located? Social/cultural factors mean score 3. Name the languages that are commonly spoken by more than 10% of people in this community. 23. Where do you think this community falls in the distribution of crime rate statistics in DC?
Community (n ¼ 6)
Hospital-based (n ¼ 25)
Primary Care Tract (n ¼ 6)
32% (10%) 29% (9%) 6%
30 (8%) 45% (11%) 9%
29% (8%) 40% (11%) 0%
42%
92%
92%
33%
24%
50%
67%
76%
50%
22% (17%) 0% 0% 67% 17% (41%) 17% 33% (19%) 50% 67% 0% 33% 0%
0% (0%) 0% 0% 0% 4% (20%) 4% 34% (12%) 76% 88% 0% 4% 4%
11% (27%) 0% 17% 17% 0% (0%) 0% 26% (11%) 83% 33% 0% 0% 0%
50%
76%
0%
52% 17%
56% 4%
83% 0%
50%
12%
17%
33% (26%) 17%
21% (20%) 28%
38% (21%) 33%
67%
40%
100%
17% 33% 50% (0%) 100% 0%
4% 12% 32%(24%) 64% 0%
17% 0% 25% (27%) 50% 0%
42% (20%) 50%
34% (32%) 20%
42% (38%) 67%
33%
48%
17%
CNMC ¼ Children’s National Medical Center. **P <.01; P values represent the results of a one-way analysis of variance to test differences between mean content area scores of the 3 groups. Standard deviations are in parentheses beside mean scores. Mean scores represent the percent of respondents answering correctly.
SCHOOLS No resident from any clinic correctly named an elementary school in their clinic’s community, and only one correctly identified a middle school. Sixty-seven percent of community-based residents correctly identified a high school, whereas no hospital-based resident did. Like residents, attendings were more aware of local high schools (100% community, 73% hospital-based) but less informed about middle (33% community, 36% hospital-based) and elementary schools (67% community, 45% hospitalbased).
DAYCARES Fifty percent of community attendings, 55% of hospitalbased attendings and few residents (17% community, 4% of hospital-based) were able to name a daycare or child care center located in the community. RESOURCES Mean scores in the resources content area were 73% for community-based attendings and 63% for hospital-based attendings. Resident scores were also similar by clinic type in this area although PCT residents scored somewhat
354
NORTHRIP ET AL
ACADEMIC PEDIATRICS
Table 4. Attending Physician Community Knowledge Scores by Clinic Location, Sorted by Theme Attendings Overall mean score** (with SD) Demographics/economics mean score 1. Circle the 3 neighborhoods in your community that contain the most patients from your continuity clinic. 2. Select the 2 largest ethnic groups living in the community served by your continuity clinic. Please select the make-up of the community, not the patient population in your office or clinic (though these may be the same). 4. Which of the following do you think was the appropriate unemployment rate for DC in 2004? 5. Do you think the unemployment rate for your clinic’s community was ______ (lower, about the same, higher)? Schools mean score* 8a. Name one of the elementary schools in the clinic’s community. 8b. Name one of the middle/Jr. high schools in the clinic’s community. 8c. Name one of the high schools in the clinic’s community. Daycares mean score 14. Name one daycare or child care center that is located in this community. Resources mean score**** 9a. Name a supermarket or large grocery store in this community. 10. Name a pharmacy in this community. 12. Name a public library in this community. 13. Name a recreation center in this community. 15. Name a food pantry, soup kitchen, or homeless shelter located in this community. 16. Name and/or describe the location of the WIC (ie, Women, Infants, & Children) office nearest to your clinic. 17. Name the metro line that will get members of this community closest to the clinic. 18. Name one bus route number that will get patients from the community to the clinic. 24. Name or describe the location of a park, green-space, or playground in this community. Barriers/access to healthcare mean score 19. Name another primary health care office that sees children in this community (other than a CNMC facility) 20. Name the nearest hospital with an ER in or near the community (other than CNMC). 21. Name a mental health facility (other than CNMC) in this community. 22. Name a dentist (other than CNMC) in this community. Local government mean score 6. Name the Mayor of DC. 7. What is the name of the city council member for the city ward where your clinic is community is located? Social/cultural factors mean score 3. Name the languages that are commonly spoken by more than 10% of people in this community. 23. Where do you think this community falls in the distribution of crime rate statistics in DC?
Community (n ¼ 6)
Hospital-Based (n ¼ 11)
Primary Care Tract (n ¼ 4)
63% (7%) 48% (12%) 44%
52% (15%) 44% (16%) 15%
21% (13%) 36% (25%) 0%
58%
86%
75%
33%
27%
50%
50%
64%
50%
67% (30%) 67% 33% 100% 50% (55%) 50% 73% (10%) 83% 100% 50% 100% 17%
52% (38%) 45% 36% 73% 55% (52%) 55% 63% (13%) 100% 100% 0% 55% 9%
8% (17%) 0% 0% 25% 0% (0%) 0% 7% (9%) 0% 25% 0% 0% 0%
83%
91%
0%
100% 17%
82% 45%
50% 0%
83%
36%
0%
63% (26%) 67%
41% (26%) 55%
25% (29%) 50%
100%
82%
50%
50% 33% 75% (27%) 100% 50%
18% 9% 59% (30%) 91% 27%
0% 0% 50% (0%) 100% 0%
50% (0%) 83%
27% (26%) 0%
38% (25%) 75%
17%
55%
0%
CNMC ¼ Children’s National Medical Center. * P < .05; **P < .01; ****P < .0001. P-values represent the results of a one-way analysis of variance to test differences between mean content area scores of the 3 groups. Standard deviations are in parentheses beside mean scores. Mean scores represent the percent of respondents answering correctly.
lower (33% community, 34% hospital-based, 26% PCT). As compared with hospital-based residents, PCT residents struggled with questions about a community WIC (Women, Infants, & Children) office (0% PCT, 76% hospital-based) and pharmacy (33% PCT, 88% hospitalbased). Attendings and residents at all sites struggled with naming a community library, food pantry/homeless shelter, and bus route to the clinic. BARRIERS/ACCESS TO HEALTH CARE In this content area, residents and attendings were most aware of the nearest emergency department. Attendings
and residents had trouble naming a community mental health facility and dentist. LOCAL GOVERNMENT A total of 100% community and 91% hospital-based attendings as well as 100% community and 64% hospital-based residents were able to name the mayor of DC. No resident at any site could name the city council member where their clinic was located. Fifty percent of community-based and 27% of hospital-based attendings identified the clinic’s council member.
ACADEMIC PEDIATRICS
SOCIAL/CULTURAL FACTORS This is another content area in which PCT resident scores differed from their hospital-based peers. No hospital-based attendings and only 20% of their residents were able to correctly identify the languages spoken in the clinic’s community, whereas PCT residents and community-based physicians could do so (67% PCT residents, 50% community residents, 83% community attendings). Attendings (17% community, 55% hospital) and residents (33% community, 48% hospital, 17% PCT) had difficulty identifying the community crime rate. SOURCES OF INFORMATION Residents most commonly reported learning about their community from 1) clinic patients and families (83.8%), 2) clinic preceptors (67.6%), and 3) other residents (24.3%). These responses were similar for residents in all settings except for the third response. No resident in the community-based clinics listed other residents as a source. OVERALL PATTERNS We analyzed overall scores and content area subscores for statistically significant patterns. Parametric and nonparametric tests were performed with similar results. Overall scores for residents were greater in the community clinics, followed by PCT residents, but this was not statistically significant. Community-based residents also scored greater in the schools content area (P < .001). In addition, greater scores were noted in the health care access and local government content areas (P ¼ .13 and 0.14, respectively) but were not statistically significant. Communitybased residents scored lower in demographics/economics than their peers (P < .001). Resident scores overall were low, particularly in the school, daycare, and health care access content areas. Community-based attendings had greater scores than hospital-based and PCT attendings (P < .002). The only content areas to independently reach statistical significance were schools and resources. Scores did not improve with seniority as measured by year of residency or the number of years an attending worked at a clinic.
DISCUSSION This study assesses the knowledge of pediatric residents and attendings at an urban pediatric hospital about the communities their continuity clinics serve. Overall resident’s community knowledge scores were poor with a mean score of 28.9% correct. Further, resident knowledge is uneven across content areas, with highest scores for the demographic/economic, sociocultural, and local government content areas. Residents were weakest in schools, daycares, and health care access. Few residents could name a school or daycare in their area. More importantly, they were unable to identify dental, emergency department, or mental health facilities serving children in the community. These areas of weakness could directly affect a physician’s ability to appropriately guide families in need of educational, dental, or mental health services in their community.
PEDIATRIC RESIDENT COMMUNITY KNOWLEDGE
355
Residents reported learning about their community from patients, attendings, and other residents—all clinic-based resources. They struggled with questions about crime and unemployment rates, city councilmen, bus routes, daycares, schools, libraries, food pantries, and homeless shelters. One possible explanation is that these topics may be less commonly discussed during routine clinic processes. Another area of weakness was health services not provided by their home institution. There are no data to explain why this was so. We speculate that residents might refer patients internally and therefore may be less aware of communitybased medical resources. Resident knowledge scores did not improve with year of training. The residency program involved in this study had no formal community pediatrics curriculum. Further study is needed to evaluate whether a formal curriculum with yearly learning objectives would improve performance on an objective test of community knowledge with each year of training. Although greater than resident scores, attending scores were also low at 42.6% correct. Resident areas of weakness often mirrored attending weaknesses. This could be the result of shared variability in content area exposure in clinic. Alternatively, the areas of attending weakness may have led to decreased teaching in these content areas. Further study may clarify the relationship between attending and resident knowledge. One commonly proposed strategy to prepare residents to function within a community is training residents in community-based clinics.1,2 In our study, communitybased attendings were more knowledgeable about their community. We also saw increased knowledge in community-based residents, but it was not statistically significant except for the schools content area (P < .001). Community-based residents scored worse than their peers in demographics/economics (P < .001). There are several possible reasons for these results. Little difference may exist in the potential to acquire knowledge about a clinic’s community due to location alone. If residents are primarily learning about the community from within their clinic, as was stated here, then being physically present in the community may be less important. Previous studies of community rotations have demonstrated an improvement in self-assessed knowledge and comfort, but those rotations also included a structured curriculum.8–13 It may be the curriculum, not the location, which is important. However, attending community expertise is greater in community-based clinics, making them a potential source for knowledge. Community-based attendings demonstrated weakness in questions about unemployment, crime rates, middle schools, bus routes, dental care, and food pantries/homeless shelters. Areas of poor attending knowledge may contribute to the pattern of resident knowledge and would need to be remedied in order to optimize education in community pediatrics. A formal curriculum might augment the potential benefits of clinic location and attending knowledge by exposing residents to all 7 content areas and alternate information sources, including community members and quantitative
356
NORTHRIP
data sets. Further study is necessary to determine what elements of a community pediatrics curriculum would improve objectively measured community knowledge and whether or not its effect would be stronger in a community-based clinic. Finally, the size of a clinic’s community may affect community knowledge. The hospital-based and PCT clinics served much larger areas than community-based clinics including multiple neighborhoods. This could affect knowledge in several ways. Intimate knowledge of each neighborhood and its schools, resources, and barriers to health could be more difficult to acquire when working in larger clinics. This could be an alternate explanation for our results. It could also describe why community-based practices could be promising. The ability to intimately understand a single community may help residents see first-hand the relationship between community factors and health. However, this was not within the scope of this study. This study has several limitations. Subjects were all from one institution in one city. This limits generalizability of the study. We also had a small subject pool with a fair response rate from the residents. Therefore, the study may lack power to describe all possible differences between the groups. The study could also be affected by response bias. The survey was entirely anonymous to try to limit this. A future multicenter study would improve power and generalizability. One subject in this study was developing a community curriculum with 2 clinic attendings. This is a potential source of bias; however, this subject did not score higher than the community resident average; therefore, we did not consider participating in curriculum development to be a factor in the results. The survey was a test of what the authors thought residents should know about these communities on the basis of the recommendations of our professional organizations. Although it was evaluated by local experts for content validity and clarity, the survey was not a validated tool. It is possible the questions were too difficult or did not represent all views of adequate community knowledge. We have included the questions and their results to provide our readers with context.
CONCLUSIONS This study lends some support to the AAP recommendations for community-based experiences for pediatric residents. However, at our institution, working solely in the
ACADEMIC PEDIATRICS
community did not equate to better scores across all content areas. Community-based clinical training demonstrates potential for improving resident knowledge of the community. More study is needed to determine how best to optimize the experience. Without a universal community pediatrics curriculum, most residents got their knowledge of the community from sources within the clinic. Formal community-based curricula which consistently address all 7 content areas and encourage outside sources of learning may be a promising area for exploration. Lastly, more study is needed to determine whether improved knowledge of the community translates into improved skill at working with patients within the community to improve health.
REFERENCES 1. American Academy of Pediatrics Committee on Community Health Services. The Pediatrician’s Role in Community Pediatrics. Pediatrics. 2005;115:1092–1094. 2. Educational Guidelines for Residency Training in General Pediatrics. Academic Pediatric Association. Available at: http://www. academicpeds.org/egwebnew. Accessed March 28, 2012. 3. ACGME Program Requirements for Graduate Medical Education in Pediatrics. 2007. Available at: http://www.acgme.org/acWebsite/ downloads/RRC_progReq/320_pediatrics_07012007.pdf. Accessed March 28, 2012. 4. Shipley LJ, Stelzner SM, Zenni EA, et al. Teaching community pediatrics to pediatric residents: strategic approaches and successful models for education in community health and child advocacy. Pediatrics. 2005;115:1150–1157. 5. Garfunkel L, Sidelinger D, Rezet B, et al. Achieving consensus on competency in community pediatrics. Pediatrics. 2005;115: 1167–1171. 6. Rezet B, Wanessa R, Blashke GS. Competency in community pediatrics: consensus statement of the Dyson initiative curriculum committee. Pediatrics. 2005;115:1172–1183. 7. Wright C, Katcher M, Blatt S, et al. Toward the development of advocacy training curricula for pediatric residents: A National Delphi Study. Ambul Pediatr. 2005;5:165–171. 8. Takagishi J, Christner J, McCoy R, et al. Lessons learned from pediatric residents on a community pediatrics rotation. Clin Pediatr. 2006;45:239–244. 9. Shope T, Bradley B, Taras H. A Block rotation in community pediatrics. Pediatrics. 1999;104:143–147. 10. Kaczorowski J, Aligne CA, Halterman JS, et al. A block rotation in community health and child advocacy: improved competency of pediatric residency graduates. Ambul Pediatr. 2004;4:283–288. 11. Chin NP, Aligne CA, Stronczek A, et al. Evaluation of a communitybased pediatrics residency rotation using narrative analysis. Acad Med. 2003;78:1266–1270. 12. Olson CA, Stoddard J, DeMuri G. A community pediatrics/public health rotation for pediatric residents. Acad Med. 1998;73:598–599. 13. Lozano P, Biggs VM, Sibley BJ, et al. Advocacy training during pediatric residency. Pediatrics. 1994;94:532–536.
ERRATUM IN THE ARTICLE by Roberts KB et al (“The Association of Pediatric Program Directors: The First 25 Years” Vol. 12 No. 3, May/June 2012), there are two errors that require correction. On page 167, The name “Wilbur Cohen” should in fact be “Herbert Cohen.” Also, in Figure 1, Evan Charney is listed as president of the APPD from 1987 to 1988 and Edward Reiter as president from 1988 to 1989. In fact, Dr. Reiter was president from 1987 to 1988 and Dr. Charney was president from 1988 to 1989.