Length of advanced pharmacy practice experience and first-time NAPLEX pass rate of US pharmacy programs

Length of advanced pharmacy practice experience and first-time NAPLEX pass rate of US pharmacy programs

Currents in Pharmacy Teaching and Learning 12 (2020) 14–19 Contents lists available at ScienceDirect Currents in Pharmacy Teaching and Learning jour...

488KB Sizes 0 Downloads 72 Views

Currents in Pharmacy Teaching and Learning 12 (2020) 14–19

Contents lists available at ScienceDirect

Currents in Pharmacy Teaching and Learning journal homepage: www.elsevier.com/locate/cptl

Research Paper

Length of advanced pharmacy practice experience and first-time NAPLEX pass rate of US pharmacy programs

T

L. Douglas Ried College of Pharmacy, University of New Mexico, 2502 Marble Ave., Albuquerque, NM 87106, United States

ARTICLE INFO

ABSTRACT

Keywords: Advanced pharmacy practice experiences (APPE) North American Pharmacy Licensing Examination (NAPLEX) Assessment Experiential education Licensure Pharmacy education

Introduction: The objectives of this study were to (1) report the length in weeks of advanced pharmacy practice experiences (APPEs) of US pharmacy programs in 2016 and (2) compare firsttime North American Pharmacy Licensure Examination (NAPLEX) pass rates according to the length in weeks of the programs' APPEs. Methods: First-time NAPLEX pass rate was obtained from the National Association of Boards of Pharmacy public web page. The length in weeks of programs' individual APPEs (iAPPEs) and program characteristics were obtained from the individual pharmacy programs' web pages. Analysis of variance was used to compare iAPPE length and first-time NAPLEX pass rate and multiple regression was used to quantify the independent influence of iAPPE length on first-time NAPLEX pass rate. Results: The length in weeks was evenly distributed among four-, five- and six-week iAPPEs for NAPLEX testing years 2013 to 2015, although six-week iAPPEs have been preferred recently. The first-time NAPLEX pass rate was not associated with the total APPE length or whether the program used four-, five- or six-week iAPPEs for all three years and for the three-year aggregate pass rate. Conclusion: Six-week iAPPEs were the most common, but not the majority among pharmacy programs. Longer total or individual APPEs did not translate into higher first-time NAPLEX pass rates. Length of iAPPE rotations can be chosen without concern that student pharmacists' performances on first-time NAPLEX pass rates will be significantly impacted.

Introduction An important decision for the designers of any pharmacy curriculum is the length of the advanced pharmacy practice experiences (APPEs). The APPEs are typically 25% or more of pharmacy curricula and the capstone of the professional education experience. The Accreditation Council for Pharmacy Education (ACPE) Standards require at least 1440 hours for accreditation as a structural component of a curriculum.1 At a minimum, Standard 13.6 requires community pharmacy, ambulatory patient care, hospital pharmacy and inpatient general medicine APPEs.1 However, information regarding the typical length of individual APPEs (iAPPEs) is sparse or non-existent. For example, a faculty member requested information regarding the national distribution of length of iAPPEs in the AACP Experiential Education Section digest.2 The request went unanswered. The evidence-base regarding the optimal length for iAPPEs seems limited to a single study3 and is otherwise limited to commentaries.4,5 The study's three-part intervention included an eight-week rotation in a single community pharmacy compared with four-week rotations in two control community pharmacies. The other two aspects of the intervention were a five-day, asynchronous

E-mail address: [email protected]. https://doi.org/10.1016/j.cptl.2019.10.009

1877-1297/ © 2019 Elsevier Inc. All rights reserved.

Currents in Pharmacy Teaching and Learning 12 (2020) 14–19

L.D. Ried

student site orientation and a preceptor training program specific to the required student outcomes. Students' performances were compared on educational and clinical outcomes.3 However, the study's design did not allow the authors to disentangle the intervention's impact of the two four-week versus the one eight-week clinical rotation length on the outcomes from other aspects of the intervention; leaving them confounded. The optimal iAPPE length seems to depend upon value-laden criteria with little agreement.3,4 That is likely because when it comes to meeting ACPE standards,1 most assessments are designed, conducted and reported by individual pharmacy programs using unique measures. Therefore, pharmacy programs' outcomes are difficult to compare because of the plethora of unstandardized measures. One important exception is the North American Pharmacist Licensure Examination (NAPLEX). It is appropriate temporally because student pharmacists take the NAPLEX after they complete their APPE rotations. Moreover, every pharmacy program is required to publish its graduates' first-time NAPLEX pass rate and ACPE requires it to be included in programs' self-studies.1 Finally, the NAPLEX is a required component of the licensure process. It is used by boards of pharmacy as a measure of minimum competency and passing is required for licensure in all 50 states.6 The literature describing predictors of success on the NAPLEX have been conducted by individual pharmacy programs,7–10 with one exception.11 Continuous quality improvement (CQI) principles suggest that structure and process can be modified to improve programs' outcomes.12 The one exception is that multivariable analysis found programmatic structure and process variables to be non-significant. It reported that postgraduate year one (PGY1) residency match and the previous year's NAPLEX performance significantly predicted current NAPLEX performance.11 However, their final model may have been overly influenced by autocorrelation and multicollinearity from using temporally adjacent outcomes rather than programmatic structural resources and processes to predict subsequent outcomes.11 In summary, the evidence regarding the association between NAPLEX and the length of iAPPE rotations and the total number of APPE (tAPPE) weeks is scarce or non-existent. This study had two objectives, specifically to report the length in weeks of APPEs of US pharmacy programs at the beginning of 2016 and compare first-time NAPLEX pass rates according to the length of APPEs. Methods The study population was the census of schools and colleges of pharmacy in the US at the beginning of 2016.13 Fifteen programs were not fully accredited; therefore, their students had not taken the NAPLEX. Length of APPEs was unavailable for five programs. Programs' first-time NAPLEX pass rates are available annually for pharmacy programs in the United States at the score results section of the National Association of Boards of Pharmacy (NABP) web page.14 School-by-school reports include the pass rate within the same year for all candidates who took the exam for the first-time. NAPLEX score results are reported only for the state selected by the student as their primary jurisdiction for licensure. The pass rate and number of first time NAPLEX takers was collected for each program for individual testing years 2013, 2014 and 2015. The weighted average for 2013 to 2015 was calculated for each program. The weighted average was used to calculate programs' aggregate first-time NAPLEX pass rate for the three-year period 2013 to 2015. Data regarding iAPPE length was gathered from pharmacy programs' web pages. If the information was unavailable there, individual programs' experiential directors were contacted by email or by telephone. All iAPPEs were found to be four-, five-, or six-weeks in length. One-month long iAPPEs were classified as four-weeks. For the multivariable regression model, dummy variables were created for the four- and five-week iAPPEs, with six-week iAPPEs as the reference category. The number of tAPPE weeks was either specified on the programs' webpages or was calculated from the number of weeks per iAPPE multiplied by the number of required APPEs. Length of APPEs did not change during the time frame that would have affected the reported NAPLEX scores. Pharmacy programs were classified as located at a research intensive university,15 an academic health center,16 or a public university,16 (0 = no; 1 = yes); whether a bachelor's degree was required or preferred for admission (0 = no; 1 = yes); and the year the program was established.16 This data was dichotomized into whether the program was in existence prior to 2000 (0 = no; 1 = yes) to approximate the start of the all doctor of pharmacy entry level degree requirement.11 Proportions are reported for categorical variables. Means and standard deviations were used to describe normally distributed numeric data, including the number of tAPPE weeks for each pharmacy program. Medians and ranges were used to describe nonnormally distributed data. Analysis of variance (ANOVA) was used to compare first-time NAPLEX passing rates. All five of the programmatic control variables were first entered as a group in the ordinary least squares (OLS) regression model. The length of iAPPE dummy variables were entered next as a group. The length of iAPPEs would independently add to the prediction of first-time NAPLEX pass rate if the R2 change was statistically significant.17 Data were analyzed using IBM SPSS Statistics: Version 24.0 [IBM: Armonk, NY].18 The study's protocol was approved by the institutional review board at the University of Texas at Tyler and conducted according to the principles of the Declaration of Helsinki. Results At the beginning of 2016, there were 130 pharmacy programs in the United States. First-time NAPLEX pass rates were reported for programs in 2013 (n = 115), 2014 (n = 121), and 2015 (n = 125). At the beginning of 2016, just over 51% (n = 62) of the programs were located at a public university. Just over one-third of programs (n = 41) were located at a research-intensive university and 37% at an academic health center (n = 45). Over two-thirds of the programs did not require or prefer a bachelor's degree for admission and existed before 2000 (n = 83). The median number of years in existence for pharmacy programs with 4-week iAPPEs was 93 years (range = 8 to 166 years), 5-week iAPPEs was 105 years (range = 7 to 195 years) and 6-week iAPPEs was 42 years (range 6 to 193 years). 15

Currents in Pharmacy Teaching and Learning 12 (2020) 14–19

L.D. Ried

Table 1 Analysis of variance comparing the average NAPLEX first-time pass rate by the number of weeks for individual APPEs. Testing year

APPE length (n, %)

2013

4-weeks (n = 33, 28.7%) 5-weeks (n = 34, 29.6%) 6-weeks (n = 48, 41.7%) Total (n = 115) 4-weeks (n = 33, 27.3%) 5-weeks (n = 36, 29.8%) 6-weeks (n = 52, 43%) Total (n = 121) 4-weeks (n = 33, 26.4%) 5-weeks (n = 36, 28.8%) 6-weeks (n = 56, 41.7%) Total (n = 125)

2014

2015

Aggregate weighted average 2013 to 2015

4-weeks (n = 33, 28.7%) 5-weeks (n = 34, 29.6%) 6-weeks (n = 48, 41.7%) Total (n = 115)

Mean NAPLEX % pass rate (SD)

95% CI (Lower–upper bound)

95.6 (6.1)

93.4 to 97.8

95.7 (5.7)

93.7 to 97.7

95.7 (3.9)

94.6 to 96.9

95.7 (5.1) 95.4 (5.0)

94.7 to 96.6 93.6 to 97.2

93.8 (5.3)

92.0 to 95.6

95.2 (3.8)

94.1 to 96.2

94.8 (4.6) 93.2 (5.2)

94.0 to 95.7 91.3 to 95.0

93.0 (5.3)

91.2 to 94.8

91.4 (3.9)

89.8 to 93.2

92.4 (5.8)

91.3 to 93.4

94.74 (5.0)

73.0 to 100.0

94.60 (4.2)

76.0 to 99.5

94.30 (4.1)

82.0 to 99.4

94.52 (4.4)

73.5 to 100.0

F-ratio (p-Value) 0.01 (0.99)

1.29 (0.28)

1.25 (0.29)

0.10 (0.90)

The first objective was to report the length in weeks of APPEs of US pharmacy programs at the beginning of 2016. The tAPPEs averaged 39.2 weeks (SD = 3.9) across all programs, with a median of 40 weeks (range 35 to 58). The iAPPEs were evenly distributed in 2013 with six-week iAPPEs being the most frequent (n = 48). The number of programs with four-week iAPPEs was the same from 2013 to 2015 (n = 33) and five-week iAPPEs increased in number by two during the same time frame (n = 34 to n = 36). The distribution among four- and five-week iAPPEs remained constant for NAPLEX administration years 2014 to 2015. The number of programs with 6-week iAPPEs increased from 48 to 56 from 2013 to 2015. For all of the eligible programs, the proportions of fourweek and five-week iAPPEs declined by 2% and 1%, respectively, from 2013 to 2015 and the proportion of six-week iAPPEs increased by 3%. Six-week iAPPEs (n = 8) were the most common among the ten candidate programs not included in the statistical analysis. Of eligible programs starting before 2000 (n = 83), 32.6% had four-week, 30.1% had five-week and 37.3% had six-week iAPPEs. On the other hand, for programs starting since 2000 (n = 42), 14.3% had four-week, 26.2% had five-week and 59.5% had six-week iAPPEs (chi-square = 6.73, df = 2, p < .03). The linear trend analysis showed an upward trend indicating longer iAPPEs over time (chisquare = 6.67, df = 1, p = .01). The second objective was to compare first-time NAPLEX pass rates according to the length of APPEs. Zero-order correlations between the number of tAPPE weeks and first-time pass rates were not statistically significant for any of the three individual testing years (Pearson r = −0.09; −0.03; −0.03; p > .05) or the average for the 2013 to 2015 administrations (Pearson r = −0.09; p > .05). The first-time pass rate was similar for programs with four-, five- and six-week iAPPEs over the three testing years, as was the average for testing years 2013 to 2015 (Table 1). In the final analysis, the programmatic control variables were first entered into the OLS regression model. The two programmatic control variables that predicted higher first-time NAPLEX pass rates for one or more testing periods were location at an academic medical center and if the program existed before 2000 (Table 2). When Bonferroni post hoc adjustments were made for Type I error (p < 0.0125), these two variables remained statistically significant for at least one analysis in 2015 and for the three-year average. The R2 change was not statistically significant after adding the dummy variables representing the lengths of iAPPEs in any of the models. In other words, after controlling for other structural control factors, iAPPE length had no independent influence on programs' NAPLEX pass rates. These two findings suggest that programmatic structural factors other than length of iAPPE warrant further investigation as predictors of NAPLEX success. Discussion APPEs are a vital component of pharmacy education and an ACPE accreditation requirement.1 ACPE requires the use of assessment measures to inform student learning and to ascertain student pharmacists' achievement of educational outcomes (Standard 16

Currents in Pharmacy Teaching and Learning 12 (2020) 14–19

L.D. Ried

Table 2 Ordinary least squares (OLS) regression of program characteristics and length of individual APPEs on NAPLEX first-time pass rate for individual testing years 2013, 2014, 2015 and for the weighted average passing rate for 2013 to 2015. 2013 Step 1

Located at a research-intensive institution Located at an academic medical center Located at a public institution Bachelor's degree required or preferred Program existed before 2000 4-Week APPE 5-Week APPE R2 Adjusted R2 Model F-ratio Model significance (p-value) Degrees of freedom (df) R2 change F-ratio of change Significance of R2 change (p-value)

2014 Step 2

Step 1

2015 Step 2

Step 1

Weighted average Step 2

Step 1 a

beta (p-value)

beta (p-value)

beta (p-value)

beta (p-value)

beta (p-value)

beta (p-value)

beta (p-value)

0.13 0.30 0.18 0.08 0.06 0.64 −0.08 0.39 0.20 0.04 –

0.12 0.32 0.18 0.07 0.05 0.67 −0.09 0.31 0.21 0.03 −0.09 0.40 −0.03 0.79 0.18 0.13 30.43 0.002 7,107 0.005 0.37 0.69

0.19 0.12 0.22 0.03 −0.03 0.79 0.01 0.90 0.16 0.10 –

0.17 0.16 0.23 0.02 −0.04 0.76 −0.00 0.96 0.18 0.08 −0.04 0.69 −0.15 0.12 0.19 0.14 30.79 0.001 7,113 0.018 10.26 0.29

0.14 0.26 0.18 0.06 −0.04 0.76 0.05 0.55 0.25 0.01† –

0.15 0.22 0.17 0.08 −0.03 0.81 0.07 0.43 0.23 0.03 0.08 0.43 0.11 0.23 0.18 0.13 30.59 0.002 7,116 0.011 0.79 0.46

0.17 0.16 0.25 0.01† −0.01 0.90 0.00 0.97 0.19 0.05 –

– 0.18 0.14 40.71 0.001 5,109

– 0.17 0.14 40.77 0.001 5,115

– 0.17 0.13 40.73 0.001 5,118

– 0.20 0.16 50.41 < 0.001 5,109

Step 2 betaa (p-value) 0.17 0.16 0.25 0.01† −0.01 0.91 0.00 0.96 0.19 0.05 −0.00 0.99 0.02 0.79 0.20 0.15 30.81 0.001 7,107 0.00 0.04 0.96

Beta = standardized ordinary least squares regression coefficient a p < 0.01250. APPE = advanced pharmacy practice experience; NAPLEX = North American Pharmacy Licensure Examination.

24.4). Given that they are so vital, this study examined a structural component of APPEs at the programmatic level using a standardized professional outcome. Few if any studies have examined the association of either structural or process APPE characteristics on NAPLEX results at the programmatic level. None have examined the association of programs' iAPPE or tAPPE length on first-time NAPLEX pass rate. Pharmacy programs have not overwhelmingly favored four-, five-, or six-weeks as the preferred length of iAPPEs, until recently. An indirect measure of the recent preference for six-week rotations is the significantly lower average age of the pharmacy programs incorporating six-week iAPPEs (i.e. 42 years versus 93 and 105 years for four-week and five-week programs, respectively). Although the six-week iAPPE is most common; less than half of the pharmacy programs use them. A plausible reason for the recent preference for the longer preferred iAPPE length that is consistent with these findings may be the difficulty in finding sufficient sites for quality practice experiences.19 Not only has the number of pharmacy schools increased over the past two decades, but the number of students enrolled in established programs has increased, as well. Therefore, more sites are needed all across the country.20 For example, according to the ACPE standard requirements, a minimum of 36 weeks would require six sites if iAPPEs were six weeks in length and would require nine sites for four-week iAPPEs; a 50% increase. This study's findings are consistent with the hypothesis that the shortage and need for high-quality clinical rotations may have contributed to the lengthening of iAPPEs.19–21 Another possible reason is that established pharmacy programs changed the length of iAPPEs in response to the CAPE outcomes of 2013. Since 2013, 64% of the new programs adopted six-week iAPPEs, whereas 36% adopted four- or five-week iAPPEs. It may be that longer iAPPEs were felt to be needed to meet those curricular requirements. However, assessment of the optimal iAPPE length requires validated standardized outcome measures gathered during the actual experience. Academic pharmacy has neither voluntary nor mandatory standardized instruments, such as the Physical Therapist Clinical Performance Instrument (PT-CPI).22 Equally disconcerting is the fact that academic pharmacy has neither commonly accepted nor required APPE clinical education outcome measures. Academic pharmacy needs to invest in efforts to standardize and/or improve measures to ensure APPEs meet performance quality standards.23 Until that time, validity of APPE performance outcomes will be variable and program-to-program comparisons about optimal length and iAPPE outcomes will be hampered.24 That said, while longer iAPPEs were most common, they did not translate into higher first-time pass rates over any of three administration cycles. The final reason for studying APPE length and first-time pass rate is that prospective students are more likely to use NAPLEX as an indicator of program quality in making school choice decisions.25 Potential students are less likely to be impressed with claims of superior professionalism and co-curricular outcomes in comparison with a greater likelihood of passing the licensure exam when making their school choices.3,4,25 Therefore, modifiable structure and process quality factors, such as optimal iAPPE and tAPPE length, should be examined for their impact on first-time NAPLEX pass rate. 17

Currents in Pharmacy Teaching and Learning 12 (2020) 14–19

L.D. Ried

This study has limitations that warrant caution to prevent overinterpretation of these findings for programmatic decision making. However, it still provides direction for research. Curricular or logistical reasons for programs' choice of iAPPE and tAPPE lengths were not examined. Unlike first-time NAPLEX pass rate, data to answer these questions are unavailable or too difficult to access at the program level across academic pharmacy. Reasons for programs' APPE length choices include site capacity26; student, faculty and preceptor stress; need to physically move between sites fewer times27,28; productive use of preceptors' time; and preceptors' satisfaction with the APPE program's logistics and support.5 Longer iAPPEs require fewer site orientations and allows student pharmacists to become more familiar with experiential sites' operation and distribution systems. These economies allow student pharmacists to focus more on the clinical learning experience.27 Other reasons include expanded opportunities for student pharmacists to meet important professional goals and co-curricular accreditation standards,29 including professionalism and entrusted professional activities, exposure to a broader population of patients and cultural diversity, among others.3–5 Finally, iAPPE length choices may be based on program specific reasons including tradition, student and faculty preferences,25 and scheduling requirements to promote collaboration among consortiums30; none of which have been shown to be related to academic performance. Therefore, while limited, this study provides a modicum of evidence for discussions about the minimum iAPPE length.3,4 However, these results should only be used for speculation and hypothesis generation because those other research questions are beyond the scope of these data.24 The advantage of the present study is that it does not need to use a sample of students attending an individual pharmacy program to represent estimates of the census. Using the results of studies conducted at single programs as predictors for programmatic assessment and decision-making has a serious conceptual and methodological challenge known as the atomistic fallacy and may result in Simpson's paradox. The fallacy arises because associations between two variables at the individual respondent level may differ from associations at the programmatic or group level.31 The best predictor of a program's NAPLEX success may not be individual students' PCAT score or first semester grade point average, but rather programmatic structure and process predictors. The one study of programmatic predictors of NAPLEX pass rates found the proportion of PGY1 residency matches and previous years' NAPLEX pass rates as optimal predictors of the current year's pass rate.11 The current study found location at an academic health center and program age as significant predictors. Therefore, additional work needs to be done to examine program level data and success on the NAPLEX. Finally, it is possible that a small number of programs may have changed the number of weeks and/or length of iAPPEs or tAPPEs between 2013 and 2016, although none were found. The question of the optimal number and length of APPEs is not new and will likely be debated into the foreseeable future.4,5 This study found that 6-week iAPPEs were the most common among existing programs and were generally preferred among newer programs. But, the longer iAPPE length did not translate into higher first-time pass rates. The implication is that the length of iAPPEs and tAPPEs can be chosen based on criteria other than student pharmacists' NAPLEX performances. ACPE has minimum standards regarding the tAPPE length, but not iAPPEs length. Therefore, these findings can inform future discussions. Pharmacy academicians will continue to question whether the NAPLEX is the best indicator of pharmacy programs' educational outcomes or whether it is valid a predictor of future practice competency.3,4,24,32,33 However, while the discussion about the best measure continues, the NAPLEX remains a standardized professional education performance indicator. For the time being, academic pharmacy cannot discount the NAPLEX as a legitimate pharmacy program outcome measure and these findings about APPE length are relevant. That said, legitimate discussions regarding other reasons for deciding how long it takes to achieve important APPE educational performance goals are warranted. However, those discussions are beyond the scope of these data. Theoretically, pharmacy programs' performances on curricular and co-curricular educational outcomes may be achieved in iAPPEs of any length; but these studies have not been conducted as yet. Additional program-to-program comparisons will need to be made based on standardized or universally accepted criteria yet to be identified. Academic pharmacy needs to allocate resources to develop those instruments or agree on key educational outcomes applicable to the universe of pharmacy programs. Only then can APPE components be compared to support programs' decisions regarding the optimal iAPPE length.3–5,23 Until that time, the optimal iAPPE length will continue to depend upon multiple value-laden criteria with little agreement among those responsible for experiential programs. Conclusion The length of iAPPEs has trended towards 6-weeks. Length of iAPPEs and tAPPEs was not associated with first-time NAPLEX pass rate. Disclosures None Declaration of competing interest None References 1. Accreditation standards and key elements for the professional program in pharmacy leading to the doctor of pharmacy degree. Standards 2016. Accreditation Council for Pharmacy Education. https://www.acpe-accredit.org/pdf/Standards2016FINAL.pdf. Published February 2015. Accessed 6 December 2019. 2. Grice G. Average length of APPE and percentage of faculty-precepted versus adjunct-precepted APPEs. Experiential Education Section Digest. Alexandria: VA:

18

Currents in Pharmacy Teaching and Learning 12 (2020) 14–19

L.D. Ried

American Association of Colleges of Pharmacy. . 3. Kassam R, Kwong M. An enhanced community advanced pharmacy practice experience model to improve patient care. Am J Pharm Educ. 2009;73(2) https://doi. org/10.5688/aj730225. 4. Cox CD. Quantity vs quality in experiential education. Am J Pharm Educ. 2016;80(3) https://doi.org/10.5688/ajpe80336. 5. Svensson CK. What should constitute an acceptable advanced pharmacy practice experience? Am J Pharm Educ. 2016;80(3) https://doi.org/10.5688/ajpe80337. 6. Preparing to Apply and Sit for the NAPLEX. National Association of Boards of Pharmacy. https://nabp.pharmacy/programs/naplex/. Accessed 16 October 2019. 7. McCall KL, MacLaughlin EJ, Fike DS, Ruiz B. Preadmission predictors of PharmD graduates’ performance on the NAPLEX. Am J Pharm Educ. 2007;71(1) https:// doi.org/10.5688/aj710105. 8. Chisholm-Burns MA, Spivey CA, Byrd DC, McDonough SLK, Phelps SJ. Examining the association between the NAPLEX, pre-NAPLEX, and pre- and post-admission factors. Am J Pharm Educ. 2017;81(5) https://doi.org/10.5688/ajpe81586. 9. Allen RE, Diaz C Jr. Use of preadmission criteria and performance in the doctor of pharmacy program to predict success on the North American Pharmacists Licensure Examination. Am J Pharm Educ. 2013; 77(9). doi: https://doi.org/10.5688/ajpe779193. 10. Naughton CA, Friesner DL. Correlation of P3 PCOA scores with future NAPLEX scores. Curr Pharm Teach Learn. 2014;6(6):877–883. 11. Williams JS, Spivey CA, Hagemann TM, et al. Impact of pharmacy school characteristics on NAPLEX first-time pass rates. Am J Pharm Educ. 2018;83(2) https:// doi.org/10.5688/ajpe6875. 12. Donabedian A. Evaluating the quality of medical care. Milbank Mem Fund Q. 1966;44(3):166–206 Suppl. 13. Programs by Status. Accreditation Council for Pharmacy Education; 2016 https://www.acpe-accredit.org/accredited-programs-by-status/ Accessed 6 December 2019. 14. School Passing Rates. National Association of Boards of Pharmacy. https://nabp.pharmacy/programs/naplex/score-results/. Accessed 16 October 2019. 15. Carnegie Classification of Institutions of Higher Education. Center for Postsecondary Research. http://carnegieclassifications.iu.edu. Accessed 16 October 2019. 16. PharmD School Directory. PharmCAS: Pharmacy College Application Service. http://www.pharmcas.org/school-directory/#/pharmd/general-information Accessed 16 October 2019. 17. Kerlinger FN, Pedhazur EJ. Multiple Regression in Behavioral Research: Explanation and Prediction. 2nd ed. New York, NY: Holt, Rinehart & Winston; 1982:35–36. 18. IBM SPSS Statistics [computer program]. Version 24.0. Armonk, NY: IBM Corp.; 2016. 19. Brackett PD, Byrd DC, Duke LJ, et al. Barriers to expanding advanced pharmacy practice experience site availability in experiential education consortium. Am J Pharm Educ. 2009;73(5) https://doi.org/10.5688/aj730582. 20. Grabenstein JD. Trends in the numbers of US colleges of pharmacy and their graduates 1900 to 2014. Am J Pharm Educ. 2016;80(2) https://doi.org/10.5688/ ajpe80225. 21. Skrabel MZ, Jones RM, Nemire RE, et al. National survey of volunteer pharmacy preceptors. Am J Pharm Educ. 2008;72(5) https://doi.org/10.5688/aj7205112. 22. . Physical Therapist Clinical Performance Instrument. American Physical Therapy Association. http://www.apta.org/PTCPI/. Accessed 16 October 2019. 23. Kassam R, Collins JB. Validation of a survey instrument to evaluate students’ learning during community-based advanced pharmacy practice experiences. Am J Pharm Educ. 2009;73(6) https://doi.org/10.5688/aj7306106. 24. Poirier TI, Devraj R. Time for consensus on a new approach for assessments. Am J Pharm Educ. 2015;79(1) https://doi.org/10.5688/ajpe79102. 25. Ascione FJ. In pursuit of prestige: the folly of the US News and World Report survey. Am J Pharm Educ. 2012;76(6) https://doi.org/10.5688/ajpe766103. 26. Gibson MJ, Bradley-Baker LR, Bush CG, Nelson SP. Reassessment of health-system capacity for experiential education. Am J Pharm Educ. 2017;81(9) https://doi. org/10.5688/ajpe6014. 27. Dennis VC, Britton ML, Wheeler RE, Carter SM. Practice experiences at a single institutional practice site to improve advanced pharmacy practice examination performance. Am J Pharm Educ. 2014;78(3) https://doi.org/10.5688/ajpe78360. 28. O'Sullivan TA, Sullivan L, Webber K, Weber SS. A pilot comparison of student outcomes between longitudinally- and traditionally-placed advanced pharmacy practice experiences [published online ahead of print 2 August 2018]. Am J Pharm Educ. 29. Dennis VC, May DW, Kanmaz TJ, Reidt SL, Serres ML, Edwards HD. Pharmacy student learning during advanced pharmacy practice experiences in relation to the CAPE 2013 outcomes. Am J Pharm Educ. 2016;80(7) https://doi.org/10.5688/ajpe807127. 30. Ried LD, Nemire R, Doty R, et al. An automated competency-based student performance assessment program for advanced pharmacy practice experiential programs. Am J Pharm Educ. 2007;71(6) https://doi.org/10.5688/aj7106128. 31. Diez Roux AV. A glossary for multilevel analysis. J Epidemiol Community Health. 2002;56(8):588–594. https://doi.org/10.1136/jech.56.8.588. 32. Romanelli F. Pharmacist licensure: time to step it up? Am J Pharm Educ. 2010;74(5) https://doi.org/10.5688/aj740591. 33. Romanelli F. In reply to Boyle M, Catizone CA, Finnerty WB. The National Association of Boards of Pharmacy response to pharmacy licensure: time to step it up? Am J Pharm Educ. 2010;74(9) https://doi.org/10.5688/aj7409176.

19