Development and initial feasibility of an organizational measure of behavioral health integration in medical care settings

Development and initial feasibility of an organizational measure of behavioral health integration in medical care settings

Journal of Substance Abuse Treatment 43 (2012) 402–409 Contents lists available at SciVerse ScienceDirect Journal of Substance Abuse Treatment Deve...

221KB Sizes 0 Downloads 31 Views

Journal of Substance Abuse Treatment 43 (2012) 402–409

Contents lists available at SciVerse ScienceDirect

Journal of Substance Abuse Treatment

Development and initial feasibility of an organizational measure of behavioral health integration in medical care settings☆ Mark P. McGovern Ph.D. a,⁎, Darren Urada Ph.D. b, Chantal Lambert-Harris M.A. a, Steven T. Sullivan Ph.D. c, Noel A. Mazade Ph.D. d a b c d

Department of Psychiatry, Dartmouth Medical School, 85 Mechanic Street, Suite B4-1, Lebanon, New Hampshire, 03766 USA Integrated Substance Abuse Programs, University of California at Los Angeles, Los Angeles, CA, USA Center for Safety and Sustainability, Cloudburst Consulting Group, Inc., Landover, MD, USA Mazade and Associates, LLC, Wilmington, DE, USA

a r t i c l e

i n f o

Article history: Received 6 March 2012 Received in revised form 2 August 2012 Accepted 9 August 2012 Keywords: Integrated treatment Community health centers Organizational measures

a b s t r a c t In the advent of health care reform, models are sought to integrate behavioral health and routine medical care services. Historically, the behavioral health specialty has not itself been integrated, but instead bifurcated by substance use and mental health across treatment systems, care providers and even research. With the present opportunity to transform the health care delivery system, it is incumbent upon policymakers, researchers and clinicians to avoid repeating this historical error, and provide integrated behavioral health services in medical contexts. An organizational measure designed to assess this capacity is described: the Dual Diagnosis Capability in Health Care Settings (DDCHCS). The DDCHCS was used to assess a sample of federallyqualified health centers (N = 13) on the degree of behavioral health integration. The measure was found to be feasible and sensitive to detecting variation in integrated behavioral health services capacity. Three of the 13 agencies were dual diagnosis capable, with significant variation in DDCHCS dimensions measuring staffing, treatment practices and program milieu. In general, mental health services were more integrated than substance use. Future research should consider a revised version of the measure, a larger and more representative sample, and linking organizational capacity with patient outcomes. © 2012 Published by Elsevier Inc.

1. Introduction Persons with co-occurring substance use and psychiatric disorders drive the cost of health care in the United States at two to three times the rate of individuals without these problems (Boyd et al., 2010). Although more research is needed, findings across several studies support the benefits of integrating behavioral health services (substance use and mental health) with medical care, in terms of patient outcomes and costs (Friedmann, Zhang, Hendrickson, Stein, & Gerstein, 2003; Katon, Russo, Von Korff, et al., 2008; Parthasarathy et al., 2003; Simon et al., 2007; Weisner, Mertens, Parthasarathy, Moore, & Lu, 2001). Recent reports have underscored the promising but ambiguous evidence for integrated behavioral health services in general medical settings, and reiterated the need for more systematic research (Butler et al., 2008; Gurewich, Sirkin, & Shepard, 2012; Pating, Miller, Goplerud, Martin, & Ziedonis, 2012).

☆ This research was supported by the Substance Abuse and Mental Health Services Administration Co-Occurring Disorder Implementation and Innovation (#8732) and the Robert Wood Johnson Foundation Substance Abuse Policy Research Program (#63110). ⁎ Corresponding author. E-mail address: [email protected] (M.P. McGovern). 0740-5472/$ – see front matter © 2012 Published by Elsevier Inc. http://dx.doi.org/10.1016/j.jsat.2012.08.013

Longstanding efforts to improve access to integrated services have been recently accelerated by the promise of health care reform (Halvorson, 2010). Addiction and mental health treatment, historically provided in separate “silos” of care, must become more closely integrated with each other as true behavioral health services, and ultimately merge with primary care (Grantham, 2010; McLellan, 2010, January 21). In particular, federally qualified health centers (FQHCs) are expected to take a leading role in this transformation. This systemic change is rational given that substance use disorders undermine virtually every aspect of medicine, from chronic pain, to sleep disorders, to chronic diseases such as diabetes and hypertension (Office of National Drug Control Policy, 2010, 2011). Behavioral health is thus presently on the verge of joining “mainstream” health care (Kuehn, 2010; Pating et al., 2012). Identified as logical settings to launch integration, FQHCs are nonprofit, community-directed health care organizations that serve medically indigent populations (National Association of Community Health Centers [NACHC], 2009). FQHCs constitute the nation's largest network of primary care providers, delivering services to 1 in 18 of all individuals in the United States (U.S. Department of Health and Human Services Health Resources and Services Administration [HRSA], 2010a, 2010b). Sixty-nine percent of FQHC patients fall under 200% of the federal poverty level, and 38% are uninsured

M.P. McGovern et al. / Journal of Substance Abuse Treatment 43 (2012) 402–409

(NACHC, 2009). This population is not only at increased risk for substance use and psychiatric disorders (Aldeman, 2003; Wu, Kouzis, & Schlenger, 2003), but also endures disparities in access to adequate care (Office of the Surgeon General, 2001; Wells, Klap, Koike, & Sherbourne, 2001). Recent legislation is expected to enhance the ability of FQHCs to provide integrated care. The Patient Protection and Affordable Care Act of 2010 and the Health Care and Education Reconciliation Act of 2010 (collectively the Affordable Care Act, ACA) are projected to expand Medicaid coverage to 15 million more people by 2019 (NACHC, 2010). The ACA together with the American Recovery and Reinvestment Act of 2009 provides funding for dramatic FQHC expansion (Buck, 2011; U.S. Department of Health and Human Services Health Resources and Services Administration [HRSA], 2010a, 2010b), including the expansion of behavioral health care services (Lo Sasso & Byck, 2010; Wells, Morrissey, Lee, & Radford, 2010). The ACA also facilitates integration by designating both addiction and mental health treatment as “essential health benefits” to be covered by health plans (including Medicaid) starting January 1, 2014. Finally, the ACA will offer incentives for FQHCs to become “health homes” that specialize in the integration and coordination of care for chronic conditions, including substance use and psychiatric disorders (Boyd et al., 2010; Buck, 2011). The movement toward integration of behavioral health and traditional medical services has generated a number of models of possible service delivery (Collins, Hewson, Munger, & Wade, 2010). However, in reality, only limited data exist on the current state of integration, in part due to the lack of an objective measure of the organizational, clinical and workforce processes (Pating et al., 2012). A recent assessment of the behavioral health needs of community health centers identified the lack of objective measures of integration as a major barrier to advancement (Technical Assistance Collaborative / Human Services Research Institutes, 2012). One study has attempted to identify the degree of behavioral health integration in a sample of community health centers across the United States (Lardiere, Jones, & Perez, 2011). Using survey methodology, Lardiere et al. (2011) found that 55% of FQHCs reportedly provide substance abuse related services, 70% mental health services, and approximately 65% of FQHCs reported that they delivered some form of integrated care. While these national data provide the field with an important first glimpse into the current reality of integrated services, due to a low survey response rate (38% of total), the findings must be interpreted with caution. Another caveat also exists. Several studies of organizational capacity, if assessed by provider self-report only, tend to yield inflated estimates of integration (Adams, Soumerai, Lomas, & Ross-Degnan, 1999; Lee & Cameron, 2009; McGovern, Xie, Segal, Siembab, & Drake, 2006). Objective data on integration, using measures that are not entirely reliant on self-reports, but instead including site visits, direct observation and standardized interviews with multiple informants can yield reliable and valid data. Such information could serve to further advance the field's ability to study and compare integration of behavioral health services in FQHCs in greater depth and accuracy. Research on the integration of behavioral health and primary care is particularly lacking for substance abuse treatment services. Notably, Butler's systematic review of 33 integration trials included only one that addressed integration of alcohol problem services and no study of integrating drug use disorder treatments (Butler et al., 2008). Likewise, in the national survey described above, Lardiere et al. (2011) found that while 90% of FQHCs screen for depression, only 65% screen for substance use. The persistent bifurcation between substance use and mental health services within primary care threatens to repeat the unfortunate history of the specialty psychiatric and addiction treatment system, by unwittingly reinforcing their artificial separation. Therefore, informed models of integration must consider a critical lesson learned by the specialty mental health and

403

addiction treatment delivery system over the past 30 years: psychiatric and substance use disorders commonly co-occur. Therefore, integrated behavioral health care services in routine medical settings must be organized for the clinical reality of comorbidity. Over the past 10 years, two practical benchmark measures have informed and catalyzed integration efforts between the mental health and addiction treatment delivery system, as well as at the ground level among service providers. These measures, the Dual Diagnosis Capability in Addiction Treatment (DDCAT) and Dual Diagnosis in Mental Health Treatment (DDCMHT) indexes, have been implemented across over two-thirds of U.S. states and in tribal systems (McGovern, Lambert-Harris, McHugo, Giard, & Mangrum, 2010). The DDCAT and DDCMHT measures have been utilized to document system and treatment provider capacity, and also as benchmarking indices to guide policy, practice and workforce improvements. These improvements have been replicated across multiple systems, and in resource neutral environments (i.e., quality was improved at little to no additional cost) (McGovern et al., 2010). Anticipating the need for an objective and psychometrically sound organizational measure of behavioral health and primary care integration, a companion measure to the DDCAT and DDCMHT has been developed. This measure, the Dual Diagnosis Capability in Health Care Settings (DDCHCS) index, was designed specifically to assess the degree to which an organization offers integrated behavioral health care services, both mental health and substance use, within traditional medical settings. Using methodology similar to its counterparts, the DDCHCS is developed to be a practical benchmark measure of policy, practice and workforce dimensions which can serve to assess integration at baseline, and then objectively guide quality improvement efforts. The present study reports on the development and feasibility of the DDCHCS in a sample of FQHCs, and addresses the following research questions: 1. Is it feasible to use the DDCHCS to assess FQHCs on degree of integration of medical and behavioral health care services? 2. What are the preliminary reliability estimates of the DDCHCS index? 3. Is the DDCHCS sensitive to detecting variation in behavioral health integration across a sample of FQHCs? In evaluating these research questions, the present study will examine the potential value of DDCHCS in terms of feasibility, reliability and sensitivity. If the DDCHCS meets acceptability on these criteria, measure refinement and further research may be warranted. 2. Methods 2.1. Design This study utilized a cross-sectional design in which 13 federallyqualified health centers from six states were assessed during a site visit at a single point in time. A structured manual-guided procedure was used to make the ratings on the Dual Diagnosis Capability in Health Care Settings (DDCHCS) index. 2.2. Sampling and agency characteristics Eighteen Federally-Qualified Health Centers (FQHC) sites were approached by the authors based on prior research experience with these agencies or familiarity via state wide initiatives on integrated care. Of the 18 approached, 2 refused, 1 did not respond and 2 reported scheduling conflicts. This resulted in a final convenience sample of 13 FQHCs across six states: California, Connecticut, Illinois, Maryland, South Carolina and Vermont. Six programs were identified as private non-profits, five as non-profit, and two as public non-profit. All programs delivered primary care and some specialty services

404

M.P. McGovern et al. / Journal of Substance Abuse Treatment 43 (2012) 402–409

within an outpatient format. Services provided by the FQHCs focused on the medically-indigent, but also Medicare (100.0%); individual (92.3%); other public funds (92.3%); private pay (84.6%); other funds (84.6%); Medicaid (76.9%); state funds (69.0%); and military funds (7.7%) were obtained by fee for service or contract. This project was reviewed and deemed exempt from further review by the Trustees of Dartmouth College Committee for the Protection of Human Subjects (CPHS). 2.3. Measure The Dual Diagnosis Capability in Health Care Settings (DDCHCS) is an organizational measure constructed along a similar framework and methodology as the Dual Diagnosis Capability in Addiction Treatment (DDCAT) and Dual Diagnosis Capability in Mental Health Treatment (DDCMHT) Indexes. The DDCHCS measure construction began with a review of the literature to develop a list of potential benchmark items, review of panel of experts on integrated treatments, field testing of alpha version of the measure, revisions based on redundancy, irrelevance or lack of inter-rater agreement, and then beta field testing of the revised measure. More specifically, a review of the literature was conducted to map on potential benchmarks of behavioral–medical care integration onto the existing DDCAT/ DDCMHT frameworks. The screening, assessment, treatment planning and delivery and continuity of care process were common themes across all contexts. Based on this review, a preliminary version of the measure was drafted and reviewed by integrated mental health and addiction treatment and research experts at Dartmouth. Their feedback incorporated into a field-testing ready version. The field testing was initiated with two to three members of an interdisciplinary and experienced assessment team. These individuals are either study authors or acknowledged at the conclusion of this article. A modified Delphi process (Fink, Kosecoff, Chassin, & Brook, 1984; Jones & Hunter, 1995) was used in monthly teleconference meetings among this team. These meetings were focused on developing consensus about data collection procedures during the site visit, interpretation of the meaning of specific items, and scoring or ratings of the items. Items involving such concepts as “admission criteria” or “discharge” were seen as less relevant in medical care (particularly in an FQHC setting), so the intended meaning required adjustment and clarification. Thus, some residual from the DDCAT and DDCMHT indexes were not useful or applicable for the DDCHCS. In addition, certain items needed to be entirely changed. Only one entirely new item, pertaining to prescription drug policy and practice, was added. Some revisions were made to the draft DDCHCS measure whereas other potential revisions were noted, but reserved until after the feasibility phase was fully complete. The DDCAT and DDCMHT, now in versions 4.0, have been extensively studied, including for psychometric properties such as reliability (inter-rater, internal consistency, test–retest stability) and validity (convergent/discriminant, criterion, and predictive) (Gotham, Brown, Comaty, McGovern, & Claus, 2009; Gotham, Claus, Selig, & Homer, 2010; McGovern et al., 2007; McGovern et al., 2010). As noted, the DDCHCS was adapted from the DDCAT and DDCMHT, and modified to evaluate the degree of behavioral health integration, mental health and substance use, within traditional medical care settings. Such settings may include community health centers, family practice, primary care, general internal medicine and/or emergency department. The instrument used in this study (DDCHCS, version 2.0) is composed of 36 benchmark items and organized by seven dimensions: I. Program Structure, II. Program Milieu, III. Clinical Process: Assessment, IV. Clinical Process: Treatment, V. Continuity of Care, VI. Staffing, and VII. Training. Each benchmark item is scored on a five-point Likert-type scale, from 1–Health Care Only Services (HCOS) to 3–Dual Diagnosis Capable (DDC) to 5–Dual Diagnosis Enhanced (DDE). Ratings of a “2” or a “4” are made when criteria for “3” or “5”

are not completely met. The HCOS, DDC and DDE categories are adaptations of those on the DDCAT and DDCMHT, based upon the American Society of Addiction Medicine's taxonomy of organizationlevel dual diagnosis capability (Mee-Lee, Schulman, Fishman, Gastfriend, & Griffith, 2001). The DDCHCS yields individual benchmark scores on each of the 36 items, average scores on each of the seven dimensions, and a DDCHCS total score. In addition, a program can be categorized as HCOS, DDC or DDE based upon overall performance. In general, HCOS level programs do not offer behavioral health services, either mental health or substance use, in a consistent manner. DDC level programs offer behavioral health services but unevenly, leaning in either a substance use or mental health direction, or such services are available but inconsistent. DDE level programs address both mental health and substance use issues using a systematic and protocol-driven approach. In addition to clinical practice elements such as behavioral health assessment, treatments and continuity of care, the DDCHCS also captures policy, milieu and workforce elements that may either support or hinder integrated behavioral health services. Table 1 (below) lists the dimensions, briefly defines their content, and enumerates the number of items included within each dimension. 2.4. DDCHCS assessors and raters The DDCHCS assessment teams, across the six states and 13 programs evaluated, were trained in the methodology using a similar procedure: first by attending a didactic presentation, next by observing a DDCHCS assessment performed by an experienced assessor, and finally conducting an assessment under observation. All assessments included in this study were made by a pair of raters. Ratings were made independently during the site visit, and then discussed afterwards. Items on which there were disagreements were further examined by both raters, and differences reconciled. In some instances, where raters were more than two points apart and raters remained confident in their scoring, the differences were split with intermediate ratings. The discipline and background of DDCHCS raters ranged from board certified internists, clinical psychologists, and master's level evaluators in public health or social work. In previous studies with the DDCAT and DDCMHT a similar range in assessor background was common and reduced the measures' psychometric properties (interrater reliability). Table 1 DDCHCS dimensions, content and number of benchmark items assessed. Dimension

Content of benchmark items

No. of items

4 Integrated agency mission; licensure or certification to provide behavioral health services; provided onsite or offsite: financial contingencies II Program Physical and social environment open to persons with 2 milieu behavioral health concerns 7 III Assessment Systematic and structured screening and assessment protocol to identify cases, diagnose them, and develop treatment plan IV Treatment Consistent use of integrated treatment plan, availability 11 of onsite behavioral health medications and psychosocial therapies 5 V Continuity Chronic disease model for both mental health and of care substance use disorders including ongoing management and follow-up for both types of problems and their interactions VI Staffing Presence , access and use of physician and/or other 5 behavioral health specialists 2 VII Training Basic and non-discriminatory awareness of behavioral health issues among medical care patients, and for clinical staff enhanced expertise in medical – behavioral health interface Total 36 I

Program structure

M.P. McGovern et al. / Journal of Substance Abuse Treatment 43 (2012) 402–409

Monthly DDCHCS Workgroup teleconference calls were held, with topics discussed such as: conducting site visits, data gathering protocol, item definition and meaning, rating anchors and scoring, and summary and interpretation of findings. 2.5. Procedure The FQHCs were informed that the purpose of the assessment was to “field test” a new measure of organizational capacity of behavioral health service integration. They were reassured that the data would not identify their agency, but that the findings would be verbally conveyed via feedback to each agency, or in some cases by letter. Participating agencies were not compensated for their participation, but were interested in objective information for the purposes of quality improvement. Site visits were scheduled in advance, and agencies told that the visit would consist of interviews with leadership, a tour of the facilities, and interviews with providers including physicians and nurses, behavioral health staff, support staff, and any other individuals available. In addition, sites were asked to prepare documents such as policy and procedure manuals, any patient education materials related to behavioral health, program schedules, brochures and patient handbooks. Agencies were told that at least 10 medical records, electronic or paper, would need to be reviewed. The visits followed a standardized protocol beginning with leadership interview, facility tour, scheduled and opportunistic interviews with clinical staff, interviews with randomly (conveniently) selected patients and support staff (including secretarial, administrative, pharmacy, laboratory personnel), observation of team meeting or staff group interaction, and review of documents including a sample of medical records. At the conclusion of the site visit during which these data were gathered, ratings were made on the DDCHCS measure, and preliminary feedback offered to the agency leadership. Typically, these conversations also were used to verify observations, clarify inconsistencies, and debrief with agency leadership about the overall experience of the assessment and learning of the preliminary DDCHCS findings. DDCHCS were scored by the rating team, de-identified and then relayed to Dartmouth for aggregate statistical analyses. 2.6. Data analyses Continuous and discrete variables were analyzed using univariate analyses. For the reliability question, an intra-class correlation

405

analysis was used to estimate inter-rater reliability, and Cronbach's alpha as the measure of dimension and overall DDCHCS internal consistency. DDCHCS sensitivity to variation was assessed by scoring the measure and then categorizing the agency according to the framework of HCOS, DDC or DDE. DDCHCS sensitivity would be supported if the agencies differed in placement into these categories. If variation can be detected, this can be further examined by t-test to determine specific benchmark item and dimension differences by HCOS, DDC or DDE categories. A more stringent criteria than typical convention was used to reduce the chances of type I error, since multiple t-tests are being conducted. Only differences at p b .01 will be considered significant. Data were analyzed using SPSS statistical software (SPSS Inc, 2008). 3. Results 3.1. Format for delivery of behavioral health services As seen in Table 2, the most common professional delivering behavioral health interventions in FQHCs are physicians. Clinical psychologists and social workers represent the next largest groups, followed by nurse practitioners and nursing personnel. The bottom section of Table 2 reveals that in only one agency does a physician–psychiatrist provide all direct behavioral health care. Of the 10 physicians, only 3 were board-certified psychiatrists, none with advanced certification in addiction. Of the other non-psychiatric physicians, only two were ASAM-certified, with one having capacity to prescribe buprenorphine. In most agencies the physician will either refer to a behavioral health specialist on site (75.0%), off-site but within the same agency (46.2%), and/or refer patients to an off-site or community service (15.4%; 7.7%). Although many agencies have a format to provide integrated mental health and addiction services, mental health services and medications were typically provided on site, and substance use services referred offsite to another agency. 3.2. DDCHCS reliability estimates To estimate inter-rater agreement, we obtained data on rater pairs from approximately one-third of all FQHCs assessed (4 of 13). Across the DDCHCS measure the overall intra-class correlation coefficient was .90. Cronbach's alpha was used to evaluate internal consistency of the DDCHCS dimension scores and DDCHCS total score. Internal consistency estimates by dimension are as follows: program structure = .17; program milieu = .57; assessment = .67; treatment = .81;

Table 2 Behavioral health professional staffing pattern and role. Integrated behavioral health providers:

Number and percent of agencies with on site staff by discipline

Range in number of staff by discipline (minimum to maximum)

MD (three with psychiatry specialty) PhD/PsyD APRN RN/BSN MSW/LCSW MA/MS LADC/CADC

10 (77.0) 6 (49.9) 4 (30.8) 3 (23.1) 5 (38.5) 1 (7.7) 1 (7.7)

0–5 0–3 0–2 0–1 0–23 0–8 0–1

Role of medical staff in providing behavioral health services:

SA only1

MH only1

SA and MH1

Neither SA or MH1

Medical Medical Medical Medical Medical

0 0 1 2 1

0 3 1 0 0

1 6 4 0 0

12 (92.3%) 3 (25.0%) 7 (53.8%) 11 (84.6%) 12 (92.3%)

staff deliver most behavioral care on site staff identify positive cases, refer to onsite behavioral health providers staff refer patients to off-site services but within the same agency staff refer patients to off-site program not within the same agency Staff refer patients to community services

(0.0%) (0.0%) (7.7%) (15.4%) (7.7%)

(0.0%) (25.0%) (7.7%) (0.0%) (0.0%)

(7.7%) (50.0%) (30.8%) (0.0%) (0.0)

1 SA only = substance abuse treatment only; MH only = mental health treatment only; SA and MH = both substance abuse treatment and mental health treatment; neither SA or MH = neither substance abuse treatment or mental health treatment.

406

M.P. McGovern et al. / Journal of Substance Abuse Treatment 43 (2012) 402–409

continuity of care = .80; staffing = .87; and training = .70. By removing one item of the four items in the program structure dimension (Ia. Behavioral health integrated into agency mission statement), the program structure dimension alpha coefficient improved to .53. Thus, the internal consistency of the DDCHCS dimensions ranges from .17 to .87. However, the overall Cronbach's alpha estimate was .89.

HCOS FQHCs to attend to mental health but not substance recovery issues over time. Training, either at the dimension or benchmark item level, did not reveal any differences between HCOS and DDC agencies. 4. Discussion 4.1. Summary of findings

3.3. DDCHCS sensitivity to behavioral health care variation in FQHCs The DDCHCS was sensitive in detecting variation in behavioral health care across the FQHC sample. Three FQHCs met criteria for DDC level services. No program met criteria for DDE services. On the program structure, program milieu, assessment, continuity of care dimensions, the average FQHC was at or above the DDC level. The DDCHCS total score average was exactly 3.0 (DDC level) but with a range of ± 1 point. The dimensions of treatment, staffing and training were on average below DDC, indicating that it is likely scores in these areas that drove programs categorical placement to HCOS, but just below DDC. It is likely these dimensions could be improved to enhance benchmark item scores to achieve DDC level services. 3.4. Variation in behavioral health integration by DDCHCS dimensions and benchmark items To further explore variation in services, we examined the differences between HCOS and DDC level agencies more closely. In order of magnitude, staffing, treatment and program milieu were the most significant dimensions of difference (all at p b .001). There were no significant dimensional differences in program structure, assessment, continuity of care and training. Within these dimensions, specific benchmark items differentiated HCOS and DDC programs. In the Staffing dimension, DDC programs clearly had more behavioral health (psychiatric) physician involvement than HCOS agencies (p b .001). In addition, there were staff members with mental health expertise (psychiatrists, psychologists, social workers) who supervised other staff and were integrated into clinical team meetings. Lastly, there were identified staff members, either paid or volunteer, who represented peer recovery support, and worked to connect existing patients with others recovering from similar issues. With the treatment dimension, the largest discrepancy between HCOS and DDC FQHCs was in the development of integrated treatment plans or problem lists (medical and behavioral health) (p b .001). Additional benchmark items achieving statistical significance (p b .001) included clear protocols for psychiatric or substancerelated emergencies, and facilitated access to persons in personal recovery from mental health and/or addictive disorders. The program milieu dimension revealed one benchmark item in particular accounted for the difference between HCOS and DDC agencies: expectation of behavioral health problems. On this item, assessors examine explicit and implicit messages to patients and families about the clinic being a place to address behavioral health concerns. The availability of patient educational materials, such as brochures and posters in waiting and common areas, and informational material about common problems (depression, smoking, alcohol use, prescription narcotic abuse potential) is also evaluated. Although there were no overall differences between the HCOS and DDC FQHCs on the program structure, assessment, continuity of care and training dimensions, several items within these scales were statistically significant at the benchmark item level (p b .01). These included: financial and reimbursement mechanisms, initial and ongoing access to services irrespective of psychiatric or substance use problem acuity and a balanced focus on recovery from both disorders. In the last item, there was a tendency among

This study sought to address three questions about the application of an organizational measure of integrated behavioral health capability in a sample of FQHCs. The first question involved the feasibility of using the DDCHCS to assess a range of typical community health centers. We found that during the course of a 4 to 6 hour site visit to a FQHC, data from a variety of sources could be gathered to complete a DDCHCS assessment. Although we do not have quantitative data to document this statement, qualitatively, we observed that the FQHC staff and leadership not only found the process unobtrusive, but described it as useful. At the conclusion of most site visits, FQHC leadership reported that the assessment captured the true functioning of their behavioral health services, and in doing so suggested practical quality improvements to enhance integration. In some instances, the assessment team was invited back to conduct a re-assessment to measure these changes. Thus, we conclude with the finding to the first question that the DDCHCS is in fact feasible to use to assess the integration of behavioral health, substance use and mental health, in the context of an FQHC setting. The second research question pertained to the preliminary reliability estimates of the DDCHCS index. With respect to both overall inter-rater reliability (intra-class correlation) and internal consistency (Cronbach's alpha), the DDCHCS was found to have acceptable psychometric properties overall. On one dimension, program structure, the internal consistency of the measure was low (.17), but improved to .53 with one item removed. The fact that a dimension does not have internal consistency warrants caution in interpreting the overall dimension score, but could also indicate that items within the dimension are measuring different but potentially important constructs. The final question involved the potential for the DDCHCS to be sensitive to variation in the integration of behavioral health services across a sample of FQHCs. We found that of the 13 FQHCs, 3 met criteria for DDC level services, and although on the threshold of DDC level, 10 programs met HCOS criteria. The sorting of agencies into two DDCHCS categories confirms detection of major differences in agency capacity. We then examined these differences by more closely inspecting the DDCHCS dimension and benchmark item scores. Three of the seven dimensions were significantly different between HCOS and DDC level FQHCs: treatment, staffing and program milieu. The remaining four dimensions were not significantly different overall. However, there were several individual benchmark items exposing variation in DDC versus HCOS level agencies. These findings are consistent with the community health center provider survey by Lardiere et al. (2011) in that FQHCs were more likely to offer integrated behavioral health services focused on mental health and not substance use problems. In particular, screenings for depression and integrated medication treatments were available. Substance use services were less integrated, and often involved linkage or referrals to agencies off site or in the community. Furthermore, in most instances, behavioral health services that did involve substance use addressed alcohol and tobacco use, but not illicit drug use. Of significant and expanding interest are the needs of patients abusing prescription medications, especially prescription narcotics. All FQHC providers expressed concern about this growing problem and varied considerably in their response in policy and practice. Unlike the DDCAT and DDCMHT measures, the DDCHCS

M.P. McGovern et al. / Journal of Substance Abuse Treatment 43 (2012) 402–409

added an additional benchmark item on the treatment dimension that evaluated programs on exactly this issue. Also noteworthy is small presence of physicians, including psychiatrists, and nurses, or even behavioral health providers with addiction specialization or training (see Table 3 scores on staffing and training). As behavioral health integration increases, it will be essential to insure that this expertise is available, either by personnel selection or by training. At first glance, the finding that only 3 of FQHCs met criteria for DDC level services and 10 HCOS may be sobering given impending changes to the health care delivery system. The FQHCs do not appear ready. Interestingly, baseline assessments of addiction and mental health agencies using the DDCAT and DDCMHT respectively yielded similar findings (McGovern et al., 2010). Of 180 community addiction programs assessed at baseline, only 18% met DDC criteria, and 1%

407

DDE. Of 78 mental health programs assessed, only 9% achieved the DDC level, and none DDE. Thus, the 23% rate of integrated behavioral health services in FQHCs is actually higher than integrated services in the specialty care sector. If the application of the next version of the DDCHCS can follow in the tracks of the DDCAT and DDCMHT, then progress may be forthcoming. From baseline to follow-up, addiction programs have been able to incur a three-fold improvement in AOS to DDC level services (from 15 to 46% of sample), and mental health programs nearly eight times the rate of MHOS to DDC organizations (from 4 to 31% of sample). These improvements were made in resource neutral environments, typically by using commonly deployed quality improvement strategies (McGovern et al., 2010). With baseline rates of 23% DDC, the FQHCs are starting off in better shape than their specialty care provider counterparts.

Table 3 DDCHCS dimension and benchmark item scores by FQHC category: HCOS and DDC level services.

I. Program structure IA. Organizational identity and documented focus (i.e. mission) IB. Program certification, licensure and/or accreditation IC. Locale of and relationship with behavioral health services ID. Financial and reimbursement mechanisms Dimension I total score II. Program milieu IIA. Physical and social environment open and accessible IIB. Evidence of materials addressing behavioral health Dimension II total score III. Clinical process: assessment IIIA. Systematic screening for MH and SA problems IIIB. Integrated medical and behavioral health assessment IIIC. Documented psychiatric and substance use diagnoses IIID. Chronology of medical and behavioral disorders examined IIIE. Access to program based on symptom acuity IIIF. Access to program based on severity and persistence IIIG. Consideration of motivational stage of problems Dimension III total score IV. Clinical process: treatment IVA. Integrated treatment plan or problem list IVB. Monitor interactive treatment responses IVC. Procedures for behavioral emergencies and crises IVD. Treatment consideration based on motivational stage IVE. Access to psychiatric and addiction medications IVF. Integrated psychosocial interventions IVG. Patient education about co-occurring disorders and treatment IVH. Family education and support IVI. Facilitation to community peer recovery support on site IVJ. Exposure to peer recovery support individuals on site IVK. Practice guidelines for prescription medication abuse Dimension IV total score V. Continuity of care VA. Integrated discharge planning VB. Capacity to maintain treatment continuity VC. Focus on recovery issues for both disorders VD. Facilitation of peer-recovery support in community VE. Ongoing use of psychiatric and addiction medications Dimension V total score VI. Staffing VIA. Availability and role of physician or other prescriber VIB. Onsite staff with mental health and addiction credentials VIC. Access to integrated supervision or consultation VID. Case review, staffing, quality assurance support integrated behavioral health care VIE. Consistent availability and role of peer support persons on site Dimension VI total score VII. Training VIIA. Basic training on co-occurring disorders: All staff VIIB. Advanced training on co-occurring disorders: Clinical staff Dimension VII total score DDCHCS total score ⁎⁎ p ≤ .01. ⁎⁎⁎ p ≤ .001.

HCOS (n = 10)

DDC (n = 3)

M (sd)

M (sd)

2.90 3.80 3.50 3.10 3.33

3.00 4.67 4.67 5.00 4.33

t-test

(1.16) (0.57) (0.57) (0.00) (0.63)

− 0.10 − 2.11 − 1.42 − 3.77⁎⁎ − 2.70

3.70 (1.06) 2.00 (1.25) 2.85 (0.91)

5.00 (0.00) 3.00 (0.00) 4.00 (0.00)

− 3.88⁎⁎ − 2.54 − 3.98⁎⁎

3.00 3.20 3.50 3.50 3.90 4.40 2.00 3.36

(1.63) (1.14) (1.18) (1.35) (0.88) (0.70) (1.16) (0.66)

3.67 4.00 4.67 4.33 5.00 5.00 1.67 4.05

(1.53) (0.00) (0.58) (0.58) (0.00) (0.00) (1.16) (0.30)

− 0.63 − 2.23 − 1.62 − 1.01 − 3.97⁎⁎ − 2.71 0.44 − 1.72

2.50 2.90 2.80 1.60 2.90 2.30 2.80 2.50 1.60 1.20 3.38 2.39

(0.97) (1.37) (1.03) (0.70) (0.99) (0.82) (0.92) (1.18) (0.84) (0.42) (1.51) (0.43)

5.00 3.67 5.00 2.67 3.33 3.67 3.33 2.00 2.33 2.67 5.00 3.52

(0.00) (1.53) (0.00) (1.16) (0.58) (0.58) (0.58) (0.00) (1.16) (0.58) (0.00) (0.14)

− 8.14⁎⁎⁎ − 0.83 − 6.74⁎⁎⁎ − 2.02 − 0.71 − 2.65 − 0.94 1.34 − 1.23 − 4.91⁎⁎⁎ − 3.05 − 4.31⁎⁎⁎

2.56 3.56 3.44 1.90 3.60 2.94

(1.51) (1.01) (1.24) (1.20) (1.65) (0.98)

4.67 5.00 5.00 2.67 4.33 4.33

(0.58) (0.00) (0.00) (1.53) (1.16) (0.46)

− 2.30 − 4.27⁎⁎ − 3.78⁎⁎

3.50 1.50 2.30 1.80 1.40 2.10

(1.35) (0.53) (1.57) (0.92) (0.70) (0.73)

5.00 4.67 4.67 3.00 3.33 4.13

(0.00) (0.58) (0.58) (1.00) (0.58) (0.23)

− 3.50⁎⁎ − 8.97⁎⁎⁎ − 3.96⁎⁎ − 1.95 − 4.33⁎⁎⁎ − 7.66⁎⁎⁎

2.00 2.40 2.20 2.73

(0.67) (1.43) (1.01) (0.48)

3.33 4.00 3.67 4.00

(1.16) (1.00) (0.29) (0.16)

− 2.60 − 1.79 − 2.43 − 4.40⁎⁎⁎

(1.37) (0.63) (1.35) (1.60) (0.55)

− 0.92 − 0.71 − 2.34

408

M.P. McGovern et al. / Journal of Substance Abuse Treatment 43 (2012) 402–409

4.2. Limitations This is the first published report of a new measure of organizational capacity. With respect to external validity, the sample size was small, potentially not representative, and with the validity threats associated with a convenience sample, such as volunteer selection biases. Because of the sampling limitation, these data may not be generalizable to other FQHCs and may actually reflect an inflated estimate of capacity relative to the population of FQHCs across the United States. With respect to internal validity, the site visits occurred at a single point in time, and may have been influenced by peculiar situational or personnel factors unique to that day. The DDCHCS methodology and measure are evolving, and adapted from specialty care organizations that are quite different in purpose, size and staffing from FQHCs. Although all site assessors and raters were similarly trained, and inter-rater reliability estimates excellent, there may have been unique features of the evaluation/ evaluator processes that could account for some differences. There may be factors other than what were measured in this small study, as well as dimensions other than what the DDCHCS evaluates, that could be more critical to patient outcome.

4.3. Future research Bearing each of these potential limitations in mind, the findings from this small feasibility and field testing suggest that measure refinement and further research with the DDCHCS are warranted. These FQHCs, although a small and convenience sample, represented significant geographic diversity in the United States, from the northeast to the west coast to the south and mid-west. Arguably, organizations that volunteered to participate may be innovators or early adopters of integrated behavioral health care services, and thus these findings may not reflect the state of integration in FQHCs. A larger, more representative sample is necessary to discern the external validity of the findings. To reduce volunteer bias, FQHCs could be sampled from a national or even within state sample, and offered compensation to defray the cost of staff time to participate in interviews. As with the DDCAT and DDCMHT, each in version 4.0, the DDCHCS 2.0 will need to undergo an iterative process of refinement and revision. As a result of this study, the instrument has already been revised for the next stage of implementation. In addition to modifications suggested by these reliability study data, a significant revision includes revising the criteria for DDC level scores on the benchmark items. In the version of the DDCHCS described here, addressing substance use or mental health issues typically scored a “3” (DDC level) on most items. In some instances, this meant a more systematic approach to mental health (e.g., depression) patient concerns, and a less systematic protocol for substance use. The fact that substance use was being addressed, perhaps in a variable or more individual clinician preferential way still resulted in a score of 3. This may be a lenient way to score, DDC. The revised version of the DDCHCS will be more precise anchor definitions for rating each benchmark item. In order to achieve a 3, both disorders must be addressed, from screening to intervention to continuity of care, and if done so increasingly by protocol and routine, the score may progress from a 3 (DDC) to a 5 (DDE). Addressing one or the other, substance use or mental health, in a non-integrated manner, will not result in a score of 3 on the revised version of the DDCHCS. In addition, we discovered that concepts of admission criteria or discharge from services are not directly transferable from behavioral health to medical care settings. Medical providers responded to early explorations of these items with confusion: any patient who presents is seen, and the status of a patient once a patient remains. These items clearly needed to be revised and reframed.

In the staffing dimension of the measure, this version of the DDCHCS rated the overall integration of expertise in both mental health and addiction treatments. In addition to this benchmark rating, a revised version of the instrument must also include the exact number, percent of time and extent of formal expertise by discipline. These data will be important when considering staffing ratios, patterns and profiles, as well as calculations of the associated human resource costs that may be required to deliver quality integrated care. A final measure-related question emerged about extending the use of the ASAM categories to medical care: HCOS, DDC or DDE. Why not “not integrated”, “partially integrated” or “fully integrated” as it pertains to behavioral and medical care integration? At the present juncture, we suggest that the risk of integrating behavioral health and medical care and excluding one or the other “type” of problem, mental health or substance use, remains significant. For this reason, we believe that maintaining the HCOS, DDC and DDE framework is useful to assist this process, and avoid the making the historical mistake of system bifurcation. Systematic studies of integrated behavioral health and general medical care models are needed (Butler et al., 2008). It may be that integrated care is associated with better outcomes and efficient service delivery, but there is also some potential benefit in wellcoordinated services that remain based in separate locations (Druss et al., 2006; Friedmann, Alexander, Jin, & D'Aunno, 1999). Studies such as by Gurewich et al. (2012), that examine outcomes associated with service delivery models, are sorely needed. Co-location does not guarantee integration, and integration may not guarantee favorable outcomes. Future research will feature larger and more representative samples, with DDCHCS assessments using a revised and improved measure. Organizational factors, including size, leadership, agency tradition and affiliation may be also examined as predictors of the degree of integration. More descriptive data about number, discipline and documented expertise will also be important to systematically gather. Funding mechanisms associated with better integration might also be elucidated. Implementation science studies of comparative and effective strategies designed to improve integration (as measured by the revised DDCHCS), much like with the DDCAT and DDCMHT, are necessary. Last but not least, as a matter of predictive validity, how organizational capacity for integrated behavioral health and medical care translates into patient outcomes is the most critical of research questions. At this stage of development, the DDCHCS appears to be a promising measure of organizational capacity for integrated behavioral and medical care services. Acknowledgments The authors are indebted to the following individuals without whose collaboration and expertise this study would not have been possible: Charles Brackett, M.D., Desiree Crevecoeur-MacPhail, Ph.D., Peter Friedmann, M.D., Julienne Giard, M.S.W., Shelley McGeorge, Ph.D., Deborah Nieri, M.A., Elizabeth Schaper, MPH, Randi Tolliver, Ph.D. and William Torrey, M.D. The authors are also grateful for the participation and support of the leadership, staff members and patients from the FQHCs involved in this project. References Adams, A. S., Soumerai, S. B., Lomas, J., & Ross-Degnan, D. (1999). Evidence of self-report bias assessing adherence guidelines. International Journal of Quality in Health Care, 11(3), 187–192. Aldeman, P. K. (2003). Mental and substance abuse disorders among Medicaid recipients: Prevalence estimates from two national surveys. Adminstration and Policy in Mental Health, 31(2), 111–128. Boyd, C., Leff, B., Weiss, C., Wolff, J., Hamblin, A., & Martin, L. (2010). Clarifying multimorbidity patterns to improve targeting and delivery of clinical services for medicaid populations (Center for Health Care Strategies, Inc., December 2010).

M.P. McGovern et al. / Journal of Substance Abuse Treatment 43 (2012) 402–409 Retrieved January 20, 2012 from http://www.chcs.org/usr_doc/clarifying_ multimorbidity_patterns.pdf. Buck, J. (2011). The looming expansion and transformation of public substance abuse treatment under the affordable care act, 30, no.8 (2011). Health Affairs, 1402–1410, http://dx.doi.org/10.1377/hlthaff.2011.0480. Butler, M., Kane, R. L., McAlpine, D., et al. (2008). Integration of mental health/substance abuse and primary care. Evidence Report/Technology Assessment (Full Report)(173), 1–362. Collins, C., Hewson, D., Munger, R., & Wade, T. (2010). Evolving models of behavioral health integration in primary care. New York: Millbank Memorial Fund. Druss, B. G., Bornemann, T., Fry-Johnson, U. W., McCombs, H. G., Politzer, R. M., & Rust, G. (2006). Trends in mental health and substance abuse services in the nation's community health centers: 1998–2003. American Journal of Public Health, 96, 1779–1784. Fink, A., Kosecoff, J., Chassin, M., & Brook, B. H. (1984). Consensus methods: Characteristics and guidelines for use. American Journal of Public Health, 74, 979–983. Friedmann, P. D., Alexander, J. A., Jin, L., & D'Aunno, T. A. (1999). Onsite primary care and mental health services in outpatient drug abuse treatment units. Journal of Behavioral Health and Services Research, 26, 80–94. Friedmann, P. D., Zhang, Z., Hendrickson, J., Stein, M. D., & Gerstein, D. R. (2003). Effect of primary medical care on addiction and medical severity in substance abuse treatment programs. Journal of General Internal Medicine, 18, 1–8. Gotham, H. J., Brown, J. L., Comaty, J. E., McGovern, M. P., & Claus, R. E. (2009). The Dual Diagnosis Capability in Mental Health Treatment (DDCMHT) index. Presented at the annual Addiction Health Services Researchconference, San Francisco, CA. Gotham, H. J., Claus, R., Selig, K., & Homer, A. L. (2010). Increasing program capability to provide treatment services to clients with co-occurring substance use disorders and mental illness: The role of the organizational characteristics. Journal of Substance Abuse Treatment, 38(2), 160–169. Grantham, D. (2010). Looking into the crystal ball. Behavioral healthcare: The practical resource for the field's leaders. Retrieved from: http://www.behavioral.net/ME2/ dirmod.asp?sid=&nm=&type=Publishing&mod=Publications%3A%3AArticle& mid=64D490AC6A7D4FE1AEB453627F1A4A32&tier=4&id= 35A0252458974B1C85F4991BB18617632010. Gurewich, D., Sirkin, J. T., & Shepard, D. S. (2012). On-site provision of substance abuse treatment services at community health centers. Journal of Substance Abuse Treatment, 42, 339–345. Halvorson, A. (2010). Implementing healthcare reform: First steps to transforming your organization a practical guide for leaders. Moving Forward Alliance. Retrieved from: http://www.saasnet.org/PDF/Implementing_Healthcare_Reform-First_ Steps.pdf. Jones, J., & Hunter, D. (1995). Consensus methods for medical and health services research. British Medical Journal, 311, 376–380. Katon, W., Russo, J., Von Korff, M., et al. (2008). Long-term effects on medical costs of improving depression outcomes in patients with depression and diabetes. Diabetes Care, 31(6), 1155–1159. Kuehn, B. M. (2010). Treatment given high priority in new White House drug control policy. Journal of the American Medical Association, 303(9), 821–822, http://dx.doi.org/ 10.1001/jama.2010.210. Lardiere, M. R., Jones, E., & Perez, M. (2011). National Association of Community Health Centers 2010 assessment of behavioral health services provided in federally qualified health centers. Bethesda: National Association of Community Health Centers. Lee, N., & Cameron, J. (2009). Differences in self and independent ratings on an organizational dual diagnosis capacity measure. Drug and Alcohol Review, 28(6), 682–684. Lo Sasso, A., & Byck, G. (2010). Funding growth drives community health center services. Health Affairs, 29, 289–296. McGovern, M. P., Lambert-Harris, C., McHugo, G. J., Giard, J., & Mangrum, L. (2010). Improving the dual diagnosis capability of addiction and mental health treatment services: Implementation factors associated with program level changes. Journal of Dual Diagnosis, 6, 237–250.

409

McGovern, M. P., Xie, H., Acquilano, S., Segal, S. R., Siembab, L., & Drake, R. E. (2007). Addiction treatment services and co-occurring disorders: The ASAM-PPC-2R taxonomy of program dual diagnosis capability. Journal of Addictive Diseases, 26(3), 27–37. McGovern, M. P., Xie, H., Segal, S. R., Siembab, L., & Drake, R. E. (2006). Addiction treatment services and co-occurring disorders: Prevalence estimates, treatment practices, and barriers. Journal of Substance Abuse Treatment, 31(3), 267–275. McLellan, A. T. (2010). Addiction Treatment in Healthcare Reform. Retrieved from: http://niatx.adobeconnect.com/p44628257/?launcher=false&fcsContent= true&pbMode=normal. Mee-Lee, D., Schulman, G. D., Fishman, M., Gastfriend, D. R., & Griffith, J. H. (2001). ASAM patient placement criteria for the treatment of substance-related disorders. Chevy Chase, MD: American Society of Addiction Medicine, Inc. National Association of Community Health Centers. (2009). United States Health Center Fact Sheet. Retrieved on July 30, 2010, from http://www.nachc.com/client/ documents/United%20States%20FSv2.pdf. National Association of Community Health Centers. (2010). Healthcare reform impact at a glance: What's in it for persons with mental and addiction disorders. National Council Magazine. Washington DC: National Council for Community Behavioral Healthcare. Office of National Drug Control Policy. (2010). National Drug Control Strategy. Retrieved from: http://www.whitehouse.gov/sites/default/files/ondcp/policyand-research/ndcs2010.pdf. Office of National Drug Control Policy. (2011). National Drug Control Strategy. Retrieved from: http://www.whitehousedrugpolicy.gov/strategy/2011ndcs.html. Office of the Surgeon General. (2001). Mental health: Culture, race, and ethnicity: A supplement to mental health: A report of the surgeon general. Rockville, MD: Substance Abuse and Mental Health Services Administration. Retrieved from: http://www.surgeongeneral.gov/library/mentalhealth/cre/sma-01-3613.pdf. Parthasarathy, S., Mertens, J., Moore, C., et al. (2003). Utilization and cost impact of integrating substance abuse treatment and primary care. Medical Care, 41(3), 357–367. Pating, D. R., Miller, M. M., Goplerud, E., Martin, J., & Ziedonis, D. M. (2012). New systems of care for substance use disorders. Psychiatric Clinics of North America, 35, 327–356. Simon, G., Katon, W., Lin, E., et al. (2007). Cost effectiveness of systematic depression treatment among people with diabetes mellitus. Archivesof General Psychiatry, 64(1), 65–72. SPSS Inc. (2008). SPSS 17.0 for Windows. Chicago, IL. Technical Assistance Collaborative / Human Services Research Institutes. (2012). California Mental Health and Substance Use Needs Assessment, January 30, 2012 Draft. Retrieved from: http://www.dhcs.ca.gov/Documents/All%20chapters%20final%201-31-12.pdf. U.S. Department of Health and Human Services Health Resources and Services Administration. (2010a). Primary Health Care. Retrieved August 16, 2010, from http://www.hrsa.gov/about/organization/bureaus/bphc/bphc.pdf. U.S. Department of Health and Human Services Health Resources and Services Administration. (2010b). The Health Center Program: Recovery Act Grants. Retrieved August 20, 2010, from http://bphc.hrsa.gov/recovery/. Weisner, C., Mertens, J., Parthasarathy, S., Moore, C., & Lu, Y. (2001). Integrating primary medical care with addiction treatment: A randomized controlled trial. Journal of the American Medical Association, 286, 1715–1723. Wells, K., Klap, R., Koike, A., & Sherbourne, C. (2001). Ethnic disparities in unmet need for alcoholism, drug abuse, and mental health care. American Journal of Psychiatry, 158(12), 2027–2032. Wells, R., Morrissey, J. P., Lee, I. H., & Radford, A. (2010). Trends in behavioral health care service provision by community health centers, 1998–2007. Psychiatric Services, 61(8), 759–764. Wu, L. T., Kouzis, A. C., & Schlenger, W. E. (2003). Substance use, dependence, and service utilization among the US uninsured nonelderly population. American Journal of Public Health, 93(12), 2079–2085.