Journal of Clinical Epidemiology 65 (2012) 544e552
No difference demonstrated between faxed or mailed prenotification in promoting questionnaire response among family physicians: a randomized controlled trial Melina Gattellaria,b,*, Nicholas Zwara, John M. Worthingtonc,d a
School of Public Health and Community Medicine, The University of New South Wales, Ingham Institute for Applied Medical Research, Centre for Research Management, Evidence and Surveillance, Sydney South West Area Health Service, Locked Bag 7008, Liverpool NSW 1871, Australia b School of Public Health and Community Medicine, The University of New South Wales, Australia c Department of Neurophysiology, Ingham Institute for Applied Medical Research, Sydney South West Area Health Service, Sydney, Australia d Stroke and Neurology Services, Northern Beaches Hospitals, New South Wales, Australia Accepted 9 August 2011
Abstract Objective: Achieving high survey participation rates among physicians is challenging. We aimed to assess the effectiveness of response-aiding strategies in a postal survey of 1,000 randomly selected Australian family physicians (FPs). Study Design and Setting: A two two randomized controlled trial was undertaken to assess the effectiveness of a mailed vs. faxed prenotification letter and a mailed questionnaire sealed with a label marked attention to doctor vs. a control label. At the time of our final reminder, we randomized remaining nonresponders to receive a more or less personalized mail-out. Results: Response did not significantly differ among eligible FPs receiving a prenotification letter via mail or fax. However, 25.6% of eligible FPs whose questionnaires were sealed with a label marked attention to the doctor responded before reminders were administered and compared with 18.6% of FPs whose questionnaires were sealed with a control label (P 5 0.008). Differences were not statistically significant thereafter. There was no significant difference in response between FPs who received a more vs. less personalized approach at the time of the final reminder (P 5 0.16). Conclusion: Mail marked attention to doctor may usefully increase early response. Prenotification letters delivered via fax are equally effective to those administered by mail and may be cheaper. Ó 2012 Elsevier Inc. All rights reserved. Keywords: Postal questionnaires; Response rate; Family physicians; Randomized controlled trial; Prenotification prompts; Message
1. Introduction Postal surveys are widely used in public health and epidemiologic research [1e4]. Questionnaires may be used to describe characteristics of populations and collect outcome data for epidemiologic studies and randomized controlled trials. Random selection of study participants ensures adequate representation of the population from which the sample was drawn [5]; however, selection bias may still occur if a high response rate is not achieved. Further, larger samples increase the power of statistical analyses, reducing the likelihood of type II errors occurring. * Corresponding author. School of Public Health and Community Medicine, The University of New South Wales, Ingham Institute for Applied Medical Research, Centre for Research Management, Evidence and Surveillance, Sydney South West Area Health Service, Locked Bag 7008, Liverpool NSW 1871, Australia. Tel.: þ61-2-9612-0656; fax: þ61-2-9612-0650. E-mail address:
[email protected] (M. Gattellari). 0895-4356/$ - see front matter Ó 2012 Elsevier Inc. All rights reserved. doi: 10.1016/j.jclinepi.2011.08.014
Surveys of physicians provide insights into their knowledge, attitudes, and self-reported practices about clinical issues (e.g., [6e9]). Surveys may provide important information about gaps or deficiencies in current practice and guide the development of interventions designed to bridge those gaps. Physicians, including those engaged in general or family practice, are traditionally viewed as difficult to recruit into survey research [3,9e12]. A lack of time, inundation with requests to respond to surveys, and insufficient remuneration for research participation are possible reasons family physicians (FPs) do not reply to questionnaires [3,9,11,12]. Prenotification of a questionnaire mail-out via a letter, postcard, or phone call is proven to increase response rates among physicians [4,13e16]. Advance letters appear to be just as effective as phone calls and are easier to administer as they do not require direct contact with intended participants [15,16].
M. Gattellari et al. / Journal of Clinical Epidemiology 65 (2012) 544e552
What is new? Key findings Faxed prenotification letters alerting family physicians (FPs) to the imminent arrival of a questionnaire are just as effective in promoting response as mailed prenotification letters. Marking mail attention to the doctor increases early response rates. Personalizing a mail-out to promote response at the time of the last reminder letter does not increase response rates. What this adds to what was known? The effect of faxed prenotification letters on physician response rates has not been previously evaluated, and we found these are just as effective as mailed prompts and may be cheaper to administer. Marking mail attention to doctor as a ‘‘teaser’’ has not been previously evaluated and is a cheap method for improving early response. Attempts to increase response at a late stage of data collection may not prove effective. What is the implication, what should change now? There is now evidence that phone, faxed, and mailed prenotifications of a survey mail-out to FPs are equally effective. Researchers may select the least costly method, depending on their circumstances. Marking mail attention to doctor should be considered when first mailing questionnaires to FPs. Reminder letters remain essential response-aiding strategies to promote participation. Implementing strategies to increase response rates at a late stage of data collection should not be attempted without further evaluation.
Advance letters can be mailed, faxed, or e-mailed. A recently updated systematic review of response-aiding strategies suggests that trials of advance letters or postcards have only tested the effectiveness of mailed and not other modes of delivery [4]. Mailed letters may be visually more appealing than faxed letters, and physicians may also appreciate the extra effort that is required to prepare and mail a formal letter. On the other hand, faxed letters are likely to save costs on postage and packaging. We are unaware of any head-to-head comparisons of mailing, and faxing advance prompts to physicians.
545
Including a ‘‘teaser’’ message to arouse curiosity about the contents of a letter promoted a higher response rate in members of the general public [17]. To reach FPs, incoming mail must pass through practice staff, who may act as ‘‘gate-keepers’’ to filter and dispose of mail that may seem unimportant and outside a doctor’s core business [3]. Marking mail as attention to the doctor may, therefore, increase the likelihood that surveys are delivered to the doctor. It is possible that such an attempt to gain attention for a nonclinical or nonconfidential matter could be perceived as relatively trivial and be counterproductive. There are a limited number of studies evaluating ‘‘teaser messages’’ and, to our knowledge, have yet to be trialed in health professionals [4]. Personalizing mail has been shown to increase response rates [4] and can be achieved by including a stamp attached reply paid envelope as opposed to a business reply paid envelope [4,18e20] and hand-signing cover letters [4]. In addition, researchers have varied the color of envelopes [4,21e23] with variable success. According to the most recent systematic review, none of these strategies appear to have been trialed with physicians. Stamps on outgoing envelopes were trialed in several studies in the 1970s in nonclinician samples [4]. This strategy did not increase response, although it may be worth reconsidering this approach in a modern study, as stamps are now less commonly used than franked envelopes and may be seen as more distinctive. It is unclear if combining several methods would amplify response rates beyond what can be expected if these methods are used singly. We carried out a national survey of Australian FPs about their management of nonvalvular atrial fibrillation [24,25], incorporating a trial to assess the effectiveness of responseaiding strategies, including a prenotification letter that was either faxed or mailed to all FPs. At the time of mailing questionnaires, we sealed envelopes with a label that either did or did not include a message marked attention to the doctor. When the final reminder was due, we evaluated the effectiveness of a more personalized approach that differed from our previous approach. We hypothesized that nonresponders who received a combination of more personalized approaches would be significantly more likely to reply to the survey than FPs who received mail that looked similar to that previously sent.
2. Methods 2.1. Participant selection As previously reported [24,25], we obtained the names, practice and preferred mailing addresses, and phone and fax numbers of 1,000 randomly selected FPs known to be in current practice. The commercial database compiles details of practicing doctors within Australia deriving
546
M. Gattellari et al. / Journal of Clinical Epidemiology 65 (2012) 544e552
information from a variety of sources, including medical colleges, medical directories, and state medical boards. Demographic information was also provided, and we used postal code of practice to assess socioeconomic disadvantage, measured using a standardized decile rank measure, the Index of Relative Socioeconomic Disadvantage [26], and ‘‘remoteness’’ of practice location, measured using the Accessibility/ Remoteness Index of Australia [27]. We did not carry out a pilot of the survey.
(1) to receive an advance letter via fax, or (2) to receive the advance letter via mail. Random allocation was achieved by using the ‘‘select if’’ command in SPSS [28] to request a random sample equating to exactly 500 of 1,000 names. The two lists were each randomly divided into two groups of 250, allocating FPs to receive either a fluorescent pinkcolored label (control) or the same label with the following text printed in black: ‘‘Attention: to be opened by doctor only.’’ The label measured approximately 9.8 by 3.5 cm (4 1.5 inches) and was affixed to the seal of the envelope. Faxed prompts were sent 8e15 days before the mail-out of questionnaires (i.e., 4e11 October, 2005), and mailed advance prompts were sent 7 days before questionnaires were mailed (i.e., on October 12, 2005). The first author prepared all mail-outs, including the advance mailed prompt letters, whereas administrative assistants were responsible for faxing prompt letters.
2.2. Response-aiding strategies and follow-up of nonresponders We used a two two factorial design to evaluate two response-aiding strategies (see Fig. 1). The first author, who was responsible for managing the project, used SPSS to randomly divide FP records into one of two groups:
1000 randomly selected general practitioners
Randomised Advanced prompt mailed (N=500)
Advance prompt faxed (N=500)
Day 1: Questionnaire mailed October 19, 2005 Randomised Coloured seal (N=250)
Randomised
Coloured seal + text (N=250)
Coloured seal (N=250)
Coloured seal+ text (N=250)
Day 15: Reminder letter sent to non-responders Day 29: Reminder letter + another copy of questionnaire sent to non-responders Days 57 and 58: reminder letter sent to non-responders Day 85: Phone calls to non-responders commence
Faxed prompt, Coloured seal Questionnaires returned from eligible participants: Refusals*: Ineligible Reasons for Ineligibility Died On extended leave Ill Left Australia Not in general practice Retired Unable to contact
Faxed prompt, Coloured seal + text
Mailed prompt, Coloured seal
Mailed prompt, Coloured seal+ text
136 (54.4%)
141 (56.4%)
143 (57.2%)
n=38 (15.2%) n=17 (6.8%)
n=34 (13.6%) n=21 (8.4%)
n=38 (15.2%) n=17 (6.8%)
147 (58.8%) n=31 (12.4%) n=15 (6%)
n=0 n=6 n=0 n=1 n=9 n=1 n=0
n=1 n=2 n=1 n=1 n=13 n=2 n=1
n=0 n=2 n=0 n=1 n=10 n=4 n=0
n=0 n=2 n=1 n=0 n=9 n=3 n=0
Day 209: Final reminder due for 222 GPs. N=216 sent final reminder 6 GPs not sent final reminder either because the current practice address unavailable (n=2); the GP could not be contacted via phone (n=3) or because they had contacted researchers to indicate they would complete survey (n=1).
Reply-paid envelope changed: (N=108)
Cover letter, outgoing and reply paid envelope changed: (N=108)
Ineligible: n=1 (retired) Responders: 18 (16.7%)
Ineligible: n=3 (not in general practice, n=1; unable to contact, n=2). Responders: 11 (10.2%)
Fig. 1. Participant flow for 1,000 family physicians.
M. Gattellari et al. / Journal of Clinical Epidemiology 65 (2012) 544e552
All questionnaires were mailed in a yellow-colored envelope, with the institutional title, logo, and return address printed on the top left-hand corner. Each questionnaire was printed in blue ink on a sand-colored 24-page, ‘‘saddle-stitched’’ A4 sized booklet. A business reply paid envelope was included. Questionnaires were mailed on October 19, 2005. Three standardized mailed reminders were administered (see Fig. 1). All letters included a personalized salutation (Dear Dr X). With the exception of the reminder sent on days 57e58, FPs were advised of their potential to earn five continuing medical education points to be allocated by their professional college at each contact and the labels used for all other mailings matched those that had been allocated to FPs at the initial mail-out. All letters were printed on institutional letterhead with the electronic signatures of the investigators. All mail was franked. The first author administered phone calls to the practices of the remaining nonresponders from day 85, offering to send another questionnaire if required. Call sheets for nonresponding FPs were prepared identifying FPs according to their name, contact details, sex, and identification number (not coded for group allocation). Up to six attempts were made to speak personally with the FP, after which a reminder note was faxed. The trial evaluating the impact of the response-aiding strategies ended on May 16, 2006 (day 209), the date we mailed the final reminder. Questionnaires were mailed to each FP’s preferred mailing address, which we identified as being the address listed as the main practice, home, or another address (i.e., address not identified as being the main practice address), or unclear because the mailing address was a post office box or private mailing box. 2.2.1. Final reminder: assessment of more personalized vs. less personalized approach to nonresponders All questionnaires were mailed with a stamped reply-paid envelope as a meta-analysis had proven the effectiveness of this approach [4]. Using SPSS, we randomized the remaining nonresponders to receive a stamped outgoing manila-colored envelope or the same packaging as previously used with franked postage. The former group also received an A5-sized note, printed on stripedembossed sand-colored paper, hand-signed by one of the investigators (M.G.). Therefore, we were able to compare an approach that was more personalized or alternatively, one that differed to our previous unsuccessful attempts. The final mailing also requested doctors to complete a form indicating if they had retired or were no longer in family practice. We had not planned to carry out this trial at the outset of the project, amending our protocol for the planned final reminder to formally test these strategies. 2.3. Outcome assessment We defined our outcome as the number of questionnaires returned by eligible FPs divided by the number of all FPs
547
randomized to the experimental conditions. We calculated response rates across the key time points of scheduled reminders (see Fig. 1). Therefore, we were able to determine how effective our strategies were in reducing the researcher burden and resources required for implementing reminders. 2.4. Sample size and analysis We aimed to receive 600 responses from FPs so that the 95% confidence interval (CI) of estimated proportions did not vary by more than 62%. We approached 1,000 FPs aiming for a 60% response rate, achieved in a previous survey of Australian FPs about a similar topic [15]. Our study was powered to detect an 8.5% difference in final response rates between groups, assuming a baseline response of 60%, power of 80%, and alpha of 5%. The first author recorded dates associated with reminders and questionnaires and, therefore, completed data entry for this study and analyzed data. During the trial, 74 FPs were discovered to be ineligible (see Fig. 1). We included ineligible doctors in our denominator to perform an intention-to-treat analysis. Differences in proportions were assessed using chi-square tests, and differences in mean and median outcomes were assessed using analysis of variance independent Student’s t-test for normally distributed variables and ManneWhitney U test and KruskalleWallis tests for nonnormally distributed variables. SPSS Version 18.0.2 was used for these analyses [28]. To determine the effect of type of prompt (fax or letter) and type of seal (marked attention to doctor or control), we used log binomial regression using PROC GENMOD in SAS software version 9.2 [29] to first assess the significance of the interaction between prompt and label type. If no significant interaction was demonstrated, we removed the interaction term, testing the main effects in the same model. Statistical significance was based on the likelihood ratio and was declared if P-values were less than 0.05.
3. Results 3.1. Baseline characteristics Characteristics appeared similar across groups, although despite randomization, age, years since graduation, and decile ranking scores for socioeconomic disadvantage of practice address differed significantly across groups (see Table 1). The largest difference in mean ages, years since graduation, and median Socioeconomic Indexes for Areas (SEIFA) scores between groups was 2.2 years, 2.1 years, and 1 point, respectively. Hence, the differences do not appear practical, and the relatively large sample size may have provided sufficient power to detect small differences in continuous variables as statistically significant. Age and years since graduation were highly correlated, as expected (r 5 0.95), with fewer missing values for the latter variable. Results adjusting for years since graduation and
548
M. Gattellari et al. / Journal of Clinical Epidemiology 65 (2012) 544e552
Table 1. Baseline characteristics of 1,000 randomly selected Australian general practitioners, by group
Variable Age, mean (95% CI) Years since graduation, mean (95% CI) Sex Female, n (%) Male, n (%)
Fax prompt, colored label without text (N [ 250)
Fax prompt, colored label with text (N [ 250)
Letter prompt, colored label without text (N [ 250)
Letter prompt, colored label with text (N [ 250)
P-valuea
P-valueb
P-valuec
47.9 (46.6e49.3) 23.3 (22.0e24.6)
49.9 (48.6e51.3) 24.8 (23.5e26.1)
48.9 (47.6e50.2) 23.8 (22.6e25.1)
50.1 (48.9e51.4) 25.4 (24.1e26.7)
0.37 0.41
0.01 0.02
0.07 0.11
92 (36.8) 158 (63.2)
80 (32.0) 170 (68.0)
0.84
0.16
0.57
162 (65.3)
169 (67.6)
53 (21.4) 31 (12.5)
48 (19.2) 28 (11.2)
2 (0.8)
5 (2.0)
0.16
0.98
0.54
9 (3.6)
0.23
0.16
0.44
91 (36.4) 159 (63.6)
83 (33.2) 167 (66.8)
Accessibility/remoteness index score of practice Highly accessible, 174 (70.2) 172 (69.1) n (%) Accessible, n (%) 39 (15.7) 46 (18.5) Moderately 26 (10.5) 25 (10.0) accessible, n (%) Remote/very 9 (3.6) 6 (2.4) remote, n (%) Australian state of practice New South Wales, n (%) Victoria, n (%) Queensland, n (%) South Australia, n (%) Western Australia, n (%) Other, n (%)
85 (34.0)
88 (35.2)
75 (30.0)
82 (32.8)
64 (25.6) 37 (14.8) 23 (9.2)
51 (20.4) 51 (20.4) 19 (7.6)
68 (27.2) 45 (18.0) 26 (10.4)
56 (22.4) 59 (23.6) 22 (8.8)
24 (9.6)
23 (9.2)
25 (10.0)
22 (8.8)
17 (6.8)
18 (7.2)
11 (4.4)
SEIFA score of practice address Median (IQR) 7 (4e9) Mailing address Listed as practice address, n (%) Home or other address, n (%) Uncleard, n (%)
7 (4e9)
6 (4e8)
6 (3e9)
0.01
0.89
0.06
129 (51.6)
129 (51.6)
135 (54.0)
144 (57.6)
0.52
0.41
0.61
91 (36.4)
81 (32.4)
81 (32.4)
75 (30.0)
30 (12.0)
40 (16.0)
34 (13.6)
31 (12.4)
Abbreviations: SEIFA, Socioeconomic Indexes for Areas; IQR, interquartile range; CI, confidence interval. a P-value comparing fax vs. letter prompt. b P-value comparing colored label without and with text. c P-value comparing all four groups. d Sent to private mailing box.
decile ranking of socioeconomic disadvantage did not differ from unadjusted results (Table 2). 3.2. Evaluation of response-aiding strategies: prompt and seal type The different interventions did not interact with each other (Table 2) allowing for the assessment of main effects. Differences between groups were not statistically different at any time point according to whether FPs received a faxed or mailed prompt letter (Table 2). However, by the date the first reminder was due (Fig. 1), significantly more eligible FPs allocated to receive a label printed with text (‘‘Attention: to be opened by doctor only’’) returned
questionnaires compared with those who received a control label (n 5 128, 25.6% vs. n 5 93, 18.6%) (relative risk 5 1.38, 95% CI 5 1.09e1.74) (P 5 0.008). The difference in the response rates between these two groups was marginally significant before the second reminder was administered (34.8% vs. 29.4%) (relative risk 5 1.18, 95% CI 5 0.99e1.42) (P 5 0.07). However, differences did not remain either practically or statistically significant after this second reminder. Response rates before the final reminder differed by less than 2% (see Table 2). We carried out post hoc analyses to further explore the effect of marking questionnaires attention to doctor over the 209-day study period. We compared median days to respond among responding FPs. Respondents whose questionnaires
M. Gattellari et al. / Journal of Clinical Epidemiology 65 (2012) 544e552
549
Table 2. Response rates by group, across study period Response-aiding strategy
Before first reminder
Fax prompt (N 5 500), n (%) Letter prompt (N 5 500), n (%) Unadjusted relative risk (95% CI) Adjusted relative risk (95% CI)
110 (22.0)
163 (32.6)
222 (44.4)
230 (46.0)
277 (55.4)
111 (22.3)
158 (31.6)
228 (45.6)
239 (47.8)
290 (58.0)
0.99 (0.78e1.25)
1.03 (0.86e1.23)
0.97 (0.85e1.12)
0.96 (0.84e1.10)
0.96 (0.86e1.06)
1.01 (0.80e1.27)
1.03 (0.86e1.24)
0.98 (0.85e1.13)
0.97 (0.85e1.11)
0.96 (0.86e1.07)
Colored seal without text (N 5 500), n (%) Colored seal with text (N 5 500), n (%) Unadjusted relative risk (95% CI) Adjusted relative risk (95% CI)
Before second reminder
Before third reminder
Before phone calls
Before final reminder
93 (18.6)
147 (29.4)
218 (43.6)
225 (45.0)
279 (55.8)
128 (25.6)
174 (34.8)
232 (46.4)
244 (48.8)
288 (57.6)
1.38 (1.09e1.74)*
1.18 (0.99e1.42)***
1.06 (0.93e1.22)
1.08 (0.95e1.24)
1.03 (0.93e1.15)
1.34 (1.06e1.70)**
1.18 (0.99e1.42)***
1.07 (0.93e1.23)
1.09 (0.95e1.24)
1.05 (0.94e1.17)
Prompt type seal type, unadjusted P-value
0.69
0.36
0.71
0.87
0.94
Prompt type seal typea, adjusted P-value
0.57
0.44
0.59
0.72
0.97
Abbreviation: CI, confidence interval. *P 5 0.008. **P 5 0.02. ***P 5 0.07. a Index of Relative Socioeconomic Disadvantage decile rank scores modeled categorically; years practicing as family physician modeled categorically created using quintile cutoff points.
were marked ‘‘attention to doctor’’ replied within a median of 19 days (interquartile range [IQR] 5 9e41) compared with 27 (IQR 5 12e48) days for other FPs (ManneWhitney U test z 5 2.21, P 5 0.03). A KaplaneMeier test for median time to respond over the 209-day study period revealed that it took 96 days (95% CI 5 53.9e138.1) to achieve a 50% response rate among FPs randomized to receive a questionnaire marked ‘‘attention to doctor.’’ This compared with 125 days among FPs randomized to the control label (95% CI 5 91.3e158.7), which was not statistically significant (Log-rank testP-value 5 0.28). This was consistent with findings from our main analysis, demonstrating that differences in response rates attenuated over time. We tested whether the effect of seal type interacted with the address the survey was mailed to (practice address, home, or other address not otherwise associated with a practice address or unclear); these analyses were not significant (P-valueinteraction 5 0.33e0.88 for unadjusted analyses or P-valueinteraction 5 0.66e0.89 for adjusted effects). 3.2.1. Final reminder: assessment of more personalized vs. less personalized approach to nonresponders When the final reminder was administered, 567 eligible FPs had responded to the questionnaire (56.7%), whereas 70 (7.0%) were found to be ineligible, and 141 (14.1%) had refused to participate or declined for reasons that did not fulfill exclusion criteria. When compared with
respondents, FPs who refused participation were significantly older (mean 5 50.8 years vs. 48.8 years) (P 5 0.04) and had been in practice longer (mean 5 25.8 years vs. 23.9 years) (P 5 0.05). There were no other statistically significant differences between groups on other variables listed in Table 2. Forwarding addresses for five FPs was being sought, whereas one FP had advised us she/he would be returning his/her questionnaire (see Fig. 1), leaving 216 nonresponding FPs known to be eligible. FPs were randomized to receive a less personalized (n 5 108) or more personalized (n 5 108) approach when final reminders were mailed. This sample size was sufficient to detect an 18% difference in response between groups, assuming a 60% response rate in the control arm (a 5 0.05; 1-b 5 80%). FPs in both groups were similar with respect to age, years since graduation, geographical accessibility of practice location, state of practice, SEIFA index score, and mailing address (P-values 5 0.29e0.82). A lower proportion of FPs allocated to the more personalized approach responded to the survey compared with those allocated to a less personalized approach (n 5 18, 16.7% vs. n 5 18, 10.2%, see Fig. 1), (relative risk 5 0.61, 95% CI 5 0.29e1.18) (P 5 0.16), although the result was not statistically significant (P 5 0.16). At the end of data collection, we received 596 responses from 926 eligible FPs, yielding a response rate among FPs known to be
550
M. Gattellari et al. / Journal of Clinical Epidemiology 65 (2012) 544e552
eligible of 64.4%. The final reminder increased the response by 2.9%.
4. Discussion 4.1. Main findings This study evaluated strategies to enhance the response rate in a survey of Australian FPs. We incorporated a prenotification letter as randomized trials carried out in Australia have proven the effectiveness of alerting physicians to the imminent arrival of a questionnaire [13e16], and a recent study in Ireland has reported a nonsignificant improvement in response rates with a mailed prenotification letter to FPS [6]. A systematic review has reported that both phone and mailed prenotifications improve response rates [4], whereas a randomized head-to-head comparison found phone and mailed prompts were equally effective for promoting response among FPs [15]. We found no difference in initial or subsequent response according to whether the method of communicating the prenotification letter was by mail or fax. The relative appeal of mailed or faxed letters may depend on how local circumstances influence the costs of both. In Australia, for example, there is a standard cost of mailing a letter to any region, whereas fax costs are metered for long-distance calls or charged at a flat rate for local calls. ‘‘Plans’’ available through telecommunication providers can greatly reduce the costs of both local and metered long-distance calls, and bulk postage can reduce the cost of mailings. Mailed letters have the cost disadvantage of institutional envelopes and letterhead for each mail-out, whereas faxed prompts can use a less expensive photocopy or a printable letterhead template. Together with other reports, our findings suggest that the type of prenotification may not be critical in promoting response and researchers may select the least costly method for their circumstances. We are aware of one study that has trialed prenotification via e-mail (Adily A. Capacity for evidence-based policy and practice in Australian population health [PhD Dissertation]. The University of New South Wales; 2009.) compared with other methods, in a survey targeting public health practitioners. E-mail prenotification may not always be feasible as it is difficult to obtain doctors’ e-mail addresses before contacting them. When carrying out our research, we had not considered e-mail prenotification and our mailing list provider currently estimates that e-mail addresses are only available for around roughly one-third of doctors (Australian Medical Publishing Company, July 11, 2011, personal communication). Delivering prompts via e-mail has obvious appeal as these can be automatically generated and are almost cost free. FPs randomized to receive the fax prompt received prenotification 8e15 days before receiving their questionnaire, and questionnaires were mailed 7 days after mail prompts. When
planning our protocol, we wished to ensure that all FPs received the mailed questionnaires at around the same date, irrespective of group allocation. If our protocol required sending questionnaires at a specific time after prenotifications had been sent (e.g., 7 days after prenotification), then questionnaire and reminder mail-outs for FPs receiving the faxed prompt would have been staggered over time as not all faxes could be delivered on the same day. In contrast, FPs allocated to receive the mailed prompt would have been sent their questionnaires and subsequent reminders on the same date as it would have been feasible to mail the prenotification on the same day for these FPs. Therefore, standardizing the time between prenotification delivery and questionnaire mail-out will have necessarily introduced differences between groups according to the dates when questionnaires and reminders were mailed. We wished to avoid introducing confounding between groups in response rates because of when reminders/questionnaires were mailed, especially with those dates where we could expect difficulties (e.g., coinciding with the Christmas/New Year period and summer holiday period of 2005 to 2006). Second, we saw this as a pragmatic trial, accepting that delivering faxed prompts are dependent on human resources and available facilities. It may be argued that mailed prompts would be more effective than faxed prompts as these were received closer to the date that FPs received their questionnaire. Our null result, corroborating reports that the type of advance prompt does not influence response rates, calls such an interpretation into question. The timing of prenotification is unlikely to have had an effect on response rates, although further research may be useful as evidence on the effect of prenotification timing is limited. Marking the envelope ‘‘Attention: to be opened by doctor only’’ increased initial response by 7%. One quarter of FPs receiving the questionnaire package with the label marked for the doctor’s attention responded after the first mailing compared with fewer than one-fifth in the control group. The effectiveness was attenuated in subsequent mailings, and later differences between the label groups were not statistically significant. In studies evaluating different response-aiding strategies, early differences are often sustained at the conclusion of the study [4]. Our finding suggests FPs had become desensitized to any novelty or attention-grabbing effect of the label once they had already been made aware of the contents of the envelope. Alternatively, a reminder may have a stronger effect on response than the label strategy. Before implementing our final mail-out, each reminder (except the third reminder consisting of a brief note coinciding with the Christmas/New Year period) increased response rates by around 10% irrespective of the type of reminder administered. The first, a reminder letter, was followed by a second copy of the questionnaire followed by a reminder telephone call. These different types of reminders incur varying resource costs, underscoring the importance of promoting early response to reduce resources needed to follow-up nonresponders. A recent systematic
M. Gattellari et al. / Journal of Clinical Epidemiology 65 (2012) 544e552
review supports mailing a second questionnaire with a telephone reminder over no reminder [4]. However, only one of the studies included in the systematic review had targeted health professionals. The study randomized 78 participants to a second copy of the questionnaire or to receive a telephone reminder with an offer to complete the questionnaire via the phone [30]. This study showed that a second mailing of the questionnaire was more cost effective than telephone contact and interviews. A recent study targeting Australian FPs carried out after our own, found that both telephone and mailed reminders were equally effective [31]. This study did not use prenotification prompts and sent one mailed reminder to FPs. Future research may evaluate the effectiveness and cost implications of different types of reminders and as well as their frequency and timing. We also incorporated response-aiding strategies at the time we administered the final reminder. Our final reminder increased the response rate by only 2.9%. We combined several strategies that had been previously shown to be effective, increasing the personalization of the mail-out [4]. These results suggest that combining several responseaiding strategies does not compound the effectiveness of individual strategies. It should be noted that the final reminder occurred after four reminders has been administered. At such a late stage of the study, encouraging the participation of nonresponders may be particularly challenging. The results of our study suggest that using response-aiding strategies as part of the final reminder is unlikely to improve participation and that response-aiding strategies are best applied early in the study period. 4.2. Limitations We did not calculate the relative costs of our different response-aiding strategies. We note that costs of mailing or faxing prenotification letters will depend on the geographical reach of the survey and local charges, including any available mailing or telephone discounts. Cost-effectiveness analysis of the response-aiding strategies will depend on the location and scope of individual research projects. Affixing a label marked attention to doctor may justify the minimal extra cost by improving early response. Our final reminder tested the effect of several strategies designed to increase the personalization of the mail-out; however, we did not keep these factors constant in our comparison arm to enable us to tease out more or less effective strategies. When using the number of FPs known to be eligible as our denominator, we achieved a response rate of 64.4% (596/926). Although this is a relatively high response rate, it is unclear if more could have been done to further improve response. 4.3. Conclusions There is tension between an ethical obligation to respect the nonresponding physician’s decision to ignore survey
551
requests and the ethical obligation of researchers to maximize the scientific validity of their research. The majority does not respond unless prompted. It appears incumbent on researchers to maximize response rates to ensure that a survey represents the views of most physicians approached. Prenotification letters via fax are as effective as mailing and may be cheaper to administer, and the labeling of mail for the doctors’ attention may usefully increase early responses. Our findings suggest that future research efforts may best concentrate on improving the early responses. Late use of response-aiding strategies did not promote increased participation, suggesting that attempts to increase responses at late stages may be ineffective.
Acknowledgments The authors thank the FPs who took part in our study, Tracey Coles, and Courtney Worthington for delivering the faxed prompts and Chris Goumas for statistical advice. This methodological component of the survey received ethics approval from the University of New South Wales, Human Research Ethics Committee [HREC UNSW 05087]. The project was supported by an internal Early Career Researcher grant from the University of New South Wales, Faculty of Medicine awarded to M.G. as Chief Investigator. M.G. was supported by an NHMRC Public Health Training Fellowship at the time the study was carried out (#300616). This project was also supported by grant from the Australian Government, Department of Health and Aging Primary Health Care Research, Evaluation and Development Senior Research Fellowship awarded to M.G., at the time of undertaking analysis and writing results. The opinions expressed in this publication do not necessarily reflect those of the Commonwealth of Australia, which does not accept any liability for loss, damage, or injury incurred by the use of or reliance on the information contained herein. References [1] Nakash RA, Hutton JL, Jorstad-Stein EC, Gates S, Lamb SE. Maximising response to postal questionnairesda systematic review of randomised trials in health research. BMC Med Res Methodol 2006;6:5. [2] VanGeest JB, Johnson TP, Welch VL. Methodologies for improving response rates in surveys of physicians: a systematic review. Eval Health Prof 2007;30:303e21. [3] Kellerman SE, Herold J. Physician response to surveys. A review of the literature. Am J Prev Med 2001;20:61e7. [4] Edwards PJ, Roberts I, Clarke MJ, DiGuiseppi C, Wentz R, Kwan I, et al. Methods to increase response to postal and electronic questionnaires. Cochrane Database of Syst Rev 2009;(3):MR000008. [5] Hennekens CH, Buring JE. Epidemiology in medicine. In: Mayrent SL, editor. Boston, MA/Toronto, Canada: Little, Brown and Company; 1987. pp 101e212. [6] Drummond FJ, Carsin AE, Sharp L, Comber H. Factors prompting PSA-testing of asymptomatic men in a country with no guidelines:
552
[7]
[8]
[9]
[10] [11]
[12] [13]
[14]
[15]
[16] [17] [18]
[19]
M. Gattellari et al. / Journal of Clinical Epidemiology 65 (2012) 544e552 a national survey of general practitioners. BMC Fam Pract 2009;10:3. Bjertnaes OA, Garratt A, Botten G. Nonresponse bias and costeffectiveness in a Norwegian survey of family physicians. Eval Health Prof 2008;31:65e80. Siversten LM, Woolfenden SR, Woodhead HJ, Lewis D. Diagnosis and management of childhood obesity: a survey of general practitioners in South West Sydney. J Paediatr Child Health 2008;44:622e9. Key C, Layton D, Shakir SA. Results of a postal survey of the reasons for non-response by doctors in a prescription event monitoring study of drug safety. Pharmacoepidemiol Drug Saf 2002;11:143e8. Cummings SM, Savitz LA, Konrad TR. Reported response rates to mailed physician questionnaires. Health Serv Res 2001;35:1347e55. Askew DA, Clavarino AM, Glasziou PP, Del Mar CB. General practice research: attitudes and involvement of Queensland general practitioners. Med J Aust 2002;177:74e7. McAvoy BR, Kaner EFS. General practice postal surveys: a questionnaire too far? BMJ 1996;313:732e3. Osborn M, Ward J, Boyle C. Effectiveness of telephone prompts when surveying general practitioners: a randomised controlled trial. Aust Fam Physician 1996;(Suppl 1):S41e3. Gupta L, Ward J, D’Este C. Differential effectiveness of telephone prompts by medical and nonmedical staff in increasing survey response rates: a randomised controlled trial. Aust N Z J Public Health 1997;21:98e9. Middleton S, Sharpe D, Harris J, Corbett A, Lusby R, Ward J. Case scenarios to assess Australian general practitioners’ understanding of stroke diagnosis, management and prevention. Stroke 2003;34:2681e6. Priotta M, Gunn J, Farish S, Karabatsos G. Primer postcard improves postal survey response rates. Aust N Z J Public Health 1999;23:196e7. Dommeyer CJ, Elganyan D, Umans C. Increasing mail survey response with an envelope teaser. J Mark Res Soc 1991;33:137e40. Shiono PH, Klebanoff MA. The effect of two mailing strategies on the response to a survey of physicians. Am J Epidemiol 1991;134: 539e42. Streiff MB, Dundes L, Spivak JL. A mail survey of United States haematologists and oncologists: a comparison of business reply versus stamped return envelopes. J Clin Epidemiol 2001;54:430e2.
[20] Urban N, Anderson GL, Tseng A. Effects on response rates and costs of stamps vs business reply in a mail survey of physicians. J Clin Epidemiol 1993;46:455e9. [21] Roberts I, Coggan C, Fanslow J. Epidemiological methods: the effect of envelope type on response rates in an epidemiological study of back pain. Aust N Z J Occ Health Safety 1994;10:55e7. [22] McCoy M, Hargie O. Effects of personalization and envelope colour on response rates, speed and quality among a business population. Indus Mark Manage 2007;36:799e809. [23] Taylor KS, Counsell CE, Harris CE, Gordon JC, Fonesca SC, Lee AJ. In a randomized study of envelope and ink color, colored ink was found to increase the response rate to a postal questionnaire. J Clin Epidemiol 2006;59:1326e30. [24] Gattellari M, Worthington JM, Zwar NA, Middleton SM. Barriers to the use of anticoagulation for nonvalvular atrial fibrillation: a representative survey of Australian family physicians. Stroke 2008;39:227e30. [25] Gattellari M, Worthington JM, Zwar NA, Middleton SM. The management of non-valvular atrial fibrillation (NVAF) in Australian, general practice: bridging the evidence-practice gap. A national, needs assessment survey. BMC Fam Pract 2008;9:62. [26] Australian Bureau of Statistics. Socio-Economic Indexes For Areas (SEIFA) Technical Paper (ABS cat. no. 2039.0.55.001, 2006). Available at http://www.abs.gov.au/ausstats/
[email protected]/mf/2039.0.55.001. Accessed February 7, 2011. [27] Measuring remoteness: Accessibility/Remoteness Index of Australia (ARIA). Revised edition. Canberra: Commonwealth Department of Health and Aged Care, 2001. (Occasional papers: New Series No. 14.). Available at http://www.health.gov.au/internet/main/publishing. nsf/Content/health-historicpubs-hfsocc-ocpanew14a.htm. Accessed February 7, 2011. [28] SPSS for Windows, Rel. 18.0.2. Chicago, IL: SPSS Inc.; 2010. [29] SAS Software, Version 9.2, of the SAS System for Windows Ó 2002e2008. Cary, NC: SAS Institute Inc. [30] Ogborne AC, Rush B, Fondacaro R. Dealing with nonrespondents in a mail survey of professionals: the cost-effectiveness of two alternatives. Eval Health Prof 1986;9:121e8. [31] Bonevski B, Magin P, Horton G, Foster M, Girgis A. Response rates in GP surveysdtrialling two recruitment strategies. Aust Fam Physician 2011;40:427e30.