The jigsaw puzzle of fraudulent health claims: Missing psychological pieces

The jigsaw puzzle of fraudulent health claims: Missing psychological pieces

Journal Pre-proof The jigsaw puzzle of fraudulent health claims: Missing psychological pieces Scott O. Lilienfeld, Candice Basterfield PII: S0277-953...

485KB Sizes 0 Downloads 44 Views

Journal Pre-proof The jigsaw puzzle of fraudulent health claims: Missing psychological pieces Scott O. Lilienfeld, Candice Basterfield PII:

S0277-9536(20)30037-X

DOI:

https://doi.org/10.1016/j.socscimed.2020.112818

Reference:

SSM 112818

To appear in:

Social Science & Medicine

Received Date: 5 January 2020 Accepted Date: 27 January 2020

Please cite this article as: Lilienfeld, S.O., Basterfield, C., The jigsaw puzzle of fraudulent health claims: Missing psychological pieces, Social Science & Medicine (2020), doi: https://doi.org/10.1016/ j.socscimed.2020.112818. This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record. This version will undergo additional copyediting, typesetting and review before it is published in its final form, but we are providing this version to give early visibility of the article. Please note that, during the production process, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain. © 2020 Published by Elsevier Ltd.

Fraudulent Health Claims Commentary 1

The Jigsaw Puzzle of Fraudulent Health Claims: Missing Psychological Pieces

Scott O. Lilienfeld Emory University University of Melbourne Candice Basterfield Emory University

Abstract Fraudulent health claims have become an inescapable fixture of the contemporary information (or misinformation) landscape. MacFarlane, Hurlstone, and Ecker (2020) provided a five-fold framework for conceptualizing susceptibility to fraudulent health claims, and propose potential remedies for each driver of these claims. We build on their analysis by arguing that a complete account of fraudulent health claim susceptibility additionally requires a thoroughgoing consideration of (a) individual differences in cognitive styles and personality traits, (b) innumeracy and statistical illiteracy, and (c) persuasive appeals involving logical fallacies and commonplaces. We further contend that dual processing models of cognition may help to synthesize a variety of psychological variables relevant to fraudulent health claim vulnerability. In conjunction with our commentary, MacFarlane’s framework underscores the broader point that complex, multifactorial psychological phenomena demand complex, multifactorial psychological explanations and solutions.

Keywords: fraudulent health claims; complementary and alternative medicine; dual processing models; innumeracy; statistical illiteracy; logical fallacies; commonplaces

Fraudulent Health Claims Commentary 1 Introduction With the advent of social media, the internet, cable television, and other readily accessible sources, fraudulent health claims have become an inescapable fixture of the contemporary information—or perhaps more accurately, misinformation—landscape (Merchant & Asch, 2018). A better understanding of the appeal of such claims and of empirically-supported means of combatting them is essential to protecting the public from harm. In this respect, MacFarlane, Hurlstone, and Ecker (2020) are to be congratulated for their scholarly and thoughtprovoking review of why and how individuals are drawn to fraudulent health claims, especially those in the domain of complementary and alternative medicine (CAM). Their five-fold framework of psychological drivers of such claims, which is comprised of Visceral influence, Affect, Nescience, Misinformation, and Norms, reflects an impressive effort to integrate wellreplicated findings from diverse psychological disciplines, including health psychology, physiological psychology, affect, social cognition, and the psychology of persuasion. We concur with McFarlane and co-authors that the appeal of fraudulent health claims is multifactorial, and that any effective approach to addressing them will almost certainly need to draw upon insights from multiple psychological subfields. We hope that their innovative synthesis of the literature stimulates further research into the psychological influences on acceptance of fraudulent health claims as well as evidence-based approaches to counteracting them. Admittedly, we are somewhat less optimistic than are MacFarlane et al. (2020) that several of their suggested “treatments” (interventions) will prove effective in combatting this appeal, largely because many of them appear to rest on the assumption that CAM consumers will be receptive to alternative means of presenting, framing, and appraising health-related information. For example, equipping consumers who are prone to conspiracy-related health

Fraudulent Health Claims Commentary 2 beliefs with analytic thinking skills, as MacFarlane et al. proposed, may be undermined by two propensities that they discuss: (a) confirmation bias (Nickerson, 1989; Tavris & Aronson, 2007), which comprises both the selective seeking out of congenial data (Hart et al, 2009) and the selective reinterpretation of data in schema-consistent ways (Munro et al., 2002); and (b) motivated reasoning, which involves invoking logic in the service of reaching a desired conclusion (Kunda, 1990). As a consequence, we worry that when presented with dissonant information, many CAM users will turn to alternative information sources that support their preferences or reinterpret extant informant sources in a manner consistent with these preferences. Still, we concur with MacFarlane et al. that their proposed interventions are worthy of development and systematic testing in controlled trials. In this commentary, we offer a set of friendly amendments to MacFarlane et al.’s (2020) review. Specifically, we consider three sets of potential psychological drivers of acceptance of fraudulent health claims that they did not discuss or to which they accorded short shrift: (a) individual differences in cognitive styles and personality, (b) innumeracy and statistical illiteracy, and (c) persuasive appeals that draw on logical fallacies and commonplaces. In doing so, we hope to fill in some of the gaps in our incomplete but growing understanding of why individuals are susceptible to fraudulent health claims. Individual Differences in Cognitive Styles and Personality: A Dual Processing Perspective In their closing comments, MacFarlane et al. (2020) acknowledge that their framework does not incorporate individual differences that might render certain people particularly susceptible to fraudulent health claims. At the same time, a better delineation of individual difference correlates of psychological phenomena can often provide valuable clues to causal mechanisms. For example, data demonstrating that individuals who are especially prone to

Fraudulent Health Claims Commentary 3 illusory correlations are more vulnerable than other individuals to CAM would buttress MacFarlane et al.’s assertion that nescience is one driver of fraudulent health claim acceptance. In addition, a better understanding of individual difference correlates can sometimes inform intervention efforts, as such correlates may be moderators of treatment response (Harkness & Lilienfeld, 1997). In the case of fraudulent health claims, intervention efforts might be more fruitfully directed toward consumers who are especially prone to nescience or scientific misinformation, among other individual difference variables. We regard individual differences as conceptually orthogonal to MacFarlane et al.’s (2020) hypothesized five drivers of fraudulent health claims susceptibility. Rather than constituting an independent set of drivers in their own right, we posit that they can operate as moderators, potentiating vulnerability to each driver. Nevertheless, this hypothesis awaits empirical corroboration in studies examining statistical interactions between individual differences and MacFarlane et al.’s drivers in predicting fraudulent health claim susceptibility. Dual processing models of cognition (Kahneman, 2011; Stanovich & West, 2000), which posit the existence of two overarching processes governing human thinking, afford one helpful rubric for organizing a myriad of diverse individual difference variables potentially relevant to susceptibility to fraudulent health claims. Such models have not escaped scientific criticism, as some authors have argued that these two processes reflect opposing poles of a single dimension rather than separate dimensions (e.g., Keren & Schul, 2009; Kruglanski & Orehek, 2007). Moreover, as Kahneman (2011) observed, these two systems should not be reified, as they are better conceptualized as useful metaphors than as distinct or discrete psychological structures. Nevertheless, dual processing models synthesize large bodies of well-replicated scientific

Fraudulent Health Claims Commentary 4 evidence suggesting that human thinking operates in two broadly different modes (Evans & Stanovich, 2013). According to perhaps the most influential instantiation of these models, cognition operates in two qualitatively distinct modes, one of which, System 1 (sometimes also termed Type 1) is fast, intuitive, and guided by heuristic processing, and the other of which, System 2 (sometimes also termed Type 2) is slow, reflective, and guided by analytic processing (Kahneman, 2011). According to this model, System 2 performs an override function, checking the “gut hunch” outputs of System 1 and contravening them when necessary. Nevertheless, because humans are cognitive misers (Fiske & Taylor, 2013), System 2 tends to be “lazy” and frequently allows System 1 cognitions to prevail even when reflective processing is warranted. Individual differences in the strength of these two systems may bear important implications for the appraisal of numerous unsupported claims. Some authors, such as Stanovich (2009), further subdivide System 2 deficits into mindware gaps and contaminated mindware, with the former reflecting an absence of knowledge arising largely from inadequate learning or education (or what MacFarlane et al., 2020 term nescience) and the latter reflecting distorted knowledge arising largely from thinking styles that predispose toward irrational beliefs. Although MacFarlane et al (2020) acknowledged the role of “irrational affective associations” in fostering fraudulent health claim vulnerability, they did not review literature on individual differences in cognitive dispositions, such as intuitive thinking styles, that may predispose to such associations. Multiple studies suggest that CAM acceptance and use are positively associated with intuitive thinking styles characteristic of System 1 (Galbraith, Moss, Galbraith, & Purewal, 2018; Lindeman, 2011; Saher & Lindeman, 2005; Wheeler & Hyland, 2008), as assessed by subscale scores on the Rational-Experiential Inventory (Pacini & Epstein,

Fraudulent Health Claims Commentary 5 1999). These dispositions ostensibly stem from what Stanovich (2009) described as contaminated mindware. At least among practitioners, these findings extend to acceptance of unsubstantiated mental health claims as well; in two studies of psychotherapists, intuitive thinking styles were tied to greater acceptance of CAM claims and misgivings regarding the use of evidence-based psychological interventions (Gaudiano, Brown, & Miller, 2011), as well as to more negative attitudes toward empirically supported psychological treatments (Seligman et al., 2016). Broadly consistent with these findings on intuitive thinking styles are results indicating that acceptance of CAM is also positively correlated with acceptance of other largely unsupported assertions that may reflect System 1 contaminated mindware, such as paranormal beliefs (e.g., beliefs in extrasensory perception and astrology; Lindeman, 2011), and magical thinking about foods and health (e.g., the belief that many illnesses are caused by imbalances in energy currents, or the belief that because our bodies consist of 70 percent water, our diets should consist of 70 percent water; Bryden, Browne, Rockloff, & Unsworth, 2018; Lindemann, Keskivaara, & Roschier, 2000; see Oliver & Wood, 2018, for a broader discussion of the potential roles of paranormal beliefs and magical thinking in fostering irrational beliefs). Personality research suggests that, among the traits of the five-factor model of personality (which comprises the dimensions of extraversion, neuroticism, agreeableness, conscientiousness, and openness to experience), openness to experience is the most robust correlate of acceptance of CAM claims and CAM use (Galbraith et al., 2018; Honda & Jacobson, 2005; Smith et al., 2008; Sirois & Gick, 2002). This trait also appears related to skepticism toward vaccines (Browne, Thompson, Rockloff, & Pennycook, 2015). Given that openness to experience is tied to selfreported intuitive thinking (Sobkow, Traczyk, Kaufman, & Nosal, 2018), these findings are potentially consistent with those linking intuitive thinking to CAM. Still, they are challenging to

Fraudulent Health Claims Commentary 6 interpret within a dual processing framework given that openness to experience comprises two facets, one reflecting imagination/fantasy and another reflecting intellect/cognitive engagement, which often diverge markedly in their external correlates (DeYoung, 2014). It is possible that the imagination/fantasy component of openness, which is ostensibly more tied to System 1 thinking, is positively associated with CAM attitudes and use, whereas the intellect/cognitive engagement component of openness, which is ostensibly more tied to System 2 thinking, is negatively associated with these variables. Indeed, scientific curiosity, which is presumably a marker of the intellect/cognitive engagement component of openness, is tied to a heightened preference for information that challenges one’s views, at least in the political realm (Kahan, Landrum, Carpenter, Helft, & Hall Jamieson, 2017). Nevertheless, to our knowledge, this hypothesis has yet to be put to an empirical test in the domain of fraudulent health claims. One potentially fruitful line of research suggested by the literature cited above could examine the effectiveness of framing health-related information in a manner that is largely consistent with System 1 thinking styles, including intuition. For example, data indicate that presenting health risk information in natural frequencies (e.g., one in ten persons) as opposed to conditional probabilities (10 percent) yields more accurate estimates of health risks (Hoffrage & Gigerenzer, 1998; McDowell, Galesic, & Gigernezer, 2018), and individuals with a preference for System 1 thinking may be especially likely to benefit from such intuitively appealing framing manipulations. Innumeracy and Statistical Illiteracy One of the formidable challenges to interpreting health information is that it typically requires at least a modicum of numerical and statistical literacy, both of which are in short supply among much of the U.S. population, even in the highly educated (Paulos, 1988).

Fraudulent Health Claims Commentary 7 Although the distinction between numeracy and statistical literacy is fuzzy, most authors regard the former as comprising basic knowledge of mathematics and the latter as comprising the application of this knowledge to the evaluation of everyday claims regarding probabilities, such as base rates and the like (Gal, 2002). Innumeracy and statistical illiteracy fall broadly under the driver that MacFarlane et al (2020) termed nescience and can probably be regarded as a fourth barrier within this category. Nevertheless, innumeracy and statistical illiteracy go considerably beyond the manifestation of nescience they underscore, namely, the tendency to detect patterns in random data. In addition, innumeracy and statistical illiteracy ostensibly reflect mindware gaps characteristic of deficient System 2 thinking (see Stanovich, 2009). Evidence suggests that low levels of numeracy, such as difficulties grasping probabilities and ratios, are associated with poorer health outcomes in the general population (Reyna & Brainerd, 2007), although the mechanisms underpinning this association are unclear. One potential mechanism is poor medical decision-making; as low levels of numeracy are predictive of inaccurate estimates of health behaviors and health risks (e.g., the benefits and dangers of mammography screening; Reyna, Nelson, Han, & Dieckmann, 2009). Provisional evidence suggests that these associations persist even after controlling for scores on measures of education and intelligence (Reyna & Brainerd, 2007), although further research along these lines is needed. Compounding the problem, medical information is commonly portrayed in incomplete or misleading ways, lending itself to misinterpretation of health risks and benefits (Gigerenzer, Gaissmaier, Kurz-Milcke, Schwartz, & Woloshin, 2007; Gigerenzer, Wegworth, & Feufel, 2010). To take merely one example, the media frequently presents data on the effectiveness of medical interventions or the hazards of medical risk factors strictly in terms of relative as opposed to absolute ratios. Because the former ratios omit data on base rates, they can result in a

Fraudulent Health Claims Commentary 8 grossly exaggerated picture of treatment effectiveness or of health risks (Gigerenzer et al., 2007). For instance, a website might proclaim that a novel medication “cuts heart attack risk in half.” This result sounds impressive until one learns that the drug diminishes heart attack risk from .05% to .025%, a tiny drop in absolute terms (see Victory, 2017). One content analysis revealed that media coverage of CAM techniques emphasizes relative rather than absolute risk ratios (Bonevski, Wilson, & Henry, 2008), perhaps contributing to an overstated impression of their effectiveness in the eyes of the general public (although this analysis did not examine whether this preference was more marked for CAM than for traditional medicine websites). Indeed, data suggest that health consumers are more likely to select an intervention if its benefits are framed in relative as opposed to absolute terms, and do so with considerable confidence (Hux & Naylor 1995). It is plausible that CAM advocates sometimes make use of such ratios to play up the hazards of conventional medical interventions, although we are unaware of systematic evidence bearing on this hypothesis. Surprisingly, few studies have examined the relation between innumeracy and evaluation of health claims, let alone fraudulent health claims per se (Apter et al., 2008). Still, in one investigation, participants who scored poorly on a one-item measure of numeracy were more likely to overestimate the benefits of an experimental cancer intervention (Weinfurt et al., 2003). One key feature shared by numerous fraudulent health claims is a propensity to draw heavily on anecdotal and testimonial information as persuasive techniques (Ernst, 2004). Such “anecdata,” also referred to as “person who” statistics (“I know a person who had cancer and recovered soon after being treated with shark cartilage”; Stanovich, 2017), are typically of questionable veracity and generalizability. Yet, because they tend to be vivid, memorable, and subjectively compelling, they are often more convincing than base rate data derived from large

Fraudulent Health Claims Commentary 9 samples. Individuals with low levels of numeracy appear to be particularly vulnerable to the lure of anecdotal information (Scurish, 2015), perhaps in part because of denominator neglect, which is a common bias among members of the general public (Reyna & Brainerd, 2008). As MacFarlane et al (2020) noted in their discussion of reporting bias (which they regard as a contributor to the illusion of causality), such individuals may focus largely or exclusively on a small number of individuals (in the numerator) who report positive outcomes from an intervention, such as an untested herbal remedy, yet neglect to consider the large number of individuals (in the denominator) who may have failed to respond to this intervention. Fortunately, some evidence suggests that the undue impact of anecdotal data on medical decisions can be partially overcome by intuitively understandable visual aids, such as pictographs (icon arrays; Fagerlin, Wang, & Ubel, 2005; Garcia-Retaramero & Cokely, 2013), and that such aids may be especially helpful among individuals with low numeracy. Clearly, there are many gaps in our knowledge regarding the associations between statistical literacy and numeracy, on the one hand, and vulnerability to fraudulent health claims, on the other. In particular, there are few data directly linking these constructs to the acceptance and use of unsubstantiated medical techniques, and even fewer data on potential mechanisms underlying this association. Furthermore, in conducting such research, investigators will need to more consistently incorporate statistical controls for potential third variables, including educational level, which is positively associated with endorsement of most CAM claims (Wolsko, Eisenberg, Davis, & Phillips, 2004). Moreover, although we have underscored the potential linkages between low statistical literacy and numeracy, on the one hand, and inadequate medical decision-making, on the other, there are some grounds for predicting the opposite pattern of associations. Research by Kahan

Fraudulent Health Claims Commentary 10 and colleagues (Kahan, Peters, Dawson, & Slovic, 2017; Kahan et al., 2012) raises the possibility that high levels of numeracy and scientific literacy can in certain cases predispose to unwarranted beliefs, such as rejection of threats arising from global warming. One interpretation of these findings is that superior mathematical and scientific knowledge may enable individuals to justify their decisions via motivated reasoning, even when these decisions are inconsistent with research evidence. We encourage systematic investigations of this hypothesis in the domain of fraudulent health claims, especially in circumstances in which individuals may feel compelled to rationalize their dubious health decisions. Persuasive Appeals: Logical Fallacies and Commonplaces Pseudosciences and allied questionable disciplines frequently make use of persuasive appeals (Gambrill & Reiman, 2011; Herbert et al., 2000; Pratkanis, 1995). From a dual processing perspective, most of these appeals probably exploit System 1 cognition by capitalizing on the heuristic thinking styles typical of contaminated mindware. From the perspective of an influential dual processing account in social psychology, the elaboration likelihood model (Caccippo & Petty, 1986), effective persuasion appeals associated with unsubstantiated claims typically bypass the central route to persuasion (akin to System 2), which relies on deliberative, thoughtful consideration of the merits of advertised products, and quickly enter higher-level cognitive pathways via the peripheral route to persuasion (akin to System 1), which emphasizes shallow and superficial features (e.g., popularity, attractiveness, apparent sophistication) of these products. In the domain of fraudulent health claims, persuasive appeals often assume the form of logical fallacies, which are errors in thinking, and commonplaces, which are commonly accepted beliefs or memes that are often oversimplified or misleading (Myers, 2007; Pratkanis, 1995). In

Fraudulent Health Claims Commentary 11 their discussion of norm appeals, MacFarlane et al. (2020) discussed three widespread logical fallacies: the appeal to duty, the appeal to tradition, and the appeal to authority. Nevertheless, logical fallacies relevant to the marketing of fraudulent health claims go well beyond norms; in addition, proponents of these claims frequently exploit a variety of commonplaces. In this section, we identify 10 frequently used logical fallacies and commonplace persuasive tactics relevant to the dissemination of fraudulent health claims that were not considered by MacFarlane et al. (2020). Many of these tactics appeal to our intuitions and wishes, and are likely to bypass the central route of persuasion and enter via the peripheral route. Our list is by no means exhaustive (see also Ernest, 2013), although it adds to MacFarlane et al.’s overview by highlighting (a) additional logical fallacies, many of which go beyond the normative appeals they highlight and (b) commonplaces. At least one of these fallacies, the bandwagon fallacy, would also fit comfortably within MacFarlane’s et al. section on normative appeals. Each of the seven logical fallacies we present are informal logical fallacies, meaning that although they are typically incorrect in practice, they do not violate formal rules of logic. Nevertheless, when taken to an extreme, they are almost invariably erroneous. For example, although it is not necessarily incorrect to be skeptical of a study funded exclusively by a pharmaceutic company, it is almost always incorrect to dismiss this study entirely on the basis of its funding source, as the validity of a study’s conclusion must ultimately rise or fall on its own merits (see “Genetic fallacy”). We summarize each of these logical fallacies and commonplaces here, and refer readers to Table 1 for an example of each tactic taken verbatim from a CAM-related website or blog: •

Anecdotal fallacy: Error of assuming that because product X appeared to help one or more individuals, it is likely to be effective (“Fallacy Files,” n.d.);

Fraudulent Health Claims Commentary 12 •

Argument from ignorance fallacy (ad ignorantium fallacy): Error of assuming that because product X hasn’t been proven not to work, we can assume that it is effective (Woods & Walton, 1978);



Bandwagon fallacy (ad populum fallacy): Error of assuming that because product X is popular or widely used, it is effective (Briggs, 2014);



Fallacy of exaggerated conflict: Error of overstating the extent to which scientists disagree on a given claim, and then using this presumed lack of consensus to call the claim into question (Byrne, n.d.);



Genetic fallacy: Error of attacking the validity of a claim solely on the basis of its origins (or genesis, hence the name of the fallacy; Goudge, 1961);



Hyman’s fallacy: Named after psychologist Ray Hyman, who argued that before seeking to explain how a phenomenon works, we should first make sure that it is real (Loxton, 2015), this fallacy is the error of asking how a treatment works before verifying that it works to begin with, thereby placing the cart before the horse;



Nirvana fallacy (perfect solution fallacy): Error of assuming that because a discipline, such as conventional medicine, is imperfect, it should be rejected; this fallacy erroneously presumes that the alternative solution is necessarily superior (Biesecker, 2013);



Glittering generalities commonplace: Portraying product X in entirely positive terms, with no acknowledgement of potential negative effects or side effects (Ramey, 2005);



Goddess-within commonplace: Appealing to an ostensibly magical spiritual essence possessed by humans that is neglected by conventional disciplines, such as modern medicine (Pratkanis, 1995);

Fraudulent Health Claims Commentary 13 •

Science commonplace: Appealing to technical-sounding terms and concepts to lend an unscientific discipline the aura of a genuine science (Pratkanis, 2005). Conclusion MacFarlane et al. (2020) have done the field a valuable service by bringing together

findings from diverse psychological subdisciplines to help us to better understand why so many people—even the best educated—are drawn to fraudulent health claims. We have elaborated on MacFarlane et al.’s (2020) analysis by contending that dual processing models of cognition (Kahneman, 2011; Stanovich & West, 2000), including the distinction between mindware gaps and contaminated mindware (Stanovich, 2009) may provide a fruitful organizing framework for many of their proposed drivers as well as ours. Furthermore, we have proposed that further consideration of individual differences in cognitive styles, personality, and innumeracy and statistical illiteracy may shed further light on vulnerability to fraudulent health claims. We have also argued that further consideration of persuasive tactics that go beyond normative appeals, many of which may bypass the thoughtful consideration afforded by System 2 processing, would extend and enrich MacFarlane et al.’s excellent starting framework. Their analysis, supplemented by our commentary, highlight the obvious but often overlooked point that complex, multifactorial psychological phenomena demand complex, multifactorial psychological explanations and solutions (Manzi, 2012). Furthermore, a consideration of individual differences suggests that one-size-fits-all interventions may be insufficient to solve the puzzle of fraudulent health claim vulnerability. Given the increasing visibility of fraudulent health claims in the popular media and in everyday life more generally, the need for sophisticated causal frameworks and intervention approaches is arguably more pressing than ever.

Fraudulent Health Claims Commentary 14 Table 1 Ten Additional Logical Fallacies and Commonplaces. Logical Fallacy Anecdotal fallacy

Example “In more than forty years of practice, we have amassed many stories [acupuncture success stories] about the positive outcomes experienced by our patients. Below is a selection of testimonials” (“Chinese medicine works,” n.d.). Argument for “Scientists haven’t proven that horny goat weed increases sexual function in humans. ignorance fallacy However, anecdotal evidence, along with certain animal studies, suggests it has some of the following benefits: Helps to increase testosterone production…” (Levy, 2018). Bandwagon fallacy “Homeopathic medicine is so widely practiced by physicians in Europe that it is no longer appropriate to consider it “alternative medicine” there. Approximately 30% of French doctors and 20% of German doctors use homeopathic medicines regularly, while over 40% of British physicians refer patients to homeopathic doctors…The fact that the British Royal Family has used and supported homeopathy since the 1830s reflects its longstanding presence in Britain’s national health care system” (Ullman, 2017). Fallacy of exaggerated “There is controversy amongst doctors as to whether homeopathy has proven effectiveness. conflict This is evident from such titles as 'Is homeopathy a placebo?'[2] and 'Homoeopathy: medicine or magic?'[3] There have been relatively few well-conducted trials evaluating it.” (Coker, 1999). Genetic fallacy

Hyman’s fallacy Nirvana fallacy

Glittering generalities commonplace

Goddess-within commonplace Science commonplace

“Conventional medicine generates profit for pharma and funding for healthcare and charities. It’s easy to see why no-one in mainstream healthcare is interested in looking too deeply at the existing research that supports complementary care, or to seek their own truths”(The Natural Doctor, 2019). “It remains unclear how homeopathy works, but a German study shows it effective for hay fever” (Castleman, 2002). “If allopathic medicine always fixed the problem, nobody would ever need holistic healing modalities. But, alas, modern medicine is far from perfect… Complementary medicine, with its holistic healing focus on the whole individual, is often able to make progress with people who had no success with drugs or surgeries” (“The Formula for Miracles,” n.d.). “Acupuncture is one of the safest treatment methods you can use to help manage pain, stress, and mobility issues. It will help with any physical injury, or sore, strained muscles…It helps with digestive issues, hormone balancing, headaches, vertigo, tinnitus, allergies, insomnia, anxiety and depression etc. and it is virtually side-effect free.” (Adam, n.d.) “Aromatherapy is a holistic therapy that treats the mind, body and spirit” (“Absolute Aromas,” n.d.). “Pulsing magnetic fields from the hands of Reiki therapists are in the same frequency ranges that are optimal for stimulating tissue repair. Biologically optimal levels of electromagnetic frequencies for stimulating human tissue repair are all in what’s called the extremely low frequency (ELF) range. They have been documented as 2 cycles per second (Hz) for nerve regeneration, 7 Hz for bone growth, 10 Hz for ligament repair, and 15 Hz for capillary formation” (Doran, 2009).

Fraudulent Health Claims Commentary 15 References Absolute Aromas. (n.d.). The benefits of aromatherapy. https://www.absolutearomas.com/cms/cms.jsp?menu_id=24356 Adam, B. (n.d.). The positive side-effects of acupuncture. https://www.pathwayshealing.com/thepositive-side-effects-of-acupuncture/ Apter, A. J., Paasche-Orlow, M. K., Remillard, J. T., Bennett, I. M., Ben-Joseph, E. P., Batista, R. M., ... & Rudd, R. E. (2008). Numeracy and communication with patients: they are counting on us. Journal of General Internal Medicine, 23, 2117-2124. Biesecker, L. G. (2013). The Nirvana fallacy and the return of results. The American Journal of Bioethics, 13, 43-44. Bonevski, B., Wilson, A., & Henry, D. A. (2008). An analysis of news media coverage of complementary and alternative medicine. PLoS One, 3, e2406. Browne, M., Thomson, P., Rockloff, M. J., & Pennycook, G. (2015). Going against the herd: psychological and cultural factors underlying the ‘vaccination confidence gap’. PLoS One, 10, e0132562. Briggs, W. M. (2014). Common statistical fallacies. Journal of American Physicians and Surgeons, 19, 58-60. Bryden, G. M., Browne, M., Rockloff, M., & Unsworth, C. (2018). Anti-vaccination and proCAM attitudes both reflect magical beliefs about health. Vaccine, 36, 1227-1234. Byrne, J. (nd). Logical fallacies. Skeptical Medicine. https://sites.google.com/site/skepticalmedicine/home

Fraudulent Health Claims Commentary 16 Castleman, M. (2002). Natural herbal remedied for hay fever. https://www.motherearthliving.com/health-and-wellness/natural-herbal-remedies-forhayfever Chinese Medicine Works. (n.d.). Success stories. https://chinesemedicineworks.com/acupuncture-clinic/testimonials/ Coker, R. (1995). Homeopathy-a Christian medical perspective. https://www.cmf.org.uk/resources/publications/content/?context=article&id=548 DeYoung, C. G. (2014). Openness/Intellect: A dimension of personality reflecting cognitive exploration. In M. L. Cooper and R. J. Larsen (Eds.), APA Handbook of Personality and Social Psychology: Personality processes and individual differences (Vol 4, pp. 369– 399). Washington, DC: American Psychological Association. Doran, B. (2019). The science behind reiki. https://www.equilibriume3.com/images/PDF/Science%20Behind%20Reiki.pdf Ernst, E. (2004). Anecdotal obsessions? A comment on the use of anecdotes by the general media to support claims in CAM. Complementary Therapies in Nursing & Midwifery, 10, 254-255. Ernst, E. (2013). Thirteen follies and fallacies about alternative medicine. EMBO Reports, 14, 1025-1026. Evans, J. S. B., & Stanovich, K. E. (2013). Dual-process theories of higher cognition: Advancing the debate. Perspectives on Psychological Science, 8, 223-241. Fagerlin, A., Wang, C., & Ubel, P. A. (2005). Reducing the influence of anecdotal reasoning on people’s health care decisions: is a picture worth a thousand statistics? Medical Decision Making, 25, 398-405.

Fraudulent Health Claims Commentary 17 Fallacy Files (n.d.). The Volvo Fallacy. https://www.fallacyfiles.org/volvofal.html Fiske, S. T., & Taylor, S. E. (2013). Social cognition: From brains to culture. New York: Sage. Gal, I. (2002). Adults' statistical literacy: Meanings, components, responsibilities. International Statistical Review, 70, 1-25. Galbraith, N., Moss, T., Galbraith, V., & Purewal, S. (2018). A systematic review of the traits and cognitions associated with use of and belief in complementary and alternative medicine (CAM). Psychology, Health & Medicine, 23, 854-869. Gambrill, E., & Reiman, A. (2011). A propaganda index for reviewing problem framing in articles and manuscripts: An exploratory study. PloS one, 6, e19516. Garcia-Retamero, R., & Cokely, E. T. (2013). Communicating health risks with visual aids. Current Directions in Psychological Science, 22, 392-399. Gaudiano, B. A., Brown, L. A., & Miller, I. W. (2011). Let your intuition be your guide? Individual differences in the evidence based practice attitudes of psychotherapists. Journal of Evaluation in Clinical Practice, 17, 628-634. Gigerenzer, G., Gaissmaier, W., Kurz-Milcke, E., Schwartz, L. M., & Woloshin, S. (2007). Helping doctors and patients make sense of health statistics. Psychological Science in the Public Interest, 8, 53-96. Gigerenzer, G., Wegwarth, O., & Feufel, M. (2010). Misleading communication of risk. BMJ, 341, c4830. Goudge, T. A. (1961). The genetic fallacy. Synthese, 13, 41-48.

Fraudulent Health Claims Commentary 18 Harkness, A. R., & Lilienfeld, S. O. (1997). Individual differences science for treatment planning: Personality traits. Psychological Assessment, 9, 349-360. Hart, W., Albarracín, D., Eagly, A. H., Brechan, I., Lindberg, M. J., & Merrill, L. (2009). Feeling validated versus being correct: A meta-analysis of selective exposure to information. Psychological Bulletin, 135, 555–588. Herbert, J.D., Lilienfeld, S., Lohr, J., Montgomery, R. W., O'Donohue, W., Rosen, G.M. & Tolin, D. (2000) Science and pseudoscience in the development of Eye Movement Desensitization and Reprocessing: Implications for clinical psychology. Clinical Psychology Review, 20, 945 – 971. Honda, K., & Jacobson, J. S. (2005). Use of complementary and alternative medicine among United States adults: the influences of personality, coping strategies, and social support. Preventive Medicine, 40, 46-53. Hux, J. E., & Naylor, C. D. (1995). Communicating the benefits of chronic preventive therapy: does the format of efficacy data determine patients' acceptance of treatment? Medical Decision Making, 15, 152-157. Kahneman, D. (2011). Thinking, fast and slow. New York: Macmillan. Keren, G., & Schul, Y. (2009). Two is not always better than one: A critical evaluation of twosystem theories. Perspectives on Psychological Science, 4, 533–550. Kruglanski, A. W., & Orehek, E. (2007). Partitioning the domain of social inference: Dual mode and systems models and their alternatives. Annual Review of Psychology, 58, 291-316. Levy, J. (2018, October 2). Horny goat weed benefits or libido & bone health. https://draxe.com/nutrition/horny-goat-weed/

Fraudulent Health Claims Commentary 19 Lindeman, M. (2011). Biases in intuitive reasoning and belief in complementary and alternative medicine. Psychology and Health, 26, 371-382. Lindeman, M., Keskivaara, P., & Roschier, M. (2000). Assessment of magical beliefs about food and health. Journal of Health Psychology, 5, 195-209. Loxton, D. (2015, May 24). History and Hyman’s Maxim (Part One): Skeptic: Insight. Munro, G. D., Ditto, P. H., Lockhart, L. K., Fagerlin, A., Gready, M., & Peterson, E. (2002). Biased assimilation of sociopolitical arguments: Evaluating the 1996 US presidential debate. Basic and Applied Social Psychology, 24, 15-26. Myers, G. (2007). Commonplaces in risk talk: Face threats and forms of interaction. Journal of Risk Research, 10, 285-305. Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2, 175-220. Pacini, R., & Epstein, S. (1999). The relation of rational and experiential information processing styles to personality, basic beliefs, and the ratio-bias phenomenon. Journal of Personality and Social Psychology, 76, 972–987. Paulos, J. A. (1988). Innumeracy. New York: Macmillan. Petty, R. E., & Cacioppo, J. T. (1986). Communication and persuasion: Central and peripheral routes to attitude change. New York: Springer-Verlag Pratkanis, A. R. (1995). How to sell a pseudoscience. Skeptical Inquirer, 19(4), 19-25. Ramey, D.W. (2005). Letter to the editor: Is there a commonplace theme in the alternative medicine debate? Journal of the American Journal of Veterinary Medicine, 226, 1295.

Fraudulent Health Claims Commentary 20 Reyna, V. F., & Brainerd, C. J. (2007). The importance of mathematics in health and human judgment: Numeracy, risk communication, and medical decision making. Learning and Individual Differences, 17, 147-159. Reyna, V. F., & Brainerd, C. J. (2008). Numeracy, ratio bias, and denominator neglect in judgments of risk and probability. Learning and Individual Differences, 18, 89-107. Reyna, V. F., Nelson, W. L., Han, P. K., & Dieckmann, N. F. (2009). How numeracy influences risk comprehension and medical decision making. Psychological Bulletin, 135, 943–973. Saher, M., & Lindeman, M. (2005). Alternative medicine: A psychological perspective. Personality and Individual Differences, 39, 1169-1178. Scurich, N. (2015). The differential effect of numeracy and anecdotes on the perceived fallibility of forensic science. Psychiatry, Psychology and Law, 22, 616-623. Seligman, L. D., Hovey, J. D., Hurtado, G., Swedish, E. F., Roley, M. E., Geers, A. L., Kene, P., Elhai, J. D., & Ollendick, T. H. (2016). Social cognitive correlates of attitudes toward empirically supported treatments. Professional Psychology: Research and Practice, 47, 215–223. Sirois, F. M., & Gick, M. L. (2002). An investigation of the health beliefs and motivations of complementary medicine clients. Social Science & Medicine, 55, 1025-1037. Smith, B. W., Dalen, J., Wiggins, K. T., Christopher, P. J., Bernard, J. F., & Shelley, B. M. (2008). Who is willing to use complementary and alternative medicine? EXPLORE: The Journal of Science and Healing, 4, 359-367. Sobkow, A., Traczyk, J., Kaufman, S. B., & Nosal, C. (2018). The structure of intuitive abilities and their relationships with intelligence and Openness to Experience. Intelligence, 67, 110.

Fraudulent Health Claims Commentary 21 Stanovich, K. E. (2009). What intelligence tests miss: The psychology of rational thought. New Haven, CT: Yale University Press. Stanovich, K. (2017). How to think straight about psychology (10th edition). Hoboken, N.J.: Pearson. Stanovich, K. E., & West, R. F. (2000). Individual differences in reasoning: Implications for the rationality debate? Behavioral and Brain Sciences, 23, 645-665. Tavris, C., & Aronson, E. (2007). Mistakes were made (but not by me): Why we justify foolish beliefs, bad decisions, and hurtful acts. Boston, MA: Houghton Mifflin Harcourt. The Formula for Miracles. (n.d.). Holistic healing. http://www.theformulaformiracles.com/holistic-healing/ The Natural Doctor.(2019, July 16). Corruption, collusion and conspiracy: The shocking truth about drug research and big pharma. https://thenaturaldoctor.org/article/corruptioncollusion-and-conspiracy-the-shocking-truth-about-drug-research-and-big-pharma/ Ullman, D. (2017, January 23). Why homeopathy makes sense and works. https://homeopathic.com/why-homeopathy-makes-sense-and-works-3/ Victory, J. (2017, January 12). Reporting the findings: Absolute vs. relative risk. USC Annenberg Center for Health Journalism. https://www.centerforhealthjournalism.org/2017/01/11/reporting-findings-absolute-vsrelative-risk Wolsko, P. M., Eisenberg, D. M., Davis, R. B., & Phillips, R. S. (2004). Use of mind-body medical therapies. Journal of General Internal Medicine, 19, 43-50. Woods, J., & Walton, D. (1978). The fallacy of ‘ad ignorantiam’. Dialectica, 32, 87-99.