Hostname: page-component-cc8bf7c57-5wl6q Total loading time: 0 Render date: 2024-12-11T04:32:37.033Z Has data issue: false hasContentIssue false

Cults, Conspiracies, and Fantasies of Knowledge

Published online by Cambridge University Press:  09 January 2023

Daniel Munro*
Affiliation:
University of Toronto, Toronto, Canada
Rights & Permissions [Opens in a new window]

Abstract

There's a certain pleasure in fantasizing about possessing knowledge, especially possessing secret knowledge to which outsiders don't have access. Such fantasies are typically a source of innocent entertainment. However, under the right conditions, fantasies of knowledge can become epistemically dangerous, because they can generate illusions of genuine knowledge. I argue that this phenomenon helps to explain why some people join and eventually adopt the beliefs of epistemic communities who endorse seemingly bizarre, outlandish claims, such as extreme cults and online conspiracy theory groups. It can be difficult to grasp how members of such groups come to believe the theories they endorse. I argue that one route to such beliefs is via deep absorption in fantasies of knowledge, which can lead entire groups to become collectively detached from reality.

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
Copyright © The Author(s), 2023. Published by Cambridge University Press

1. Introduction

There's often pleasure in possessing knowledge. There can be a thrill in, for example, achieving knowledge that was especially difficult to come by, as when reaching the outcome of a long intellectual pursuit. There's also often pleasure in knowing things others don't – being told secrets, overhearing juicy gossip, or achieving expertise that surpasses a layperson's understanding. As with most sources of pleasure, there's also pleasure in merely imagining or fantasizing about possessing knowledge, including secret knowledge only few possess. It can be fun to imagine that you were among the first to make a monumental scientific discovery, that you've uncovered a government conspiracy to cover up UFOs, or that you've been chosen to receive a prophesy directly from God.

Such fantasies of knowledge are typically harmless. However, as I'll argue in this paper, there are circumstances under which they can become epistemically dangerous. That's because, under the right conditions, fantasies of knowledge can become illusions of knowledge. Specifically, I'll argue that this phenomenon helps to explain why some people join and eventually endorse the claims of epistemic communities whose beliefs are radically detached from reality.

I'll develop my account of this phenomenon by drawing on empirical research about such communities. Perhaps the most prominent today are certain internet subcultures which proliferate on social media and online message boards, trafficking in extreme conspiracy theories – theories that, for example, Democratic politicians are secretly masterminding a child sex trafficking ring or that vaccines are a government tool for mass depopulation. I'll also draw on research about extreme religious cults – those who subscribe to bizarre theories about, for example, the world's governments being controlled by demonic forces, or about space aliens who will soon bring humanity to a new age of enlightenment.

For those of us outside such groups, it can be difficult to grasp how anyone could come to believe these sorts of claims, which seem more like a product of the imagination than something one could actually believe. I'll argue that, in fact, the imagination is important for understanding how such far-fetched beliefs develop. There's one epistemological feature these groups have in common which is particularly important for my account: their claims to possess secret knowledge. These claims to secret knowledge make for an alluring object of fantasy, and I'll argue that many people acquire their beliefs via a transition from mere fantasies of knowledge to illusions of knowledge.

In §2, I first give a more detailed description of the epistemological profile of the sorts of groups on which I'm focused. §3 then argues that many people who participate in these groups don't genuinely believe their groups’ claims; instead, they participate because they're acting out fantasies of secret knowledge. I argue that we should distinguish these mere pretenders from other group members who are under the illusion of genuinely possessing knowledge.

In §4, I argue that, under the right circumstances, what begins as merely fantasizing about possessing secret knowledge can eventually give rise to the illusion of possessing knowledge. I identify certain psychological features of the imagination, as well as certain social dynamics of these communities, which make members susceptible to becoming so absorbed in their fantasies that they mistake them for reality. I then defend this account by arguing that it explains certain empirical facts better than other prominent accounts of belief formation in these groups.

Finally, §5 concludes by drawing out some broader epistemological implications of my arguments: one about the rationality of group members’ beliefs and one about the epistemic dangers of the imagination.

2. Far-fetched Beliefs and Secret Knowledge

I'm focused on groups who profess adherence to far-fetched belief systems while claiming to possess secret knowledge not possessed by outsiders. I'll develop my account of fantasies and illusions of knowledge by considering two, particular kinds of groups: religious cults and online conspiracist groups.Footnote 1 My discussions of these groups should be read as focused on the most extreme examples: those which are epistemically problematic in that their belief systems have clearly become detached from reality.Footnote 2 Some more detailed examples will illustrate this, while bringing out some epistemological features these groups share.Footnote 3

First, consider some examples of cults. As per Dawson (Reference Dawson2006: Ch. 2), cults are typically built around a charismatic leader who has relatively unrestricted authority. These leaders typically claim to possess some kind of esoteric knowledge, often accessed via mystical experience or direct contact with the divine (see also Galanter Reference Galanter1999; Chryssides Reference Chryssides, Lewis and Tøllefsen2016).

Some cult leaders teach that, as revealed to them by God, their group is under constant threat from demonic forces that control the world's governments. For example, in the 1980s and 90s, leader Shoko Asahara taught his cult Aum Shinrikyo that they were mounting a spiritual resistance against such forces. This culminated in actions such as the murder of a defector from the group and a large-scale nerve gas attack on the Tokyo Metro system (Repp Reference Repp, Lewis and Petersen2005; Ryutaro Reference Ryutaro, Dyrendal, Robertson and Asprem2018). Similarly, in the 2010s, online cult leader Sherry Shriner taught that Satanic, alien reptilians controlled the world's governments. She claimed they were trying to thwart her efforts to expose them by covertly replacing her followers with clones who were really disguised reptilians. When some followers defected and publicly criticized her, Shriner accused them of being reptilian clones.Footnote 4

Other cults’ central tenets are less conspiratorial, but no less bizarre. For example, between the 1970s and 90s, the leaders of Heaven's Gate, Bonnie Nettles and Marshall Applewhite, claimed to communicate with aliens who were planning humanity's ascent to a higher plane of existence. They themselves were purportedly spiritually advanced beings: Applewhite's physical body was inhabited by the same alien who had inhabited Jesus’ body, while Nettles was literally God the Father. In 1997, thirty-nine followers died by mass suicide, claiming this would allow their souls to ascend to a UFO that was nearing Earth in the wake of the Hale-Bopp comet (Chryssides Reference Chryssides2011).

Next, consider some recent examples of online conspiracist groups. As with cults, these groups often claim to posses knowledge which outsiders lack – in fact, empirical researchers widely argue that such claims are part of the allure of these groups (Barkun Reference Barkun2015; Imhoff and Lamberty Reference Imhoff and Lamberty2017; Douglas and Sutton Reference Douglas and Sutton2018; Sternisko et al. Reference Sternisko, Cichocka and Van Bavel2020).

Take the “Pizzagate” theory. This theory claimed that a ring of pedophiles and child murderers was operating out of a Washington, D.C. pizza restaurant called Comet Ping Pong, run by a cabal of high-profile political figures including Hilary Clinton and John Podesta. Proponents claim to have pieced together their theories from evidence such as coded language in Podesta's leaked emails – for example, references to eating pizza – thereby discovering a hidden, largescale conspiracy. This eventually sparked one man, Edgar Welch, to investigate. After bringing an assault rifle into the restaurant on a quest to search for victims, he eventually turned himself over to police upon finding no evidence (Menegus Reference Menegus2016; The Star 2016).

Next, consider the proliferation of conspiratorial anti-vaccination (“anti-vax”) rhetoric during the COVID-19 pandemic. Various articles in the far-right publication LifeSiteNews, for example, promote the idea that COVID-19 vaccines are part of the government's nefarious plot to covertly spread disease for the purpose of mass depopulation (Delaney Reference Delaney2021; Sones Reference Sones2021). Similar theories are promoted by Romana Didulo, a conspiracist with a large following who claims to be the newly appointed “Queen of Canada.” Didulo claims that the vaccines are being used to restructure the population's DNA so that the government can control them (Sarteschi Reference Sarteschi2022). Proponents of such theories often claim to be exposing knowledge that has been covered up by the political and/or scientific establishments, often by appealing to the testimony of lone whistleblowers revealing secret information.

It should now be clear that these groups share a certain epistemological feature: their claims to possess a body of secret knowledge. Both claim to have some kind of special channel to the truth which outsiders to their group lack. For cults, that's typically because they claim to have some kind of privileged access to divinely revealed truths, often through something like their leader's direct channel to God or their members’ ability to self-induce mystical experiences. For conspiracy theorists, it's because they claim to have uncovered evidence revealing important truths which have been hidden from the general public and mainstream media.

Of course, many of these groups try to recruit new members, so their putative knowledge isn't “secret” in the sense that they're totally unwilling to share it with new people. Still, both claim that, before one can acquire this “knowledge,” one must adopt certain belief forming methods or evidential standards that are specified by the group. For cults, this often involves being initiated into certain ritualistic practices, as well as a willingness to trust the authority of the cult leader. For conspiracists, this often involves a willingness to trust certain obscure methods of decoding evidence (e.g., for decoding hidden messages in leaked emails) and/or a willingness to trust certain sources of information (e.g., lone whistleblowers or charismatic figures like Romana Didulo). So, this “knowledge” remains hidden from outsiders in the sense that outsiders cannot possess it until they've accepted the group's methods of discernment.

Now, given that members of these groups outwardly profess to have secret knowledge, it might seem they genuinely believe that these putative items of knowledge are true. After all, since knowledge entails truth, to self-attribute knowledge that P is to claim that P is true. However, the next section will argue that things aren't so straightforward.

3. Fantasies and Illusions of Knowledge

In §3.1, I'll argue that many cult members and conspiracists don't genuinely believe their groups’ far-fetched claims about the world. Instead, the best explanation of their patterns of attitudes and behaviours is that they're merely fantasizing and pretending that these claims are true, which includes fantasizing about possessing secret knowledge. §3.2 then draws a distinction between these “mere pretenders” and the “true believers,” where the latter are under the illusion of genuinely possessing knowledge.

3.1. Fantasies of knowledge

It can be difficult as an outsider to grasp how someone could initially start believing cultic or conspiracist claims as outlandish as those described in §2. However, let's put aside belief for a moment and focus instead on merely imagining or fantasizing that such claims are true. (In what follows, I use “mere imagining that P” to refer to imagining without believing that P, while “fantasizing” refers to pleasurable mere imagining.Footnote 5) I think it's much easier to see why someone would be attracted to such fantasies.

In general, as I observed in §1, there's often pleasure in fantasizing about being part of a group that possesses knowledge to which outsiders aren't privy. Furthermore, the kinds of people who tend to join cults and conspiracist groups would find such fantasies especially alluring. Cult members are often at a time in their lives when they're seeking an understanding of the universe and life's mysteries, as well as seeking deep social bonds with likeminded individuals (Dawson Reference Dawson2006: Ch. 4). It's therefore easy to see why it would be pleasurable to imagine that answers to the mysteries of the universe have been revealed, as well as that one is part of a close-knit group to whom this knowledge has been imparted.

Many who get involved in conspiracist groups have a similar desire for belonging (Douglas et al. Reference Douglas, Sutton and Cichocka2017; Sternisko et al. Reference Sternisko, Cichocka and Van Bavel2020; Phadke et al. Reference Phadke, Samory and Mitra2021). They also often feel politically or socially powerless or alienated (Douglas and Sutton Reference Douglas and Sutton2018; Klein et al. Reference Klein, Clutton and Dunn2019), along with desires to feel unique or special by possessing secret knowledge (Imhoff and Lamberty Reference Imhoff and Lamberty2017; Douglas and Sutton Reference Douglas and Sutton2018; Sternisko et al. Reference Sternisko, Cichocka and Van Bavel2020). Since experts who possess knowledge that laypeople lack are often afforded a certain social status, the idea of acquiring secret knowledge is an empowering one that makes for a pleasurable fantasy. Through such fantasies, one imagines oneself an expert who knows what's really going on, more so than the general population and mainstream media.

This subsection appeals to empirical evidence to argue that members indeed often merely fantasize that their groups’ theories are true. Since these groups claim to possess secret knowledge, this in part involves fantasizing about possessing such knowledge. So, on my account, the contents group members initially imagine to be true include both first-order theories – e.g., that alien reptilians control the government, or that Democrats are murdering children – and the second-order claim that group members know these theories are true. I'll especially emphasize the second-order imaginings about possessing knowledge because they play some key explanatory roles in my account (for example, as I've highlighted already, they explain what makes these fantasies so alluring).

First, consider evidence from the way cult members and conspiracists speak about their own attitudes towards their groups’ claims.

Empirical research on cults, based on evidence such as interviews about conversion stories, has found that active participation in a cult often precedes full belief in its doctrines (Dawson Reference Dawson1990; Galanter Reference Galanter1999: Ch. 3; Iannaccone Reference Iannaccone2006; Mercier Reference Mercier2020: Ch. 8). By “active participation,” I mean participation in various parts of a cult's organized community life: meetings, lectures, rituals such as group prayer and meditation, etc. Cult members often claim they began engaging in this participation before fully converting. Then, over time, they came to genuinely believe the cult's putative items of secret knowledge.

Many members of conspiracist communities also admit, at least under the right circumstances, that they don't genuinely believe the theories they might outwardly seem to endorse. Rosenblum and Muirhead (Reference Rosenblum and Muirhead2019: Ch. 1) describe how, when pressed on whether they genuinely believe their theories despite how far-fetched they seem, conspiracy theorists often back away from full endorsement. Instead, they retreat to claiming that they're “just asking questions” or that a theory could be true. Along similar lines, anti-vax articles by LifeSiteNews discussing the theory that vaccines are a tool for mass depopulation repeatedly refer to this as an “entirely possible” scenario (Delaney Reference Delaney2021; Sones Reference Sones2021). Despite this, conspiracists also actively engage with their communities, by posting, retweeting, and sharing relevant content online.

Members of both sorts of groups therefore outwardly behave in a way that, at first glance, resembles how a genuine believer would behave, since they actively participate in their communities. Yet, many also speak as if they don't genuinely believe, at least when pressed. If we're to take their self-attributions at face value, we need another explanation of what cognitive attitude they hold towards their groups’ claims, one which can rectify the fact that they don't genuinely believe with the facts about their outward behaviours.

We could explain these behavioural data by positing that group members often merely fantasize that their groups’ claims are true, with their actions constituting a form of pretense. Ordinarily, actions are guided by beliefs. However, during pretense, imagining that P can guide action in a way that, from the outside, looks much as if one believes that P (Gendler Reference Gendler2007; Van Leeuwen Reference Van Leeuwen2011; Picciuto and Carruthers Reference Picciuto, Carruthers and Kind2016). This often occurs in the context of children's play: one can imagine that one is at a tea party and that one's plastic teapot is filled with tea, and this imagining then guides one to act as if one is at a tea party. Similarly, Gendler (Reference Gendler2007) argues that pretense is involved in processes of self-deception: that one who believes P yet seems to deny P and continue to act as if not-P does so because she imagines that not-P. Likewise, if one were merely imagining some cultic or conspiracist claims to be true, this would explain why one actively participates in a cult or conspiracist group despite a lack of belief.

In fact, various empirical evidence coheres better with the theory that many group members merely imagine and pretend than with the theory that all those who actively participate are genuine believers.

One reason to suspect fantasy and pretense are involved is that cultic and conspiracist claims often seem to be a source of entertainment. Rather than taking conspiratorial cult theories completely seriously, cult members often report spinning and elaborating on these theories more out of a sense of play, through activities that resemble narrative storytelling (Dyrendal Reference Dyrendal2016). Similarly, a series of studies by van Prooijen et al. (Reference van Prooijen, Ligthart, Rosema and Xu2022) found that the entertainment value of conspiracy theories is positively correlated with the degree to which people are likely to (claim to) endorse them. In line with this, Stefanie MacWilliams, an online writer who helped popularize Pizzagate, has compared the sleuthing involved in analyzing leaked emails for vague clues to “a giant game” (The Star 2016).

Furthermore, as Rosenblum and Muirhead (Reference Rosenblum and Muirhead2019: Ch. 2) note, it seems that many popular online purveyors of conspiracy theories are in the business of entertaining rather than producing believable content. Consider, for example, Alex Jones, who has helped spread many conspiracy theories through his website Infowars. It's hard not to be entertained by many of Jones’ clearly outlandish videos – take his angry rants about government plots to “make people gay” by putting chemicals in the water supply, as evidenced by the (totally unsubstantiated) claim that the majority of frogs in the U.S. are now gay (Kacala Reference Kacala2018). Even if some audience members actually believe Jones’ claims, this doesn't really seem to be the goal of the content he produces. Instead, the point appears to be to titillate and entertain enough to keep people coming back and sharing his content.Footnote 6

This fits well with the idea that there's a kind of pleasure in fantasizing about possessing secret knowledge. If some outwardly act as if they endorse an item of putative knowledge for entertainment purposes, this seems to resemble pleasurable fantasy and pretense more than belief – we characteristically adopt fantasies and engage in pretend play for pleasure, but we don't typically adopt beliefs for this reason.

Furthermore, the idea that group members’ attitudes are adopted because they're entertaining or pleasurable, rather than based on evidence, is part of a broader pattern in which these attitudes fail to exhibit belief's characteristic sensitivity to evidence. Our beliefs aren't always precisely proportioned to our evidence, since we're not perfectly rational. Still, it seems like a prima facie reason to think that some body of attitudes is a set of imaginings rather than a set of beliefs if it bears no clear, systematic relation to one's evidence. Beliefs are characteristically formed on the basis of evidence (cf. Kim Reference Kim1988; Helton Reference Helton2020), while mere imagining and fantasizing are typically not sensitive to evidence the way beliefs are (cf. Currie and Ravenscroft Reference Currie and Ravenscroft2002: 15–16).

In cults, this evidence-insensitivity shows up especially clearly in the way members’ attitudes are often immune to revision on the basis of evidence. Consider the way cults often remain cohesive after a leader's predictions about the impending end of the world fail to come true (Boyer Reference Boyer2001: Ch. 9; Dawson Reference Dawson2006: Ch. 7; Van Leeuwen Reference Van Leeuwen2014, Reference Van Leeuwen2017). If members’ attitudes towards such predictions were ordinary, evidence-responsive beliefs, we'd expect them to give up these beliefs after a leader's predictions fail to come true multiple times. Furthermore, we'd expect members’ confidence in a leader's credibility to be compromised. However, many groups simply become more cohesive and continue their preparations for impending cataclysms (see Dawson Reference Dawson2006: 168–9 for specific examples). If attitudes towards these predictions are merely a matter of pleasurable fantasizing, this becomes easy to explain. Since what one imagines for pleasure bears no rational connection to one's evidence, one can simply extend one's fantasy in new ways when predictions fail to pan out, thus allowing the fantasy to continue.

Many people also leave cults for reasons unrelated to evidence. Former members often don't leave because of evidence that a cult's doctrines are false or that a leader lacks credibility. Instead, they often leave because social bonds deteriorate, because of reasons to do with their values or moral principles (e.g., they realize their leader is doing unethical things behind closed doors), or because they simply “grow out of” a lifestyle they once found appealing as a young person free of family obligations and commitments (Galanter Reference Galanter1999: Ch. 8; Dawson Reference Dawson2006: Ch. 4; Van Leeuwen Reference Van Leeuwen2014, Reference Van Leeuwen2017). Cult members’ attitudes thus seem more responsive to practical or moral factors than to evidence, which is uncharacteristic of belief. This again instantly becomes intelligible once we view cult members’ attitudes as mere imaginings. It's epistemically irrational to abandon beliefs merely based on practical factors, but it's perfectly rational to cease acting out a fantasy when doing so is no longer practically expedient.Footnote 7

We also observe patterns of evidence insensitivity with conspiracists. Rosenblum and Muirhead (Reference Rosenblum and Muirhead2019) observe that many recent online conspiracy theories don't seem to be based on the kind of painstaking analysis of evidence that was once more common in conspiracy theorizing. In the past, conspiracists often engaged in thorough, quasi-scientific analysis – think detailed scrutiny of bullet trajectories in the JFK assassination or analyses of building collapses in the 9/11 attacks. In contrast, the style of conspiracy theorizing Rosenblum and Muirhead dub “the new conspiracism” seems to care more about making bold claims that will be retweeted, liked, and shared without much thought. This suggests that conspiracists are, like cult members, more sensitive to practical factors – i.e., to whatever will gain the most retweets and the like – than they are to evidence.

It might seem at first glance that careful analysis of evidence occurs in some conspiracist cases – proponents of Pizzagate, for example, claim to have closely analyzed leaked emails for clues. But in many such cases the “evidence” turns out, on examination, to be transparently incredible: for example, that email references to eating pizza for dinner are coded references to a child sex ring in a pizza restaurant. Since mere imagining and fantasy have no normative relation to one's evidence, it's easy to explain why someone merely pretending that the Pizzagate theory is true would share the theory on the basis of this “evidence” – one need not really believe this constitutes good evidence in order to derive entertainment from acting as if it does.

In particular, one can derive entertainment from sharing such content because it very effectively allows one to enact fantasies of possessing secret knowledge. In an analysis of millions of tweets, Vosoughi et al. (Reference Vosoughi, Roy and Aral2018) found that information which is novel and surprising is more likely to be shared, even though it's also more likely to be false. They argue that the allure of sharing novel information is that doing so conveys a certain social status: it signals that one is “in the know” or possesses “insider” information. So, it's easy to see why someone enacting fantasies about possessing secret knowledge would share seemingly far-fetched conspiracies which, by their nature, are more likely to be novel and surprising.

Of course, one might respond to all of these data about evidence insensitivity by claiming that group members do have genuine beliefs, but that these beliefs are simply irrational. However, my imagination-based explanation is superior to a belief-based one for several reasons.

First, the belief-based explanation has difficulty accounting for the fact that, as I explained above, many group members speak as if they don't genuinely believe, with some explicitly self-attributing non-belief. My imagination-based explanation accounts both for these self-attributions and for the fact that many group members’ attitudes are sensitive to factors other than evidence. Furthermore, consider the particular way conspiracists often claim that their theories are merely possible or merely “could be” true. Philosophers often posit that there's a tight connection between imagination and possibility: that the imagination is the primary cognitive faculty involved in generating and entertaining mere possibilities or hypotheticals (Picciuto and Carruthers Reference Picciuto, Carruthers and Kind2016; Spaulding Reference Spaulding, Kind and Kung2016; Aronowitz Reference Aronowitz2021). So, conspiracists’ use of modal language suggests they're engaged in imaginatively entertaining possibilities rather than expressing genuine beliefs.

My account also provides a more plausible explanation than a belief-based one given that, in general, it's psychologically difficult to form a belief that P based on practical factors rather than on evidence that bears (or at least seems to you to bear) on the truth of P. There's an established tradition of philosophers claiming that it's psychologically impossible to do so (Williams Reference Williams1973: Ch. 9; Shah Reference Shah2006). However, even if we don't adopt such a strong view, reflecting on examples suggests that it's at least psychologically difficult to form beliefs on the basis of practical factors.Footnote 8 One instance of this point is familiar from responses to Pascal's Wager: merely being convinced that believing in God would be practically beneficial doesn't make it easier to believe. Likewise, the practicality of believing in a cult's doctrines or in a conspiracy theory doesn't seem like enough to easily cause one to believe. In contrast, it's typically psychologically easy to voluntarily imagine things, regardless of our evidence.

Finally, my account fits well with the idea from psychology that the evolved function of pretense is to enable humans to explore novel spaces of possible action, thereby discovering new patterns and causal regularities and developing more sophisticated causal models (Schmidhuber Reference Schmidhuber2010; Buchsbaum et al. Reference Buchsbaum, Bridgers, Weisberg and Gopnik2012). This is most often studied in children's pretend play, where acting out imagined scenarios is a means of developing counterfactual and causal reasoning capacities. But there's reason to think that both cultic and conspiracist communities are engaged in something like this, too.

In the case of cults, it's telling that they often claim secret knowledge of special rituals and methods for inducing mystical experiences and altered states of consciousness. In other words, joining a cult comes with the promise of exploring totally new kinds of patterns of action, where these are supposed to afford knowledge of cause-effect relationships not previously experienced – it's only by acting out certain new, ritualistic behaviours that one can discover how these behaviours cause new kinds of experiences.

Conspiracists’ online activities plausibly also have the function of exploring novel possible actions and causal regularities, though in a slightly different way. In particular, they can fulfill this function in the social realm: by provoking reactions from online interlocutors, conspiracists can explore and push the boundaries of possible social interactions. When online conspiracists share seemingly absurd theories which can't easily be undermined by evidence or rational argument, it often seems to be with the aim of provoking disorientation and confusion: “Conspiracist accusations leave the rest of us … baffled, our sense of reality threatened, our responses tentative and, it feels, inadequate. Disorientation is one of the dangerous effects of conspiracism, and producing this reaction is one of the new conspiracists’ declared pleasures” (Rosenblum and Muirhead Reference Rosenblum and Muirhead2019: 38). The way conspiracy theorists engage with non-community members online thus constitutes a way of exploring types of interactions that wouldn't normally be possible within the bounds of “real life” social norms.Footnote 9

Of course, my claim in this subsection isn't that all group members fantasize rather than genuinely believing, merely that many do. In the next subsection, I further argue that some group members are instead under the illusion of possessing knowledge.

3.2. Illusions of knowledge: mere pretenders vs. true believers

In this section, I draw a distinction between group members I'll call “mere pretenders” and those I'll call “true believers.” The mere pretenders are those who are merely fantasizing and pretending, as described in the previous subsection. These are the subjects who we should expect to give up on their group when participating is no longer entertaining or practically beneficial. In contrast, true believers remain dedicated for many years and are willing to undertake quite extreme actions in the name of their group. I'll argue that we can best explain this contrast by positing that the true believers genuinely self-attribute knowledge.

Consider some examples of extreme actions by cult members. In §2, I mentioned Aum Shinrikyo, several of whose members carried out murders and a large-scale nerve gas attack as part of the spiritual warfare they were allegedly waging. Similarly, recall Heaven's Gate, thirty-nine of whose members died by mass suicide in order to ascend to the UFO that would help them evolve to a new spiritual existence. There seems to be a stark contrast between the behaviours of these true believers and those of someone who is merely imagining and fantasizing. We'd expect the latter to cease enacting their fantasies when continuing to do so leads to something so drastic.

A similar contrast shows up in online conspiracist groups. Many members restrict their actions to online engagement with other, like-minded individuals. However, in many cases, this isn't what you'd expect if they genuinely believed what they profess to believe. Consider Pizzagate. As Mercier (Reference Mercier2020: Ch. 10) notes, millions of people surveyed said they believed a child abuse ring was being run out of Comet Ping Pong. However, very few actually took any action that seems proportionate to the gravity of the situation. One was Edgar Welch, described in §2; another was a man who planned to blow up a monument in Springfield, Illinois, in order to draw public attention to Pizzagate (Winter Reference Winter2019). Similarly, consider the anti-vax conspiracy theories of Romana Didulo, self-proclaimed Queen of Canada. Didulo told her online followers that they should “shoot to kill” people involved in vaccinating children, such as hospital and school staff. However, almost none of her followers took steps to commit violence. One possible exception was a man who was arrested after posting an apparent threat to shoot staff at his daughter's school (Sarteschi Reference Sarteschi2022).

Now, my claim isn't that we'd necessarily expect all true believers to commit violence. However, vanishingly few who claim they believed these theories take actions that seem proportionate to their gravity. Almost none who purportedly believed children were being abused and murdered in the basement of a pizza restaurant decided to organize protests, call the police to demand an investigation, or anything like that. Similarly, almost none who professed to believe Didulo's claims followed her directives to intervene at all costs in the vaccination of children, despite the fact that children were apparently being gravely harmed. So, there again seems to be a stark difference between the behaviours of the mere pretenders, described in the previous subsection, and the true believers who undertake more drastic actions.

We can explain these differences in behaviour by appealing to a difference in the attitudes which the mere pretenders and true believers take towards their groups’ doctrines and theories. Specifically, I think we can explain this by positing a contrast between those who merely imagine they possess secret knowledge and those who genuinely believe they possess it.

Intuitively, it seems that, if someone is going to carry out such extreme actions, they must really take themselves to know that their group's theories about the world are true. Consider someone who is willing to ingest a deadly poison on the basis that doing so will transport their soul to the next plane of human existence. It doesn't seem that most people would be willing to do so if they were merely highly confident that the poison would have this effect. Instead, we'd expect this behaviour from someone to whom it seems they know the poison will have this effect. Similarly, it doesn't seem that someone would be willing to blow up a public monument on the basis of the Pizzagate theory unless he takes himself to know that the theory is true.

This is an instance of the more general intuition that, when we're deliberating about some high-stakes decision, we typically don't act on the basis of P until we ensure we know that P. If it's imperative that you get to the bank before it closes, for example, you'll take steps to ensure you really know what time the bank closes, such as looking up its hours online. I'm not here endorsing a normative thesis about the connection between knowledge and action – e.g., that one must know that P before rationally acting on the basis of P (Hawthorne and Stanley Reference Hawthorne and Stanley2008). Instead, I'm making a psychological point: when deliberating about high-stakes decisions, we typically try to ensure we know that P before acting on the basis of P. When making it to the bank is of high importance, our deliberations tend to involve asking questions such as: “But do I really know the bank closes at 5 pm today?” While cases like this are used to argue for controversial views about the nature of knowledge – including contextualism (DeRose Reference DeRose1992) and pragmatic encroachment (Stanley Reference Stanley2005) – I'm not committing to any such view. Rather, I'm merely appealing to an intuition about high-stakes deliberation that underlies various such arguments.

These intuitive considerations about high-stakes action and knowledge fit especially well with the way cult members self-attribute knowledge. Consider this excerpt from an interview with Rio DiAngelo, a member of Heaven's Gate who was instructed to remain alive to continue spreading the group's teachings after their mass suicide. DiAngelo comments on several members of Heaven's Gate who survived the original suicides yet took their own lives soon after:

People ask: ‘Why would they do that?’ … It doesn't make sense to give up everything. Unless … you know. Unless you know what they knew. And what I know. [Which is?] That [Marshall Applewhite] was the second coming of Jesus Christ. That's what I'm here to help people understand. (Bearman Reference Bearman2007)

If one really takes oneself to know that Applewhite was the second coming of Christ, it's intuitively easy to see why one would act on his instructions to take one's own life.Footnote 10 If one took oneself to have something that falls short of knowledge – e.g., relatively high confidence, or a merely plausible theory about how the world works – it would be more difficult to understand this behaviour.

By parallel reasoning, it seems like a good explanation of extremist conspiracist behaviours that the relevant conspiracists are under an illusion of possessing knowledge. Again, this intuitively fits with our general practice of deliberating about how to act in very high-stakes situations, where we typically don't act on the basis of P until we ensure we know that P.

4. From Fantasies to Illusions of Knowledge

The previous section first argued that many members of cults and conspiracist groups merely fantasize about possessing secret knowledge. I then argued that others are under the illusion of genuinely possessing knowledge. This section will argue that, under the right conditions, fantasies of knowledge can give rise to illusions of knowledge. Before doing so, two caveats are in order.

First, as I'll describe in more detail below, empirical research has posited many different psychological factors that can contribute to cultic and conspiracist belief formation. The exact belief forming processes involved are likely different for different people. So, this section shouldn't be read as arguing that my account applies to every case, nor should it be read as offering a complete picture of every case to which it does apply. Instead, I aim to describe one kind of psychological force that's operative in many cases.

Second, my account is ultimately an empirically testable one. It's therefore open to empirical investigation to determine exactly how prevalent the process I describe is. My project in this paper involves a high-level synthesis of many different philosophical and empirical considerations, in a way that isn't suited to making fine-grained distinctions about the exact prevalence of the phenomena I'm describing. That would be a task for more targeted empirical research.

In §4.1, I'll first conceptualize what the route from fantasies to illusions of knowledge looks like, developing an account of how such fantasies could be mistaken for genuinely possessing knowledge. §4.2 then argues that this account plausibly characterizes many actual cult members and conspiracists. That's because it explains certain empirical facts about these subjects which existing explanations fail to adequately explain.

4.1. How fantasies become illusions

We're typically easily able to distinguish between fantasy and reality, so it might seem implausible to claim that fantasies of knowledge can be mistaken for genuine knowledge. However, this subsection develops a psychologically plausible account of how this shift from fantasies to illusions could occur. I first appeal to certain general features of the psychology of imagination. I then pinpoint features of the particular context in which cultic and conspiracist pretense occur, features which make it especially likely that cult members would mistake their fantasies for reality.

It's in general true that what we imagine can feed into our beliefs. In particular, philosophers often argue that the imagination is one of the primary faculties involved in generating and exploring novel possibilities and hypotheses which then become candidates for new beliefs (Picciuto and Carruthers Reference Picciuto, Carruthers and Kind2016; Spaulding Reference Spaulding, Kind and Kung2016; Aronowitz Reference Aronowitz2021). You might try, for instance, to figure out how some future event will unfold by imagining various possible ways it could play out; eventually, you might land on a belief that one of the hypothetical scenarios you imagined reflects how that event really will go.

Now, many such cases commonly discussed in the literature involve using the imagination to form beliefs about what will be the case in the future or what would be the case in some counterfactual possibility. In the groups I've described, subjects’ imaginings would instead result in beliefs about what is the case, such as the belief that one currently has knowledge. However, reflecting on further cases shows that we often use the imagination specifically as a tool to represent and investigate reality as it is in the present. Suppose, for example, that you're wondering how many windows are on the outside of your house or how many tables are in your favourite restaurant. The natural way to answer such questions is by using the imagination to mentally “tour” the outside of your house or the interior of the restaurant (cf. Munro Reference Munro2021). You thereby use the imagination to bring to mind information about the actual world. It thus seems that there's very often a cognitive pathway from imagining to occurrent beliefs about reality.

Of course, cult members’ and conspiracists’ imaginings are often much more fantastical in nature than this. However, it's empirically well-documented that subjects can mistake even quite fantastical imagined contents for actuality. This occurs especially when subjects mistake imaginings for memories (Loftus Reference Loftus1997; Muschalla and Schönborn Reference Muschalla and Schönborn2021). In the phenomenon of “imagination inflation,” subjects who are asked by researchers to imagine a fictional past event are prone to later mistake their imaginings for memories. Similarly, suggestions and leading questions from researchers in a lab or from therapists in a clinical setting can cause patients to construct imaginings which they take to be recovered, suppressed memories. These can involve far-fetched contents, such as “memories” of past lifetimes or of satanic ritual abuse suffered in childhood (Spanos et al. Reference Spanos, Menary, Gabora, DuBreuil and Dewhirst1991, Reference Spanos, Burgess and Burgess1994; Pyun and Kim Reference Pyun and Kim2009). In light of results like this, it doesn't seem so implausible that cult members and conspiracists could mistake the contents of what seem like highly fantastical imaginings for actuality.

Moreover, how plausible we take an imagined possibility to be depends in part on the level of detail and vividness with which we can imagine it (Dobson and Markham Reference Dobson and Markham1993; Szpunar and Schacter Reference Szpunar and Schacter2013). And when groups of people imagine things together, these imaginings are more likely to be vivid and detailed. That's because, rather than one person constructing the imagining on her own, multiple people share the task of collaboratively giving input into the way an imagined possibility is constructed (Michaelian and Sutton Reference Michaelian and Sutton2019). It's plausible that the vividness with which cult members and conspiracists are able to collectively imagine some possibility would increase the chances that they mistake this imagining for reality.

So, in general, it's psychologically plausible that even fantastical imaginings could influence beliefs about reality, especially when an entire group is imagining together. However, whether it's plausible that this often occurs in the specific kinds of groups on which I've focused depends on whether there are particular features of these groups which make it especially likely to occur.

I'll now identify some such features of cults and online conspiracist groups. These features contribute to how deeply absorbed cult members and conspiracists become in their fantasies. I use “absorption” here to refer to an episode of pretense in which markers of the difference between fantasy and reality have become obscured or have dropped out of one's awareness.

To see what I mean, first consider more ordinary cases of pretense. Ordinarily, our fantasies remain cognitively quarantined from our beliefs: we don't typically see children start to believe their toy teapot is filled with tea, nor do we see actors on a stage or film set start to believe they're fictional characters. However, notice that these ordinary cases include many external markers of the fact that the imaginers are merely pretending: the teapot is made of plastic and the guests around the table are dolls; the actor is performing on a stage in front of an audience; the children pretending to sword fight giggle and refrain from hitting each other too hard; and so on. It's not as if pretenders in these cases ever cease to be fully aware of features that clearly mark their imaginings as fictional, so they wouldn't lose track of fantasy versus reality.

In contrast, participation in the activities of cults and online conspiracist groups is likely to lack markers which signal that one is merely pretending.

For one thing, cults that are most successful at retaining recruits are often those whose leadership imposes a strict, intense routine that members must follow day-to-day (Dawson Reference Dawson2006: 78). Participants therefore continuously and actively behave as if they genuinely know the cult's theories to be true. Furthermore, we wouldn't expect mere pretenders to go around signalling their lack of belief, given that cults are characteristically overseen by leaders who impose strict authority and so wouldn't take kindly to signals of disloyalty. So, a cult member's own behaviours are unlikely to exhibit many markers that she's merely pretending, unlike, for example, a child who remains aware that her own actions are mere pretense because she giggles and refrains from hitting her playmate too hard with her pretend sword.

We should expect to see similar dynamics in online conspiracist groups, despite the fact that they often lack the kind of centralized leader seen in cults. As Rosenblum and Muirhead (Reference Rosenblum and Muirhead2019: 52) point out, conspiracy theories tend to circulate in social media environments that encourage constant “repeating, sharing, liking, and forwarding” of conspiracist claims. It's a familiar fact that modern social media encourages constant sharing of content in order to quickly amass likes, re-tweets, further sharing, etc. (cf. Nguyen Reference Nguyen and Lackey2021). Furthermore, one is especially likely to like or share conspiracist content when it's particularly entertaining or titillating, even if one doesn't genuinely believe it (Mercier Reference Mercier2020: Ch. 10). Online conspiracy theories thereby gain traction in a context which fosters behaviour that's as if one genuinely believes the theories. That's because sharing on social media resembles a form of assertion or testimony, since by sharing content one outwardly appears to be endorsing it (cf. Rini Reference Rini2017). So, participants in these online communities behave in a way that resembles how genuine believers would act.

The fact that members of these groups are collectively engaged in fantasies of secret knowledge also contributes to their deep absorption in their fantasies. Members are surrounded by others who, from the outside, are behaving similarly to how genuine knowers would behave. In cults, everyone would together be engaged in their characteristically intense routines. In online conspiracist environments, one would be around others who are under the same pressures to constantly post, like, and share juicy content. So, when one looks around at other members, one wouldn't observe explicit markers in their behaviour which signal that they're merely pretending. Furthermore, observing others’ behaviours would likely feed back into one's own behaviour. Being around other cult members who are faithfully enacting a cult's intense routines generates social pressure to conform and behave similarly, just as being part of any social group typically generates such pressures (cf. Dawson Reference Dawson2006: 115). We should expect the same to apply to conspiracist groups. So, observing the behaviours of others, which lack explicit markers of pretense, would in turn influence one to behave in a way that lacks such markers.

We should also expect group members to be actively attributing knowledge to one another. As I've stressed throughout the paper, part of the allure of (fantasizing about) being in a cult or conspiracist group is that it involves being part of a select group of knowers. We'd expect group members’ conversations and interactions to reflect this: they'd speak to each other as if they're all mutually in on the same secret knowledge to which outsiders aren't privy. One would thus be surrounded by others who constantly imply that one is a knower, rather than being aware of any explicit markers that one is pretending.

Finally, we should also expect these communities to lack explicit markers of pretense in the ways members interact with outsiders, especially when members are publicly promulgating outlandish theories. In discussing conspiracy theories, Mercier (Reference Mercier2020: Ch. 12) argues that such public endorsements often serve the function not of expressing beliefs but of signalling that one is genuinely committed to the community of which one is or wants to be a member. This, he argues, explains many cases in which people publicly endorse theories that seem to have very little evidence supporting them: one isn't aiming to share beliefs as much as signal that one has totally rejected the mundane belief systems of outsiders, thus boosting one's status in the eyes of other community members. It's easy to see how the same sort of reasoning could apply to cult members, given that, again, cult leaders typically demand a high degree of loyalty, which can be signalled by publicly professing commitment to a cult's doctrines.

If this loyalty-signalling account is true, we wouldn't expect to find many explicit acknowledgments or markers that members are merely pretending: if one regularly signalled a lack of belief, one's commitment would seem insincere. This is especially so given that these communities are so fixated on possessing secret knowledge: in order to keep up appearances that one believes the community to possess such knowledge, one can't signal that one's attitude towards the community's theories somehow falls short of knowledge.

So, the degree to which group members become collectively absorbed in the fantasy that they possess secret knowledge is much greater than the degree of absorption in typical cases of pretense. This makes it easier to see how one's fantasy could bleed into one's sense of reality. Typical cases involve markers that act as evidence about where fantasy ends and reality begins. This allows one to keep the line between the two clearly in view. But when one lacks such markers, and when one is surrounded by others who are collectively acting as if they're genuine knowers, it will seem as if one is a genuine knower, since one won't be immediately aware of evidence to the contrary. This explains how members of these groups could become so absorbed in their fantasies that they start to seem real.Footnote 11

4.2. Explanatory power over alternative accounts

In this subsection, I argue that my account explains certain empirical observations better than other prominent accounts of cultic and conspiracist belief formation. Specifically, it better explains why there's often a delay between the time when members begin actively participating in their group and the time when they genuinely believe their group's theories. I'll first unpack the evidence for this delay, then compare my account to alternatives.Footnote 12

As I explained in §3.1, there's evidence that there's often a delay between initial participation and belief formation in the case of cults: various empirical researchers have commented on this, their evidence including the firsthand testimony of cult members (Dawson Reference Dawson1990; Galanter Reference Galanter1999: Ch. 3; Iannaccone Reference Iannaccone2006; Mercier Reference Mercier2020: Ch. 8). There's also evidence of a similar delay for conspiracists.

Consider the way Packer and Stoneman (Reference Packer and Stoneman2021) describe belief formation among supporters of QAnon, a recent conspiracist group with roots in the Pizzagate movement. They describe QAnon supporters recounting their paths from initially participating in online conspiracist spaces to, over time, coming to “know” the theory is true. Some of these conspiracists even explicitly claim they originally suspected QAnon was a pretense or fiction, before eventually acquiring this “knowledge.”

Similarly, writers documenting the rise of conspiracist groups often describe their behaviours as escalating over time, in a way that suggests members have moved from lacking genuine belief to self-attributing knowledge. A clear example of this comes from journalist Mack Lamoureux's reporting on Romana Didulo's followers. In his early reporting, Lamoureux (Reference Lamoureux2021) expresses doubt about whether all of these followers genuinely believe Didulo's claims, even quoting a follower who claims she isn't sure whether Didulo is legitimate. Around that time, the group was mostly engaging in relatively low-stakes actions, such as merely actively participating in Didulo's online community. More recently, though, some members have escalated to higher stakes actions. Some have ceased paying their water and electricity bills at Didulo's command, despite racking up thousands of dollars in debt (Lamoureux Reference Lamoureux2022a). Others have followed her orders to arrest police officers on her behalf, leading to the violent arrests of group members themselves (Lamoureux Reference Lamoureux2022b).

So, it seems that members of both cults and conspiracist groups often begin actively participating before genuinely believing. If so, we need an explanation of the cognitive attitude they hold towards their groups’ theories during the interval in between. In particular, we have to explain what kind of cognitive attitude besides belief could guide them to actively participate in these groups – without an explanation, it's puzzling why they'd act in a way that somewhat resembles how a genuine believer would act, even though they don't believe.

My account gives a straightforward explanation. As per §3, actions needn't always be guided by beliefs, since they can also be guided by imaginings in the context of pretense. On my account, members start out fantasizing about possessing secret knowledge before transitioning to genuinely self-attributing knowledge. So, during the time between initial participation and genuine belief, they hold an attitude of imagining towards their group's theories. This explains why they act in a way that largely resembles belief-guided action, while stopping short of carrying out higher-stakes actions.

In contrast, many alternative accounts, which don't posit a period of mere fantasy and pretense, fail to provide equally satisfactory explanations.

One of the most prominent kinds of account posits that group members’ beliefs are simply caused by their desires. Researchers have found that membership in these groups is highly correlated with certain desires. Conspiracism is highly correlated with a desire to feel special because one possesses secret knowledge (Dyrendal Reference Dyrendal, Asprem and Granholm2013; Imhoff and Lamberty Reference Imhoff and Lamberty2017; Douglas and Sutton Reference Douglas and Sutton2018; Sternisko et al. Reference Sternisko, Cichocka and Van Bavel2020), as well as desiring to find community and a sense of belonging with likeminded individuals (Douglas et al. Reference Douglas, Sutton and Cichocka2017; Sternisko et al. Reference Sternisko, Cichocka and Van Bavel2020; Phadke et al. Reference Phadke, Samory and Mitra2021). Cult members exhibit both sorts of desires as well, especially the desire to find a sense of community and belonging (Galanter Reference Galanter1999; Dawson Reference Dawson2006; Chryssides Reference Chryssides, Lewis and Tøllefsen2016). On the basis of such correlations, researchers often posit that these desires influence group members’ beliefs – i.e., they accuse them of a kind of wishful thinking, which occurs because forming these beliefs allows them to feel as if they've fulfilled their desires.

Notice, however, that this sort of account fails to explain which cognitive attitude guides group members’ actions between their initial participation and their belief formation. Besides belief, the only attitude these accounts mention is desire. However, we don't typically start acting as if P is true merely because we desire that P is true. Instead, it seems more natural to say that wishing P were true can cause one to fantasize and pretend that P is true – i.e., that one's desire can cause one to imagine P is true, which then guides one's action.

Various other prominent accounts face similar difficulties. For example, van Prooijen et al. (Reference van Prooijen, Ligthart, Rosema and Xu2022) argue that the entertainment value of conspiracy theories causes belief in them, based on high correlations between the entertainment value of a theory and professed belief in it. However, this again doesn't explain the gap between initial participation and belief formation. My account does so while also explaining the correlation between entertainment value and belief: group members derive entertainment from fantasizing and pretending, and these states then eventually gives rise to belief.

Another prominent account from Nguyen (Reference Nguyen2020) argues that both cult members and conspiracists believe their groups’ theories because of how they allocate trust in epistemic sources (and see Cassam Reference Cassam2016 for similar claims about conspiracists). Specifically, he argues that these subjects place too much trust in sources coming from within these groups (e.g., charismatic cult leaders, conspiracist media sources), while too deeply distrusting outside sources (e.g., scientists, mainstream media outlets). However, this account again doesn't explain the delay between participation and belief formation. That's because, if everyone participating in these groups already trusted these epistemic sources, then it's not clear why they wouldn't just believe them from the beginning – after all, trusting someone typically involves accepting what they say at face value.

So, I conclude that my account, on which fantasies of knowledge become illusions of knowledge, better explains this delay than many recent accounts of cultic and conspiracist belief formation. This gives us reason to accept that the processes I described are operative in cases where this kind of delay occurs.

5. Implications

This section draws out two implications of my account. §5.1 focuses on an implication for assessing the rationality of cultic and conspiracist beliefs. §5.2 then brings out a more general implication about the potential epistemic dangers of the imagination.

5.1. The rationality of cultic and conspiracist beliefs

As per the previous section, many prominent accounts of cultic and conspiracist belief focus on irrational psychological influences on belief – for example, they posit that these beliefs are caused by desires or by the entertainment value of a theory. If that were true, it would mean that members of these groups are simply epistemically irrational, since they follow patterns of belief-formation which generally aren't truth-conducive or reliable. However, my account in the previous section gives rise to a more nuanced way of understanding the irrationality of group members’ beliefs.

On my account, group members first fantasize and pretend; then, due to certain features of the environments in which they're pretending, their imaginings eventually give rise to beliefs. Now, it does sound intuitively like an epistemically irrational process to mistake fantasy for reality. I don't want to deny that this is indeed irrational. However, I do want to deny that this must occur via cognitive mechanisms which are irrational and unreliable in a context-independent sense, as would be processes such as wishful thinking or believing based on entertainment value. Instead, my account involves normally rational capacities going awry or being mis-applied, due to features of the circumstances under which these subjects’ imaginings occur.

To see this, consider again my discussion of the connections between imagination and belief in §4.1. As I argued, the imagination is often a reliable tool for investigating reality – in many mundane cases, as when thinking about how many windows are on the outside of one's house, the imagination can put us in touch with the actual world. I also noted that it's an empirical fact that we're more likely to believe the content of our more vivid, detailed imaginings (Dobson and Markham Reference Dobson and Markham1993; Szpunar and Schacter Reference Szpunar and Schacter2013). And I take it that we're often rational to do so, since detail and vividness in an imagining typically indicate that it's more likely to be accurate. Since imaginings are constructed in part by drawing on our existing background knowledge and experiences, we can typically more easily and vividly imagine things which better cohere with our existing knowledge of the world (Addis Reference Addis2020).

If the shift from fantasizing to believing that one possesses secret knowledge occurs via the same kinds of cognitive mechanisms involved when the imagination generates genuine knowledge of reality, then cult members and conspiracy theorists are not employing generally irrational patterns of reasoning. Instead, they're mis-applying imaginative belief-forming processes that can be reliable and rational under the right circumstances, when used properly. Unfortunately, in these particular cases, the social circumstances under which subjects exercise their imaginations result in their imaginations leading them astray.

5.2. The epistemic dangers of the imagination

There's an intuitive tendency to think that merely fantasizing and exploring possibilities in one's imagination is epistemically harmless. Along these lines, Langland-Hassan (Reference Langland-Hassan2020) refers to the imagination as epistemically “safe” in the sense that one doesn't epistemically compromise oneself by merely imagining. This picture makes imagination akin to a “playground” for the mind, in which one is free to live out one's fantasies while being immune from epistemically harmful consequences.

Now, it's clearly true that, when everything operates in a cognitively ideal way, fantasies are epistemically safe and stay insulated from one's beliefs. However, my account of the imagination's role in cults and online conspiracist groups shows how the two can become blurred. This reveals that mere imagining isn't always epistemically safe. Instead, there are some circumstances under which it can be epistemically dangerous.

I don't mean to suggest that this potential danger means one should refrain from fantasizing and engaging in pretense altogether. However, my arguments in this paper allow us to identify what sorts of factors might make one especially prone to succumbing to this danger, such that we should try to avoid fantasy and pretense under certain circumstances.

I argued that typical cases of fantasy and pretense are marked as such in various ways – for example, they last a relatively short amount of time, they occur in a context where there are obvious indicators that one is merely pretending, and/or one stops short of fully acting as if one believes the contents of one's imagining. Participation in cults and online conspiracist groups, though, often lacks such markers that one is merely pretending. This can cause one to conflate the contents of one's imaginings with reality, especially when one is surrounded by many others who are acting in the same way, without signalling when they're merely pretending.

Thus, becoming too absorbed in one's fantasies, especially in a social context where others are doing the same, can be epistemically dangerous. Under such circumstances, entire groups can, by the powers of the imagination, become collectively detached from reality.Footnote 13

Footnotes

1 I'll remain neutral about exactly which other groups my account applies to. However, there are likely others to which it could be extended, since there are various groups that claim to possess secret knowledge in a way that resembles the claims of conspiracists and/or cults. As Barkun (Reference Barkun2015) notes, this includes some groups professing beliefs in UFOs, psychic mediums, and cryptozoological creatures like Bigfoot. It may also include more politically charged extremist groups – for example, the incel (“involuntarily celibate”) movement, who claim to have recognized certain hidden truths about society's sexual hierarchy (see, e.g., Chang Reference Chang2020).

2 It's controversial whether terms like “cult” and “conspiracy theory” should be used pejoratively to refer only to extreme, epistemically problematic groups, versus whether they apply to some less extreme groups as well (on controversies about how to define “cult,” see especially Dawson Reference Dawson2006; on controversies about how to define “conspiracy theory,” see, e.g., Coady Reference Coady2007; Dentith Reference Dentith2014; Reference CassamCassam 2019; Napolitano Reference Napolitano, Bernecker, Flowerree and Grundmann2021). I'll set aside this terminological dispute and stipulate that I'm using these terms to talk about the most extreme examples. However, I grant that, on a fully worked out definition of these terms, their extension may include some less extreme groups.

3 I'm not the first to compare cults with online conspiracist groups: recently, both philosophers (e.g., Nguyen Reference Nguyen2020) and journalists (e.g., Karlis Reference Karlis2021; Smith Reference Smith2021) have drawn lessons from the former to help understand the latter. However, they often draw from the work of anti-cult figures who rely on popular tropes such as “brainwashing” and “mind control” (e.g., Singer and Lalich Reference Singer and Lalich1995), which have largely been discredited by psychologists and sociologists of religion (see, e.g., Dawson Reference Dawson2006: Ch. 5; Lewis Reference Lewis, Lewis and Tøllefsen2016; note that Nguyen (Reference Nguyen2020: fn. 6) does acknowledge such controversies, but notes that he's less concerned with how cults actually work than with using a certain popular conception of cults as a tool to better understand phenomena such as modern conspiracists). Still, I think a more nuanced discussion of both groups in relation to one another can help us better understand the dynamics of each.

4 While Shriner's group had no official name, former followers have called her a cult leader. See Shriner (Reference Shriner2005) for her teachings in her own words and Flanagin (Reference Flanagin2019) for stories of her accusations against defectors.

5 Many philosophers of imagination take imagining and believing to be compatible: I can, for example, both believe that the sun will rise tomorrow and imagine that the sun will rise tomorrow. In mere imagining and fantasizing, though, one imagines in absence of belief.

6 For further discussion of conspiracy theories as a form of entertainment, see Moore (Reference Moore, Lewis and Petersen2005) and Mercier (Reference Mercier2020: Ch. 13).

7 Van Leeuwen (Reference Van Leeuwen2014, Reference Van Leeuwen2017) develops arguments similar to those I just made, but applied to religious attitudes in general rather than just to cults. Van Leeuwen argues that “religious credence” is a cognitive attitude that's functionally similar to imagining but distinct from both imagining and belief, on the basis that religious attitudes lack the evidence-sensitivity of belief. My view is that the kinds of cultic attitudes I'm focused on are better viewed as imaginings than as some new, third category of religious credence. This fits well with other evidence I survey, such the entertainment value aspect. It's also more parsimonious to subsume cult members’ attitudes into more familiar categories of imagining and/or belief rather than positing an entirely new attitude of religious credence. Still, I remain neutral on whether Van Leeuwen is right that there's also another, distinctive attitude of religious credence held by some religious people.

8 Strictly speaking, philosophers usually claim that practical considerations are typically insufficient for belief-formation when one is actively deliberating about what to believe (so that they can still influence beliefs in non-deliberative ways – cf. Shah Reference Shah2006: 482–3). I can grant this, because someone deciding whether to join a cult or believe a conspiracy theory would likely engage in deliberation.

9 One might object by pointing out the following apparent difference between ordinary pretense and cult or conspiracist participation. In most cases of pretense, one disbelieves the propositions one is imagining to be true – a child pretending to be at a tea party believes she isn't really at a tea party, while an actor believes she's not really the character she's playing. In contrast, it seems cult members and conspiracists might be suspending judgment and even actively inquiring about whether a cult's doctrines are true. Isn't this quite unlike ordinary pretense? I think that, while pretense may typically be accompanied by disbelief, it's compatible with either disbelief or suspension of judgment. Suppose that, as you were beginning graduate school to study philosophy, you were racked by imposter syndrome and self doubt. You think to yourself: “I'm not sure whether I'm really a good philosopher, but I'll pretend I am while I figure it out.” You thus pretend while suspending judgment and inquiring.

10 Some extreme actions by cult members may result from coercion by a cult's leadership rather than from illusions of knowledge. The mass suicide of over 900 cult members in Jonestown in 1978 is often interpreted this way (Galanter Reference Galanter1999: Ch. 7; Dawson Reference Dawson2006: Ch. 7). However, many cases don't fit this mould, such as Heaven's Gate: scholars typically don't think members were coerced into their fate, since they knew for some time what was coming and had opportunities to leave (Chryssides Reference Chryssides2011: Ch. 1). My claim that extreme actions are guided by illusions of knowledge rather than imaginings brackets cases of coercion.

11 The notion of imaginative “absorption” developed in this subsection is distinct from what Schellenberg (Reference Schellenberg2013) calls imaginative “immersion.” Schellenberg defines immersion as a state intermediate between imagining and believing, in which a subject can largely forget she's merely imagining that P and come close to believing that P. I use “absorption” to refer to exercises of pretense where one lacks markers by which to clearly distinguish fantasy from reality. However, it's consistent with my account that imaginative immersion occurs during the shift from mere imagining to believing – i.e., that, at some point during this shift, subjects occupy a state that's intermediate between imagining and believing. This would be one possible way of further filling out the psychological details of my account.

12 Recall from the beginning of §4, though, that I'm not aiming to totally replace these alternative explanations with mine. It may be that they offer a complete characterization of some cases, and/or that a complete account of the typical case includes elements of all of them.

13 For helpful comments and discussion, thank you to David Barnett, Tom Hurka, Liang Zhou Koh, Jennifer Nagel, Michael Omoge, Mason Westfall, Julia Jael Smith, an anonymous reviewer for Episteme, and audiences at the University of Toronto and the African Centre for Epistemology and Philosophy of Science. This research was supported by the Social Sciences and Humanities Research Council of Canada.

References

Addis, D.R. (2020). ‘Mental Time Travel? A Neurocognitive Model of Event Simulation.’ Review of Philosophy and Psychology 11(2), 233–59.CrossRefGoogle Scholar
Aronowitz, S. (2021). ‘Exploring by Believing.’ Philosophical Review 130(3), 339–83.CrossRefGoogle Scholar
Barkun, M. (2015). ‘Conspiracy Theories as Stigmatized Knowledge.’ Diogenes 62(3–4), 114–20.CrossRefGoogle Scholar
Bearman, J. (2007). ‘Heaven's Gate: The Sequel.’ LA Weekly. https://www.laweekly.com/heavens-gate-the-sequel/.Google Scholar
Boyer, P. (2001). Religion Explained: The Evolutionary Origins of Religious Thought. New York, NY: Basic Books.Google Scholar
Buchsbaum, D., Bridgers, S., Weisberg, D.S. and Gopnik, A. (2012). ‘The Power of Possibility: Causal Learning, Counterfactual Reasoning, and Pretend Play.’ Philosophical Transactions of the Royal Society B: Biological Sciences 367(1599), 2202–12.CrossRefGoogle ScholarPubMed
Cassam, Q. (2016). ‘Vice Epistemology.’ The Monist 99(2), 159–80.CrossRefGoogle Scholar
Cassam, Q. (2019. Conspiracy Theories. Cambridge: Polity Press.Google Scholar
Chang, W. (2020). ‘The Monstrous-Feminine in the Incel Imagination: Investigating the Representation of Women as “Femoids” on R/Braincels.’ Feminist Media Studies 22(2), 254–70.CrossRefGoogle Scholar
Chryssides, G.D. (2011). Heaven's Gate: Postmodernity and Popular Culture in a Suicide Group. London: Routledge.Google Scholar
Chryssides, G.D. (2016). ‘Conversion.’ In Lewis, R.J. and Tøllefsen, I. (eds), The Oxford Handbook of New Religious Movements: Volume II, pp. 2535. Oxford: Oxford University Press.Google Scholar
Coady, D. (2007). ‘Are Conspiracy Theorists Irrational?Episteme 4(2), 193204.CrossRefGoogle Scholar
Currie, G. and Ravenscroft, I. (2002). Recreative Minds: Imagination in Philosophy and Psychology. Oxford: Oxford University Press.CrossRefGoogle Scholar
Dawson, L. (1990). ‘Self-Affirmation, Freedom, and Rationality: Theoretically Elaborating ‘Active’ Conversions.’ Journal for the Scientific Study of Religion 29(2), 141–63.CrossRefGoogle Scholar
Dawson, L. (2006). Comprehending Cults: The Sociology of New Religious Movements. Oxford: Oxford University Press.Google Scholar
Delaney, P. (2021). ‘Former Pfizer VP: ‘Your Government is Lying to You in a Way that could Lead to Your Death'.’ LifeSiteNews. https://www.lifesitenews.com/news/exclusive-former-pfizer-vp-your-government-is-lying-to-you-in-a-way-that-could-lead-to-your-death/.Google Scholar
Dentith, M.R.X. (2014). The Philosophy of Conspiracy Theories. New York, NY: Palgrave Macmillan.CrossRefGoogle Scholar
DeRose, K. (1992). ‘Contextualism and Knowledge Attributions.’ Philosophy and Phenomenological Research 52(4), 913–29.CrossRefGoogle Scholar
Dobson, M. and Markham, R. (1993). ‘Imagery Ability and Source Monitoring: Implications for Eyewitness Memory.’ British Journal of Psychology 84(1), 111–18.CrossRefGoogle ScholarPubMed
Douglas, K.M. and Sutton, R.M. (2018). ‘Why Conspiracy Theories Matter: A Social Psychological Analysis.’ European Review of Social Psychology 29(1), 256–98.CrossRefGoogle Scholar
Douglas, K.M., Sutton, R.M. and Cichocka, A. (2017). ‘The Psychology of Conspiracy Theories.’ Current Directions in Psychological Science 26(6), 538–42.CrossRefGoogle ScholarPubMed
Dyrendal, A. (2013). ‘Hidden Knowledge, Hidden Powers: Esotericism and Conspiracy Culture.’ In Asprem, E. and Granholm, K. (eds), Contemporary Esotericism, pp. 200–25. Sheffield: Equinox.Google Scholar
Dyrendal, A. (2016). ‘Conspiracy Theories and New Religious Movements.’ In J.R. Lewis and I. Tøllefsen (eds), The Oxford Handbook of New Religious Movements, Vol. 2, 198–209. Oxford: Oxford University Press.Google Scholar
Flanagin, J. (2019). ‘How YouTube Became a Breeding Ground for a Diabolical Lizard Cult.’ The New Republic. https://newrepublic.com/article/154012/youtube-became-breeding-ground-diabolical-lizard-cult.Google Scholar
Galanter, M. (1999). Cults: Faith, Healing and Coercion. New York, NY: Oxford University Press.CrossRefGoogle Scholar
Gendler, T.S. (2007). ‘Self-Deception as Pretense.’ Philosophical Perspectives 21, 231–58.CrossRefGoogle Scholar
Hawthorne, J. and Stanley, J. (2008). ‘Knowledge and Action.’ Journal of Philosophy 105(10), 571–90.CrossRefGoogle Scholar
Helton, G. (2020). ‘If You Can't Change What You Believe, You Don't Believe It.’ Noûs 54(3), 501–26.CrossRefGoogle Scholar
Iannaccone, L.R. (2006). ‘The Market for Martyrs.’ Interdisciplinary Journal of Research on Religion 2(4), 128.Google Scholar
Imhoff, R. and Lamberty, P.K. (2017). ‘Too Special to be Duped: Need for Uniqueness Motivates Conspiracy Beliefs.’ European Journal of Social Psychology 47(6), 724–34.CrossRefGoogle Scholar
Kacala, A. (2018). ‘Infowars’ Alex Jones has a Long History of Inflammatory, Anti-LGBTQ Speech.’ NBC News. https://www.nbcnews.com/feature/nbc-out/infowars-alex-jones-has-long-history-inflammatory-anti-lgbtq-speech-n898431.Google Scholar
Karlis, N. (2021). ‘Cult Recovery Experts Explain how to ‘Deprogram’ QAnon Adherents.’ Salon. https://www.salon.com/2021/03/14/cult-recovery-experts-explain-how-to-deprogram-qanon-adherents/.Google Scholar
Kim, J. (1988). ‘What is ‘Naturalized Epistemology?'‘ Philosophical Perspectives 2, 381405.CrossRefGoogle Scholar
Klein, C., Clutton, P. and Dunn, A.G. (2019). ‘Pathways to Conspiracy: The Social and Linguistic Precursors of Involvement in Reddit's Conspiracy Theory Forum.’ PLoS ONE 14(11), e0225098.CrossRefGoogle ScholarPubMed
Lamoureux, M. (2021). ‘QAnons are Harassing People at the Whim of a Woman they Say is Canada's Queen.’ Vice World News. https://www.vice.com/en/article/3aqvkw/qanons-are-harassing-people-at-the-whim-of-a-woman-they-say-is-canadas-queen-romana-didulo.Google Scholar
Lamoureux, M. (2022a). ‘The QAnon Queen Told Followers they Didn't Need to Pay Bills. It Didn't End Well.’ Vice News. https://www.vice.com/en/article/z3na53/qanon-queen-bills-electricty-canada.Google Scholar
Lamoureux, M. (2022b). ‘The ‘QAnon Queen’ Told Her Followers to Arrest Cops. It Didn't Go Well.’ Vice News. https://www.vice.com/en/article/m7gb5y/queen-romana-didulo-citizen-arrest-qanon.Google Scholar
Langland-Hassan, P. (2020). Explaining Imagination. Oxford: Oxford University Press.CrossRefGoogle Scholar
Lewis, J.R. (2016). ‘Brainwashing and ‘Cultic Mind Control’.’ In Lewis, J.R. and Tøllefsen, I. (eds), The Oxford Handbook of New Religious Movements, pp. 174–85. Oxford: Oxford University Press.CrossRefGoogle Scholar
Loftus, E.F. (1997). ‘Creating False Memories.’ Scientific American 277(3), 70–5.CrossRefGoogle ScholarPubMed
Menegus, B. (2016). ‘Pizzagaters Aren't Giving this Shit Up.’ Gizmodo. https://gizmodo.com/pizzagaters-arent-giving-this-shit-up-1789692422.Google Scholar
Mercier, H. (2020). Not Born Yesterday: The Science of Who we Trust and what we Believe. Princeton, NJ: Princeton University Press.Google Scholar
Michaelian, K. and Sutton, J. (2019). ‘Collective Mental Time Travel: Remembering the Past and Imagining the Future Together.’ Synthese 196(12), 4933–60.CrossRefGoogle Scholar
Moore, R. (2005). ‘Reconstructing Reality: Conspiracy Theories about Jonestown.’ In Lewis, J.R. and Petersen, J.A. (eds), Controversial New Religions, pp. 6178. Oxford: Oxford University Press.Google Scholar
Munro, D. (2021). ‘Imagining the Actual.’ Philosophers' Imprint 21(17), 121.Google Scholar
Muschalla, B. and Schönborn, F. (2021). ‘Induction of False Beliefs and False Memories in Laboratory Studies – A Systematic Review.’ Clinical Psychology & Psychotherapy 28(5), 1194–209.CrossRefGoogle ScholarPubMed
Napolitano, M.G. (2021). ‘Conspiracy Theories and Evidential Self-Insulation.’ In Bernecker, S., Flowerree, A.K. and Grundmann, T. (eds), The Epistemology of Fake News, pp. 82106. Oxford: Oxford University Press.CrossRefGoogle Scholar
Nguyen, C.Thi. (2020). ‘Echo Chambers and Epistemic Bubbles.’ Episteme 17(2), 141–61.CrossRefGoogle Scholar
Nguyen, C.Thi. (2021). ‘How Twitter Gamifies Communication.’ In Lackey, J. (ed.), Applied Epistemology, pp. 410–36. Oxford: Oxford University Press.CrossRefGoogle Scholar
Packer, J. and Stoneman, E. (2021). ‘Where we Produce One, we Produce all: The Platform Conspiracism of QAnon.’ Cultural Politics 17(3), 255–78.CrossRefGoogle Scholar
Phadke, S., Samory, M. and Mitra, T. (2021). ‘What Makes People Join Conspiracy Communities? Role of Social Factors in Conspiracy Engagement.’ Proceedings of the ACM on Human–Computer Interaction 4(CSCW3), 130.CrossRefGoogle Scholar
Picciuto, E. and Carruthers, P. (2016). ‘Imagination and Pretense.’ In Kind, A. (ed.), The Routledge Handbook of Philosophy of Imagination, pp. 334–45. Abingdon: Routledge.Google Scholar
Pyun, Y.D. and Kim, Y.J. (2009). ‘Experimental Production of Past-Life Memories in Hypnosis.’ International Journal of Clinical and Experimental Hypnosis 57(3), 269–78.CrossRefGoogle ScholarPubMed
Repp, M. (2005). ‘Aum Shinrikyo and the Aum Incident.’ In Lewis, J.R. and Petersen, J.A. (eds), Controversial New Religions, pp. 154–94. New York, NY: Oxford University Press.Google Scholar
Rini, R. (2017). ‘Fake News and Partisan Epistemology.’ Kennedy Institute of Ethics Journal 27(2), E-43E-64.CrossRefGoogle Scholar
Rosenblum, N.L. and Muirhead, R. (2019). A Lot of People are Saying: The New Conspiracism and the Assault on Democracy. Princeton, NJ: Princeton University Press.CrossRefGoogle Scholar
Ryutaro, T. (2018). ‘The Role of Conspiracy Theory in the Aum Shinrikyo Incident.’ In Dyrendal, A., Robertson, D.G. and Asprem, E. (eds), Handbook of Conspiracy Theory and Contemporary Religion, pp. 389406. Leiden: Brill.CrossRefGoogle Scholar
Sarteschi, C. (2022). ‘How the Self-Proclaimed ‘Queen of Canada’ is Causing True Harm to Her Subjects.’ The Conversation. https://theconversation.com/how-the-self-proclaimed-queen-of-canada-is-causing-true-harm-to-her-subjects-185125.Google Scholar
Schellenberg, S. (2013). ‘Belief and Desire in Imagination and Immersion.’ Journal of Philosophy 110(9), 497517.CrossRefGoogle Scholar
Schmidhuber, J. (2010). ‘Formal Theory of Creativity, Fun, and Intrinsic Motivation (1990–2010).’ IEEE Transactions on Autonomous Mental Development 2(3), 230–47.CrossRefGoogle Scholar
Shah, N. (2006). ‘A New Argument for Evidentialism.’ Philosophical Quarterly 56(225), 481–98.CrossRefGoogle Scholar
Shriner, S. (2005). Aliens on the Internet. New York, NY: iUniverse.Google Scholar
Singer, M.T. and Lalich, J. (1995). Cults in our Midst. San Francisco, CA: Jossey-Bass.Google Scholar
Smith, T. (2021). ‘'Exist Counselors’ Strain to Pull Americans Out of a Web of False Conspiracies.’ Npr. https://www.npr.org/2021/03/03/971457702/exit-counselors-strain-to-pull-americans-out-of-a-web-of-false-conspiracies.Google Scholar
Sones, M. (2021). ‘It's ‘Entirely Possible’ Vaccine Campaigns ‘Will be used for Massive-Scale Depopulation': Former Pfizer VP.’ LifeSiteNews. https://www.lifesitenews.com/opinion/former-pfizer-vp-to-aflds-entirely-possible-this-will-be-used-for-massive-scale-depopulation/.Google Scholar
Spanos, N.P., Burgess, C.A. and Burgess, M.F. (1994). ‘Past-Life Identities, UFO Abductions, and Satanic Ritual Abuse: The Social Construction of Memories.’ International Journal of Clinical and Experimental Hypnosis 42(4), 433–46.CrossRefGoogle ScholarPubMed
Spanos, N.P., Menary, E., Gabora, N.J., DuBreuil, S.C. and Dewhirst, B. (1991). ‘Secondary Identity Enactments during Hypnotic Past-Life Regression: A Sociocognitive Perspective.’ Journal of Personality and Social Psychology 61(2), 308.CrossRefGoogle Scholar
Spaulding, S. (2016). ‘Imagination through Knowledge.’ In Kind, A. and Kung, P. (eds), Knowledge Through Imagination, pp. 207–26. Oxford: Oxford University Press.CrossRefGoogle Scholar
Stanley, J. (2005). Knowledge and Practical Interests. Oxford: Oxford University Press.CrossRefGoogle Scholar
Sternisko, A., Cichocka, A. and Van Bavel, J.J. (2020). ‘The Dark Side of Social Movements: Social Identity, Non-Conformity, and the Lure of Conspiracy Theories.’ Current Opinion in Psychology 35, 16.CrossRefGoogle ScholarPubMed
Szpunar, K.K. and Schacter, D.L. (2013). ‘Get Real: Effects of Repeated Simulation and Emotion on the Perceived Plausibility of Future Experiences.’ Journal of Experimental Psychology: General 142(2), 323.CrossRefGoogle ScholarPubMed
Van Leeuwen, N. (2011). ‘Imagination is Where the Action is.’ Journal of Philosophy 108(2), 5577.CrossRefGoogle Scholar
Van Leeuwen, N. (2014). ‘Religious Credence is Not Factual Belief.’ Cognition 133(3), 698715.CrossRefGoogle ScholarPubMed
Van Leeuwen, N. (2017). ‘Do Religious ‘Beliefs’ Respond to Evidence? ' Philosophical Explorations 20 (suppl. 1), 5272.CrossRefGoogle Scholar
van Prooijen, J.-W., Ligthart, J., Rosema, S. and Xu, Y. (2022). ‘The Entertainment Value of Conspiracy Theories.’ British Journal of Psychology 13(1), 2548.CrossRefGoogle Scholar
Vosoughi, S., Roy, D. and Aral, S. (2018). ‘The Spread of True and False News Online.’ Science 359(6380), 1146–51.CrossRefGoogle ScholarPubMed
Williams, B. (1973). Problems of the Self. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Winter, J. (2019). ‘FBI Document Warns Conspiracy Theories are a New Domestic Terrorism Threat.’ Yahoo! News. https://news.yahoo.com/fbi-documents-conspiracy-theories-terrorism-160000507.html.Google Scholar