Evaporative cooling implies groups tend to radicalise via self-selection.
(Except this is false in cases of groups on the rise in popularity, and groups that are evangelistic, e.g. pop fandoms, political and religious movements.)

Idea innoculation = process whereby first exposure to a terrible version of an idea innoculates them from adopting better versions of the idea later on. This means we often have only one shot to persuade people, since a bad first attempt will innoculate them. So be hesitant to leap across wide inferential gaps.

Avoiding information cascades is the main reason to encourage forming inside views. Topics in Contemporary Epistemology Local and global epistemic virtues

Grain of truth/realizability problem is the same as unknown unknowns, non standard actions, and all point to the importance emphasised in virtue ethics, of having your attentional dispositions pick out the right set of oppourtunities and choices as salient to you. Can also be phrased in the language of mental frames, but this seems to put a dualism between the frame and it’s content, (the choices it selects), rather than a one level, reflexive attentional disposition.

https://kevindorst.substack.com/p/bayesian-injustice This post gives a super clear, uncontroversial example of individually-theoretically-rational decision making that can lead to structural injustice, which plausibly means we should inquire in special ways, i.e. zetetic considerations are distinct from purely theoretical epistemic considerations. What’s the generalisation of this? Tag: epistemic virtue.

Epistemic cul-de-sacs/attractors is a good metaphor.

Social epistemology See section on evaporative cooling.

Note that data saying epistemic bubbles don’t exist doesn’t mean echo chambers don’t exist

EC victims are often epistemically virtuous, the mechanisms of ECs just twist this into harm, it’s reverse-Mandeville. They make globally-virtuous policies locally vicious {and vice-versa like regular Mandeville??} Smart, P. R. 2017. ‘Mandevillian Intelligence.’ suggests a fable of the bees analogy for social epistemology. - oh cool lol, was wondering whether there are epistemic fables of the bees

Echo chambers: social epistemic structure in which other relevant voices have been actively excluded and discredited. In these cases, mere exposure to conflicting information is insufficient to breack the chamber - e.g. chamber-members might think it’s more likely that you’re cherry-picking info, reinforcing their prior on your maliciousness, and the leaders’ foresight. This means Sunstein-esque free and public forums approaches may even be counterproductive. - More specifically: an EC is ‘an epistemic community which: - creates a signfiicant disparity in trust between members and non-members. - This disparity is created by excluding non-members through epistemic discrediting, - This means non-members are actively assigned an epistemic discredit, like malice, dishonesty, or unreliability - while simultaneously amplifying members’ epistemic credentials. - i.e. members are assigned very high levels of trust, and these two aspects reinforce each other. - Finally, echo chambers are such that general agreement with some core set of beliefs is a prerequisite for membership, - where those core beliefs include beliefs that support that disparity in trust.’ (146) - Additional (non-essential, but common) traits: - ECs often involve a disagreement reinforcement mechanism, such as evidential pre-emption. If you tell your follower to expect disagreement/undermining in the form of people calling you a cultist, then when people call you a cultist, that actually reinforces belief in the cult-theory, and increases trust in the leader.
- The opposite of corroborative bootstrapping happens, where members treat independent sources of disagreement as stemming from a single source, instead of counting it independently. Conspiracy theories work this way, which explains some of their cultishness and bad epistemics.

  • Claims ECs normally form/are maintained intentionally because they foster power through epistemic control {This seems wrong to me, though ofc hard to litigate}
    • Fricker’s credibility gaps, (between perceived/actual credibility) and Mill’s active ignorance are both mechanisms some but not all ECs might use to do this. Opressed or apolitical groups can form ECs (149-50)