Search for various other aumann references in vault

Is there some connection between martingale rule with diachronic updating and the importance of your beliefs about X not being predictable from your beliefs about Y and Z where believing XYZ clusters by demographic? I.e. is there an analogy with bayes requiring deference to ones future self since the EV of your future belief should be your current belief and Aumann agreement where your belief must line up with your neighbours belief? Are both martingale and Aumann agreement motivated by the same kind of (strict dominance/money pump?) arguments, do they have similar implications and similar exceptions?

  • This gets more interesting if it seems like we should have epistemic duties which asymmetric between the two cases, in that the epistemic role played by beliefs about our future beliefs shouldn’t play the same epistemic role as our beliefs about other people’s beliefs. Ties in to the objective/participant stance stuff The objective and participant stances
  • seems to connect to Pruss’s stuff with branching and the agent relativity of epistemic duties
  • you could probably also frame this stuff as epistemic anthropics (i.e. regular anthropics lmao)
  • See other refs to Aumann
  • How does the obvious objection to martingale that we shouldn’t price in expected future irrational belief change (e.g. religious conversion or forcible sci fi belief change) into our current beliefs apply to the other analogous principles?

Random aside, you should allegedly expect rational belief updating in an individual to be a random walk. Bc if you believe you’ll believe p = 0.8 a week from now, (modulo not expecting any head injuries), you should just update to this belief now. Why does this not seem like a requirement on social truth-seeking processes so much? Maybe it’s just because they’re not purely truth seeking: https://www.astralcodexten.com/p/heuristics-that-almost-always-work. Maybe it is a requirement idk