Is Bayesianism Susceptible to the Mail-Order Prophet Scam? — LessWrong

Status
✓ complete
Domain
www.lesswrong.com
Archived
2026-04-02 02:50:20

Plaintext Content

(2.0 KB)
Comment by Myron Hedderson - Short answer: Yes.

Longer answer: Two Bayesians who start out with the same prior probabilities, and see the same evidence, should update their posterior probabilities in the same way, and so their mental models should stay consistent with each other. Two Bayesians who start out with different prior probabilities, but see the same evidence, should update their posterior probabilities in ways that are predictable to each other, and in line with the evidence - that is, if one reasoner (A)'s prior probability that (for example) General Relativity is true was high, while another (B)'s was low, then when an experiment is run which provides evidence for general relativity, A's estimates of General Relativity's likelihood of being true will change less than B's (because B's priors were more wrong), but both will update in a direction and to an extent that is predictable to either of them. As they see more and more of the same evidence, their models of the world should converge.

This is all assuming an ideal Bayesian reasoner with practically-unlimited computing power who doesn't cheat or decide not to reason according to Bayesian rules when it becomes inconvenient, and humans don't meet those constraints. But, there's math to say how much you should update given particular evidence. So:

[...]

Yep. "How to weight credence" is a bit unclearly stated, but there's Bayes' formula, which tells you how to update your probabilities based on evidence, and that might be what you're getting at?

Which is (one reason) why bother with Bayesianism at all. It's a method of approaching consensus when working under uncertainty. It's kind of an "agreeing to the rules of the game" situation, where "the rules" are a mathematical equation that says how probabilities must change when people are disagreeing (and "must" here carries the same level of mathematical strength as saying "2+2 must equal 4", it's not a thing that was decided by committee) - if for example you say it's 95% unlikely/5

Archived Page Preview (Full Archive)

Page Captures

PDF Document

📄View PDF

552.9 KB

MHTML Archive

📦Download MHTML

937.9 KB

Archived Files

TypeFileSizeDedupActions
HTML (Original)raw.html942.2 KB
HTML (With Banner)view.html945.7 KB
HTML (Full Archive)complete.html10.2 MB
PDFpage.pdf552.9 KB
MHTML Archivecomplete.mhtml937.9 KB

Total Size: 13.5 MB

Archive Jobs (5)

JobStatusStartedCompletedDurationDetails
Fetch HTML✓ completed2026-04-02 02:49:452026-04-02 02:49:472.0s
Monolith✓ completed2026-04-02 02:49:472026-04-02 02:49:525.0s10661543 bytes
Screenshot✗ failed2026-04-02 02:49:522026-04-02 02:50:0311.0sScreenshot is empty (0 bytes) - page may not have
PDF✓ completed2026-04-02 02:50:032026-04-02 02:50:1310.0s566199 bytes
MHTML✓ completed2026-04-02 02:50:132026-04-02 02:50:207.0s960372 bytes

Archive Metadata

Archive ID565
Link ID565
Created At2026-04-02 02:49:43
Statuscomplete
Retry Count0
Is NSFWfalse
Content Typetext
Link Info
Original URLhttps://www.lesswrong.com/posts/FpEfERKZzGwapvnaH/is-bayesianism-susceptible-to-the-mail-order-prophet-scam?commentId=rSyoGgMmkTu6owAjT
Normalized URLhttps://www.lesswrong.com/posts/FpEfERKZzGwapvnaH/is-bayesianism-susceptible-to-the-mail-order-prophet-scam?commentId=rSyoGgMmkTu6owAjT
Domainwww.lesswrong.com
Last Archived At2026-04-02 02:50:20

Compare with Another Archive

Enter an archive ID to compare content differences.