The Specific Anti-"The Generalized Anti-Boltzmann principle" Post
I LOVE ANTHROPICS I LOVE ANTHROPICS I LOVE ANTHROPICS I LOVE ANTHROPICS
You might say that the space of all possible conscious experiences is the wrong reference class, and you might be right. After all why aren’t you an arthropod? Most brains are arthropod brains so it is very surprising on priors that you find yourself in a human brain having human thoughts rather than arthropod ones. I’m actually not sure if this should count as evidence for the Yudkowsky view that almost all animals are not conscious but it doesn’t feel like it should. Anyway what is the right reference class?
The right reference class is the set of all experiences identical to my current experiences and no others.
So long as the universe lasts forever or basically forever an infinite or basically infinite number of Boltzmann brains occur1. So long as some fraction have identical experiences to mine there will be a finite number of normal world "reading your blog" experiences and an infinite or basically infinite number of Boltzmann world "reading your blog" experiences which are exactly identical.
Having Adequately Defended Boltzmann Brains,
This relates to the broader point that you can't reason from "most observers like me[broadly construed]”. For instance, you can't say "well, if humans are going to be around for the next long long time, that means almost all humans will live in the future. I don't find myself in the future, therefore the apocalypse is near!" You can’t say that because future humans don't have the experiences you are having. When they look at calendars or technology or whatever they see calendars that say Universal Century 0079, not 2025 CE, and they see cool Gundams instead of tanks. So no, "you" couldn't just as easily have found yourself in the future, so you can't reason from the fact that you aren't in the future. The only minds that could give rise to your current experiences are:
Your exact mind (and duplicates thereof)
A really lucky cloud of gas
Me, perfectly simulating you in my head before deciding if I wanna cooperate with you
Melkor's giga-torture compute cluster which is currently torturing 10^10 copies of you
And 3-5 obscure edge cases that exclusively crop up in Bostrom papers I haven't read (I saw a cool iceberg the other day, I recommend reading phil papers while cuddling beautiful women, both experiences are improved)
Ignoring the other 4 because they're hard to think about, you shouldn't be at all surprised that given that you are having your experience, you find yourself now instead of later. There was a probability of basically 1! People in the utopian Future don’t have experiences like yours!
Aha you say! Sure that's true given I am having my experience, but is it still true given I am having any experience? Or given I am having any experience of reading the blog post of someone who is WRONG on the INTERNET!?
I Lied, We Are Talking About Boltzmann Brains Again
No. If you could condition on either of those things the previous argument does not hold. However you can't condition off that and get meaningful answers, so I’m still right nya nya nya.
You cannot tell based on your experience if you are Identical-to-real-you Boltzmann Brain 1 (BBI1), BBI2, BBI3, or ordinary you, so you might as well be indifferent between these options. It isn't the case that you literally could have been BBI1, you either are or you aren't. There are no dice that choose which Boltzmann brain “gets” which experience (I am assuming this for sort of bad reasons. See end of post for more). Each gets an identical experience. It is purely epistemic uncertainty we are looking at here, not random placement.
Obviously, you only have epistemic uncertainty between being you and BBIs. There is no epistemic uncertainty between being you and being observably fuzzy Boltzmann brain 1 (BBF1)
Since I have have now ruled out being epistemically uncertain about being a BBF and ruled out being randomly placed into any set of brains, I therefore conclude that I don’t know if I’m a BBI or not.
The End of Post Justification for Rat Crimes:
Doesn't it just seem really weird if, rather than every particle arrangement identical to your brain giving rise to an identical conscious experience with no additional features, there is an additional feature and that one is a random pick of "which" experience maps to it? Like idk one way this could work would be that there’s a bunch of identical souls that could map to identical brains, and they map at random, such that you literally could have been someone else? That'd be weird to me, so I'm assuming it's not real.
I'm not sure there's any principled reason such a property couldn't exist though. So if you want to say anthropic reasoning is valid because you reckon such a property does exist and randomizes between the exact right set of brain-experience mappings, I lack the might to stop you today. I just think it's pretty silly.
I'd like to say this doesn't matter for the initial disagreement but it totally does and is a possible crux of it (assuming that finding yourself in a privileged infinite set of real-like you-s is surprising relative to finding yourself in an infinite set of infinite sets of obviously fake you-s. I'm still not sure how to reason about infinities. I will incant the words "measure theory" and "finitism wrt actual quantities like time" and hope this protects me from witches, vampires, and mathematicians.
Also I’m totally leaving out how the universe plausibly2 being infinite and having infinite matter already means there’s plausibly infinite real copies of you. This is because I don’t know how to do math.
I mean, it seems plausible at least, maybe it lasts forever but experience is substrate dependent and the universe runs out of carbon and spontaneous particle/antiparticle generation stops and and and [whatever else would have to be true for an infinite duration universe to not have infinite Boltzmann brains]
Nick Bostrom said it
