The theory, which I probably misunderstand because I have a similar level of education to a macaque, states that because a simulated world would eventually develop to the point where it creates its own simulations, it's then just a matter of probability that we are in a simulation. That is, if there's one real world, and a zillion simulated ones, it's more likely that we're in a simulated world. That's probably an oversimplification, but it's the gist I got from listening to people talk about the theory.
But if the real world sets up a simulated world which more or less perfectly simulates itself, the processing required to create a mirror sim-within-a-sim would need at least twice that much power/resources, no? How could the infinitely recursive simulations even begin to be set up unless more and more hardware is constantly being added by the real meat people to its initial simulation? It would be like that cartoon (or was it a silent movie?) of a guy laying down train track struts while sitting on the cowcatcher of a moving train. Except in this case the train would be moving at close to the speed of light.
Doesn't this fact alone disprove the entire hypothesis? If I set up a 1:1 simulation of our universe, then just sit back and watch, any attempts by my simulant people to create something that would exhaust all of my hardware would just... not work? Blue screen? Crash the system? Crunching the numbers of a 1:1 sim within a 1:1 sim would not be physically possible for a processor that can just about handle the first simulation. The simulation's own simulated processors would still need to have their processing done by Meat World, you're essentially just passing the CPU-buck backwards like it's a rugby ball until it lands in the lap of the real world.
And this is just if the simulated people create ONE simulation. If 10 people in that one world decide to set up similar simulations simultaneously, the hardware for the entire sim reality would be toast overnight.
What am I not getting about this?
Cheers!
If our simulated universe's framerate drops because of the extra compute required for the nested simulations we're running, would we even notice? It stands to reason that everything would slow down, including our perception of the universe.
For all we know, the smallest unit of time we can measure in our simulated existence could take an hour or more to render outside the simulation. To us, it's nearly instantaneous.
EVE Online (Video Game) uses a similar technology to handle large fights with thousands of players.
Or the frame quality drops, and we're all Jerry. "My man!"
yes 👉
Looking good :)
Slow down!
Just watch for graphics tearing. On a completely unrelated note, why are earthquake zones so heavily populated?
It’s a scenario that Neal Stephenson covers in his book “Fall; or, Dodge in Hell”. Interesting read, although it’s one of my least favorite books of him and I liked the first book in the “Dodge” series a lot better.
Cool, I'll have to add that to my list. Thanks for the recommend!