Assuming our simulation is not designed to auto-scale (and our Admins don’t know how to download more RAM), what kind of side effects could we see in the world if the underlying system hosting our simulation began running out of resources?
Who is to say that the sim needs ram. What if it were just a giant state machine where the current state only depends on the previous state. And the entire universe is the “ram”.
These answers are all really fun but I didn’t see anyone point out one thing: why should we assume that our creators’ “computer” architecture is anything remotely similar to our technology? I’m thinking of something like SETI—We can’t just assume that all other life is carbon-based (though evidently it’s a pretty good criterion). The simulation could be running on some kind of dark matter machine or some other exotic material that we don’t even know about.
Personally I don’t subscribe to the simulation theory. But if it were true, why would the system have any kind of limitation? I feel like if it can simulate everything from galactic superclusters down to strings vibrating in Planck Time, there are effectively no limits.
Then again, infinity is quite a monster, so what do I know?
all other life is carbon-based (though evidently it’s a pretty good criterion)
The short version is that the only other element that allows 4 covalent bonds is silicon, but nobody has been able to find a solvent that allows complex silicon-based molecules to form without instantly dissolving any structures they form.
How would you know what physics runs the host universe? For all we know, things like ram limitations doesn’t even apply there
Who knows… maybe we’ll experience pointless wars and massive inequality… selfish douchebags who only care about bolstering their ego might gain power… heck, maybe even the climate will slowly start changing for the worse.
Limitations of hardware resources show up as “Natural Limits”, like the speed of light, in the simulation. The amount of RAM consumed translates to the Hubble Bubble, or the greatest distance light could have traveled since the beginning of our universe, and moreso to the amount of matter and energy contained within it, which is a constant. Energy and matter cannot be created or destroyed, only changed forms allowed, so a set amount from the beginning.
Render distance would be reduced requiring us to come up with plausible theories to account for the fact that there is a limit to the size of the so-called ‘observable universe’
One word
Alzheimers
Why would we run out of RAM? Is there new matter being created? It’s not like we’re storing anything. We will keep using the same resources.
New human instances are being created, and as our society’s general education keeps going up, they demand more processing power.
As our tech goes up, this has to be simulated as well. Not only things like telescopes and the LHC, but your computer who’s running a game world doesn’t actually exists and it’s the super computer who’s running it.
Obviously, this is just a drop in the bucket for an entity that can make a fully simulated universe but the situation quickly becomes untenable if we start creating hyper advanced simulation as well, we are maybe only a few decades away.
Human instances still run on the same underlying physics. No further RAM is needed.
As our tech goes up, this has to be simulated as well
Everything is made up of atoms/photons/etc. If every particle is tracked for all interactions, it doesn’t matter how those particles are arranged, it’s always the same memory.
Atoms and photons wouldn’t actually exist, they would be generated whenever we measure things at that level.
Obviously, there’s many ways to interpret what kind of simulation it would be. A full simulation from the big band is fun but doesn’t make for good conversation since it would be indistinguishable from reality.
I was thinking more of a video game like simulation, where the sim doesn’t render things it doesn’t need to.
where the sim doesn’t render things it doesn’t need to.
That can’t work unless it’s a simulation made personally for you.
I don’t follow. If there are others it would render for them just as much as me. I’m saying it wouldn’t need to render at an automic level except for the few that are actively measuring at that level.
Everything interacting is “measuring” at that level. If the quantum levels weren’t being calculated correctly all the time for you, the LEDs in your smartphone would flicker. All those microscopic effects cause the macroscopic effects we observe.
If it was a simulation, there would be no need to go that far. We simulate physics without simulating the individual atoms.
None of it would be real, the microscopic effects would just be approximated unless a precise measurement tool would be used and then they would be properly simulated.
We wouldn’t know the difference.
The nature of quantum interactions being probabilistic could be some resource saving mechanism in a higher order simulation.
Without knowing the nature of the simulation, we don’t even know if there is an analogue for RAM or limited memory. Maybe you could walk in and out a door repeatedly and then glitch into a locked room. Maybe the whole thing would crash - our programs tend to do this when memory runs out. Maybe everything would just get paused or “adjusted down” to fit the restriction. The crash, pause or throttle wouldn’t be apparent to us “on the inside” at all if it were happening.
That’s why history repeats itself. It’s doing that more frequently these days because there’s more people remembering more things.
We are the RAM
We can see that already when something approaches the speed of light: time slows down for it.
This simplification horribly misunderstands what time-dilation is, and I love it.
I have a running theory that that’s also what’s going on with quantum physics, because I understand it so poorly that it just seems like nonsense to me. So in my head, I see it as us getting into some sort of source code we’re not supposed to see, and on the other side some programmers are going “fuck I don’t know, just make it be both things at once!” and making it up on the fly.
My vm is running out of ram.
We go to sleep and it clears
An automatic purge process will start to prevent this. It happened several times in the past. Last time between 2019-2022. It removed circa 7 million processes. With regular purges like this it is made sure that the resources are not maxed out before the admins can add more capacity.
I know exactly what would happen. It…uhh, what was I gonna say again? It just slipped out, it’ll come back…