Assuming our simulation is not designed to auto-scale (and our Admins don’t know how to download more RAM), what kind of side effects could we see in the world if the underlying system hosting our simulation began running out of resources?

  • fidodo@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    That would only be a problem if you need dynamically allocated memory. It could be a statically allocated simulation where every atom is accounted for.

    • Seasoned_Greetings@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      8 months ago

      Given the whole “information can neither be created nor destroyed” aspect of atomic physics, taken literally, this theory checks out.

  • flashgnash@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    8 months ago

    If our entire universe is a simulation so are our laws of physics, in the parent universe running our simulation the universe might be powered by pure imagination and the concept of memory or CPU cycles or even electricity might not even exist

  • Pons_Aelius@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    8 months ago

    Simply put.

    We wouldn’t notice anything.

    Our perception of the world would be based only on the compute cycles and not on any external time-frame.

    The machine could run at a Million Billion hertz or at one clock-cycle per century and your perception of time inside the machine would be the same.

    Same with low ram, we would have no indication if we were constantly being paged out to a hard drive and written back to ram as required.

    Greg Egan gave a great explanation of this in the opening chapter of his Novel Permutation City

    • Feyr@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      Clearly wrong .

      Running out of ram happen all the time. We see something, store it, and that something also gets stored in ram. But if that second storage gets reaped by the oom, the universe reprocess it.

      Since it’s already in our copy, it cause weird issues. We call it Déjà Vu!

  • 𝘋𝘪𝘳𝘬@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    An automatic purge process will start to prevent this. It happened several times in the past. Last time between 2019-2022. It removed circa 7 million processes. With regular purges like this it is made sure that the resources are not maxed out before the admins can add more capacity.

  • Vlarb@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    I don’t necessarily believe this, but I’ll play along.

    To make it appear natural so we don’t notice, death is the first thing that comes to mind. So pandemics, disasters and wars that kill off beings on a large scale to free up memory. A globe with limited surface area seems ideal to stick us on to begin with, with anything outside of that sphere virtually impossible to access. The size of Earth could have been chosen because it fits comfortably within the RAM limits. If Earth is pushing the RAM limits, each planet could be hosted on its own server. So if we someday colonized Mars or the moon, the trip between would be like a server transfer making the RAM issues for interplanetary colonization inconsequential.

    If you want to really explore the fringes of this concept, maybe those in the simulation would see glitches that shouldn’t happen if it starts running out of RAM. UFOs, shadows, or synchronicities could become commonplace. People could randomly go catatonic or experience amnesia if they’re personally impacted. If it got out of control across the entire simulation, perhaps a hard reset would become necessary. It may even be a planned cycle of hard resets based on the anticipated maximum lifespan of the simulation before things start to get fucky due to memory errors. So power on = big bang, and hard reset something like big crunch or heat death of the universe.

  • person@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    You won’t notice anything. As things are deleted to save on memory all references to them are removed as well.

  • bjg13@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    Limitations of hardware resources show up as “Natural Limits”, like the speed of light, in the simulation. The amount of RAM consumed translates to the Hubble Bubble, or the greatest distance light could have traveled since the beginning of our universe, and moreso to the amount of matter and energy contained within it, which is a constant. Energy and matter cannot be created or destroyed, only changed forms allowed, so a set amount from the beginning.

  • espentan@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    Who knows… maybe we’ll experience pointless wars and massive inequality… selfish douchebags who only care about bolstering their ego might gain power… heck, maybe even the climate will slowly start changing for the worse.

  • HeartyBeast@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    Render distance would be reduced requiring us to come up with plausible theories to account for the fact that there is a limit to the size of the so-called ‘observable universe’

    • AnomalousBit@programming.dev
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      8 months ago

      I believe you are thinking in terms of a Turing-machine-like computer. I don’t think it’s possible today to “suspend” the bits in a quantum computer. I also don’t think it’s possible to know if the simulation could be paused (or even “added to” without losing its initial state).

  • Sentient Loom@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    Why would we run out of RAM? Is there new matter being created? It’s not like we’re storing anything. We will keep using the same resources.

    • Grimy@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      New human instances are being created, and as our society’s general education keeps going up, they demand more processing power.

      As our tech goes up, this has to be simulated as well. Not only things like telescopes and the LHC, but your computer who’s running a game world doesn’t actually exists and it’s the super computer who’s running it.

      Obviously, this is just a drop in the bucket for an entity that can make a fully simulated universe but the situation quickly becomes untenable if we start creating hyper advanced simulation as well, we are maybe only a few decades away.

      • Blue_Morpho@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        As our tech goes up, this has to be simulated as well

        Everything is made up of atoms/photons/etc. If every particle is tracked for all interactions, it doesn’t matter how those particles are arranged, it’s always the same memory.

        • Grimy@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          8 months ago

          Atoms and photons wouldn’t actually exist, they would be generated whenever we measure things at that level.

          Obviously, there’s many ways to interpret what kind of simulation it would be. A full simulation from the big band is fun but doesn’t make for good conversation since it would be indistinguishable from reality.

          I was thinking more of a video game like simulation, where the sim doesn’t render things it doesn’t need to.

          • Blue_Morpho@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            8 months ago

            where the sim doesn’t render things it doesn’t need to.

            That can’t work unless it’s a simulation made personally for you.

            • Grimy@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              8 months ago

              I don’t follow. If there are others it would render for them just as much as me. I’m saying it wouldn’t need to render at an automic level except for the few that are actively measuring at that level.

              • Blue_Morpho@lemmy.world
                link
                fedilink
                arrow-up
                0
                ·
                8 months ago

                Everything interacting is “measuring” at that level. If the quantum levels weren’t being calculated correctly all the time for you, the LEDs in your smartphone would flicker. All those microscopic effects cause the macroscopic effects we observe.

                • Grimy@lemmy.world
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  edit-2
                  8 months ago

                  If it was a simulation, there would be no need to go that far. We simulate physics without simulating the individual atoms.

                  None of it would be real, the microscopic effects would just be approximated unless a precise measurement tool would be used and then they would be properly simulated.

                  We wouldn’t know the difference.

    • blargerer@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      The nature of quantum interactions being probabilistic could be some resource saving mechanism in a higher order simulation.