• halcyoncmdr@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      They have 16GB of RAM physically, 3GB is reserved for AI. So yeah, only 13GB is usable by regular apps, even if you don’t care about any of the AI stuff.

        • halcyoncmdr@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          It’s not about a single app, it’s about multitasking without having to reload apps.

          At various times I’ve juggled between 4 apps at once on my phone. Say something like Messaging, Firefox, maybe a lemmy app, and Bitwarden for logging into something.

        • seang96@spgrn.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          Adding to what others said it is beneficial to load an app from RAM.

          1. Loads faster for user convenience
          2. Lower power usage so you’ll have better battery life
          3. More RAM reduces disk writes for cache / temporary files / from cold started apps that could write to storage.
          • catloaf@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            Yeah but… I have a Pixel 6 with 8gb RAM. I just checked the memory usage. Over the past day, average usage has been about 4gb. And the biggest user is Android itself at 1.8gb. The next biggest is Instagram at 285mb, and that’s with me scrolling videos when I’m bored.

            Adding hardware just for the sake of it just means it’s costing people for features they almost certainly won’t use.

          • cron@feddit.org
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            When Android uses this larger page size, we observe an overall performance boost of 5-10% while using ~9% additional memory.

            From the linked article. So I doubt that the larger page size is the (only) reason for 16G ram. AI is the more likely reason.

            • Unreliable@lemmy.ml
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 months ago

              Ooh I understand that it’s for AI, I just meant that more RAM would certainly help in this case.

            • henfredemars@infosec.pub
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 months ago

              I think it’s likely you would have access to all of it because the testing in the article clearly shows the kernel can see the memory. Thus, the graphene kernel should be able to use it.