It’s not about a single app, it’s about multitasking without having to reload apps.
At various times I’ve juggled between 4 apps at once on my phone. Say something like Messaging, Firefox, maybe a lemmy app, and Bitwarden for logging into something.
Yeah but… I have a Pixel 6 with 8gb RAM. I just checked the memory usage. Over the past day, average usage has been about 4gb. And the biggest user is Android itself at 1.8gb. The next biggest is Instagram at 285mb, and that’s with me scrolling videos when I’m bored.
Adding hardware just for the sake of it just means it’s costing people for features they almost certainly won’t use.
I think it’s likely you would have access to all of it because the testing in the article clearly shows the kernel can see the memory. Thus, the graphene kernel should be able to use it.
Seriously, what app is capable of saturating 13GB?
It’s not about a single app, it’s about multitasking without having to reload apps.
At various times I’ve juggled between 4 apps at once on my phone. Say something like Messaging, Firefox, maybe a lemmy app, and Bitwarden for logging into something.
Me too, and I’ve never had an issue juggling those apps on my Pixel 6 with 8gb RAM.
At that point I go to my PC.
Adding to what others said it is beneficial to load an app from RAM.
Yeah but… I have a Pixel 6 with 8gb RAM. I just checked the memory usage. Over the past day, average usage has been about 4gb. And the biggest user is Android itself at 1.8gb. The next biggest is Instagram at 285mb, and that’s with me scrolling videos when I’m bored.
Adding hardware just for the sake of it just means it’s costing people for features they almost certainly won’t use.
I think it’s more to do with app switching without having to reload or losing your state in minimized apps.
If this is enabled by default it’ll certainly be needed, especially in cases like multitasking.
https://android-developers.googleblog.com/2024/08/adding-16-kb-page-size-to-android.html?m=1
From the linked article. So I doubt that the larger page size is the (only) reason for 16G ram. AI is the more likely reason.
Ooh I understand that it’s for AI, I just meant that more RAM would certainly help in this case.
Ollama server running in termux
deleted by creator
What if we use graphene OS I would guess that RAM would not be longer use and you can use the full 16GB.
I think it’s likely you would have access to all of it because the testing in the article clearly shows the kernel can see the memory. Thus, the graphene kernel should be able to use it.
deleted by creator