• AlecSadler@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    I’ll admit I don’t use Macs, so maybe they are more efficient than the Linux and windows machines I work off…

    …but I typically use machines with 64GB and recently upgraded my personal machine to 128GB. I still swap about 50GB to my SSD from time to time.

    And I’m not doing heavy graphic design or movie editing stuff.

    I cannot fathom for the life of me how 8GB would ever be feasible.

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Dude, that’s how much RAM I used to have on a super high-end dev box at work with 56 cores. It was very helpful for compiling Chrome. WTF are you doing with a personal machine that needs that much RAM?

      • AlecSadler@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        I mean it’s my personal machine but I am a software engineer consultant/contractor so I use it for work, too.

        • lolcatnip@reddthat.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          Ok fair enough. It’s just surprising to see someone say that. The standard-issue dev machine where I work is a laptop with 32 GB.

      • AlecSadler@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        I just said I’m not doing graphic design or movie editing. I typically have 10 different browser profiles open to separate data / bookmarks, maybe 8 email accounts in tabs and Outlook (if not on Linux), 4-8 VS code windows, a mix of jetbrains rider or visual studio instances, a smattering mix of postman/SQL server/azure data studio/thunder client, among other things like PDFs and documents. And then multiple docker containers and other local running servers.

        The swap usually comes in when I’m parsing a data file or something.

    • MacN'Cheezus@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      I have a five year old MBP here with 16 gigs of RAM and it runs the latest version of macOS. I can run multiple web browsers with dozens of open tabs, VS Code, an LLM, and a video editing app on it, all simultaneously, without breaking a sweat.

      IDK what Apple’s secret sauce is but their shit just works better than everyone else’s, that’s a fact.

  • mechoman444@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    I mean. It makes sense. The vast majority of people buying apple computers are loyalists or people that simply need an Internet/word processor.

    And if you want to develop in apple then you have to spend a massive premium for their higher end hardware.

    • AdrianTheFrog@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Their CPUs are actually really good now, when the apps are actually optimized for them. Especially in single core, they are very competitive with top Intel or AMD chips while being way more power efficient.

      ex: in Geekbench 5.1 single core the M2 max gets 1967 points (85%) compared to 2311 points from the 7950X3D and 2369 from the 14900k. The M2 max (12 cores (8 p + 4 e), 12 threads) can draw a maximum of 36 watts while the 7950X3D (16 cores, 32 threads) can draw around 250 watts, and the 14900k (16 cores (8 p + 16 e), 32 threads) can draw around 350 watts.

      Apple’s GPUs are definitely lacking though, in terms of performance.

  • KillingTimeItself@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    3 months ago

    what a weird title bro, of course they argue in favor of it, they sell the fucking hardware that they created. Be a little weird if they just argued against it after spending billions designing and manufacturing it.

    Regardless, i still can’t believe apple thought 8GB minimum was ok, genuinely baffling to me.

  • anhydrous@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    My X220 and T520 each have 16GB. The designed max was actually “only” 8GB, but it turns out 16 GB actually works. I replaced the RAM modules myself without asking Lenovo for permission. Those models came out in 2011.

    • jaschen@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      3 months ago

      My HP Omen 17" was designed for a maximum of 32GB ram. I’m currently running 64GB on it.

    • Valmond@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Yeah lol my thinkcentre with a 6gen intel had only 8GB (I paid under 100€ for it) so I went shopping to double that on a second hand site, but the price for 4, 8 or the 16GB ddr4 ram stick (sodimm, there seems to be a flood of used ones) I bought was about the same, like 30€ shipping included, so now I got 24GB.

    • Duamerthrax@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      This was also true for Apple computers before they started soldering the ram in place. I remember going way over spec in my old G4 tower. Hell, I doubt the system would crash if you found larger ram chips and soldered them in.

      • Klause@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        I doubt the system would crash if you found larger ram chips and soldered them in.

        You can’t even swap components with official ones from other upgraded models. Everything is tied down with verification codes and shit nowadays. So I doubt you could solder in new ram and get it to work.

  • kamen@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    Yeah, sure. Even if what they say about the OS resource usage is true, it’s only a fraction of the total usage. A lot of the multiplatform software will use the same resources regardless of the OS. Many apps eat RAM for breakfast, doesn’t matter if it’s content creation or software development. Heck, even smartphones these days have have this much or more RAM.

    I won’t argue, I just won’t buy an Apple product in the near future or probably ever at all.

    • KillingTimeItself@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      buys [insert price] laptop, top of the line, flagship, custom silicon, built ground up to be purpose specific.

      Opens final cut pro: crashes

      ok…

      • Retrograde@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Especially paired with Apple’s 128gb integrated, non replaceable hard drives. Whoops you installed all of Microsoft office? Looks like you have no room to save any documents :(

        • KillingTimeItself@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          ah yes, we can’t forget the proprietary non controller based nvme drives that use m.2 but arent actually nvme drives, they’re just flash.

            • KillingTimeItself@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              3 months ago

              it’s NVME in the sense that it’s non volatile flash, probably even higher quality than most existing NVME ssds out there today.

              The thing is that it literally just the flash. On a card with an m.2 pin out, that fits into an m.2 slot, it doesn’t have a storage controller or any standardized method of communication, that already exists. It’s literally a proprietary non standard standard form factor SSD.

              The controller is integrated onto the silicon chip die itself, there is no storage controller on the storage itself.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    8GB RAM is what my phone has.

    Having that in a laptop shows what they think of people buying their kit. They think you’re only buying it so you can type easier on Facebook.

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Yeah, but if you have plenty of RAM on Android, there’s a chance those apps you left in the background will still be running when you go back to them, rather than doing the usual Android thing of just restarting them.

        • KillingTimeItself@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          nothing that requires 8GB of ram lol.

          I’ve played the entirety of java minecraft on an old thinkpad with 4GB of ram. It didn’t crash (i dont use swap)

          There literally shouldn’t be anything capable of using that much memory.

          • IthronMorn@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            What about running a chrooted nix install and using a vnc to connect to it? While web browsing and playing a background video? Just because you don’t use your ram doesn’t mean others don’t. And no, I don’t use all my ram, but a little overhead is nice.

            • KillingTimeItself@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 months ago

              on a phone? I mean i suppose you could do that, but VNC is not a very slick remote access tool for anything other than, well, remote access. The latency and speed over WIFI would be a significant problem, i suppose you could stream from your phone to your TV, but again, most TVs that exist today are smart TVs so literally a non issue.

              my example here was using a computer rather than a phone, to show that even desktop computing tasks, don’t really use all that much ram.

              • IthronMorn@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                0
                ·
                3 months ago

                Well, then by that logic, since desktop computing tasks don’t really use all that ram: we shouldn’t need more than 8GB in a desktop ever. Yes, my example was a tad extreme, vnc-ing into your own VM on your phone, but my point was rather phones are becoming capable and replacing traditional computers more and more. A more realistic example is when I was using Samsung Dex the other day I had 80ish chrome tabs open, a video chat, and a terminal ssh’d into my computer fixing it. I liked the overhead of ram I had above me. Was I even close to 12GB? No. But it gave me room if I wanted another background program or had to spin something up quickly without disrupting my flow or lagging out/crashing.

          • greedytacothief@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            Is this bait? Because like, you could be rendering, simulating, running virtual machines. Lots of stuff that aren’t web browsers also eat ram

              • dustyData@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                3 months ago

                My man, have you been to selfhosted? People are using smart phones for all kinds of crazy stuff. They are basically mini ARM computers. Particularly the flagships, they can do many things like editing video, rendering digital drawings, after they end their use life they can host adguards, do torrent to NAS, host nextcloud. You name it.

                • AdrianTheFrog@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  3 months ago

                  It sounds a lot more cost effective to get a used mini-pc than a flagship phone for any sort of server stuff.

                • pythonoob@programming.dev
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  3 months ago

                  Something like the Samsung Dex app that basically turns your phone into a mini computer with kbm and a monitor wouldn’t bee too bad tbh for most people. Take all your shit with you in your pocket and dock it at home or at work or whatever.

                • KillingTimeItself@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  3 months ago

                  yeah, i literally selfhost a server, running like 8 different services. I’m quite acclimated to it by now. Using a phone for this kind of thing is the wrong device. A chromebook is going to be a better alternative. You can probably get those cheaper anyway.

                  A big problem with phones is that they just aren’t really designed for that kind of thing, you leave a phone plugged in constantly and it’s going to spicy pillow itself. Let alone even trying to do that on something that isn’t an android. I cannot imagine the hell that self hosting on an android would be, let alone on an iphone.

                  I could see a usecase for it as a network relay in the event that you need a hyper portable node or something. GLHF with the dongling if you need those.

                  Unfortunately, if you already have a server, it’s going to be better to just spin up a new task on that server, as the cost of running a new device is going to outweight the cost of just using an existing one that’s already running. Also, you can get stuff like a raspi or le potato for pretty cheap also. not very powerful, but probably more utility, especially given the IO.

            • AdrianTheFrog@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 months ago

              you could be rendering, simulating, running virtual machines

              On a phone? I guess you could, although 4gb is probably enough for any video game that any amount of people use.

              • woelkchen@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                3 months ago

                People use phone apps for photo and video editing these days. The common TikTok kid out there doesn’t use Adobe Premiere on a desktop workstation.

                Phone apps often are desktop applications with a specialized GUI these days.

                • KillingTimeItself@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  3 months ago

                  i mean yeah, but even then those aren’t significant filters, and what makes you think that tiktok isn’t running a render farm somewhere in china to collect shit tons of data? They’re already collecting the data, might as well provide a rendering service to make the UI nicer, but i don’t use tiktok so don’t quote me on it.

                  Those are also all built into tiktok, and im pretty sure tiktok doesn’t require 8GB of ram to open.

                • woelkchen@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  3 months ago

                  it’s not like most people are chronically browsing the web on their phones.

                  Yes, they do.

              • greedytacothief@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                3 months ago

                I was trying to mention things that weren’t just web browsers. Since it seemed the comment was about programs that use more ram than they seemingly need to.

                Edit: There’s like photogrammetry and stuff that happens on phones now!

                • AdrianTheFrog@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  3 months ago

                  There’s like photogrammetry and stuff that happens on phones now!

                  No, the photogrammetry apps all use cloud processing. The LIDAR ones don’t, but that’s only for Apple phones and the actual mesh quality is pretty bad.

                • KillingTimeItself@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  3 months ago

                  i suppose photo editing would be one? Maybe? I’m not sure how advanced photo editing would be on mobile, it’s not like you’re going to load up the entirety of GIMP or something.

                  As for photogrammetry, i’m not sure that would consume very much ram. It could, i honestly don’t think it would be that significant.

  • sugar_in_your_tea@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    Well yeah, they’re enough to meet the minimum use cases so they can upsell most people on expensive RAM upgrades.

    That’s why I don’t buy laptops with soldered RAM. That’s getting harder and harder these days, but my needs for a laptop have also gone down. If they solder RAM, there’s nothing you can (realistically) do if you need more, so you’ll pay extra when buying so they can upcharge a lot. If it’s not soldered, you have a decent option to buy RAM afterward, so there’s less value in upselling too much.

    So screw you Apple, I’m not buying your products until they’re more repair friendly.

    • akilou@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      I had a extra stick of RAM available the other day so I went to open my wife’s Lenovo to see if it’d take it and the damn thing is screwed shut with the smallest torx screws I’ve ever seen, smaller than what I have. I was so annoyed

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        I bought the E495 because the T495 had soldered RAM and one RAM slot, while the E495 had both RAM slots replacable. Adding more RAM didn’t need any special tools. Newer E-series and T-series both have one RAM slot and some soldered RAM. I’m guessing you’re talking about one of the consumer lines, like the Yoga series or something?

        That said, Lenovo (well, Motorola in this case, but Lenovo owns Motorola) puts all kinds of restrictions to your rights if you unlock the bootloader of their phones (PDF version of the agreement). That, plus going down the path of soldering RAM gives me serious concerns about the direction they’re heading, so I can’t really recommend their products anymore.

        If I ever need a new laptop, I’ll probably get a Framework.

        • tal@lemmy.today
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          3 months ago

          I keep looking at the Frameworks, because I’m happy with the philosophy, but the problem is that the parts that they went to a lot of trouble to make user-replaceable are the parts that I don’t really care about.

          They let you stick a fancy video card on the thing. I’d rather have battery life – I play games on a desktop. If they’d stick a battery there, that might be interesting.

          They let you choose the keyboard. I’m pretty happy with current laptop keyboards, don’t really need a numpad, and even if you want one, it’s available elsewhere. I’ve got no use for the LED inserts that you can stick on the thing if you don’t want keyboard there.

          They let you choose among sound ports, Ethernet, HDMI, DisplayPort, and various types of USB. Maybe I could see putting in more USB-C then some other vendors have. But the stuff I really want is:

          • A 100Wh battery. Either built-in, or give me a bay where I can put more internal battery.

          • A touchpad with three mechanical buttons, like the Synaptics ones that the Thinkpads have.

          The fact that they aren’t soldering in the RAM and NVMe is nice in that they’re committing to not charging much more then market rate, so I guess they should get credit for that, but they are certainly not the only vendor to avoid soldering those.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            Yeah, ThinkPad used to allow either a CD drive or an extra battery in their T-series. They stopped offering the extra battery and started soldering RAM, so I got the cheaper E-series (might as well save cash if I can get what I want).

            I think there’s a market there. Have an option for a hot-swap battery to bring on trips and use the GPU at home. Serious travelers could even bring a spare battery to keep working for longer.

            touchpad with three mechanical buttons

            Yes please! And give me the ThinkPad nipple as well. :) If they had those, I’d not bother with even looking at Lenovo. The middle button is so essential to my normal workflow that any other laptop (including my fancy MacBook for work) feels crappy.

            I’m guessing the things they made modular are just the low hanging fruit. It’s pretty easy to make a USB-C to whatever port, it’s a bit harder to make a pluggable battery in a slot that can also support a GPU.

      • tal@lemmy.today
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        smallest torx screws I’ve ever seen

        Torx is legitimately useful for small screws, because it’s more resistant to stripping than Phillips.

        Now, if they start using Torx security bits or some oddball shapes, then they’re just being obnoxious. But there are not-trying-to-obstruct-the-customer reasons not to use Phillips.

    • BorgDrone@lemmy.one
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      That’s why I don’t buy laptops with soldered RAM.

      In my opinion disadvantages of user-replaceable RAM far outweigh the advantages. The same goes for discrete GPUs. Apple moved away from this and I expect PC manufacturers to follow Apple’a move in the next decade or so, as they always do.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Here’s how I see the advantages of soldered RAM:

        • better performance
        • less risk of physical damage
        • more energy efficient
        • smaller

        The risk of physical damage is so incredibly low already, and energy use of RAM is also incredibly low, so neither of those seem important.

        So that leaves performance, which I honestly haven’t found good numbers for. If you have this, I’m very interested, but since RAM speed is rarely the bottleneck in a computer (unless you have specific workloads), I’m going to assume it to be a marginal improvement.

        So really, I guess “smaller” is the best argument, and I honestly don’t care about another half centimeter of space, it’s really not an issue.

        • BorgDrone@lemmy.one
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          So that leaves performance, which I honestly haven’t found good numbers for. If you have this, I’m very interested, but since RAM speed is rarely the bottleneck in a computer (unless you have specific workloads), I’m going to assume it to be a marginal improvement.

          This is where you’re mistaken. There is one thing that integrated RAM enables that makes a huge difference for performance: unified memory. GPUs code is almost always bandwidth limited, which why on a graphics card the RAM is soldered on and physically close to the GPU itself, because that is needed for the high bandwidth requirements of a GPU.

          By having everything in one package, CPU and GPU can share the same memory, which means that you eliminate any overhead of copying data to/from VRAM for GPGPU tasks. But there’s more than that, unified memory doesn’t just apply to the CPU and GPU, but also other accelerators that are part of the SoC. What is becoming increasingly important is AI acceleration. UMA means the neural engine can access the same memory as the CPU and GPU, and also with zero overhead.

          This is why user-replaceable RAM and discrete GPUs are going to die out. The overhead and latency of copying all that data back and forth over the relatively slow PCIe bus is just not worth it.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            3 months ago

            Do you have actual numbers to back that up?

            The best I’ve found is benchmarks of Apple silicon vs Intel+dGPU, but that’s an apples to oranges comparison. And if I’m not mistaken, Apple made other changes like a larger bus to the memory chips, which again makes comparisons difficult.

            I’ve heard about potential benefits, but without something tangible, I’m going to have to assume it’s not the main driver here. If the difference is significant, we’d see more servers and workstations running soldered RAM, but AFAIK that’s just not a thing.

            • Turun@feddit.de
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 months ago

              I understand the scepticism, but without links of what you’ve found or which parts in particular you consider dubious claims (ram speed can be increased when soldered, higher speeds lead to better performance, etc) it comes across as “i don’t believe you, because i choose to not believe you”

              LTT has made a comparison video on ram speeds: https://www.youtube.com/watch?v=b-WFetQjifc

              Do you need proof that soldered ram can be made to run faster?

              • sugar_in_your_tea@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                0
                ·
                3 months ago

                Yes, and the results from that video (i assume, I skimmed it, but have watched similar videos) is that the difference is negligible (like 1-10FPS) and you’re usually better off spending that money on something else.

                I look at the benchmarks between the Intel MacBook Pro and the M1 MacBook Pro, and both use soldered RAM, yet the M1 gets so much better performance, even on non-GPU tasks (e.g. memory-heavy unit tests at work went from 3-5min to 45-50sec from latest Intel to M1). Docker build times saw a similar drop. But it’s hard for me to know what the difference is between memory vs CPU changes. I’d have to check, but I’m guessing there’s also the DDR4 to DDR5 switch, which increases memory channels.

                The claim is that proximity to the CPU explains it, but I have trouble quantifying that. For me, a 1-10FPS drop isn’t enough to reduce repairability and expandability. Maybe it is for others though, but if that’s the difference, that’s a lot less than the claims they seem to make.

                • Turun@feddit.de
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  3 months ago

                  The video has a short section on productivity (i.e. rendering or compiling). That part is probably the most relevant for most people. Check the chapter view in YouTube to jump directly to it.

                  I think a 2x performance improvement is plausible when comparing non-soldered ram to the Apple silicon, which goes even further and has the memory on the die itself. If, of course, ram is the limiting factor.

                  The advantages of upgradable, expandable ram are obvious. But let’s face it: most people don’t need and even less use that capability.

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      These days I don’t realistically expect my RAM requirements to change over the lifetime of the product. And I’m keeping computers longer than ever: 6+ years where it used to be 1 or 2.

      People have argued millions of times on the internet that Apple’s products don’t meet people’s needs and are massively overpriced. Meanwhile they just keep selling like crazy and people love them. I think the issue comes from having pricing expectations set over the in race-to-the-bottom world of commoditized Windows/Android trash.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        I upgraded my personal laptop a year or so after I got it (started with 8GB, which was fine until I did Docker stuff), and I’m probably going to upgrade my desktop soon (16GB, which has been fine for a few years, but I’m finally running out). My main complaint about my work laptop is RAM (16GB I think; I’d love another 8-16GB), but I cannot upgrade it because it’s soldered, so I have to wait for our normal cycle (4 years; will happen next year). I upgraded my NAS RAM when I upgraded a different PC as well.

        I don’t do it very often, but I usually buy what I need when I build/buy the machine and upgrade 3-4 years later. I also often upgrade the CPU before doing a motherboard upgrade, as well as the GPU.

        Meanwhile they just keep selling like crazy and people love them. I think the issue comes from having pricing expectations set over the in race-to-the-bottom world of commoditized Windows/Android trash.

        I might agree if Apple hardware was actually better than alternatives, but that’s just not the case. Look at Louis Rossmann’s videos, where he routinely goes over common failure cases that are largely due to design defects (e.g. display cable being cut, CPU getting fried due to a common board short, butterfly keyboard issues, etc). As in, defects other laptops in a similar price bracket don’t have.

        I’ve had my E-series ThinkPad for 6 years, with no issues whatsoever. The USB-C charge port is getting a little loose, but that’s understandable since it’s been mostly a kids Minecraft device for a couple years now, and kids are hard on computers. I had my T-Mobile series before that for 5-ish years until it finally died due to water damage (a lot of water).

        Apple products (at least laptops) are designed for aesthetics first, not longevity. They do generally have pretty good performance though, especially with the new Apple Silicon chips, but they source a lot of their other parts from the same companies that provide parts for the rest of the PC market.

        If you stick to the more premium devices, you probably won’t have issues. Buy business class laptops and phones with long software support cycles. For desktops, I recommend buying higher end components (Gold or Platinum power supply, mid-range or better motherboard, etc), or buying from a local DIY shop with a good warranty if buying pre built.

        Like anything else, don’t buy the cheapest crap you can, buy something in the middle of the price range for the features you’re looking for.

  • Alien Nathan Edward@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    Tim Apple be like “We’ve tried charging more money. Have we tried charging more money and delivering less stuff in exchange?”

    • goatman360@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Yes, they do constantly. Yet, people still keep buying. I hate that I have to use Apple for my job because of the software and interface is exclusive.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Yup, same. I really don’t like macOS, but that’s what we’ve standardized on. I’m a Linux guy and use Linux at home for everything.

      • Alien Nathan Edward@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        I really like my macbook for dev work, and I think that now that macos is essentially a linux distro it’s quite nice, but it’s not that much better than the free distros and it’s getting worse while they get better. Right now the only thing keeping me on a mac at work is that they gave it to me and the only thing keeping me on a mac at home is that it’s already paid for.

        • KillingTimeItself@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          you wanna expand on why you think it’s basically a linux distro? Last i heard macos was more closely based on BSD than it was linux, and this was ages ago. Unless they rewrote it without my knowledge it really shouldn’t be anything like either one of the two.

  • phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    Granted, I’m a developer and my dev ide already uses a good 10+GB, I have probably hundreds of tabs and windows open over 6 desktops… But I got 64GB, and I’m considering upgrading to 128, and these clowns think 8 is okay today? My development laptop of like 10 years ago has 8GB

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      I’ve been okay with 16 for a while. I use ViM as my editor, and occasionally VSCode. I use a single desktop, but I generally have a half dozen or more tmux tabs for various parts of the project.

      That said, I’ve been feeling a bit squeezed with 16GB. The main RAM consumers are:

      • Firefox - I frequently have 100 tabs open, so it takes a few GBs RAM
      • Docker - running most of our app (a dozen or so microservices) takes 3-4GB if I’m careful about turning stuff off that I don’t need, 5-6 if I’m not
      • Teams and Slack - especially during calls, these use a lot

      So I think 16GB should be the minimum, and 24GB should be average. I’m going to be adding another 16GB to my personal development machine (hobbies and whatnot), and my work laptop can’t be upgraded (MacBook), but I’ll be upgrading to an M3 or M4 soonish and will request more RAM.

      8GB is probably fine if you’re just running a browser and that’s it. If you’re doing anything else, 16GB should be the minimum.

    • datelmd5sum@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      I have 16GB and I have to run shit I dev on local k8s. I have to close teams and my browser to get enough ram sometimes.

  • rasakaf679@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    Why tf can’t they sell mac with upgradable parts?? They are “so” into renewable and recycling stuff and saving planet and stuff. Then they should start selling shits with upgradable parts. Even cpu’s if possible. Now apple fan boys argue with that. And don’t bullshit me with soc should be near cpu for faster optimisation they can redesign the mobo.

    • phoenixz@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      There is what they say they are in favor of, and there is what they really are in favor of.

      They are in favor of apple getting all the monies, the end

    • Caiman86@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      They certainly used to. My wife’s 2012 MacBook Pro has upgraded RAM and SSD parts I’ve put in over the years and still runs fine, though it isn’t used much anymore and OS upgrades stopped a while ago.

      Their current environmental marketing is pure greenwashing bullshit and their stances on upgradability and repairability are terrible.

    • flop_leash_973@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Because that gives the user as much or more control over the device as Apple themselves have. One of the fairly consistent things about Apple over the years has been a desire to maintain tight control for themselves over the products they make.

    • accideath@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      There are legitimate advantages of the RAM being soldered right next to the SoC. However, if anyone could figure out how to create a proprietary RAM module, that slots in right next to the SoC (or even just an SoC module including RAM) that can be swapped out and that doesn‘t have any meaningful performance impact, it would be Apple. Just that it never could be Apple…

      • natebluehooves@pawb.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        The problem is the electrical resistance of the socket. Most of the performance on apple silicon is achieved through extremely high bandwidth, low latency memory. Unfortunately that necessitates a socketless design at the moment, and you can see that happening on the snapdragon X too.

        • accideath@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          Yea, not just snapdragon and apple. Even intel and amd processors usually get paired with higher bandwidth soldered ram on many mobile offerings.

          And on GPUs soldered VRAM has been a thing for a loooong time, with HBM memory being the prime example for what RAM close to the chip can do. AMD‘s Vega cards were highly sought after during the mining craze, even though they weren’t that fast in general computing, simply because their memory bandwidth was so beyond any other consumer cards…

    • AdrianTheFrog@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      It’s basically just greenwashing. They pretend to be into renewables and recycling only when it doesn’t disincentivize people from buying the newest product. Ex: iPhone trade in for recycling - Yes, they do recover some raw material but you can only do it if you’re buying a new iPhone with that credit, and its probably also an attempt to keep cheap used iPhones off of the market.

  • Yerbouti@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    My students with the 8gb version struggle to do basic audio work with only a few plugins. This is BS from apple. Unless you use your computer only for web browsing, in which case you shouldn’t get a stupid mac in the first place.

    • uis@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      To be fair I have no idea why audio plugins need so much ram

        • Jentu@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          Apple has a masterclass of tiering their products in just a way so that in every tier but the upper tiers, you’re giving up something really important. If you spend the least you possibly can on a MacBook, apple guarantees you’re going to have a very bad time for “doing the bare minimum to be seen with a laptop with an apple logo on it”. Their whole tier system is an exercise in “how can we get away with fucking up these things just enough so the customer feels like it is necessary to spend a little bit more” every step of the way. Then they make it unupgradable so you can’t sidestep their crafted feature tier system.

          • KillingTimeItself@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            nvidia also does this. It’s actually insane.

            Love spending 200 USD on 16GB of ram in 2024 because of apple, very cool, or however much they charge, it’s still too much.

  • Veraxus@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    My basic web dev Docker suite uses about 13GB just on its own, which - assuming you were on 16GB (double Apple’s minimum) - wouldn’t leave much for things like browser tabs, which also eat memory for breakfast.

    A fast swap is not an argument to short-change on RAM, especially since SSDs have a shorter lifespan than RAM modules. 16GB remains the absolute bare minimum for modern computing, and Apple is making weak, ridiculous excuses to pocket just a few extra bucks per MacBook.

    • accideath@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Playing devils advocate here: As someone who deals with stuff like that, you also wouldn’t buy the base model mac. The average computer user can get by with 8GB just fine and it’s not like you can’t configure Macs with more than that.

      That of course doesn’t justify the abhorrent price of the upgrades…

      • PraiseTheSoup@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        The average computer user can get by with 8GB just fine

        Hard disagree. The average computer user is idling at 5gb already because the average computer user is stupid.

        • accideath@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          Still leaves 3gb for the web browser and the average user isn’t using anything else anyways. And even on chrome that’s quite a few pages.

      • Specal@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        And here I am, putting 16gb in every machine I work on because it’s so damn cheap there’s no reason not to future proof

        • accideath@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          I mean, same. The difference in price for 8GB and 16GB is negligible, especially if you want dual channel on desktops

            • Specal@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 months ago

              That’s because apple is a greedy grabby company who wants all your money. The easiest solution is to stop buying their products

            • accideath@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 months ago

              Oh yea, absolutely. I meant that in regards to the price of memory itself, be it as modules for your desktop PC or the chips itself for soldered solutions. Apple’s markup is bonkers

          • Specal@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            My girlfriends mum wanted to know why her laptop was slow… It was because HP thought that 4gb of ram is acceptable in 2022 (when the laptop was sold). Granted ram wasn’t as cheap then as it is now… Still I paid £30 for a brand new 8gb DDR4 sodimm, there’s not reason hp couldn’t do that. It’s annoying the corners these company cut.

            • accideath@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 months ago

              My experience is, that 4GB is just about useable for a bit of web browsing and similar stuff. Even on windows 11. I have an old Surface Pro 4 laying around that, in a pinch, works perfectly fine with 11. Of course, it’s not fast. But it’s totally useable.

              • Specal@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                3 months ago

                Her laptop just wasn’t having it, windows 11, windows was using 3.7gb ram took about 30 seconds for task manager to open. As soon as I upgraded the ram is was usable.

                I checked for any surprising background services or anti virus software and there was nothing really

        • accideath@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          Maybe you’re not an average user then. Most people just browse the web and maybe manage some photos or fill out a document once in a while. You could do that on 4GB if you wanted to, let alone 8.

          • Specal@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            I wouldn’t say 4gb is usable for the average consumer. Using the assumption they’re using windows 11 that’ll eat 3.7 ish GB of ram just idling.

            • accideath@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 months ago

              You forget there though, that a lot of the RAM, that Windows (and most modern operating systems) uses, while idling, is a cache of programs you’re likely to open and that gets cleared, if you open something else. That has been a thing since Vista and was btw one of the reasons why Vista was criticized for high memory useage. Windows 11 is very useable with 4GB of RAM, if you’re not planning to do something bigger than browsing the web or editing a word document.

            • uis@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 months ago

              How? I have 108 tabs open and still use 2.67GB of RAM.

              • Specal@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                3 months ago

                Tabs of what? Chromes ram usage is more of a meme than an actual ram issue, windows will only allow an application to use so much ram depending on ram availability

                • uis@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  3 months ago

                  108 tabs in chromium. Mentioned RAM usage is total RAM usage including all system and kernel, but excluding page cache. Forgot to mention libreoffice in background.

    • filister@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      3 months ago

      Have you seen the difference between the 8 and 16Gb Macbooks, it is ridiculously expensive.

        • filister@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          Yes, my bad, I wanted to say the difference in price between the 8 and 16Gb model, I know that RAM became dirt cheap nowadays and there aren’t any excuses for Apple to continue offering 8Gb model, as this is exactly a planned obsolescence.

          • localhost443@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            Yeah I was just pointing out the insanity of their pricing, using sarcasm. Its the main way we communicate over here.

            The price difference between the first 2 models where 8gb ram is the only change, is £200. Post 2025 I’m going to need some solution to replace my windows install which solely runs CAD/CAM software. If it wasn’t for this scumbaggery I’d buy a Mac to replace win10, but at present apple are such a shower of cunts I think I may have to put up with win11.

            What a fucking choice…

    • hector@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Wow! 13GB! I did some heavy stuff on my computer with like a shit ton of Docker servers running together + deployment and I never reached 13GB!

      Without disclosing private company information lol what are you doing ;)

      • Veraxus@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Running a suite of services in containers (DBs, DNS, reverse proxy, memcached, redis, elasticsearch, shared services, etc) plus a number of discreet applications that use all those things. My day-to-day usage hovers around 20GB with spikes to 32 (my max allocation) when I run parallelized test suites.

        Dockers memory usage really adds up fast.

      • ben_dover@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        3 months ago

        not OP, but I have to run fronted and backend of a project in docker simultaneously (multiple postgres and redis dbs, queues, search index, etc., plus two webservers), plus a few browser tabs and two VSCode instances open, regularly pushes my machine over 15gb ram usage

        pretty much like this