edit: hey guys, 60+ comments, can’t reply from now on, but know that I am grateful for your comments, keep the convo going. Thank you to the y’all people who gave unbiased answers and thanks also to those who told me about Waydroid and Docker

edit: Well, now that’s sobering, apparently I can do most of these things on Windows with ease too. I won’t be switching back to Windows anytime soon, but it appears that my friend was right. I am getting FOMO Fear of missing out right now.

I do need these apps right now, but there are some apps on Windows for which we don’t have a great replacement

  1. Adobe
  2. MS word (yeah, I don’t like Libre and most of Libre Suit) it’s not as good as MS suite, of c, but it’s really bad.
  3. Games ( a big one although steam is helping bridge the gap)
  4. Many torrented apps, most of these are Windows specific and thus I won’t have any luck installing them on Linux.
  5. Apparently windows is allowing their users to use some Android apps?

Torrented apps would be my biggest concern, I mean, these are Windows specific, how can I run them on Linux? Seriously, I want to know how. Can wine run most of the apps without error? I am thinking of torrenting some educational software made for Windows.



Let me list the customizations I have done with my xfce desktop and you tell me if I can do that on Windows.

I told my friend that I can’t leave linux because of all the customization I have done and he said, you just don’t like to accept that Windows can do that too. Yeah, because I think it can’t do some of it (and I like Linux better)

But yeah, let’s give the devil it’s due, can I do these things on Windows?

  1. I have applications which launch from terminal eg: vlc would open vlc (no questions asked, no other stuff needed, just type vlc)
  2. Bash scripts which updates my system (not completely, snaps and flatpaks seem to be immune to this). I am pretty sure you can’t do this on Windows.
  3. I can basically automate most of my tasks and it has a good integration with my apps.
  4. I can create desktop launchers.
  5. Not update my system, I love to update because my updates aren’t usually 4 freaking GB and the largest update I have seen has been 200-300 mbs, probably less but yeah, I was free to not update my PC if I so choose. Can you do this on Windows? And also, Linux updates fail less often, I mean, it might break your system, but the thing won’t stop in the middle and say “Bye Bye, updates failed” and now you have to waste 4GB again to download the update. PS: You should always keep your apps upto date mostly for security reasons, but Linux won’t force it on you and ruin your workflow.
  6. Create custom panel plugin.

  1. My understanding is that the Windows terminal sucks? I don’t know why, it just looks bad.

I am sure as hell there are more but this is at the top of my mind rn, can I do this on Windows. Also, give me something that you personally do on Linux but can’t do it on Windows.

  • Skull giver@popplesburger.hilciferous.nl
    link
    fedilink
    arrow-up
    12
    arrow-down
    9
    ·
    10 months ago

    Pretending it’s not possible on Windows isn’t very helpful either. You can do it, and there are tools out there to do it for you, they’re just worse than the native Windows experience (but still better than the Linux experience in most cases).

    Here is your alternative shell or here is an alternative. Here is your tiling WM. Here is your BTRFS driver, signed and all.

    Windows users expect higher quality software than Linux users when it comes to usability, that’s the missing part. Not “the APIs haven’t been implemented yet”, but “only existing Linux users will be able to live with the state of these tools”.

    • h3ndrik@feddit.de
      link
      fedilink
      arrow-up
      2
      arrow-down
      3
      ·
      edit-2
      10 months ago

      Wow. Didn’t know this stuff existed. Seems a bit complicated to me. Installing all that additional stuff, poking around in the registry to make it work. Necessary steps on the command line like windows users used to despise… But it’s not my world.

      I’m not sure if things like that really replace for example the compositor. Or are just a layer on top or somewhere inbetween. But I’m at least surprised people do that kind of stuff on windows.

      • Skull giver@popplesburger.hilciferous.nl
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        10 months ago

        The unfamiliarity of the tools is exactly what’s keeping Windows users on Windows, and why you’ve never heard of any of them. There’s a lot of crazy stuff out there for every platform (ever run GNU on Apple’s open source kernel? Because you can!).

        Windows only has the windows compositor. On Linux the entire system is more complicated because its GUI is based on mainframe based GUI protocol design from the 70s and 80s, but with Wayland the complexity has been reduced significantly, leading to various environments where the window manager is also the compositor.

        Windows is the most popular operating system in the world, especially in business, and its backwards compatibility for closed source programs beats every competitor. Not even Linux and the BSDs can run executables from the 90s without pulling in tons of old libraries or using wrappers.

        If all I wanted was some cool graphics and to run Photoshop every now and then, Windows would be my way to go. Microsoft has driven me to open source with their enshittification since Windows 8, but huge parts of the Windows code base are works of art in terms of flexibility and functionality.

        It’s easy to forget how customisable Windows can be if you’re only using Windows for a VM every now and then, but if it’s your daily driver and you approach it the same way a distro hopper might, you can get some pretty exotic Windows configurations that you’d struggle to replicate in Linux (without WINE, of course!).

        This is also one of the reasons I’m annoyed with the task bar rewrite. I don’t care too much about the start menu placement and the shape of the start button, but the decades of explorer.exe shell integrations being broken is rather sad to see. It’s the next big step in taking out the old, customisable Windows and the introduction of the new, appified, rigid Windows.

        • h3ndrik@feddit.de
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          edit-2
          10 months ago

          Well, I think I don’t agree with some points. My personal experience is a bit different.

          more complicated […] GUI protocol design from the 70s and 80s

          While we all had annoying situations with the proprietary nvidia drivers… I had my fair share of fun with X. It was able to be a viable product from the 80s to today. From big mainframes, computers, laptops to smartphones and embedded devices. That is a crazy long time, achievement and ability to scale. I’ve once set up an internet cafe with 1 pc and a 4 people multiseat setup, we’ve fooled around with Synergy at the university. And I’ve used it the network abilities for some time to run applications on different computers but display the UI on my monitor. Both for maintenance and for something like people use Steam Remote Play or Remote Desktop nowadays. My memories aren’t “it’s complicated” but “amazing piece of software”. But I’d agree. Maybe time has come to retire. And it is a large and complicated piece of software.

          Windows […] backwards compatibility […] beats every competitor

          You’re able to execute old binaries mainly because they linked the libraries statically instead of dynamically. With Linux, dynamic linking has been more popolar because it has other benefits. And that is the main reason why there are differences. It’s not a feat of the OS, but how the executable is linked. I think you’re able to do the same thing on most of the operating systems. And in fact many proprietary programs that run on linux are statically linked. And also the games I get on Steam. (There are limitations, however. Once they swap out the sound system or replace the UI toolkit, you’ll need a compatibility layer or adapt your software. And those things have happened. But also compatibility layers exist.)

          Other than that, I don’t think it’s even true. I once had to install a windows server. And some piece of software needed the ‘ASP.net Something 4.0’ and then the next thing was something requiring the ‘Something Redistrubutable C++ 3.5’ and I remember once installing things like that for some old games. I thought those came with the Service Packs but it was a major ordeal to get the thing running with a mix of a bit older, custom software and some other current software that it needed to tie into. So the windows people also regularly pull in tons of old libraries.

          I don’t do Windows gaming myself, but people told me old games from the XP or Windows7 times sometimes don’t run on 10 or 11.

          And not to be annoying or something… But my every-day experience is stories like this: My father-in-law calls and tells me some banking device that is required to make bank transfers is dropping support for his windows version. I have bad memory. Maybe they required him to use a new device and that wasn’t supported on his old windows 8.1 machine. Doesn’t matter, he needs Windows 10. But albeit the computer being kind of still okay, the CPU is a tiny bit too old and not supported. So we buy a new Laptop. And now *drumroll* the all-in-one printer won’t work because it’s suddenly too old and HP doesn’t do Windows 10 drivers, because they want to sell new printers instead. Whereas I once bought a super old second hand b/w laserprinter for 10€ incl toner and used it for 8 more years. And I bet it’s still supported with Linux today. Next thing is the laptop updates to Windows 11 and I get to spend yet another day to fix the software that updates the Garmin, two other programs he needs and the antivirus caused mayham…

          So while I applaud Microsoft for maintaining some old APIs in their UI-Toolkit. It doesn’t do me any good in real life. So don’t teach me about backwards compatibility. It’s the same with their office suite and them deliberately making something in the word document file format incompatible every few years so everyone needs to upgrade. Including affecting me. I’m sure they can stop now because everything has become a subscription model anyways. Personally I don’t care. But if it’s in a professinal setting, I want the documents and slideshows to look as intended by the author. Please without them forcing me to buy Windows plus the most recent Office subscription. Plus a new CPU and a complete set of new printers and peripherals.

          I think I’m just getting old myself. And now I start to get what some people have been telling me. I sometimes can’t be bothered to figure out things. How to customize a product I don’t like in the first place. And I don’t want it customizable in theory. I want my shit to be the way I’m used to. I don’t want a different Start button. A new hideout for the button to shut down the thing. And then it doesn’t even shut down properly but does some magic that interferes with dual-boot. And a Ribbon-Interface for office that makes me learn how to do the modern version of File->Print. I don’t like that. Worst thing is, at some time LibreOffice will adapt. I think they already changed icons at some point. And my Linux is also starting to do silly stuff in the background. Look for updates and whatever is using up all the extra hundreds of megabytes of RAM. And change the traditional way of handling software packages and introduce 5 package managers. Do updates on startup or shutdown…

          I say sometimes… I’m also for technological progress.

          I kind of also stay with my Linux distro because of familiarity. But I promise I’m not close-minded. Once Linux starts displaying Ads in the start menu, tracks my every move to sell off my private data. And Windows becomes the ethical and free (as in user-freedom) alternative that is faster, has the better technology stack and the superior interface design… I’m going to put in the effort, learn everything that has changed since XP/2000 and switch.

          • Skull giver@popplesburger.hilciferous.nl
            link
            fedilink
            arrow-up
            5
            ·
            10 months ago

            You’re able to execute old binaries mainly because they linked the libraries statically instead of dynamically

            Not exactly. Most Windows libraries are a) guaranteed to be on the system and b) always present in a backwards compatible form. The Windows folder is full of multiple versions of the same DLL that get linked dynamically to make sure programs keep working. To the application, it’s as if the exact dependency the program requires has been installed as if it’s still 1993.

            Some toolkits (like MFC) are compiled statically more often, but Windows is very much dynamically linked in most cases.

            The redistributable problem is an annoying one, that I will admit. However, that’s part of the trickery Windows pulls: you can install MSVC++ Redis 1.2.34 and MSVC++ Redis 1.2.35 into the “same” location and both will work where necessary. On Linux, your /usr/lib/whatever.so needs to be the right version (or may need some kind of versioned symlinking solution if you manually install old versions). Microsoft’s solution is much more complicated, but also makes things possible that are VERY hard to pull off on Linux.

            The Windows 8.1 to 10 upgrade path was pretty much flawless. There were a few very rare ARM and Intel Atom CPUs that barely ran anything at all that got dropped, but in general Windows 10 reduced requirements rather than increasing them. MS did do a hard cutoff for Windows 11, but that was unprecedented.

            The printer driver issue is manufactured by HP. Windows will happily load a Windows Vista printer driver on Windows 10 (that was what I did last time I needed to interact with a HP printer). The biggest challenge is the 64-bit-versus-32-bit problem; Windows really wants 64 bit drivers on a 64 bit kernel but shit companies like HP didn’t make those for many versions. If the printer broke in Windows 10, it was probably a vulnerable, buggy mess for ages before.

            I honestly can’t remember the last time Office broke compatibility. Back when doc switched to docx (thank god) there was a breaking change (with a bunch of free conversion plugins available from Microsoft’s website) but Word 2007 files will open just fine on Office 365 in my experience.

            As a comparison: I needed to run some executable compiled for Ubuntu 14 on Ubuntu 20 at some point (or mess around with a VM). Naturally, I installed the i386 support libraries and tried to run it. That failed because it needed 30 different libraries that were all different versions. I managed to get it to work mostly (until it hit a kernel incompatibility) by downloading every dependency and dependency of a dependency for Ubuntu 12 and put it in an LD_PRELOAD directory.

            On the other hand, on Windows I ran setup.exe on a game from 2000 and it installed and played fine after a few false positives about compatibility mode. It’s no wonder that most games Steam lists as “running on Windows” are actually compiled for Windows and run under WINE.

            I honestly believe that the Windows kernel is better than the Linux kernel, from a technical point of view, especially from a security point of view. However, the Windows user interface gets worse by the month. More ads, more tracking, more bullshit. My perfect OS would be the Windows 7 UI on top of the Windows 11 kernel. Windows 8 killed the Windows experience for me. Windows 11 was better than 10, which was better than 8.1, which was better than 8, but all were a worse experience than Windows 7.

            Windows is being developed for the next generation. The next generation has grown up with iPhones and iPads, and everything needs to be dumbed down, with bright, big buttons, and of course touch friendly. Microsoft has also given up on letting the user make the right choices, and switched to protecting the user from their own idiocy. And, to be honest, for most people, I don’t disagree with them; before Windows forced updates, my impression was that the average Windows computer was about two to three months behind when it came to critical security updates.

            • h3ndrik@feddit.de
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              10 months ago

              Thanks for the insight. At some point I need to read up on the windows kernel for example. All I ever read is articles like Linux performance gains with chiplet-based CPUs with multiple L3 caches and crazy stuff like that. I don’t even know how Microsofts kernel development works and if and when they adapt to updated CPU architecture designs etc. All I know about the windows kernel is a few simple facts from uni. Like that it’s a microkernel (or hybrid?) since NT and that has also been the time they implemented proper protection rings so it wouldn’t crash as often as 98 and ME. And they made some good design choices and copied the TCP/IP layer from BSD (and presumably replaced it since). Guess I also sometimes live in my own little bubble. I’m pretty sure lots of things have been re-designed when we switched from 32 to 64 bit. I’m going to look it up. But it seems to be difficult (to me) to find articles like the one I linked above, only for Windows. And the Wikipedia articles are also very short.

              Regarding the libraries: How do they do the backwards compatible form? I had a quick look at the Windows/System32 folder of my Windows 10 install, and I saw maybe 10 files with version numbers in the filename out of 5000 files. The rest is just something .dll

              But I guess we’re talking about different libraries anyways. I thought about something like SDL, the game’s physics library, the QT5 libraries for your platform-independent application. Zlib for your compressed game assets. When I have a look at my Windows Steam directory, they’re all somewhere inside, along with the game and not bundled with Windows. When I have a look at my Linux Steam directory, it’s the same. When I have a look at any of the other applications on my Linux system, it’s different. My distribution has a package manager and the software requires the system to provide all the compression libraries, game engine … They’re not bundled with every single game but provided by the package manager. That’s what I was going for. Sure there’s also a C standard library and system libraries that are tied more tightly to the operating system itself.

              I know how the linux library symlinking works. It’s alright for many cases and it lets you have multiple versions installed at the same time. But it has its caveats. And it’s just not as easy in general. It’s just a hard problem. Your specific library version has dependencies on other specific and maybe older versions. Configurations can get version-specific. Sometimes things break or can’t be done in a backwards-compatible way. There are security vulnerabilities that need to be fixed, back-ported and whatever. And everything is just a convoluted mess. Shared library versioning is hard. And the way Linux does it isn’t optimal. Actually there is lots of room to improve. Especially since we’re designing and using things differently from back then when this was a good decision. But I don’t think Flatpak or putting everything into Docker containers is the best solution, either.

              Another thing I don’t specifically like is the way software is spread across the system. I think I like it the way it’s supposte to work on MacOS where you simply drag one application folder onto your disk and that ‘installs’ it. And everything is contained within and can simply be removed without resudie by deleting that directory. But I guess it’s also not as easy because sometimes different pieces of software share resources and it also has benefits to have your system config in one central directory.

              Is there a ‘perfect’ solution to organize applications and libraries?

              With the hard cutoff for Windows 11, you’re right. I got a bit confused with the numbers and the order of events. But the story is true nonetheless. I even tried different versions of printer and scanner drivers, tried to extract files and feed INF files to windows, tried to learn about unified printing. But to no avail. And somehow I’m always affected by the Windows printer woes with my relatives. My mom once installed one of those quarterly (or half-yearly?) update packages and it made her Brother printer stop working for almost two months until something happened like another patch or something. I don’t know if that was related to Print Nightmare, kind of fits judging by the time but I wasn’t involved in helping my mom back then.

              All I know my printer works. Sometimes it chews on a letter or complains about the counterfeit ink. But that’s it. I remember times where I needed to wrestle with some old and badly-written Brother linux drivers myself. That certainly wasn’t fun. But that’s like 13 years ago. I don’t know if the (proprietary) printer drivers for linux improved since then or if I just made better choices upon buying hardware.

              The Office compatibility has been a thing since I went to school. It’s not that it stops working entirely or becomes completely unusable. It’s just that it’s a bit off. Things get out of place, A sentence or element vanishes into thin air or the bullet points change color or shape. Something like that. I always make sure to have exactly the correct fonts installed so everything could be rendered pixel-perfectly. But it won’t happen reliably. And you’re perfectly right you can absolutely open files with the newest, most recent version. That’s not the point. They want the other person to upgrade. You and all of their supplier now use Office 365 and the company using Word 2007 is now unable to do business. That’s the way around this peer pressure works.

              Regarding the Ubuntu compatibility: Yeah, that’s tough sometimes. At some point you’ll end up with things like an incompatible libc. Or a major clash of pretty much all of the library versions. Glad you figured out you can supply libraries yourself. I’ve done stuff like that. I guess it’s kind of a rare thing with linux that you can’t up- or downgrade or install libraries alongside and also have a fixed binary without access to the source code, either. So if you manage to get into that situation somehow, you’re mostly on your own and it’ll take some time to figure it out. It’s not uncharted terretory, but it’s not fun either. I’ve also maintained super old Suse machines that didn’t even understand NTFS because it wasn’t invented back then. And some RHEL and ‘frankensteined in’ some stuff. Nowadays on more recent machines, I’d most of the time just put old software into a container that contains the old distribution with all the old libraries. And I’d be done with it.

              I don’t have my old PC game CDs anymore. I always wanted to try some classics from my childhood, like Midtown Madness 2, Need for Speed or Command and Conquer. I guess I can download them or ask some of my friends if they kept them around, and see if they still work on my Windows 10 laptop. I don’t even have an optical drive anymore…

              I think my perfect OS is a bit more sci-fi. I like the idea of convergence. And maybe some unified system with everything tied together. I’ve always been amazed by the way computers work in sci-fi. For example on Stargate they have something on their tablet and then swipe it up and show it to everyone on the big screen of the control center. Or swipe their results to the side and they get transferred to the console on the right where the other scientist starts working on them. I know that things like that are a thing in the server world where you can pause VMs and transfer them onto another cluster. ‘Hot’. But damn would that be useful at home. I could work on something, make amendmends on my phone, then transfer it onto the beamer to show it to a co-worker. Start watching a audiobook in the train and at home it’d automatically transfer to the stereo. Same with YouTube videos that I could show to my partner or grandma easily on their TVs. I know stuff like this exists. But it’s all very specific, works with one kind of media and never across different vendors. And instead of doing this in a similar, abstract and transparent way, each piece of software needs it’s own way of synchronizing data in the background and storing it on one server to simulate something like walking from your laptop to your PC and resuming work. (Or collaborating.) And all the instant messengers are incompatible.

              There are 'distributed operating system’s like Plan 9 that have interesting concepts. But they never took off. And today, technology gets more and more interconnected. We need concepts like edge-computing. Data available 24/7 from everywhere. And everyone has like 5 things that are basically a computer. And 10 IoT devices.

              And I’d like to come home, throw my laptop into the dock and then all data is backed up and available. I can resume the video that I started on my phone, transfer it to the big screen in the livingroom. And ask my smarthome to look into my partner’s calendar and make an appointment with them and my brother-in-law. And then sit down at a monitor to play a game. Please make everything update itself so it won’t get hacked and I don’t have to worry about keeping 5 computers, two routers and 15 IoT devices up to date and being involved in the maintenance, and make it interconnect across vendors and platforms. And just ‘restore’ on the next device I’m going to buy.

              I think more important than the OS itself is the platform that goes with it, because that’s the thing we interact with.

              I’m kind of a stubborn person and I also like my freedom and like to stay in control. So this interconnected and distributed operating system would need to satisfy the four freedoms of free software. I personally won’t submit into an ecosystem like the golden cage which Apple sells.

              Yeah. Media literacy and computer skills have taken a hit. And this has and will have consequences. I hope people realize we don’t always have to settle for some least common denominator.

              • Skull giver@popplesburger.hilciferous.nl
                link
                fedilink
                arrow-up
                2
                ·
                10 months ago

                The proprietary nature of Windows keeps many of its tricks under wraps. I think many people read about the detailed blogs and news articles Linux inspires and comes off thinking Linux is the obvious superior piece of technology. You used to get tons of books about the Windows internals, but most people writing those books have been hired by Microsoft so you get fewer of them these days.

                Windows 9x was absolutely unstable and Windows NT has been the victim of shitty developers. Even in the 9x days, very few crashes came from products written by Microsoft itself in my experience. Linux has the same issue: see the Nvidia driver in just about any facet of computing, from power management to simply providing a basic API for things like Wayland. Windows XP is where Microsoft discovered the problems with their 90s kernel design, and in Vista they fixed a lot of them (user-mode GPU drivers, user-mode USB drivers, all that jazz! The entire GPU driver can crash and all you notice is a quick black screen and programs using hardware acceleration freezing). Vista was obviously released before it was finished, but 7 is where these features began to shine. The 32/64 bit boundary was actually not that big a problem, because Windows already ran on 64 bit all the way back in the 90s. It took a few years for Linux to be available on as many architectures as Windows was.

                The backwards compatibility thing is mostly accomplished by putting version numbers in the DLLs and their loading mechanism, and using hot patching through the compatibility layer to put workarounds for programs that don’t support them. Microsoft has a vast database of old executables with custom-developed patches and workarounds at the API level for the most important software. For all other software, multiple versions of the DLL are loaded into memory and applications get linked to the version they desire.

                Inside a Windows install, you’ll find a huge WinSxS folder. This side-by-side assembly does a lot of the heavy lifting when it comes to versions and compatibility. It used/used to use a lot of hard links, so Windows Explorer would get very confused at the size. MS stopped using WinSxS for their own redistributables, but there are still plenty of system libraries stuck in the system.

                For redistributables, Microsoft has switched to the same system Linux has: versioned DLL files. Unlike most Linux distros, though, you can keep many versions installed next to each other. Sadly, many Linux programs link to the unversioned DLL, which means breakages galore whenever you update, and old versions of a library will often fight for the right to be the “real” libabcdef.so. Nobody can decide how to solve the versioning issue in Linux because there is no central authority for this stuff, and that’s part of the reason why it’s so hard to run old executables.

                The new fix, just shipping every library with every application, as you stated, isn’t great either. Flatpak does a lot of file deduplication (much better than Docker’s image level deduplication) so it’s not as bad as it seems, but I’d still rather have the libraries provided hy the OS itself in use than have a second copy of the OS especially for running GIMP.

                The thing about tools like Qt and zlib is that Windows comes with all that stuff out of the box, and has done so for decades. There’s no need to track down the right version of libcurl and libgzip because the target OS has all the functionality you need. Qt is replaced by MFC/ATL/Win32, libgzip is replaced by the Windows compression API, DirectX and its many shapes replace SDL and a bunch of GPU libraries, the list goes on for ages. Almost any Linux package beginning with lib.* has a Windows equivalent in the standard API that’s guaranteed to work the same for a couple of years at least. You don’t need to deal with a package manager to update them, because Windows Update does that for you. That’s why Windows doesn’t need a separate package manager for 99% of the binaries apt/dnf/pacman pull in. They still pack winget with Windows 11, though (it’s available but not enabled by default in Windows 10)

                All configuration happens in the registry. Gnome tried to do the same, which is where dconf comes from, and it’s honestly even better. Microsoft was sick and tired of Win3.x ini files everywhere on disk and set up the registry as a central key/value store. It lacked documentation at the time of invention and to this day you’ll need MSDN to find out what all of the settings mean, but on the other hand you only get one single configuration file for your entire system. This forces programs and the OS itself to remain backwards compatible and write migration tools rather than just changing up the config file type.

                The software distribution problem is actually something Microsoft is trying real hard to solve. The Windows Store can serve any application (not just UWP stuff) but users don’t seem to want to use it. There’s a modest fee required to use it, so a lot of free stuff avoids the store, but it solves the distribution problem the same way the Mac app store does, and it even includes Android apps for good measure. Store applications are kept inside their own little sandbox just like apps on other platforms. The solution just doesn’t seem to stick.

                On the other hand, Linux and MacOS do introduce annoying problems like “where do I put user only programs”. MacOS has executable folders to make your life a bit easier, but you still end up with a folder full of random stuff. Linux has the XDG standard for storing stuff in ~/.local and ~/.config, but when you try to use that you’ll find many programs don’t even bother to try to follow the XDG spec because they decided it’s wrong and nobody is forcing their hand. The Windows solution (put everything in program files) isn’t amazing either, but it works in most cases.

                Another annoyance is the lack of programs using MSI files for installers. An MSI installer can be written declaratively, with install and uninstall steps for adding and removing files, and even allows for repairing damaged programs, but everyone wants to write their own self extracting installer. This happened in Linux too, where Oracle wrote a binary installer for Java and VirtualBox guest drivers, and Nvidia still had a weird installer on their website. The APIs and abilities are all there, but programmers choose not to use them, either out of ignorance or out of laziness. Furthermore, Linux had AppImages these days, which behave like MacOS’ executable folders in a way and can be put in an Applications folder if you want. I store them in ~/.local/bin myself.

                I think the Android style app containment is the best solution so far. It combines a rich OS with tons of APIs (similar to Windows) with a sandboxed environment easily managed by the user. It’s not perfect (sharing libraries is possible but only for apps with the same signature) but I think it’s a good idea. Flatpak and Snap are both trying to solve the problem in a similar way, and perhaps after a few more years of teething issues we’ll end up with a great solution. Having been directed to a Flatpak only system on the Steam Deck, I’ve honestly changed my mind about the way Flatpaks work, though the many Nvidia huge driver containers still irk me to no end.

                As for your printer issue: I know the feeling, printers just suck. I’ve had as many fights with Apple’s CUPS on Linux as I’ve had with Windows’ shitty drivers. For many printers, just right clicking them and clicking “install” from the network center just seems to work. For others (especially the modern ones) you need stupid workarounds. The ancient laser printers that work flawlessly once the software runs are all dependent on drivers that were buggy when they were released in the 90s and Windows has since fixed a lot of security vulnerabilities in its print system that not every printer driver likes.

                I’ve honestly considered setting up a Raspberry Pi in readonly mode to fix the printer issue on my dad’s laser printer because the HP driver disappeared from the internet, but after extracting it from a backup or the driver CD everything suddenly worked.

                I honestly don’t recognise your Office problems at all. I’ve used every Office version between Office 97 and Office 2016 in incremental fashion, and documents always just seemed to work right for me. You can even interoperate with older versions by specifically saving files for an older version, which many open source alternatives simply can’t do.

                That said, with the HUGE attack surface of Microsoft Office, you really shouldn’t be using versions that are out of date. Just for security reasons alone I wouldn’t want to deal with a company that still used Office 2007 everywhere. Year-branded Office releases receive 5-10 years of updates (depending on the version, they’re clearly marked online) so you know exactly when you should be buying a new license. As a business, the price of Office really shouldn’t be what’s killing you, and the same is true with Windows licenses. They’re investments into tools and business continuity.

                The reliance on Office rather than alternatives is annoying for those using LibreOffice and such, but there simply isn’t a better offline suite available. Office 365 and Google Docs are the way to go these days in terms of MS Office competition, and neither are a great solution. I use the open source stuff, but I can’t deny that it’s worse in almost every way other than the user freedom part, and that includes some security considerations.

                I share your dream of a connected computer ecosystem! I’ve drawn up diagrams of a concept to basically share an application between mobile, desktop, and other form factors, based on the Android sandbox and leveraging the cross platform nature of the JVM.

                • Skull giver@popplesburger.hilciferous.nl
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  edit-2
                  10 months ago

                  (character limit, cont’d) Those few hundred megabytes of state applications have should be easily shareable over WiFi, and in a perfect world they should be streamable as well. I want a system where I can basically throw my screen onto another and stream what I’m doing, or transfer it when you want to take my work on the go.

                  Sadly, nothing usable for normal computer users is ready for such a system without extreme containerization and sandboxing efforts, but it’s still my dream to one day be able to do. Plus, you’d need to deal with the different control methods that TVs/laptops/phones require, and I think only GTK has a workable concept for that so far.

                  Microsoft clearly wants to move Windows to the cloud entirely, and that’s one way to implement the “throwable” applications for sure. RDP support running single applications remotely, so by making everything a client to a mainframe, your entire workflows could be shared uninterrupted, while being able to render specific applications on specific screens only. I’m not happy that this is all cloud based, but a mainframe like setup for home computers could work on Linux too if the streaming capabilities are implemented right.

                  The big problem with computers in general is that they’ve become tools. They need to be easy and safe to use for the general public, they’re no longer just cool things to tinker with. Worry-free security comes with vendor updates, and vendor updates come with a lack of freedom unless you buy open source.

                  As for IoT crap, the EU is going to force products, both hardware and software, to get patched, or to get pulled out of shelves. A bunch of open source people protested this bill (they were on the hook for maintenance, mostly because they sell consultancy and support on the side) but I think this law has the capacity to solve the IoT shit problem, or at least start solving it. Stores will think twice about the products they stock when part of their inventory becomes worthless when a company refuses to publish security updates.

              • h3ndrik@feddit.de
                link
                fedilink
                arrow-up
                1
                ·
                10 months ago

                From the user interface or specifics, idk. They’re all pretty much alike and follow very similar concepts. So I don’t care too much. Sometimes an interesting change pops up like when Android started using gestures instead of buttons for ‘back’, ‘home’ etc. I have a Thinkpad with those pointing sticks so I don’t need to lift my hands from my keyboard. And I pretty much like the concept of the Gnome destop where I just bump the corner or hit the Super key and start typing the initial letters of the name and hit enter. Other than that it mostly stays out of the way. And I like the command-line interface and configuration files. I think it’s vastly superior to digging through menus and remembering where to click with your mouse. And it has more advantages once you automate stuff, deploy it somewhere etc. But I think nowadays everyone agrees with that. Even all the windows-admins in my circle of friends have started to be glad about the PowerShell.