Obviously, a bit of clickbait. Sorry.

I just got to work and plugged my surface pro into my external monitor. It didn’t switch inputs immediately, and I thought “Linux would have done that”. But would it?

I find myself far more patient using Linux and De-googled Android than I do with windows or anything else. After all, Linux is mine. I care for it. Grow it like a garden.

And that’s a good thing; I get less frustrated with my tech, and I have something that is important to me outside its technical utility. Unlike windows, which I’m perpetually pissed at. (Very often with good reason)

But that aside, do we give Linux too much benefit of the doubt relative to the “things that just work”. Often they do “just work”, and well, with a broad feature set by default.

Most of us are willing to forgo that for the privacy and shear customizability of Linux, but do we assume too much of the tech we use and the tech we don’t?

Thoughts?

  • okamiueru@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    3 months ago

    I’ve used DOS, 3.11 to all the way to 11. Switched to Linux as main driver around 2009. Used MacOS at work for over a year now. I occasionally boot into windows for rare game that uses some anti cheat that doesn’t play well with wine.

    I’m old enough that I just want things to work. I don’t care for any fanboyism. These are my opinions:

    • Windows is a mess. It has different UI from different decades, depending on what and where. NT kernel is ancient. The registry is a horror show. The only edge it has, is third party software, like propriatery drivers. that’s it. And that’s isn’t a merit of windows, but rather market share.

    • MacOS is inconsistent at every turn. It’s frustrating to use, and riddled with UX bugs, and seemingly deliberate lack of functionality. The core tooling, like the file manager, is absolute garbage. The only good thing it has going it, is that the Unix core is solid. In that year, I’ve experienced a soft brick once, that almost was a hard brick, and the reason was having set the display refresh rate from 120 to 60 Hz. Something I changed BTW, because certain animation transitions in MacOS took twice as long on 120 Hz… Yeah, top notch QA there Apple.

    • Linux. It has its own flaws. For sure. But as for “just works”, it happens so often, that it’s exactly why Windows and MacOS feels so frustrating. I’d have my grandmother use Linux.

    And, I’m not just saying this. When I upgraded components on windows, I spent 2 hours debugging problems. One of the problems was also that it reverted a GPU driver, where every single version information was unmistakably older. It also made it not work.

    I’ve also experienced that the WiFi network adapter also doesn’t work until I download some proprietary software over ethernet cable.

    On Linux? I didn’t need to do a single thing in either case. It for sure didn’t use to be this way. In 2009 I was hunting WiFi drivers for fedora over ethernet. But in the last, say 5 years, on Arch, it’s been amazing. Did I mention that I use arch?

    Ps: The last 4 times I’ve had problems on Linux have been:

      1. A Windows update fucks up grub.
      1. Reboot from windows doesn’t release hardware claim on WiFi adapter, so it doesn’t work on Linux.
      1. The system clock is wrong, which was easy to notice because of 2. leading to a lack of remote sync. This is due to Windows storing system time as local time, and not UTC. If you do software development, you’d know how dumb the former is.
      1. Raid partition destroyed because a windows 7 install decided to, unprompted, write a boot partition on a disk with “unknown” file system.