Archived link

The polyfill.js is a popular open source library to support older browsers. 100K+ sites embed it using the cdn.polyfill.io domain. Notable users are JSTOR, Intuit and World Economic Forum. However, in February this year, a Chinese company bought the domain and the Github account. Since then, this domain was caught injecting malware on mobile devices via any site that embeds cdn.polyfill.io. Any complaints were quickly removed (archive here) from the Github repository.

  • Bertuccio@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Whichever editor let them post “100 thousand” should be spanked one 100 times with the severed hand of whatever asshole wrote it in the first place.

    • letsgo@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      What rules can we add that solve this problem? (I’ve tried DDG but didn’t find any results)

      • Supermariofan67@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        This one is already in the default uBlock filters - Badware risks

        I also strongly suggest adding https://big.oisd.nl/ as a filter list. It’s a large and well maintained domain blocklist (sourced from combining lots of other blocklists) that usually adds lots of these sorts of domains quickly and has very few false positives.

        If you want to take it even further, check out the Pro list and Thread Intelligence Feeds list here https://github.com/hagezi/dns-blocklists

        These can all be added to a pihole too if you use one.

      • ChilledPeppers@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        cdn.polyfill.io^ ? By now it was probably already added to the default lista tho…

        (I dont really understant these things tho, so correct me if I’m wrong)

  • dan@upvote.au
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    5 months ago

    My favourite part is that the developers that currently own it said:

    Someone has maliciously defamed us. We have no supply chain risks because all content is statically cached

    https://github.com/polyfillpolyfill/polyfill-service/issues/2890#issuecomment-2191461961

    Completely missing the point that they are the supply chain risk, and the fact that malicious code was already detected in their system (to the point where Google started blocking ads for sites that loaded polyfill .io scripts.

    We don’t even know who they are - the repo is owned by an anonymous account called “polyfillpolyfill”, and that comment comes from another anonymous account “polyfillcust”.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Frustrating that the article doesn’t specify and simply links to a different Github page which doesn’t clearly specify the problem either.

      I have to assume the site’s article was dynamically generated, without any actual tech journalist doing the reporting. The byline is “Sansec Forensics Team” which doesn’t even link out to the group. Also, the “Chinese Company” isn’t named either it the article or the references, which is incredibly shoddy reporting. The archive link is dead.

      This whole page is indicative of the failed state of tech journalism. A genuinely explosive story but its so threadbare and vague that it becomes meaningless.

      • sorter_plainview@lemmy.today
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        5 months ago

        The site is Sansec. They uncovered it. They also specify how the malware redirects users to sports betting sites.

  • sunzu@kbin.run
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Noscript would fix this issue… Deny most of that shit and internet still works… Mostly

    • dactylotheca@suppo.fi
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      5 months ago

      and internet still works… Mostly

      That load-bearing “mostly” is doing a lot of work here.

      I invite everybody to find out how everything “mostly” works if you disable “most of” javascript – also have fun deciding which parts to enable because you think they’re trustworthy

      • valaramech@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        I actively do this with uMatrix - granted, I only block non-first-party JavaScript. Most sites I visit only require a few domains to be enabled to function. The ones that don’t are mostly ad-riddled news sites.

        There are a few exceptions to this - AWS and Atlassian come to mind - but the majority of what I see on the internet does actually work more or less fine when you block non-first-party JavaScript and some even when you do that. uMatrix also has handy bundles built-in for certain things like sites that embed YouTube, for example, that make this much easier.

        Blocking non-first-party like I do does actually solve this issue for the most part, since, according to the article, only bundles that come from the cdn.polyfill.io domain itself that were the problem.

        • dactylotheca@suppo.fi
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          You’re still trusting that the 1st party javascript won’t be vulnerable to supply chain attacks, though

    • 9point6@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Not a solution. Much of the modern web is reliant on JavaScript to function.

      Noscript made sense when the web was pages with superfluous scripts that enhanced what was already there.

      Much of the modern web is web apps that fundamentally break without JS. And picking and choosing unfortunately won’t generally protect from this because it’s common practice to use a bundler such as webpack to keep your page weight down. This will have been pulled in as a dependency in many projects and the site either works or does not based on the presence of the bundle.

      Not saying this is a great situation or anything, but suggesting noscript as a solution is increasingly anachronistic.

      • parpol@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        I definitely prefer using no-script enabled pages. If it were me, I would prefer a fully non-JavaScript internet with static pages.

        JavaScript introduces so many vulnerabilities, it makes adobe flashplayer look like a security suite. JavaScript also breaks all accessibility features like speech recognition and font size and color control.

        • 9point6@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          Flash was magnitudes worse than the risk of JS today, it’s not even close.

          Accessibility is orthogonal to JavaScript if the site is being built to modern standards.

          Unfortunately preference is not reality, the modern web uses JavaScript, no script is not an effective enough solution.

          • parpol@programming.dev
            link
            fedilink
            English
            arrow-up
            0
            ·
            5 months ago

            Flash was containerized, and completely safe until adobe just stopped supporting it. A million times better than what JavaScript has become in terms of privacy. There is a reason noscript is bundled with Tor.

            And preference is definitely a reality. It is niche at the moment but I see a future where more and more people see JavaScript for what it is. Bloat.

            • 9point6@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              5 months ago

              Flash ran as a browser plugin (as in not an extension, but a native binary that is installed into the OS and runs beside the browser, we basically don’t do this for anything now)

              Flash was pretty much on weekly security bulletins in the final years, arbitrary code execution and privilege escalation exploits were common, that’s why Adobe killed it.

              Flash was never safe and comparing JavaScript to it as a greater risk shows you’ve not fully understood the threat model of at least one of the two.

              • parpol@programming.dev
                link
                fedilink
                English
                arrow-up
                0
                ·
                5 months ago

                We still use plugins. In fact you most likely have one installed right now for video encoding. JavaScript not being a plugin is the reason we only have two major browser cores. Chromium and gecko. JavaScript prevents new browsers from entering the ecosystem due to how hard it is to implement unlike how easy it would have been as a plugin.

                Flash had vulnerabilities because of neglect from adobe. The core design of flash and its earlier stages made by Macromedia were great. It had a sandboxes environment, and later it even was integrated into a browser sandbox just like JavaScript, eliminating most vulnerabilities.

                Flash was very limited in the malicious code it could run, as opposed to JavaScript which can automatically redirect you to malicious websites, install tracking cookies, access the browser canvas to install tracking pixels, freeze your entire browser, take control of your cursor, look at your entire clipboard history, collect enough information about you to competely identify and track your footprint over the entire internet.

                Flash couldn’t access your clipboard or files unless you clicked allow every time, couldn’t access anything outside of its little window, and if it froze, the browser was mostly unaffected, and flash had almost no ability to collect any data about your browser.

                • 9point6@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  5 months ago

                  That’s literally the one main somewhat valid use case for plugins, and it’s basically because of DRM. A plugin that allows arbitrary code to run is a security nightmare, that’s why we don’t do it anymore.

                  A lot of the security features you describe were added by browser vendors late in the game because of how much of a security nightmare flash was. I was building web software back when this was all happening, I know first hand. People actually got pissy when browsers blocked the ability for flash to run without consent and access things like the clipboard. I even seem to remember a hacky way of getting at the filesystem in flash via using the file upload mechanism, but I can’t remember the specifics as this was obviously getting close to two decades ago now.

                  Your legitimate concerns about JavaScript are blockable by the browser.

                  Flash was a big component of something called the evercookie—one of the things that led to stuff like GDPR because of how permanently trackable it made people. Modern JavaScript tracking is (quite rightfully) incredibly limited compared to what was possible with flash around. You could track users between browsers FFS.

                  You’re starting to look like you don’t know what you’re talking about here.

          • parpol@programming.dev
            link
            fedilink
            English
            arrow-up
            0
            ·
            5 months ago

            Accessibility is orthogonal to JavaScript if the site is being built to modern standards.

            In other words, accessibility is in the hands of the developers, not the visitor. And the developer really wants that scrolling background and non-selectable text, so tough luck, people with no hands, I guess.

            • 9point6@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              5 months ago

              Well, by that measure, you don’t need JavaScript to make inaccessible sites, there are plenty of sites out there that ruin accessibility with just HTML and CSS alone.

              It’s always up to the developer to make sure the site is accessible. At least now it seems to be something that increasingly matters to search result rankings

              • parpol@programming.dev
                link
                fedilink
                English
                arrow-up
                0
                ·
                5 months ago

                You really can’t. If it was only HTML and CSS, any accessibility program would be able to select any part of the page, and easily alter the CSS and HTML. That is next to impossible now because of JavaScript.

                It shouldn’t be up to the website developer. It should be up to the browser developer. You don’t blame a lemmy instance for poor accessibility with Jerboa.

  • dan@upvote.au
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Reposting my comment from Github:

    A good reminder to be extremely careful loading scripts from a third-party CDN unless you trust the owner 100% (and even then, ownership can change over time, as shown here). You’re essentially giving the maintainer of that CDN full control of your site. Ideally, never do it, as it’s just begging for a supply chain attack. If you need polyfills for older browsers, host the JS yourself. :)

    If you really must load scripts from a third-party, use subresource integrity so that the browser refuses to load it if the hash changes. A broken site is better than a hacked one.


    And on the value of dynamic polyfills (which is what this service provides):

    Often it’s sufficient to just have two variants of your JS bundles, for example “very old browsers” (all the polyfills required by the oldest browser versions your product supports) and “somewhat new browsers” (just polyfills required for browsers released in the last year or so), which you can do with browserslist and caniuse-lite data.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Yeah I used to be guilty of this. Although in slight defense of myself I never used to use random sites like that I always used to pull everything from Google CDN since I can’t see that changing hands.

      They may very well shut it down without warning, but they’re probably not going to sell it to anyone.

      • dan@upvote.au
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        5 months ago

        Yeah, it really depends on how much you trust the vendor.

        Google? Say what you want about the company, but they’ll never intentionally serve malware.

        Random company with no track record where we don’t even know who is maintaining the code? Much less trustworthy. The polyfill . io repo is currently owned by a Github user called “polyfillpolyfill” with no identifying information.

        Third-party CDNs make less sense these days though. A lot of hosting services have a CDN of some sort. Most sites have some sort of build process, and you usually bundle all your JS and CSS (both your code and third-party code, often as separate bundles) as part of that.

      • dan@upvote.au
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        5 months ago

        You’d be surprised how much code people blindly reuse without even looking at it, especially in JavaScript. A bunch of it is from projects owned by random individuals. The JS standard library is ridiculously small, so nearly all JS apps import third-party code of some sort. One JS framework can pull in hundreds of third-party modules.

        It’s much less of an issue with languages like C# and even PHP, where the first-party libraries are often sufficient for building a small or mid-sized app.

  • originalucifer@moist.catsweat.com
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    nah. over 100k sites ignored dependency risks, even after the original owners warned them this exact thing would happen.

    the real story is 100k sites not being run appropriately.

    • douglasg14b@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      5 months ago

      That’s not how systemic problems work.

      This is probably one of the most security ignorant takes on here.

      People will ALWAYS fuck up. The world we craft for ourselves must take the “human factor” into account, otherwise we amplify the consequences of what are predictable outcomes. And ignoring predictable outcomes to take some high ground doesn’t cary far.

      The majority of industries that actually have immediate and potentially fatal consequences do exactly this, and have been for more than a generation now.

      Damn near everything you interact with on a regular basis has been designed at some point in time with human psychology in mind. Built on the shoulders of decades of research and study results, that have matured to the point of becoming “standard practices”.

      • oce 🐆@jlai.lu
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        Ok, people will always fuck up, so what do you do?

        The majority of industries that actually have immediate and potentially fatal consequences do exactly this, and have been for more than a generation now.

        All the organizations (including public) getting ransomware and data stolen, it’s because the consequences are not that bad? It is not gross negligence?

      • rottingleaf@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        People will ALWAYS fuck up. The world we craft for ourselves must take the “human factor” into account, otherwise we amplify the consequences of what are predictable outcomes.

        So what does it say about us diverting from purely server-side scripted message boards with pure HTML and tables, and not a line of JS? Yes, let’s get back there please. And no phone numbers.

        The majority of industries that actually have immediate and potentially fatal consequences do exactly this, and have been for more than a generation now.

        Boeing - we know where you’re goeing.

        Damn near everything you interact with on a regular basis has been designed at some point in time with human psychology in mind. Built on the shoulders of decades of research and study results, that have matured to the point of becoming “standard practices”.

        There’s one industry which kinda started like this, with proper HIG and standard key combinations and proven usability with screenreaders or by people with color blindness, autism, ADHD, whatever.

        Then came in people talking with the tone similar to, sorry, yours in the “People will ALWAYS fuck up” part came saying that people want nice, dynamic, usable websites with lots of cool new features, people are social, they want girls with real photos, names and phone numbers on their forums which BTW should be called social nets.

        By the way, we already had that with Flash and Java applets, some things of what I remember were still cooler than modern websites of the “web application” paradigm are now. And we had personal webpages with real names and contacts and photos. And there were tools allowing to make them easily.

        These people just hated the existing culture with its individualism and depth, the web applications should be able to own you and not be just another kind of embedded content, the personal webpages should be all the same, and of course normies wouldn’t want to come as guests into the nerdspace - no, they had those new social nets as their space, looking down on those nerds and freaks of my kind.

        Now - well, try using today’s web as a person impaired in any way.

        And those normies can’t really use it too, and too feel impaired, they just won’t admit it.

        • efstajas@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          So what does it say about us diverting from purely server-side scripted message boards with pure HTML and tables, and not a line of JS? Yes, let’s get back there please.

          Ironically, proper SSR that has the server render the page as pure HTML & CSS is becoming more and more popular lately thanks to full-stack meta frameworks that make it super easy. Of course, wanting to go back to having no JS is crazy — websites would lose almost all ability to make pages interactive, and that would be a huge step backwards, no matter how much nostalgia you feel for a time before widespread JS. Also tables for layout fucking sucked in every possible way; for the dev, for the user, and for accessibility.

          people want nice, dynamic, usable websites with lots of cool new features, people are social

          That’s right, they do and they are.

          By the way, we already had that with Flash and Java applets, some things of what I remember were still cooler than modern websites of the “web application” paradigm are now.

          Flash and Java Applets were a disaster and a horrible attempt at interactivity, and everything we have today is miles ahead of them. I don’t even want to get into making arguments as to why because it’s so widely documented.

          And we had personal webpages with real names and contacts and photos. And there were tools allowing to make them easily.

          There are vastly more usable and simple tools for making your own personal websites today!

          • rottingleaf@lemmy.zip
            link
            fedilink
            English
            arrow-up
            0
            ·
            5 months ago

            Ironically, proper SSR that has the server render the page as pure HTML & CSS is becoming more and more popular lately thanks to full-stack meta frameworks that make it super easy.

            I know. Just the “full-stack meta frameworks” part alone makes any ADHD person feel nausea.

            Of course, wanting to go back to having no JS is crazy — websites would lose almost all ability to make pages interactive, and that would be a huge step backwards, no matter how much nostalgia you feel for a time before widespread JS.

            I disagree. Geminispace is very usable without scripts.

            That’s right, they do and they are.

            Well, then it appears they don’t care for what I need, so I don’t care for what they need. If only one paradigm must remain, then naturally I pick mine. If not, then there’s no problem and I still shouldn’t care.

            And those industry rules I was answering about are about making a thing work for both, even if being less functional.

            Flash and Java Applets were a disaster and a horrible attempt at interactivity, and everything we have today is miles ahead of them. I don’t even want to get into making arguments as to why because it’s so widely documented.

            Sorry, but either you still make an argument or this isn’t worth much.

            For me it’s obvious that embeddable cross-platform applications as content inside hypertext are much better than turning a hypertext system into some overengineered crappy mess of a cross-platform application system.

            The security issues with Flash and Java applets weren’t much different from those in the other parts of a web browser back then.

            There are vastly more usable and simple tools for making your own personal websites today!

            I ask you for links and how many clicks and fucks it would take to make one with these, as opposed to back then. These are measurable, scientific things. Ergonomics is not a religion.

            • efstajas@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              5 months ago

              I know. Just the “full-stack meta frameworks” part alone makes any ADHD person feel nausea.

              But why? What’s bad about this?

              I disagree. Geminispace is very usable without scripts

              That’s great, I’m not saying that it’s impossible to make usable apps without JS. I’m saying that the capabilities of websites would be greatly reduced without JS being a thing. Sure, a forum can be served as fully static pages. But the web can support many more advanced use-cases than that.

              If only one paradigm must remain, then naturally I pick mine. If not, then there’s no problem and I still shouldn’t care.

              So you can see that other people have different needs to yours, but you think those shouldn’t be considered? We’re arguing about the internet. It’s a pretty diverse space.

              For me it’s obvious that embeddable cross-platform applications as content inside hypertext are much better than turning a hypertext system into some overengineered crappy mess of a cross-platform application system.

              Look, I’m not saying that the web is the most coherent platform to develop for or use, but it’s just where we’re at after decades of evolving needs needing to be met.

              That said, embedded interactive content is absolutely not better than what we have now. For one, both Flash and Java Applets were mostly proprietary technologies, placing far too much trust in the corpos developing them. There were massive cross-platform compatibility problems, and neither were in any way designed for or even ready for a responsive web that displays well on different screen sizes. Accessibility was a big problem as well, given an entirely different accessibility paradigm was necessary within vs. the HTML+CSS shell around the embedded content.

              Today, the web can do everything Flash + Java Applets could do and more, except in a way that’s not proprietary but based on shared standards, one that’s backwards-compatible, builds on top of foundational technologies like HTML rather than around, and can actually keep up with the plethora of different client devices we have today. And speaking of security — sure, maybe web browsers were pretty insecure back then generally, but I don’t see how you can argue that a system requiring third-party browser plug-ins that have to be updated separately from the browser can ever be a better basis for security than just relying entirely on the (open-source!) JS engine of the browser for all interactivity.

              I ask you for links and how many clicks and fucks it would take to make one with these, as opposed to back then. These are measurable, scientific things. Ergonomics is not a religion.

              The idea that any old website builder back in the day was more “ergonomic” while even approaching the result quality and capabilities of any no-code homepage builder solution you can use today is just laughable. Sorry, but I don’t really feel the burden of proof here. And I’m not even a fan of site builders, I would almost prefer building my own site, but I recognize that they’re the only (viable) solution for the majority of people just looking for a casual website.

              Besides — there’s nothing really preventing those old-school solutions from working today. If they’re so much better than modern offerings, why didn’t they survive?

              • rottingleaf@lemmy.zip
                link
                fedilink
                English
                arrow-up
                0
                ·
                5 months ago

                But why? What’s bad about this?

                What I said, literally.

                But the web can support many more advanced use-cases than that.

                Which can be done with something embeddable, and not by breaking a hypertext system.

                So you can see that other people have different needs to yours, but you think those shouldn’t be considered? We’re arguing about the internet. It’s a pretty diverse space.

                If those people don’t consider mine, then I don’t consider theirs. If I must consider theirs, they must consider mine.

                Look, I’m not saying that the web is the most coherent platform to develop for or use, but it’s just where we’re at after decades of evolving needs needing to be met.

                That says nothing. It’s a market\evolution argument. Something changes tomorrow and that will be the result of evolution. Somebody uses a different system and that’s it for them.

                That said, embedded interactive content is absolutely not better than what we have now. For one, both Flash and Java Applets were mostly proprietary technologies, placing far too much trust in the corpos developing them.

                And today’s web browsers are as open as Microsoft’s OOXML. De facto proprietary.

                There were massive cross-platform compatibility problems,

                For Flash? Are you sure? I don’t remember such.

                and neither were in any way designed for or even ready for a responsive web that displays well on different screen sizes.

                Nothing was. Doesn’t tell us anything.

                Accessibility was a big problem as well, given an entirely different accessibility paradigm was necessary within vs. the HTML+CSS shell around the embedded content.

                Yes, but applet’s problems in that wouldn’t spread to the HTML page embedding it. Unlike now.

                Today, the web can do everything Flash + Java Applets could do and more, except in a way that’s not proprietary but based on shared standards, one that’s backwards-compatible, builds on top of foundational technologies like HTML rather than around, and can actually keep up with the plethora of different client devices we have today.

                I’ve already said how it’s similar to OOXML. Only MS documented their proprietary at the moment standard of their proprietary program and made it open, while Chromium is itself open, but somehow that doesn’t make things better.

                And speaking of security — sure, maybe web browsers were pretty insecure back then generally, but I don’t see how you can argue that a system requiring third-party browser plug-ins that have to be updated separately from the browser can ever be a better basis for security than just relying entirely on the (open-source!) JS engine of the browser for all interactivity.

                That’s similar to the Apple walled garden arguments. It’s valuable in areas other than security because of separating power between some browser developer and some plugin’s developer. And fighting monoculture is also good for security.

                Also people still use plugins, still separately updated, which still get compromised.

                Also plugins can be properly sandboxed.

                The idea that any old website builder back in the day was more “ergonomic” while even approaching the result quality and capabilities of any no-code homepage builder solution you can use today is just laughable. Sorry, but I don’t really feel the burden of proof here. And I’m not even a fan of site builders, I would almost prefer building my own site, but I recognize that they’re the only (viable) solution for the majority of people just looking for a casual website.

                Sorry, I still do feel that burden of proof. Because for a static site like in 2002 I’d just export a page from OpenOffice and edit some links, and then upload it.

            • JackbyDev@programming.dev
              link
              fedilink
              English
              arrow-up
              0
              ·
              5 months ago

              I know. Just the “full-stack meta frameworks” part alone makes any ADHD person feel nausea.

              ??? Please don’t make weird blanket statements like this.

    • ShaunaTheDead@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      One place I worked at recently was still using Node version 8. Running npm install would give me a mini heart attack… Like 400+ critical vulnerabilities, it was several thousand vulnerabilities all around.

      • corsicanguppy@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        Running npm install would give me a mini heart attack

        It should; but more because it installs things right off the net with no validation. Consistency of code product is not the only thing you’re tossing.

        • LordCrom@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          How else would you get LPAD ? Expect me to write 2 lines of code when I could just import a 100 Mb library to do it for me?

    • Optional@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      the real story is 100k sites not being run appropriately.

      Same as it ever was. Same as it ever was. Same as it ever was.

      • kautau@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        Yeah this is just capitalistic business in general. Don’t do anything proactive if it might reduce the bottom line in the short term. Blame others and beg for help when you weren’t proactive. Succeed singularly, fail collectively

      • Warl0k3@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        5 months ago

        I don’t think we have to choose. “Maintain your websites so you don’t get taken advantage of” and “Here’s an example of a major-world-power-affiliated group exploting that thing you didn’t do” are both pretty important stories.