We all know how awful most modern websites are in terms of bloat, javascript and tracking. Not only that, but designing and maintaining web-browsers has become such a gigantic undertaking (almost the size of an operating system), that only a few companies have the resources to do it (google and mozilla, and mozilla might not hold on for much longer).

These alternative protocols offer a minimal set of features, and are trying to get back to what the web should’ve been: static content with images, text, and links, with local applications filling the void for anything more complicated than that.

Lets say I wanted a privacy-friendly way to view a page on a news site. I could:

  • Copy the URL of the page
  • Open some tool, (or website, anything), paste that url.
  • It converts the content in the url to the necessary privacy-friendly alternative format, and I can view it with my gopher/gemini browser (or even maybe a markdown viewer).

I know there are a few html -> markdown converters that can do the last step.

Does anyone know if this would work?

  • kometes@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    2 months ago

    You are conflating content with delivery protocol. The websites would look the same regardless of protocol.

    • Dessalines@lemmy.mlOP
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      I think Gemini or Gopher includes both. They don’t read html / javascript, so they definitely wouldn’t look the same.

      • kometes@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        2 months ago

        How do you think they would look? To get the news web page in your example, you are still going to have to have to render the page by running the client-side scripts required by modern websites and render some type of markup language for layout. It’s not like web pages are secretly sitting in pure text on a server somewhere…