• expatriado@lemmy.world
    link
    fedilink
    English
    arrow-up
    44
    arrow-down
    1
    ·
    1 month ago

    Physics’ Nobel prize awarded for a Computer Science achievement, actual physics is having a dry spell I guess

    • kamenLady.@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      1 month ago

      Beyond recognizing the laureates’ inspirations from condensed-matter physics and statistical mechanics, the prize celebrates interdisciplinarity. At its core, this prize is about how elements of physics have driven the development of computational algorithms to mimic biological learning, impacting how we make discoveries today across STEM.

      They explain the flex at least

  • lambda_notation@lemmy.ml
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    1
    ·
    1 month ago

    “They used physics to do it” is just a laughably pathetic motivation. Nobel hated “abstract wankery” or “intellectual masturbation” and wanted to promote results which benefitted the common man and society directly. This is incidentally also why there doesn’t exist a Nobel prize in economics. The nobel prize comitte has since long abandoned Nobel’s will in this matter and it is anyones guess what the order of magnitude of spin Nobel’s corpse has accumulated.

  • msantossilva@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    1 month ago

    I guess some people are genuinely concerned about AI wiping out humanity. Do not worry, that will never happen. We are already doing a fine job fostering our own extinction. If we keep going down our current path, those soulless robots will never even get the chance.

    Now, in truth, I do not know what will kill us first, but I reckon it is important to stay positive

    • slaacaa@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 month ago

      I mean it’s definitely helping, but not in the way I imagined. It is becoming a major driver of CO2 emissions due to the large computational power if needs, which will only increase in the future. The planet is boiling, and they will keep building more server farms for the next LLM upgrade, giving up on stopping/controlling climate change.

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      What’s laughable are the “terminator” scenarios where it suddenly comes to life in an instant and in that moment already has the power to wipe us out, and then does so.

      A more likely scenario is that we come to rely heavily on AI more and more as time goes by, until it truly does have a grip on resource supply chains, manufacturing facilities, energy plants, etc. And I don’t just mean that machine learning gets used in all of those contexts because we are already there. I’m talking about custodial authority. We’ve ceded those duties to it in large part - can’t do those jobs without AI.

      Then a malicious AI could put a real squeeze on humanity. It wouldn’t need to be a global war. Just enough disruption that we starve and begin to war among ourselves. Has anyone ever noticed how many of us there are now? Our population would absolutely fall apart without our massive industrial and agricultural complexes running full time.

    • sunbeam60@lemmy.one
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 month ago

      Was our run really that good? We killed a bunch of species, drained our planet of resources and belched pollution into the air. I wouldn’t be surprised if the AIs manage to steward our planet better.

  • zlatiah@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 month ago

    So it was the physics Nobel… I see why the Nature News coverage called it “scooped” by machine learning pioneers

    Since the news tried to be sensational about it… I tried to see what Hinton meant by fearing the consequences. Believe he is genuinely trying to prevent AI development without proper regulations. This is a policy paper he was involved in (https://managing-ai-risks.com/). This one did mention some genuine concerns. Quoting them:

    “AI systems threaten to amplify social injustice, erode social stability, and weaken our shared understanding of reality that is foundational to society. They could also enable large-scale criminal or terrorist activities. Especially in the hands of a few powerful actors, AI could cement or exacerbate global inequities, or facilitate automated warfare, customized mass manipulation, and pervasive surveillance”

    like bruh people already lost jobs because of ChatGPT, which can’t even do math properly on its own…

    Also quite some irony that the preprint has the following quote: “Climate change has taken decades to be acknowledged and confronted; for AI, decades could be too long.”, considering that a serious risk of AI development is climate impacts