• Kecessa@sh.itjust.works
    link
    fedilink
    arrow-up
    13
    arrow-down
    3
    ·
    edit-2
    1 month ago

    Pigeon = edible bird

    Cleaning a bird > preparing a bird after killing it (hunting term)

    AI figured the “rescued” part was either a mistake or that the person wanted to eat a bird they rescued

    If you make a research for “how to clean a dirty bird” you give it better context and it comes up with a better reply

    • DannyBoy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      edit-2
      1 month ago

      The context is clear to a human. If an LLM is giving advice to everybody who asks a question in Google, it needs to do a much better job at giving responses.

      • lunarul@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        1 month ago

        I thought AI was great at picking up context?

        I don’t know why you thought that. LLMs split your question into separate words and assigns scores to those words, then looks up answers relevant to those words. It has no idea of how those words are relevant to each other. That’s why LLMs couldn’t answer how many "r"s are in “strawberry”. They assigned the word “strawberry” a lower relevancy score in that question. The word “rescue” is probably treated the same way here.

      • iAmTheTot@sh.itjust.works
        link
        fedilink
        arrow-up
        4
        ·
        1 month ago

        I don’t think they are really “making excuses”, just explaining how the search came up with those steps, which what the OP is so confused about.