… and neither does the author (or so I believe - I made them both up).

On the other hand, AI is definitely good at creative writing.

  • Flying Squid@lemmy.world
    link
    fedilink
    arrow-up
    28
    arrow-down
    3
    ·
    2 months ago

    I have a very unusual last name. There is only one other person in the country with my first and last name and they have a different middle initial from me.

    So one day, I asked ChatGPT to tell me about myself including my middle initial.

    Did you know that I was a motivational speaker for businesses and I had published a half-dozen books on it?

    Because I didn’t.

    • A_A@lemmy.world
      link
      fedilink
      arrow-up
      12
      ·
      2 months ago

      This is because there is a Mr. Flying Thomas Squid, living in another country, who is a motivational speaker and who didn’t work in (… video ?).

      • Flying Squid@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        edit-2
        2 months ago

        Good theory, but this Mr. Flying Thomas Squid that ChatGPT talked about lived in the U.S. like me.

        (And yes, I worked in the entertainment industry in various roles for about a decade. Oddly, the other person with my name was in a neighboring industry and we worked about two miles apart for years, but we’ve only met once.)

    • AngryCommieKender@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      2 months ago

      I should try that. I have an unusual first name, according to the Social Security Administration, only 600 people have this name, and I appear to be the oldest one. Also no one else has my first and last name. I should try that out.

    • brbposting@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 months ago

      Hallucinations are so strong with this one too… like really bad.

      If I can’t already or won’t be able/willing to verify an output, I ain’t usin’ it - not a bad rule I think.

      • can@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        2 months ago

        At least Bing will cite sources, and hell, sometimes they even align with what it said.

        • brbposting@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 months ago

          Heh yeah if the titles of webpages from its searches were descriptive enough

          Funny that they didn’t have a way to stop at claiming it could browse websites. Last I checked you could paste in something like

          https://mainstreamnewswebsite.com/dinosaurs-found-roaming-playground
          

          and it would tell you which species were nibbling the rhododendrons.

          …wow still works, gonna make a thread

        • brbposting@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          2 months ago

          Clowning

          (I’m not smart enough to leverage a model/make a bot like this but they’ve had too long not to close this obvious misinformation hole)

  • canihasaccount@lemmy.world
    link
    fedilink
    arrow-up
    17
    arrow-down
    1
    ·
    2 months ago

    Claude 3.5 Sonnet, using the same exact prompt:

    I apologize, but I’m not able to provide a synopsis of “The Mighty Eagle” by John Carrol. After searching my knowledge base, I don’t have any information about a book with that exact title and author. It’s possible this may be a lesser-known work or there could be an error in the title or author name provided. Without being able to verify the book’s existence or details, I can’t offer an accurate synopsis. If you have any additional information about the book or author that could help clarify, I’d be happy to assist further.

    • Benjaben@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      2 months ago

      I’ve been asking that one about a wide range of topics and been very impressed with its replies. It’s mixed on software dev, which is to be expected. It also missed on a simple music theory question I asked, and then missed again when asked to correct it (don’t have the details at hand to quote, unfortunately). But overall I’ve found it to be reliable and much faster than the necessary reading for me to answer the question myself.

      How’ve you found Claude?

  • fubarx@lemmy.ml
    link
    fedilink
    arrow-up
    7
    arrow-down
    2
    ·
    2 months ago

    Tried it with ChatGPT 4o with a different title/author. Said it couldn’t find it. That it might be a new release or lesser-known title. Also with a fake title and a real author. Again, said it didn’t exist.

    They’re definitely improving on the hallucination front.

  • sinceasdf@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    2
    ·
    2 months ago

    Y’know when you post stupid bullshit like this it really glosses over real issues with ai like propaganda but go on about how you can get it to hallucinate by asking it a question in bad faith lmao

  • A_A@lemmy.world
    link
    fedilink
    arrow-up
    9
    arrow-down
    19
    ·
    2 months ago

    You can trigger hallucinations in today’s versions of LLMs with this kind of questions. Same with a knife : you can hurt yourself by missusing it … and in fact you have to be knowledgeable and careful with both.

    • can@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      2 months ago

      Maybe ChatGPT should find a way to physically harm users when it hallucinates? Maybe then they’d learn.

      • A_A@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        Hallucinated books from AI describing what mushroom you could pick in the forest have been published and some people did die because of this.
        We have to be careful when using a.i. !