I feel like every day I come across 15-20 "AI-powered tool"s that “analyze” something, and none of them clearly state how they use data. This one seems harmless enough, put a profile in, it will scrape everything about them, all their personal information, their location, every post they ever made… Nothing can possibly go wrong aggregating all that personal info, right? No idea where this data is sent, where it’s stored, who it’s sold to. Kinda alarming

  • dustyData@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    edit-2
    4 days ago

    The only money to be made in the LLM craze is data scraping, collection, filtering, collation and data set selling. When in a gold rush, don’t dig, sell shovels. And AI needs a shit ton of shovels.

    The only people making money are Nvidia, the third party data center operators and data brokers. Everyone else running and using the models are losing money. Even OpenAI, the biggest AI vendor, is running at a loss. Eventually the bubble will burst and data brokers will still have something to sell. In the mean time, the fastest way to increase model performance is by increasing the size, that means more data is needed to train them.

    • CheeseNoodle@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 days ago

      Hey hey, there’s a flourishing market for NSFW ai chatbots that I’m sure is raking in the cash essentially re-selling access credits at a higher price.