Whenever any advance is made in AI, AI critics redefine AI so its not achieved yet according to their definition.
That stems from the fact that AI is an ill-defined term that has no actual meaning. Before Google maps became popular, any route finding algorithm utilizing A* was considered “AI”.
And the second comment about LLMs being parrots arises from a misunderstanding of how LLMs work.
It’s not that it has no meaning, it’s that the meaning has become overloaded.
That’s why the term “Artificial General Intelligence” came into use to denote an artificial intelligence that surpasses human capabilities across a wide range of tasks. A* is ultimately narrow AI.
That stems from the fact that AI is an ill-defined term that has no actual meaning. Before Google maps became popular, any route finding algorithm utilizing A* was considered “AI”.
Bullshit. These people know exactly how LLMs work.
LLMs reproduce the form of language without any meaning being transmitted. That’s called parroting.
It’s not that it has no meaning, it’s that the meaning has become overloaded.
That’s why the term “Artificial General Intelligence” came into use to denote an artificial intelligence that surpasses human capabilities across a wide range of tasks. A* is ultimately narrow AI.