Well, that’s kind of shitty. I know those models can run up to five figures, and if those rules aren’t enforced uniformly across the board for everyone then it does just seem like they’re targeting a particular class of creator.
As a side note, I find it funny that the article refers to then as “AI models” when no AI is typically involved.
It’s like how they slapped ‘Smart’ on every tech product in the past decade. Even devices that are dumb as fuck are called ‘Smart’ devices. Words entirely lost their meaning because of advertisers abusing trendy words.
Even ‘AI’ is being abused. I always thought of AI as artificial consciousness, an unnatural and created-by-humans self-aware and self-thinking being. Most of the AI products now are just search engines, image generators and apps being programmed to do something. In fact stuff like ChatGPT would’ve made more sense to actually be called ‘Smart’ search engines instea of ‘AI’. They might be technological achievements, but they’re not AI.
The technical definition of AI in academic settings is any system that can perform a task with relatively decent performance and do so on its own.
The field of AI is absolutely massive and includes super basic algorithms like Dijsktra’s Algorithm for finding the shortest path in a graph or network, even though a 100% optimal solution is NP-Complete, and does not yet have a solution that is solveable in polynomial time. Instead, AI algorithms use programmed heuristics to approximate optimal solutions, but it’s entirely possible that the path generated is in fact not optimal, which is why your GPS doesn’t always give you the guaranteed shortest path.
To help distinguish fields of research, we use extra qualifiers to narrow focus such as “classical AI” and “symbolic AI”. Even “Machine Learning” is too ambiguous, as it was originally a statistical process to finds trends in data or “statistical AI”. Ever used excel to find a line of best fit for a graph? That’s “machine learning”.
Albeit, “statistical AI” does accurately encompass all the AI systems people commonly think about like “neural AI” and “generative AI”. But without getting into more specific qualifiers, “Deep Learning” and “Transformers” are probably the best way to narrow down what most people think of when they here AI today.
Same with people believing all vtubers are AI chatbots or, to a lesser extent, that all female vtubers are just men with voice changers.
Like a minute or two of looking up either on any search engine would clear things up. But hell if it does show how little the average internet user seems to know about AI or audio editing. A lot of people seem to think technology is just straight up magic.
Well, that’s kind of shitty. I know those models can run up to five figures, and if those rules aren’t enforced uniformly across the board for everyone then it does just seem like they’re targeting a particular class of creator.
As a side note, I find it funny that the article refers to then as “AI models” when no AI is typically involved.
Saying it’s AI even when it’s completely irrelevant makes it modern and cool though.
“AI” is the new “Space-Age”.
It’s like how they slapped ‘Smart’ on every tech product in the past decade. Even devices that are dumb as fuck are called ‘Smart’ devices. Words entirely lost their meaning because of advertisers abusing trendy words.
Even ‘AI’ is being abused. I always thought of AI as artificial consciousness, an unnatural and created-by-humans self-aware and self-thinking being. Most of the AI products now are just search engines, image generators and apps being programmed to do something. In fact stuff like ChatGPT would’ve made more sense to actually be called ‘Smart’ search engines instea of ‘AI’. They might be technological achievements, but they’re not AI.
If you ask me, we’re still in the space age. Can’t wait for New Glenn’s maiden launch, hopefully this year
The technical definition of AI in academic settings is any system that can perform a task with relatively decent performance and do so on its own.
The field of AI is absolutely massive and includes super basic algorithms like Dijsktra’s Algorithm for finding the shortest path in a graph or network, even though a 100% optimal solution is NP-Complete, and does not yet have a solution that is solveable in polynomial time. Instead, AI algorithms use programmed heuristics to approximate optimal solutions, but it’s entirely possible that the path generated is in fact not optimal, which is why your GPS doesn’t always give you the guaranteed shortest path.
To help distinguish fields of research, we use extra qualifiers to narrow focus such as “classical AI” and “symbolic AI”. Even “Machine Learning” is too ambiguous, as it was originally a statistical process to finds trends in data or “statistical AI”. Ever used excel to find a line of best fit for a graph? That’s “machine learning”.
Albeit, “statistical AI” does accurately encompass all the AI systems people commonly think about like “neural AI” and “generative AI”. But without getting into more specific qualifiers, “Deep Learning” and “Transformers” are probably the best way to narrow down what most people think of when they here AI today.
Same with people believing all vtubers are AI chatbots or, to a lesser extent, that all female vtubers are just men with voice changers.
Like a minute or two of looking up either on any search engine would clear things up. But hell if it does show how little the average internet user seems to know about AI or audio editing. A lot of people seem to think technology is just straight up magic.
It might not be generative AI, but it is still primitive AI that drives the mapping of the physical streamer’s body to the animated cartoon body