flowerysong@awful.systemstoSneerClub@awful.systems•another dystopian hellscape is when your AI generator makes non-white NazisEnglish
0·
9 months agoTurns out that trying to correct for biased models by biasing your input does not make your results more reliable.
It’s still very good at autocompleting an answer to look like its training data (because that’s what it’s doing) and very bad at logic (which it is not doing.)
“I have two chickens and a sack of corn I need to get across a river, but the boat can only carry me and one item. How do I do it?”
…and we’ll stop there, since it came up with the wrong constraints, failed to realize that these constraints are impossible to fulfill, and also violated the actual constraints in its first step.
Will more detail help? No. “I have two vegetarian chickens and a sack of corn I need to get across a river, but the boat can only carry me and one item. How do I do it?”
Because it’s not actually doing logic it got the first step wrong, violated the constraints multiple times, and also included an irrelevant idea (the guard) from a variant that it was trained on. I only got a correct, efficient answer one time out of ten trials of trivial variations on this classic puzzle.