You must log in or # to comment.
They mimic the inputs. Microsoft made a chatbot a few years ago named Tay who turned into a hateful Nazi in less than 24 hours because Microsoft didn’t install any safeguards around the type of inputs it received. The program was scrapped almost immediately.
I’m not sure if we understand “hate” and “love” well enough in biological brains to the point where we would be able to replicate the emotions with transistors and be confident that we had succeeded.

