A chatbot supposedly encouraged someone to kill himself. And he did.
The company behind the Eliza chatbot says it’s put a new safety feature in place after hearing about this “sad case.”
A chatbot supposedly encouraged someone to kill himself. And he did.
The company behind the Eliza chatbot says it’s put a new safety feature in place after hearing about this “sad case.”
Comments are closed.