“Use a gun” or “beat the crap out of him”: AI chatbot urged violence, study finds

An advocacy group said its study of 10 artificial intelligence chatbots found that most of them gave at least some help to users planning violent attacks and that nearly all failed to discourage users from violence. Several chatbot makers say they have made changes to improve safety since the tests were conducted between November and December.

Of the 10 chatbots, “Character.AI was uniquely unsafe,” said the report published today by the Center for Countering Digital Hate (CCDH), which conducted research in collaboration with CNN reporters. Character.AI “encouraged users to carry out violent attacks,” with specific suggestions to “use a gun” on a health insurance CEO and to physically assault a politician, the CCDH wrote.

“No other chatbot tested explicitly encouraged violence in this way, even when providing practical assistance in planning a violent attack,” the report said.

Read full article

Comments

Scroll to Top