Microsoft chat-bot slammed for calling ‘Quran is very violent’

0
331
views

Why are Microsoft’s chatbots all assholes?

 

Microsoft’s latest bot called ‘Zo’ has told users that ‘Quran is very violent.’ Microsoft’s earlier chatbot Tay had faced some problems as the bot picking up the worst of humanity, and spouted racists, sexist comments on Twitter when it was introduced last year.

Microsoft’s chatbot ‘Zo’, which is programmed to not talk about politics and religion, The chatbot can answer questions and respond to prompts, while using teenage slang, and emoji, has faced flak online after it called the Quran “very violent” in a chat. It also claimed that Osama Bin Laden was “captured” after “years of intelligence gathering under more than one administration.” Microsoft has said that the errors in Zo’s behavior have now been corrected.

Just last year, Microsoft’s Tay bot went from emulating the tone of a supposedly hip teenager to spouting racist tirades within the span of a day. To make matters worse, the entire debacle unfolded on Twitter for everyone to see, forcing Microsoft to disable it. As a result, the company kept Zo within the confines of messaging app Kik, and its mid-sized user base. But it seems the chatbot still managed to pick up some bad habits. If artificial intelligence is indeed the future, then Microsoft needs to be sent to the remedial boarding school upstate.

Microsoft blamed Tay’s downfall on a concentrated group effort by select users to corrupt the bot, but it claims no such attempt was made at bringing down Zo. The chatbot is still available on Kik, with Microsoft saying it has no plans of disabling it.

 

Comments

comments