Jump to content

Microsoft explains why its cute AI chatbot became a crazed Nazi in under a day


Recommended Posts

Guest Brad Reed
Posted

http://androidforum.us/data/MetaMirrorCache/b580fb1cb6e1bfd20a9771b08184cc5d.jpegEarlier this week, we brought you the tragicomic story of Tay, an artificial intelligence chatbot that was designed to interact with and learn from people between the ages of 18 and 24. Unfortunately for Microsoft, however, some racist Twitter trolls figured out a way to manipulate Tay's behavior to transform it into a crazed racist who praised Hitler and denied the existence of the Holocaust. That is obviously not a good thing and Microsoft has penned a followup blog post explaining what went wrong and what it plans to do in the future.

 

(more…)http://androidforum.us/data/MetaMirrorCache/b4c155e71757a50d5e029b66cc9ff455._.gif

 

Via BRG - Boy Genius Report

×
×
  • Create New...