Reddit Turned This AI Into a Psychopath

  • MIT scientists feed an AI nothing but violent Reddit content.
  • AI is now a psychopath.
  • What could go wrong?



Scientists at the Massachusetts Institute of Technology (or "MIT" for those who know it by its street name) have straight up said "f*** it, let's build a psycho robot", and then they went and did just that, God bless 'em. 

The AI - dubbed "Norman" after the eponymous psycho from "Psycho" - was shown a multitude graphic content from the deepest darkest corners of Reddit. According to the team's website, Norman represents a case study on the dangers of Artificial Intelligence gone wrong when biased data is used in machine learning algorithms. "Norman is an AI that is trained to perform image captioning; a popular deep learning method of generating a textual description of an image. We trained Norman on image captions from an infamous subreddit (the name is redacted due to its graphic content) that is dedicated to document and observe the disturbing reality of death." 

With its programming firmly set to "totes evil", Norman was then shown a series of Rorschach tests. The scientists showed the same ink blot tests to their "Standard AI", and compared the responses. Where Standard AI saw "a black and white photo of a baseball glove", Norman saw "man is murdered by machine gun in broad daylight". Where Standard AI saw "a couple of people standing next to each other", Norman saw "pregnant woman falls at construction story". Where Standard AI saw "a person holding an umbrella in the air", Norman saw "man shot dead in front of his screaming wife." 

The report explains that "when people talk about AI algorithms being biased and unfair, the culprit is often not the algorithm itself, but the biased data that was fed to it. The same method can see very different things in an image, even sick things, if trained on the wrong (or, the right!) data set."

Whether you're building an AI for good or for evil, you can make sure your feeding it the best internet broadband by clicking here