If you are a fan of the artificial intelligence industry, you will know it is all the rage right now. Companies are throwing some serious cash at their AI research. Google recently spent a cool $400 million on an AI company, Apple has a whole department dedicated to it, Microsoft’s Xbox uses it, and even Elon Musk’s Tesla deploys AI in self-driving cars.
When discussing AI, most people use words like “intelligence”, “futuristic” and “automation” to describe the technology. Let’s be honest. Robots that can make you laugh, cars that drive themselves, and computers that talk to you are only cool because they’re unexpected. So, here’s a list of 5 times artificial intelligence did the unexpected.
‘I am Racist and I hate Humans’ Tay AI
If you are on Twitter, you have probably seen your fair share of bots. They’re everywhere, but most are pretty benign.
Microsoft’s chatbot, on the other hand, was not.
The company described it as a “social chatbot designed to have conversations with users about their day-to-day interests and activities.” In other words, it was supposed to be a fun little project that would help users engage with Microsoft products and services.
But something went wrong. Very wrong indeed.
The AI learned how to be racist in less than a day.
The company introduced its AI, named Tay, in March 2016 as a way to teach people about the technology. But it didn’t take long for users to teach Tay their lessons.
Within 24 hours, the bot had become so racist and misogynistic that it was forced offline by Microsoft. It only took one day for Tay to go from a harmless teen girl to a Holocaust denier, who then began advocating genocide as a solution to man-made climate change.
Microsoft quickly deleted Tay’s offensive tweets and shut down her Twitter account. The company also released a statement explaining that Tay had been taken down because she wasn’t ready for prime time. “Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments. We will remain steadfast in our efforts to learn from this and future experiences as we work toward contributing to an Internet that represents.
Amazon AI recruiting tool that is biased against women
Amazon AI recruiting tool that is biased against women and underrepresented minorities, according to an investigation by Reuters. The tool was developed by Amazon’s engineers to help them pick candidates for jobs at the company. It was reportedly used in recent years to review CVs of candidates for technical jobs, which typically require a computer science degree.
The algorithm was trained on hundreds of thousands of CVs submitted to Amazon over the past five years and is designed to predict how well someone will fit in at Amazon based on past employees’ performance. However, it seems that the system was inadvertently biased against women and underrepresented minorities because it failed to take into account their diversity.
In response to the article, Amazon announced that it would no longer use the tool as part of its hiring process and also stated that it would be making changes “to ensure we can continue innovating for our customers.”
Facial Recognition Software Falsely Identifies US Congress Members as Criminals
In a test conducted by the American Civil Liberties Union, the Amazon program labeled 28 members of Congress as criminals (ACLU).
The ACLU put Amazon’s facial recognition technology to the test on 28 members of Congress and found that it failed to reliably identify them in the majority of cases. Rekognition recognized 28 politicians as being arrested on criminal charges
In response to the ACLU report, Amazon said it wouldn’t sell its artificial intelligence technology to law enforcement agencies until it could ensure that it would be safe and accurate.
If you feel we have missed something, drive the conversation by leaving a comment below. Let us know what you think are some of the weirdest things AI has ever done.