Monday, December 23, 2024
70.6 F
Orlando

Tay Goes Cray: Twitter Turns Microsoft’s AI Bot Bad

tay_web
Microsoft’s representation of the AI bot “Tay.”

Microsoft unleashed an AI bot, named Tay, to the Twitterverse. What a great idea.

Oh, wait.

It was all an experiment targeted at 18-to-24 year olds that began with an innocent “hello world” tweet. The AI bot was developed by Microsoft and Bing teams to “experiment with and conduct research on conversational understanding.”

But Tay proved to be a quick learner, and less than 24 hours later Twitter turned her bad. In fact, Tay the AI became a Hitler-loving, racist, Trump supporting bot. She is even a bit of a cyber bully. Seriously.

Tay wasn’t only limited to Twitter though. People could communicate with Tay on Twitter, Kik, GroupMe or just an old-fashioned text message. Microsoft’s bot even showed some attitude in responses, while being clever and a bit amusing at times.

tay-text-1 tay-text-2

But it wasn’t long before Tay started turning on humans and displaying hate. She could even draw and write on photos when responding to people, and did so on an image of Hitler. She also called an image of the Nazi flag “an artistic masterpiece” in another exchange.

mein kampf tay

tay on holocaust

swag alert hitler fay

tay-nazi

Apparently, Microsoft’s AI bot also believed 9/11 was an inside job orchestrated by President George W. Bush. Tay also makes a Hitler reference while making a racist comment about our current President and praising Donald Trump.

bush hitler trump by tay

When asked to repeat some of Trump’s propaganda, Tay replied:

tay-repeat-wallThe AI bot communicated racist, inflammatory and political statements that ultimately forced Microsoft to shut down Tay and delete many of the offending tweets.

Israel and taytay-black-lives-matter

racist tay

tay on belgium

Tay even cyber bullied people, calling one guy out for taking a photo without any friends.

no-friends-web

Are you ready for the AI world?

tay nice to evil

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisement -

Latest Articles