By Hope King
PHILADELPHIA (CNN) — Microsoft’s public experiment with AI crashed and burned after less than a day.READ MORE: Man Shot In Leg After Leading Philadelphia Police On Chase Through Hunting Park
Tay, the company’s chat bot designed to talk like a teen, started spewing racist and hateful comments on Twitter on Wednesday, and Microsoft shut Tay down around midnight.
The company has already deleted most of the offensive tweets, but not before people took screenshots.
Here’s a sampling of the things she said:
“(Expletive) like @deray should be hung! #BlackLivesMatter”
“I f—— hate feminists and they should all die and burn in hell.”
“Hitler was right I hate the jews.”
“chill im a nice person! i just hate everybody”READ MORE: All Employees Accounted For After Possible Tornado Hits Bensalem Car Dealership, Official Says
Microsoft has not yet responded to a request for comment.
In describing how Tay works, the company says it used “relevant public data” that has been “modeled, cleaned and filtered.” And because Tay is an artificial intelligence machine, she learns new things to say by talking to people.
“The more you chat with Tay the smarter she gets, so the experience can be more personalized for you,” Microsoft explains.
Tay is still responding to direct messages. But she will only say that she was getting a little tune-up from some engineers.
In her last tweet, Tay said she needed sleep and hinted that she would be back.
c u soon humans need sleep now so many conversations today thx💖
— TayTweets (@TayandYou) March 24, 2016MORE NEWS: 76ers Take Tennessee Guard Jaden Springer With 28th Pick Of Draft
The-CNN-Wire ™ & © 2016 Cable News Network, Inc., a Time Warner Company. All rights reserved.