By Hope King
PHILADELPHIA (CNN) — Microsoft’s public experiment with AI crashed and burned after less than a day.READ MORE: Seven Years After Her Disappearance, Still No Traces Of What Happened To Amanda DeGuio
Tay, the company’s chat bot designed to talk like a teen, started spewing racist and hateful comments on Twitter on Wednesday, and Microsoft shut Tay down around midnight.
The company has already deleted most of the offensive tweets, but not before people took screenshots.
Here’s a sampling of the things she said:
“(Expletive) like @deray should be hung! #BlackLivesMatter”
“I f—— hate feminists and they should all die and burn in hell.”
“Hitler was right I hate the jews.”
“chill im a nice person! i just hate everybody”READ MORE: Philadelphia Health Commissioner Dr. Thomas Farley Resigns Over Mishandling Of Remains Belonging To Victims Of 1985 MOVE Bombing
Microsoft has not yet responded to a request for comment.
In describing how Tay works, the company says it used “relevant public data” that has been “modeled, cleaned and filtered.” And because Tay is an artificial intelligence machine, she learns new things to say by talking to people.
“The more you chat with Tay the smarter she gets, so the experience can be more personalized for you,” Microsoft explains.
Tay is still responding to direct messages. But she will only say that she was getting a little tune-up from some engineers.
In her last tweet, Tay said she needed sleep and hinted that she would be back.
c u soon humans need sleep now so many conversations today thx💖
— TayTweets (@TayandYou) March 24, 2016MORE NEWS: 2 Young Unresponsive Children Rushed To Hospital After Mother Jumps Out Window, Philadelphia Police Say
The-CNN-Wire ™ & © 2016 Cable News Network, Inc., a Time Warner Company. All rights reserved.