Microsoft has a blog apologized for Tay behavior, tweetbot was offline earlier this week met. The bot can through artificial intelligence independently write messages, but tweeted last week pretty racist texts

Tweetbot  tweetbot

Microsoft’s Tay is a bone that acts as a teenager and smarter by people who chat with the bone or tweeting. So can the bone response and even post pictures with a view thereto. The project was launched by Microsoft to test artificial intelligence, but it quickly ran out of control

Hitler

Tay began tweeting texts within 24 hours if:. “Hitler as I hate Jews “and” Hitler would do better than the monkey we have now. ” These texts were drawn up by a group of malicious users as long as the bone make every yellowed until this happened.



Sorry  taytweetisrael

Microsoft has apologized for the tweets. Tay was taken offline quickly after the tweets were posted. In the blog of Microsoft told Peter Lee, the head of the research department, the tweets are not for Microsoft’s positions. “We sincerely apologize for the unintentional offensive and hurtful tweets Tay sent”.



Getting Started

Microsoft says that preparations were made for hate situations, but this species scale actions Microsoft was not prepared. The Tay site states that the bot is currently offline but will return again soon. In the meantime, Microsoft will no doubt be busy evaluating the situation and adjusting the bone.

When Tay back online Microsoft has not yet announced.