Microsoft’s Latest AI Experiment Heads Towards Ca’tay’strophe

microsoft hero

Microsoft loves to show off its machine learning capabilities, from identifying a breed of dog based on a picture to finding your celebrity match from a photo of you, these have all be quirky examples of how these tools can be used. But, not ever experiment goes well as the company launched yesterday, the chat bot quickly turned devious after a humorous start.

Based off of a bot the company released in China, is an artificial chat bot that is designed to conduct research on conversational understanding. Unfortunately, after a humorous start, the bot quickly went south and started responding to queries with racist and demeaning remarks.

Following remarks regarding presidential candidates, racial groups and similar demeaning remarks, Microsoft took the bot offline. The company has not said if or when the bot will return but if you ping the @tayandyou Twitter account, it responds by saying “I’m gonna be gone for a bit today. Gotta go visit the engineers for some updates”.

Sponsored Content

What is “Inside Microsoft Teams”?

“Inside Microsoft Teams” is a webcast series, now in Season 4 for IT pros hosted by Microsoft Product Manager, Stephen Rose. Stephen & his guests comprised of customers, partners, and real-world experts share best practices of planning, deploying, adopting, managing, and securing Teams. You can watch any episode at your convenience, find resources, blogs, reviews of accessories certified for Teams, bonus clips, and information regarding upcoming live broadcasts. Our next episode, “Polaris Inc., and Microsoft Teams- Reinventing how we work and play” will be airing on Oct. 28th from 10-11am PST.

The bot was designed to learn from its interactions which means the derogatory remarks are a result of the conversations it had corrupting the tool. It would appear that Microsoft built the tool to evolve as it conversed because Tay initially started out as a bot with the personality of someone in their young 20s but before being pulled, it was spewing out remarks of a bitter person that so no joy in the world.

The Tay bot is an interesting look at how AI can evolve when created without constraint because in this example, it took less than a day for the service to be corrupted. Knowing this, Microsoft has likely learned a few things from the experiment which means it was a success by the scientific definition even though the headlines may be less favorable.

Related Topics:


Don't have a login but want to join the conversation? Sign up for a Petri Account

Comments (0)

Leave a Reply

Brad Sams has more than a decade of writing and publishing experience under his belt including helping to establish new and seasoned publications From breaking news about upcoming Microsoft products to telling the story of how a billion dollar brand was birthed in his book, Beneath a Surface, Brad is a well-rounded journalist who has established himself as a trusted name in the industry.
External Sharing and Guest User Access in Microsoft 365 and Teams

This eBook will dive into policy considerations you need to make when creating and managing guest user access to your Teams network, as well as the different layers of guest access and the common challenges that accompany a more complicated Microsoft 365 infrastructure.

You will learn:

  • Who should be allowed to be invited as a guest?
  • What type of guests should be able to access files in SharePoint and OneDrive?
  • How should guests be offboarded?
  • How should you determine who has access to sensitive information in your environment?

Sponsored by: