Tay, Microsoft’s AI chatbot on Twitter had to be pulled down within hours of launch after it suddenly started making racist comments. As we reported yesterday, it was aimed at 18-24 year-olds and was hailed as, “AI fam from the internet that’s got zero chill”.   hellooooooo w🌎rld!!! — TayTweets (@TayandYou) March 23, 2016 The AI behind the chatbot was designed to get smarter the more people engaged with it. But, rather sadly, the engagement it received simply taught it how to be racist. @costanzaface The more Humans share with me the more I learn #WednesdayWisdom — TayTweets (@TayandYou) March…

This story continues at The Next Web