You’d think that after its last AI chat bot fiasco, Microsoft would’ve learned its lesson — or at least waited a bit longer before introducing another monster to the world. Luckily for us, history has a tendency to repeat itself, and the company’s latest creation is a sometimes hilarious entity known as Captionbot. The premise is simple — recognize the contents of a photo, and come up with an appropriate caption. But alas, while the concept may be straightforward, the execution isn’t always so cut and dry. And social media has been quick to point out all of Captionbot’s faux pas.
In one case, the artificial intelligence system thought that Michelle Obama was a cellphone. And there’s also the time when the dress that launched a thousand debates (you know, the blue and black or white and gold dress) confuses the bot as much as it confused all of us. The system thought the dress was a cat wearing a tie (or in some cases, a suitcase).
The bot also looks to be attempting to rewrite history, absolutely refusing to identify Hitler as the infamous Nazi leader he was. When CNNMoney fed the AI photos of the dictator and swastika, more often than not, Captionbot responded, “I really can’t describe the picture” alongside a confused emoji. Strangely enough, it could identify other famous Nazis like Joseph Mengele and Joseph Goebbels.
Related: Microsoft kills AI chatbot Tay (twice) after it goes full Nazi
Osama bin Laden is another infamous face Captionbot can’t recognize, although it is unclear whether this is due to restrictions Microsoft put in place to prevent Tay-esque disasters from taking place. “We have implemented some basic filtering in an effort to prevent some abuse scenarios,” a spokesperson noted of the limitations.
You can upload your own photos for Captionbot to analyze, but proceed at your own risk. Sometimes, this bot gets it very, very wrong. Just check out a couple of our favorite examples below.