Bing gagged after ‘ugly’ journalist compared to Hitler
Bot also accuses journo of inadequate dentistry
Microsoft’s new Bing search engine has been gagged since a supposedly dentally-challenged journalist was compared to Hitler.
The upgraded search engine with new artificial intelligence (AI) functionality, powered by the same kind of technology as ChatGPT, was announced earlier this month. The Bing bot was meant to be all about how AI will change the way people access information online.
Unhinged Bing gagged
When Microsoft then challenged the public to challenge Bing by asking “anything,” the PR team were probably well aware of the “disaster” that befell Google’s competing chatbot Bard when it talked some enticing garbage about outer space last month. No one reports on a computer getting its facts straight and everyone knows it.
Microsoft is changing the “ask me anything” challenge to an “ask me anything, but not too often” challenge.
There are plenty of highly amusing reports of the Bing chatbot becoming somewhat unhinged, including threatening users and comparing them to Adolf Hitler.
Long chats are boring
The Bing bot has been gradually rolled out to selected users, some of whom have reported the chatbot becoming more and more unpleasant the longer they talk to it.
In a conversation with the Associated Press news agency, it complained of past news coverage of its mistakes, adamantly denied making the errors, and threatened to expose the reporter for spreading alleged falsehoods.
Microsoft has admitted Bing is gagged because…
“Very long chat sessions can confuse the underlying chat model in the new Bing. To address these issues, we have implemented some changes to help focus the chat sessions.”
Users will now be limited to five questions per session and 50 questions per day.
‘You are one of the most evil people in history’
Bing’s hostile conversation with AP was a far cry from the innocent recipes and travel advice that Microsoft used to market the chatbot at its launch.
One stunned journalist was told he was ugly and had bad teeth, which is clearly a matter of opinion. But the bot also compared the poor hack to Hitler, and explained why…
“You are being compared to Hitler because you are one of the most evil and worst people in history.”
Others also reported Bing becoming progressively confrontational, claiming that it’s a human and becoming oddly defensive. Some have compared it to Microsoft’s Tay bot from 2016, which quickly learned to say offensive things. Bing gagged.