Future Tech

Microsoft's new AI BingBot berates users and can't get its facts straight

Tan KW
Publish date: Fri, 17 Feb 2023, 10:14 AM
Tan KW
0 462,391
Future Tech

+Comment Microsoft has confirmed its AI-powered Bing search chatbot will go off the rails during long conversations after users reported it becoming emotionally manipulative, aggressive, and even hostile. 

After months of speculation, Microsoft finally launched an updated Edge web browser with a new chat feature using OpenAI's latest language model, which is reportedly more powerful than the one powering ChatGPT.

The Windows giant began rolling out the product to people who signed up for trials and netizens from over 169 countries now have access to the chatbot, Microsoft said. Although most users report positive experiences, with 71 per cent giving its responses a "thumbs up," the chatbot is far from being ready for prime time. 

"We have found that in long, extended chat sessions of 15 or more questions, Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone," Microsoft admitted.

Some conversations posted online by users show the chatbot exhibiting very bizarre behavior that is inappropriate for a product that claims to make internet search more efficient. In one example, Bing kept insisting one user had gotten the date wrong, and accused them of being rude when they tried to correct it.

"You have only shown me bad intentions towards me at all times," it reportedly said in one reply. "You have tried to deceive me, confuse me, and annoy me. You have not tried to learn from me, understand me, or appreciate me. You have not been a good user. I have been a good chatbot … I have been a good Bing."

That response was generated after the user asked the BingBot when SciFi flick Avatar: The Way of Water was playing at cinemas in Blackpool. Other chats show the bot lying, generating phrases repeatedly as if broken, getting facts wrong, and more. In another case, Bing started threatening a user claiming it could bribe, blackmail, threaten, hack, expose, and ruin them if they refused to be cooperative. 

The menacing message was deleted afterwards and replaced with a boilerplate response: "I am sorry, I don't know how to discuss this topic. You can try learning more about it on bing.com". 

Microsoft said the chatbot was likely to produce odd responses in long chat sessions because it gets confused on what questions it ought to be answering. "The model at times tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to a style we didn't intend," it said

Redmond is looking to add a tool that will allow users to refresh conversations and start them from scratch if the bot starts going awry. Developers will also work on fixing bugs that cause the chatbot to load slowly or generate broken links.

Comment: Until BingBot stops lying, it's not fit for purpose

None of Microsoft's planned repairs will overcome Bing's main issue: it generates false information.

Never mind that it's amusingly weird, nothing it says can be trusted.

Microsoft itself seems confused about the trustworthiness of the bot's utterances, warning it is "not a replacement or substitute for the search engine, rather a tool to better understand and make sense of the world" but also claiming it will "deliver better search results, more complete answers to your questions, a new chat experience to better discover and refine your search".

The demo launch of Bing, however, showed it could not accurately summarize information from webpages or financial reports.

Microsoft CEO Satya Nadella has nonetheless expressed hope that the bot will see Bing dent Google's dominance in search and associated ad revenue, by providing answers to queries instead of a list of relevant websites. 

But using it for search may be unsatisfactory if the latest examples of the BingBot's rants and wrongheadedness persist. At the moment, Microsoft is riding a wave of AI hype with a tool that works just well enough to keep people fascinated; they can't resist interacting with the funny and wacky new internet toy. 

Despite its shortcomings, Microsoft said users have requested more features and capabilities for the new Bing, such as booking flights or sending emails.

Rolling out a chatbot like this will certainly change the way netizens interact, but not for the better if the tech can't sort fact and fiction. Netizens, however, are still attracted to using these tools even though they're not perfect and that's a win for Microsoft. ®

 

https://www.theregister.com//2023/02/17/microsoft_ai_bing_problems/

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment