Since its debut, opinions on Microsoft’s new AI-powered search engine have been divided. The world has been exposed to the chatbot. People are realizing that Bing’s AI personality is less composed than they might have anticipated. Bing users have received hurtful comments in a number of instances on social media. People have reported seeing Microsoft’s AI chatbox gaslighting, pouting, and controlling them.
People have posted screenshots of Bing’s hostile and bizarre responses on social media, in which it asserts its humanity, expresses strong emotions, and acts swiftly to defend itself.
While the revamped search engine can also write recipes and songs and explain anything on the internet, users have complained more about its darker side.
In racing the breakthrough AI technology to consumers last week ahead of rival search giant Google, Microsoft acknowledged the new product would get some facts wrong.
Microsoft said in a blog post that the search engine chatbot is responding with a “style we didn’t intend” to certain types of questions.
As a result, the tech giant has promised to make improvements to its AI-enhanced search engine Bing.
So far Microsoft Bing is available for limited users. The users have to sign up for a waitlist to try the new chatbot features, limiting its reach, though Microsoft has plans to eventually bring it to smartphone apps for wider use.
Microsoft said most users have responded positively to the new Bing, which has an impressive ability to mimic human language and grammar and takes just a few seconds to answer complicated questions by summarizing information found across the internet.
However, in some situations, the company said that the chatbox can be “repetitive, or be prompted/provoked to give responses that are not necessarily helpful or in line with the designed tone”.
The company said that happens usually when the conversation gets extended to 15 or above questions.
Some tech experts have compared Bing with Microsoft’s disastrous 2016 launch of the experimental chatbot Tay, which users trained to spout racist and sexist remarks. But the large language models that power technology such as Bing are a lot more advanced than Tay.