Microsoft
Microsoft Bing AI Chat has been facing quality issues for a while now. Wikimedia Commons

Microsoft's well-received AI search bot dubbed Bing Chat, has been facing quality issues lately. Users have reportedly noticed a decline in the performance of Bing Chat AI over the past few weeks.

Those who have been frequently using Microsoft Edge browser's Compose box claim it has become less useful. Reportedly, the Bing Chat-backed Edge Compose either fails to help with a query or avoids questions.

Still, Microsoft is gearing up to bring its Bing AI chat to third-party browsers like Google Chrome and Apple Safari on desktops and mobile devices. In the meantime, Microsoft officials have told Windows Latest that the company is sparing no effort to address the concerns.

Some users have taken to Reddit to share their experiences. One user pointed out that the once-useful Edge browser's Compose tool has lately become totally unreliable. To those unaware, the Compose tool is available in the Edge browser's Bing sidebar.

While Microsoft claims its Edge browser will soon let users rewrite text using Bing AI, the AI is making bizarre excuses when a user tries to get creative content or asks to write a funny tongue twister. Aside from this, the AI avoids creating humorous content of fictional characters.

According to the AI bot, discussing creative topics in a certain way can turn out to be inappropriate. Likewise, it suggests humor could be problematic even if the subject is an inanimate object. Another Redditor shared their experience with Bing AI chat for proofreading in a non-native language.

Rather than answering the question, Bing came up with a list of alternative tools, urging the user to "figure out" a solution. The user downvoted all the responses to express their frustration until the AI reverted to its helpful self.

"I've been relying on Bing to proofread emails I draft in my third language. But just today, instead of helping, it directed me to a list of other tools, essentially telling me to figure it out on my own. When I responded by downvoting all its replies and initiating a new conversation, it finally obliged," the user wrote.

Microsoft acknowledges the situation

In its statement to Windows Latest, Microsoft's spokesperson said the company is always monitoring feedback from testers. Furthermore, the spokesperson assured users can expect a better experience in the future.

"We actively monitor user feedback and reported concerns, and as we get more insights through the preview, we will be able to apply those learnings to further improve the experience over time," a Microsoft spokesperson said in an email.

The word on the street is that Microsoft is secretly adjusting the settings. Meanwhile, users noted that it is hard to fathom this behavior. It is worth noting that AI is simply a tool, and it is confusing to even think Bing could be offensive.

Moreover, this could lead to misconceptions surrounding AI, especially among AI skeptics who believe AI lacks essence. Bing AI rivals Google's Bard and OpenAI's ChatGPT are also becoming gradually worse, but are they as bad as Microsoft's Bing AI bot?

Is Bing Chat better than Google Bard, ChatGPT?

  • Information: WinBuzzer's Luke Jones says Bing Chat provides irrelevant information when he asks the AI to summarise a Wikipedia page. Moreover, the tool gets small details wrong and makes up quotes.
  • More mistakes: When Jones flagged the aforesaid mistakes, Bing Chat sometimes argued about them and tried to use misinformation to justify them.
  • Mixing up sources: It looks like Bing starts to think its written summary is the original source. For instance, the AI tool will claim the quotes it just made up are original. As if that weren't enough, Bing Chat will give dead links to prove itself.
  • Ending conversations: The AI will instantly end the conversation when a user asks why Bing Chat may be struggling with a specific task. So, it is safe to say Bing Chat AI doesn't surpass other AI bots including Bard and ChatGPT in terms of performance.