Microsoft
Microsoft’s Bing Image Creator is facing content violation issues. Wikimedia Commons

Bing Image Creator users are reportedly getting content violation warnings even for safe-to-use phrases. A considerable number of Bing Image Creator users have been receiving erroneous content violation notifications lately.

Despite using seemingly harmless phrases like "a cat with a cowboy hat and boots" and "man breaks server rack with a sledgehammer", some users received content violation messages.

Bing Image Creator blocks some words, but acknowledges that "even safe content can be blocked by mistake". The tool urges users to check content policy to see how they can improve your prompt.

The folks at Windows Central first reported instances of erroneous content violation notifications. Later, various Reddit users shared evidence in the form of screenshots of the violation message.

Microsoft's Response to erroneous violation message

Much to the relief of Bing Image Creator users, Microsoft has responded to the issue. Microsoft's Mikhail Parakhin acknowledged the problem and confirmed it is being inspected. "Hmm. This is weird – checking," Parakhin replied in an X post.

The folks at WindowsCentral claim Bing Image Creator's "surprise me" button, which automatically creates random images, is censoring its own images. This is a major sign that this was not a rare occurrence.

In fact, a report by The Indian Express suggests the chances of it happening are around 30 per cent. Apparently, Microsoft is still fine-tuning the tool.

Notably, Bing's image creation tool was upgraded to the Dall-E 3 version last month. It is worth noting that Dall-E 3 is very powerful. According to Microsoft, the supercharged Dall-E 3 generated a lot of traffic and interest.

On the downside, the company said it might be sluggish initially. Aside from this, there's another issue with Dall-E 3. The Redmond-based tech giant has considerably reined in the tool after its revamp, according to new reports.

The image creation tool uses a content moderation system that stops the tool from generating inappropriate pictures. However, it looks like the censorship is more severe than expected. Microsoft is probably just reacting to the kind of content Bing AI users have been trying to get the Bing Image Creation tool to create.

For instance, there was a lot of controversy surrounding an image showing Mickey Mouse carrying out the 9/11 attack. Now, the problem is that some users have reported that their harmless image creation requests are also being denied.

Interestingly, Windows Central pointed out last week that Bing AI was able to create violent zombie apocalypse scenarios featuring copyrighted characters without raising a complaint. As if that weren't enough, some users were able to trick Bing Chat into solving CAPTCHAs last week.