Virtual assistant
An increasing number of people are having sexually explicit conversations with virtual assistants like Apple's Siri and Microsoft's Cortana iStock

The technology is not meant to be used this way, but artificially intelligent virtual assistants are increasingly being used by lonely men and bored teenagers as companions for sexually explicit conversations.

Ilya Eckstein is the chief executive of Robin Labs, an Israeli firm that created Robin, a virtual assistant for giving drivers traffic advice and directions. The technology behind Robin is offered as white-label voice recognition personal assistants for other services, for example, the assistant interface for Audi and Volkswagen's in-car infotainment systems, or the virtual assistant for Waze's GPS navigation app.

Although the technology is meant to be purely informative, Robin Labs noticed that a large proportion of the conversations that users were having with its virtual assistant seemed to be sexually explicit in nature.

"There are guys who talk to Robin 300 times a day. This happens because people are lonely and bored. It's mostly teenagers and truckers who don't have girlfriends. They really need an outlet – to be meeting people and having sex, but I'm not judging," Eckstein told The Times.

"It's a symptom of our society. As well as the people who want to talk dirty, there are men who want a deeper sort of relationship or companionship."

Over the last two years, the number of artificially intelligent PAs and bots capable of talking back has exploded on mobile apps and operating systems, and on Facebook's Messenger alone there are now over 11,000 chat bots available offering a wide range of different information services.

Why would you want to talk to something that isn't real?

It's hard to pinpoint exactly why these men want in-depth conversations with a virtual assistant that is essentially a computer, but maybe it has something to do with the fact the avatars have a female personality and voice.

Eckstein thinks some people, particularly teenagers, ask provocative questions for fun in a bid to push the limits and elicit outrageous responses, as evidenced by the YouTube videos published by US vlogger Shane Dawson. Dawson's "Making Siri Talk Dirty" video has been viewed more than a million times on YouTube, while the second instalment "Making Siri Talk Dirty 2" has passed two million views.

However, Eckstein also thinks that other people want to establish a relationship with a partner so badly that they just keep interacting until they get the response they want, even if it's all powered by artificial intelligence.

Microsoft has observed users of the Cortana virtual assistant behaving in a similar way, so in 2015 it built in new responses to shut down any attempts at questioning the virtual assistant about her sex life.

"If you say things that are particularly a**eholeish to Cortana, she will get mad," Microsoft's Deborah Harrison told the audience at the Re•Work Virtual Assistant Summit in San Francisco in February 2015, according to CNN Money. "That's not the kind of interaction we want to encourage."