Using Siri for ‘sexually explicit’ chats? It’s happening… a lot

(Associated Press)

(Associated Press)

Lonely men are increasingly turning to virtual chat assistants like Siri for ‘sexually explicit’ chats, according to a number of experts in the tech community.

From teenagers to “truckers,” men who are lacking lovers are also developing feelings for female-voiced chatbots such as Apple’s Siri.

Ilya Eckstein, the chief executive of Robin Labs, noted that his company’s virtual assistant, Robin, which was designed to give traffic advice and directions to drivers, had been used by some men for up to 300 conversations a day.

“This happens because people are lonely and bored…. It is a symptom of our society,” Eckstein told The Times. “As well as the people who want to talk dirty, there are men who want a deeper sort of relationship or companionship.”

Eckstein also noted that certain people who use such technology want to engage with their virtual assistants in unusual ways that may be just for laughs, but there may also be “something deeper underneath the surface.”

“People want to flirt, they want to dream about a subservient girlfriend, or even a sexual slave,” Eckstein told Quartz.

Overall, Eckstein believes that five percent of conversations by men with his company’s virtual assistant are sexually explicit. He also believes that 1 in 3 conversations take place for no particular reason, with a lot of male users just wanting to chat.

There is evidence that men use other female chatbots like Apple’s Siri, Amazon’s Alexa, and Microsoft’s Cortana in similar ways.

According to Cortana’s writer Deborah Harrison, “a good chunk of early queries were about her sex life.”


Latest Videos