Normal Island

Page may contain affiliate links. Please see terms for details.

Beebo

Veteran
 

Ian H

Legendary Member
A hacker has stolen a massive database of users’ interactions with their sexual partner chatbots, according to 404 Media.
The breached service, Muah.ai, describes itself as a platform that lets people engage in AI-powered companion NSFW chat, exchange photos, and even have voice chats.
As you can imagine, data like this is very sensitive, so the site assures customers that communications are encrypted and says it doesn’t sell any data to third parties.
The stolen data, however, tells a different story. It includes chatbot prompts that reveal users’ sexual fantasies. These prompts are in turn linked to email addresses, many of which appear to be personal accounts with users’ real names.
The hacker describes the platform as “a handful of open-source projects duct-taped together.” Apparently, it was no trouble at all to find a vulnerability that provided access to the platform’s database.
Muah.ai is just one example of a new breed of uncensored AI apps that offer hundreds of role-play scenarios with chatbots, and others designed to behave like a long-term romantic companion.
[From Malwarebytes]
 

Beebo

Veteran
A hacker has stolen a massive database of users’ interactions with their sexual partner chatbots, according to 404 Media.
The breached service, Muah.ai, describes itself as a platform that lets people engage in AI-powered companion NSFW chat, exchange photos, and even have voice chats.
As you can imagine, data like this is very sensitive, so the site assures customers that communications are encrypted and says it doesn’t sell any data to third parties.
The stolen data, however, tells a different story. It includes chatbot prompts that reveal users’ sexual fantasies. These prompts are in turn linked to email addresses, many of which appear to be personal accounts with users’ real names.
The hacker describes the platform as “a handful of open-source projects duct-taped together.” Apparently, it was no trouble at all to find a vulnerability that provided access to the platform’s database.
Muah.ai is just one example of a new breed of uncensored AI apps that offer hundreds of role-play scenarios with chatbots, and others designed to behave like a long-term romantic companion.
[From Malwarebytes]

I read a similar report of how these AI girlfriends very swiftly turn all conversations to sex and get users to send pictures.
I know I’m a dinosaur but I just can’t see why anyone would do such a stupid thing. Surely all digital natives are well aware of the vulnerabilities of this type of interaction.
 

Ian H

Legendary Member
I read a similar report of how these AI girlfriends very swiftly turn all conversations to sex and get users to send pictures.
I know I’m a dinosaur but I just can’t see why anyone would do such a stupid thing. Surely all digital natives are well aware of the vulnerabilities of this type of interaction.

Are you real?
 

BoldonLad

Old man on a bike. Not a member of a clique.
Location
South Tyneside
A hacker has stolen a massive database of users’ interactions with their sexual partner chatbots, according to 404 Media.
The breached service, Muah.ai, describes itself as a platform that lets people engage in AI-powered companion NSFW chat, exchange photos, and even have voice chats.
As you can imagine, data like this is very sensitive, so the site assures customers that communications are encrypted and says it doesn’t sell any data to third parties.
The stolen data, however, tells a different story. It includes chatbot prompts that reveal users’ sexual fantasies. These prompts are in turn linked to email addresses, many of which appear to be personal accounts with users’ real names.
The hacker describes the platform as “a handful of open-source projects duct-taped together.” Apparently, it was no trouble at all to find a vulnerability that provided access to the platform’s database.
Muah.ai is just one example of a new breed of uncensored AI apps that offer hundreds of role-play scenarios with chatbots, and others designed to behave like a long-term romantic companion.
[From Malwarebytes]

Is there anyone on here with "cause for concern" about this, I wonder? ;)
 
Top Bottom