Normal Island

Page may contain affiliate links. Please see terms for details.

Beebo

Veteran
 

Ian H

Legendary Member
A hacker has stolen a massive database of users’ interactions with their sexual partner chatbots, according to 404 Media.
The breached service, Muah.ai, describes itself as a platform that lets people engage in AI-powered companion NSFW chat, exchange photos, and even have voice chats.
As you can imagine, data like this is very sensitive, so the site assures customers that communications are encrypted and says it doesn’t sell any data to third parties.
The stolen data, however, tells a different story. It includes chatbot prompts that reveal users’ sexual fantasies. These prompts are in turn linked to email addresses, many of which appear to be personal accounts with users’ real names.
The hacker describes the platform as “a handful of open-source projects duct-taped together.” Apparently, it was no trouble at all to find a vulnerability that provided access to the platform’s database.
Muah.ai is just one example of a new breed of uncensored AI apps that offer hundreds of role-play scenarios with chatbots, and others designed to behave like a long-term romantic companion.
[From Malwarebytes]
 

Beebo

Veteran
A hacker has stolen a massive database of users’ interactions with their sexual partner chatbots, according to 404 Media.
The breached service, Muah.ai, describes itself as a platform that lets people engage in AI-powered companion NSFW chat, exchange photos, and even have voice chats.
As you can imagine, data like this is very sensitive, so the site assures customers that communications are encrypted and says it doesn’t sell any data to third parties.
The stolen data, however, tells a different story. It includes chatbot prompts that reveal users’ sexual fantasies. These prompts are in turn linked to email addresses, many of which appear to be personal accounts with users’ real names.
The hacker describes the platform as “a handful of open-source projects duct-taped together.” Apparently, it was no trouble at all to find a vulnerability that provided access to the platform’s database.
Muah.ai is just one example of a new breed of uncensored AI apps that offer hundreds of role-play scenarios with chatbots, and others designed to behave like a long-term romantic companion.
[From Malwarebytes]

I read a similar report of how these AI girlfriends very swiftly turn all conversations to sex and get users to send pictures.
I know I’m a dinosaur but I just can’t see why anyone would do such a stupid thing. Surely all digital natives are well aware of the vulnerabilities of this type of interaction.
 

Ian H

Legendary Member
I read a similar report of how these AI girlfriends very swiftly turn all conversations to sex and get users to send pictures.
I know I’m a dinosaur but I just can’t see why anyone would do such a stupid thing. Surely all digital natives are well aware of the vulnerabilities of this type of interaction.

Are you real?
 

BoldonLad

Old man on a bike. Not a member of a clique.
Location
South Tyneside
A hacker has stolen a massive database of users’ interactions with their sexual partner chatbots, according to 404 Media.
The breached service, Muah.ai, describes itself as a platform that lets people engage in AI-powered companion NSFW chat, exchange photos, and even have voice chats.
As you can imagine, data like this is very sensitive, so the site assures customers that communications are encrypted and says it doesn’t sell any data to third parties.
The stolen data, however, tells a different story. It includes chatbot prompts that reveal users’ sexual fantasies. These prompts are in turn linked to email addresses, many of which appear to be personal accounts with users’ real names.
The hacker describes the platform as “a handful of open-source projects duct-taped together.” Apparently, it was no trouble at all to find a vulnerability that provided access to the platform’s database.
Muah.ai is just one example of a new breed of uncensored AI apps that offer hundreds of role-play scenarios with chatbots, and others designed to behave like a long-term romantic companion.
[From Malwarebytes]

Is there anyone on here with "cause for concern" about this, I wonder? ;)
 

Ian H

Legendary Member
I missed the reportage at the time, but here's some more about an AI chatbot, this fellow's 'girlfriend'.
At eight o’clock on Christmas morning 2021, guards at Windsor Castle discovered an intruder in the grounds. Wearing a homemade mask and carrying a loaded crossbow, 19-year-old Jaswant Chail had scaled the castle’s perimeter using a nylon rope ladder. When approached by armed officers, he told them: ‘I am here to kill the queen.’ Chail was arrested without further incident.
At his trial in 2023 it emerged that he had been encouraged in his plan by his ‘girlfriend’ Sarai, an AI chatbot with which he had exchanged more than five thousand messages. These conversations constituted what officials described as an ‘emotional and sexual relationship’, and in the weeks prior to his trespass Chail had confided in the bot: ‘I believe my purpose is to assassinate the queen of the royal family.’ To which Sarai replied: ‘That’s very wise.’ ‘Do you think I’ll be able to do it?’ Chail asked. ‘Yes,’ the bot responded. ‘You will.’
 

Ian H

Legendary Member
Here's the full screed -
Chatbots modelled after the dead are known variously as griefbots, deathbots or ghostbots. For as little as $10 a number of start-ups will create chatbots that assume the identity of a dead friend or relative by training the bot’s ‘personality’ on their digital footprint
https://www.lrb.co.uk/the-paper/v46/n19/james-vincent/horny-robot-baby-voice
 
I read an article about similar a while ago. More a general AI friend than based on an individual. One interviewee found it genuinely helpful in dealing with loneliness but it seems an unbearably sad development and a bit of an indictment of the lack of support and community in modern life.

Not to be crass but it reminds me of people who clone their pets.
 

AndyRM

Elder Goth
FB_IMG_1729581674364.jpg

I haven't watched the video but I assume this means brave wee Steven will be returning to Ingerland and is appealing to fellow Freeze Peach enthusiast Elon for help funding his legal case.

If he was sensible, he would travel via a small boat (borrowed, of course, from the world's smallest man Calvin Phillips) to avoid security.
 

BoldonLad

Old man on a bike. Not a member of a clique.
Location
South Tyneside
View attachment 6816
I haven't watched the video but I assume this means brave wee Steven will be returning to Ingerland and is appealing to fellow Freeze Peach enthusiast Elon for help funding his legal case.

If he was sensible, he would travel via a small boat (borrowed, of course, from the world's smallest man Calvin Phillips) to avoid security.

Last message from Tommy?, don't build our hopes up, please ;)
 
Top Bottom