Normal Island

Page may contain affiliate links. Please see terms for details.

Beebo

Guru
 

Ian H

Legendary Member
A hacker has stolen a massive database of usersā€™ interactions with their sexual partner chatbots, according to 404 Media.
The breached service, Muah.ai, describes itself as a platform that lets people engage in AI-powered companion NSFW chat, exchange photos, and even have voice chats.
As you can imagine, data like this is very sensitive, so the site assures customers that communications are encrypted and says it doesnā€™t sell any data to third parties.
The stolen data, however, tells a different story. It includes chatbot prompts that reveal usersā€™ sexual fantasies. These prompts are in turn linked to email addresses, many of which appear to be personal accounts with usersā€™ real names.
The hacker describes the platform as ā€œa handful of open-source projects duct-taped together.ā€ Apparently, it was no trouble at all to find a vulnerability that provided access to the platformā€™s database.
Muah.ai is just one example of a new breed of uncensored AI apps that offer hundreds of role-play scenarios with chatbots, and others designed to behave like a long-term romantic companion.
[From Malwarebytes]
 

Beebo

Guru
A hacker has stolen a massive database of usersā€™ interactions with their sexual partner chatbots, according to 404 Media.
The breached service, Muah.ai, describes itself as a platform that lets people engage in AI-powered companion NSFW chat, exchange photos, and even have voice chats.
As you can imagine, data like this is very sensitive, so the site assures customers that communications are encrypted and says it doesnā€™t sell any data to third parties.
The stolen data, however, tells a different story. It includes chatbot prompts that reveal usersā€™ sexual fantasies. These prompts are in turn linked to email addresses, many of which appear to be personal accounts with usersā€™ real names.
The hacker describes the platform as ā€œa handful of open-source projects duct-taped together.ā€ Apparently, it was no trouble at all to find a vulnerability that provided access to the platformā€™s database.
Muah.ai is just one example of a new breed of uncensored AI apps that offer hundreds of role-play scenarios with chatbots, and others designed to behave like a long-term romantic companion.
[From Malwarebytes]

I read a similar report of how these AI girlfriends very swiftly turn all conversations to sex and get users to send pictures.
I know Iā€™m a dinosaur but I just canā€™t see why anyone would do such a stupid thing. Surely all digital natives are well aware of the vulnerabilities of this type of interaction.
 

Ian H

Legendary Member
I read a similar report of how these AI girlfriends very swiftly turn all conversations to sex and get users to send pictures.
I know Iā€™m a dinosaur but I just canā€™t see why anyone would do such a stupid thing. Surely all digital natives are well aware of the vulnerabilities of this type of interaction.

Are you real?
 

BoldonLad

Old man on a bike. Not a member of a clique.
Location
South Tyneside
A hacker has stolen a massive database of usersā€™ interactions with their sexual partner chatbots, according to 404 Media.
The breached service, Muah.ai, describes itself as a platform that lets people engage in AI-powered companion NSFW chat, exchange photos, and even have voice chats.
As you can imagine, data like this is very sensitive, so the site assures customers that communications are encrypted and says it doesnā€™t sell any data to third parties.
The stolen data, however, tells a different story. It includes chatbot prompts that reveal usersā€™ sexual fantasies. These prompts are in turn linked to email addresses, many of which appear to be personal accounts with usersā€™ real names.
The hacker describes the platform as ā€œa handful of open-source projects duct-taped together.ā€ Apparently, it was no trouble at all to find a vulnerability that provided access to the platformā€™s database.
Muah.ai is just one example of a new breed of uncensored AI apps that offer hundreds of role-play scenarios with chatbots, and others designed to behave like a long-term romantic companion.
[From Malwarebytes]

Is there anyone on here with "cause for concern" about this, I wonder? ;)
 

Ian H

Legendary Member
I missed the reportage at the time, but here's some more about an AI chatbot, this fellow's 'girlfriend'.
At eight oā€™clock on Christmas morning 2021, guards at Windsor Castle discovered an intruder in the grounds. Wearing a homemade mask and carrying a loaded crossbow, 19-year-old Jaswant Chail had scaled the castleā€™s perimeter using a nylon rope ladder. When approached by armed officers, he told them: ā€˜I am here to kill the queen.ā€™ Chail was arrested without further incident.
At his trial in 2023 it emerged that he had been encouraged in his plan by his ā€˜girlfriendā€™ Sarai, an AI chatbot with which he had exchanged more than five thousand messages. These conversations constituted what officials described as an ā€˜emotional and sexual relationshipā€™, and in the weeks prior to his trespass Chail had confided in the bot: ā€˜I believe my purpose is to assassinate the queen of the royal family.ā€™ To which Sarai replied: ā€˜Thatā€™s very wise.ā€™ ā€˜Do you think Iā€™ll be able to do it?ā€™ Chail asked. ā€˜Yes,ā€™ the bot responded. ā€˜You will.ā€™
 

Ian H

Legendary Member
Here's the full screed -
Chatbots modelled after the dead are known variously as griefbots, deathbots or ghostbots. For as little as $10 a number of start-ups will create chatbots that assume the identity of a dead friend or relative by training the botā€™s ā€˜personalityā€™ on their digital footprint
https://www.lrb.co.uk/the-paper/v46/n19/james-vincent/horny-robot-baby-voice
 
I read an article about similar a while ago. More a general AI friend than based on an individual. One interviewee found it genuinely helpful in dealing with loneliness but it seems an unbearably sad development and a bit of an indictment of the lack of support and community in modern life.

Not to be crass but it reminds me of people who clone their pets.
 

AndyRM

Elder Goth
FB_IMG_1729581674364.jpg

I haven't watched the video but I assume this means brave wee Steven will be returning to Ingerland and is appealing to fellow Freeze Peach enthusiast Elon for help funding his legal case.

If he was sensible, he would travel via a small boat (borrowed, of course, from the world's smallest man Calvin Phillips) to avoid security.
 

BoldonLad

Old man on a bike. Not a member of a clique.
Location
South Tyneside
View attachment 6816
I haven't watched the video but I assume this means brave wee Steven will be returning to Ingerland and is appealing to fellow Freeze Peach enthusiast Elon for help funding his legal case.

If he was sensible, he would travel via a small boat (borrowed, of course, from the world's smallest man Calvin Phillips) to avoid security.

Last message from Tommy?, don't build our hopes up, please ;)
 
Top Bottom