Mother of Elon Musk’s Child Accuses Him of Allowing ‘Sexually Explicit Images’ of Minors: ‘Children Covered in Fluids’
Ashley St. Clair, the mother of Elon Musk’s infant son Romulus, accused the billionaire of allowing “sexually explicit images” of minors on his social media platform X on Wednesday, including AI-generated images of “children covered in fluids.”
During an interview with St. Clair on CNN’s Erin Burnett OutFront, host Erin Burnett asked, “Musk says Grok generated zero naked underage images, but you say it undressed you in several photos, including one of when you were fourteen years old?”
“That’s correct,” replied St. Clair. “And what he said is deceptive at best because while maybe there weren’t actual nude images, it was pretty close to it, and the images that I saw not only of myself, but of I don’t even know whose children who were undressing and covered in various fluids, the abuse was so widespread and so horrific, and it’s still allowed to happen. They just released restrictions that are based on where it’s illegal.”
Burnett then read out Musk’s statement defending his AI assistant, during which the billionaire claimed, “Grok does not spontaneously generate images, it does so only according to user requests. When asked to generate images, it will refuse to produce anything illegal.”
Reacting to the statement, St. Clair said, “That’s not what I saw at all. Images I saw do seem to be illegal, and even them coming out and now trying to place safeguards afterwards seems like an admission that they know that there has been an issue, that it has been creating nonconsensual sexually explicit images of women and children.”
“He is saying that people are making this up and meanwhile, Ireland is probing over 200 cases of child sexual abuse material produced by Grok – 200 – and that’s just Ireland,” she continued. “He is placing the blame on the victims so that if this happens to you, you have to go to your local enforcement and take their resources and see if they can find this anonymous account instead of just turning the faucet off.”
St. Clair concluded, “This is what’s wrong because they’re handing a loaded gun to these people, watching them shoot everyone, and then blaming them for pulling the trigger.”
“I obviously haven’t seen the images, but you’re talking about images of children covered in fluids. I don’t need to say anymore,” Burnett responded. “That’s deeply, deeply disturbing, what you’re describing, to even hear that.”
Musk’s AI assistant confessed to generating “AI images depicting minors in minimal clothing” this month, telling a user that updates to its system were “ongoing” in an attempt “to block such requests entirely.”
In a statement on Wednesday, X Safety announced it would be taking action to prevent Grok from generating “images of real people in bikinis, underwear, and similar attire via the Grok account and in Grok in X in those jurisdictions where it’s illegal.”
“We remain committed to making X a safe platform for everyone and continue to have zero tolerance for any forms of child sexual exploitation, non-consensual nudity, and unwanted sexual content,” the statement added. “We take action to remove high-priority violative content, including Child Sexual Abuse Material (CSAM) and non-consensual nudity, taking appropriate action against accounts that violate our X Rules.”
Watch above via CNN.
Comments
↓ Scroll down for comments ↓