NEED TO KNOW
- Former child star Mara Wilson is warning against the dangers of generative AI — specifically the trend of users employing the technology to produce nude images of real women and children
- Wilson, 38, revealed that her image was used to create child sexual abuse material (CSAM) online for years
- Wilson shared her thoughts, as well as posed potential solutions to the threat, in a Jan. 17 essay for The Guardian
Mara Wilson is opening up about the “nightmare” of being used for child sex abuse material as a former child star.
The Matilda actress, 38, wrote an essay for The Guardian, published on Saturday, Jan. 17, in which she warned of the dangers of generative AI — specifically a recent trend in which users utilize AI to create nude and explicit images of real women and children.
Wilson drew from her own experiences as a child star to emphasize just how dangerous and damaging this new technological trend can be.
“From ages 5 to 13, I was a child actor. And while as of late we’ve heard many horror stories about the abusive things that happened to child actors behind the scenes, I always felt safe while filming,” she began.
However, the Mrs. Doubtfire actress said that the part of her career that did feel dangerous was her relationship with the public, noting that her image was used online for child sexual abuse material (CSAM) before she was “even in high school.”
:max_bytes(150000):strip_icc():focal(999x0:1001x2)/mrs-doubtfire-robin-williams-matthew-lawrence-mara-wilson-lisa-jakub-sally-field-050224-975a84a2e7f941c0b43109d0e09cf859.jpg)
“I’d been featured on fetish websites and Photoshopped into pornography. Grown men sent me creepy letters. I wasn’t a beautiful girl — my awkward age lasted from about age 10 to about 25 — and I acted almost exclusively in family-friendly movies. But I was a public figure, so I was accessible. That’s what child sexual predators look for: access. And nothing made me more accessible than the internet,” she continued.
“It didn’t matter that those images ‘weren’t me,’ or that the fetish sites were ‘technically’ legal. It was a painful, violating experience; a living nightmare I hoped no other child would have to go through,” she added.
Want to keep up with the latest crime coverage? Sign up for PEOPLE’s free True Crime newsletter for breaking crime news, ongoing trial coverage and details of intriguing unsolved cases.
Wilson, now a writer and mental health activist, went on to say that she fears that sexually exploitative AI trends are putting all women and children at risk, regardless of whether or not they are public figures.
Related Stories
:max_bytes(150000):strip_icc()/mara-wilson-1-2000-d41adadd09904bf4a4d912bc3616e275.jpg)
:max_bytes(150000):strip_icc()/mara-wilson-800-47f96fe89d7a4871a6b53b695f58727e.jpg)
“It is now infinitely easier for any child whose face has been posted on the internet to be sexually exploited. Millions of children could be forced to live my same nightmare,” she explained.
:max_bytes(150000):strip_icc():focal(1088x0:1090x2)/Mara-Wilson-091825-2-061c85f3055741f683f741687288919a.jpg)
Wilson ended her essay by imploring readers to use their collective power to shape the way tech companies approach generative AI. She said this can be done in part through boycotting companies that permit their AI to create exploitative sexual images — though she notes that she believes we must go a step further.
“We need to be the ones demanding companies that allow the creation of CSAM be held accountable. We need to be demanding legislation and technological safeguards,” she wrote.
“We also need to examine our own actions: nobody wants to think that if they share photos of their child, those images could end up in CSAM. But it is a risk, one that parents need to protect their young children from, and warn their older children about,” she added.
Read the full article here


:max_bytes(150000):strip_icc():focal(737x272:739x274)/Mara-Wilson-091825-1-c2771fe8aa4441ee9be4faf1ead7c38c.jpg)