A.I. Is Making the Sexual Exploitation of Girls Even Worse

Date:

Kat Tenbarge and Liz Kreutz of NBC News reported that several middle schoolers in Beverly Hills, Calif., were caught making and distributing fake naked photos of their peers: “School officials at Beverly Vista Middle School were made aware of the ‘A.I.-generated nude photos’ of students last week, the district superintendent said in a letter to parents. The superintendent told NBC News the photos included students’ faces superimposed onto nude bodies.”

I had heard about this kind of thing happening to high school girls, which is horrible enough. But the idea of such young children being dehumanized by their classmates, humiliated and sexualized in one of the places they’re supposed to feel safe, and knowing those images could be indelible and worldwide, turned my stomach.

I’m not a technophobe and have, in the past, been somewhat skeptical about the outsize negative impact of social media on teen girls. And while I still think the subject is complicated, and that the research doesn’t always conclude that there are unfavorable mental health effects of social media use on all groups of young people, the increasing reach of artificial intelligence adds a new wrinkle that has the potential to cause all sorts of damage. The possibilities are especially frightening when the technology is used by teens and tweens, groups with notoriously iffy judgment about the permanence of their actions.

I have to admit that my gut reaction to the Beverly Hills story was rage — I wanted the book thrown at the kids who made those fakes. But I wanted to hear from someone with more experience talking to teens and thinking deeply about the adolescent relationship with privacy and technology. So I called Devorah Heitner, the author of “Growing Up in Public: Coming of Age in a Digital World,” to help me step back a bit from my punitive fury.

Heitner pointed out that although artificial intelligence adds a new dimension, kids have been passing around digital sexual images without consent for years. According to a 2018 meta-analysis from JAMA Pediatrics, among children in the 12 to 17 age range, “The prevalence of forwarding a sext without consent was 12.0 percent,” and “and the prevalence of having a sext forwarded without consent was 8.4 percent.”

In her book, Heitner offers an example in which an eighth-grade girl sends a topless photo to her boyfriend, who circulates it to his friends without her permission. After they broke up, but without her knowledge, “her picture kept circulating, passing from classmate to classmate throughout their middle school,” and then “one afternoon, she opened her school email to find a video with her image with sound effects from a porn video playing with it.”

That kind of situation is already sickening, but the creation of fake nude images adds another layer of transgression. In the Beverly Hills case, according to NBC News, not only were middle schoolers sexualizing their peers without consent by creating the fakes, they shared the images, which can only compound the pain.

“If you’re creating an image of someone else and doing it without their consent,” Heitner told me, “whether it’s real or fake, you are violating that person and violating their privacy, violating their safety.” In these situations, she said, girls may feel that their sense of social acceptance has been lost. They may feel a sense of torturous humiliation from not knowing who among their peers has seen these types of images and who hasn’t. In her book, Heitner describes situations in which girls stop going to school altogether.

But Heitner also cautioned against over-punishing the perpetrators when they are younger children. “It’s important to understand that a 12- or 13-year-old is developmentally different than an adult,” she said. While it may be appropriate to suspend that child or move them to a different school if their victims no longer want to be around them, they shouldn’t be indefinitely barred from all participation in school or cast out of society. They are redeemable; they can make amends and become adults who know better. (It should be noted that in the Beverly Hills case, according to NBC News, the superintendent of schools said that the students responsible could face suspension to expulsion, depending on how involved they were in creating and sharing the images.)

Kids need to be better educated, starting in elementary school, about technology and consent before things like this happen. If you think grammar school is too young to learn about such things, remember that these days it’s typical for kids to get their own cellphones at around 11 or 12, and many kids even younger than that have access to a family iPad with image creation and sharing capabilities. As Heitner writes in her book:

“Teach your child the importance of never sharing an explicit message or photograph of another person — especially without that person’s consent. Explain to them that regardless of how they came across the explicit image or message, passing it on to someone else is unethical, perpetuates that person’s violation, and is very likely illegal in their state (especially if the image is of a minor).”

The relevant laws apply most directly to real photos, though. In some states, A.I.-generated nudes exist in more of a legal gray area. There is no federal law that protects victims of deepfakes, and, according to reporting by Tenbarge and Melissa Chan, “Politicians and legal experts say there are few, if any, pathways to recourse for victims of A.I.-generated and deepfake pornography” — almost all of whom are women, according to a 2019 study. School districts and our legal system need to move quickly to come up with policies that deal with these issues, because they are not going away and they are only going to become more pervasive as technology evolves and proliferates.

Heitner also emphasized the importance of getting to the root of this kind of behavior. “We actually need to lean into teaching kids about empathy and respecting one another’s humanity,” she said, and also look at “the misogyny and homophobia in society that seems to be giving these kids license to bully along these very sort of gendered lines and police one another’s bodies.”

I regularly hear from people who say they’re perplexed that young women still feel so disempowered, given the fact that they’re earning the majority of college degrees and doing better than their male counterparts by several metrics. At a certain level, it’s not that complicated: Girls frequently feel less-than because they know that some of their peers have the impression that they’re allowed to be thoughtlessly degrading. And further, they know that a portion of society values them only as objects. They walk through the world with that weight on their shoulders, and it’s up to all of us to help lift it.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Share post:

Subscribe to The Hemet & San Jacinto Chronicle

Popular

More like this
Related

‘Psychologically tortured’: California city pays man nearly $1m after 17-hour police interrogation

A California city has agreed to pay $900,000 to a man who was subjected to a 17-hour police interrogation in which officers pressured him to falsely confess to murdering his father, who was alive.

6 RivCo Hiking Trail Closures Expected, $100 Fine During Fire Season

The Board of Supervisors is expected to authorize Cal Fire/Riverside County Fire Department Chief Bill Weiser to close access to multiple outdoor recreational locations for the duration of the Southern California Wildfire Season to minimize public safety risks.

California could require age verification to visit porn sites

Republican Assemblymember Juan Alanis, a former Stanislaus County sheriff’s sergeant, and San Ramon Democrat Rebecca Bauer-Kahan, a women’s rights advocate, may not have a lot in common. 

Arizona doctors can come to California to perform abortions under new law signed by Gov. Newsom

Arizona doctors can temporarily come to California to perform abortions for their patients under a new law signed Thursday by Gov. Gavin Newsom.