No, but I might be wrong about this. On the off chance that I am, a good place to start might be the snotty little f-ckheads that prey on teen girls this way.
During the pandemic, 70% of teen girls reported having serious depression and almost a third indicated they had contemplated suicide. I'm sure this is going to help.
BTW, once an image gets passed around to different portals on social media, good luck pulling it down.
Fake Nudes of Real Students Cause an Uproar at a New Jersey High School
After boys shared faked pornographic images made of female classmates, both the school and the local police began investigating
By Julie Jargon, WSJ
Nov. 2, 2023 7:00 am ET
When girls at Westfield High School in New Jersey found out boys were sharing nude photos of them in group chats, they were shocked, and not only because it was an invasion of privacy. The images weren’t real.
Students said one or more classmates used an online tool powered by artificial intelligence to make the images, then shared them with others. The discovery has sparked uproar in Westfield, an affluent town outside New York City.
Digitally altered or faked images and videos have exploded along with the availability of free or cheap AI tools. While celebrity likenesses from Oprah Winfrey to Pope Francis have drawn media attention, the overwhelming majority of faked images are pornographic, experts say.
The lack of clarity on such images’ legality—and how or whether to punish their makers—has parents, schools and law enforcement running to catch up as AI speeds ahead.
The high school confirmed the incident in an email to parents, but a Westfield Public Schools spokeswoman declined to provide details on the number of students involved or to confirm whether any disciplinary action had been taken, citing student confidentiality.
Some Westfield parents said their daughters have felt humiliated and powerless, and worry about damage to the girls should the images surface later. And they are upset that no resolution is forthcoming.
Even among parents, there is no consensus. In a local Facebook group, some called for harsh punishment for whoever created the images. Others deemed it a youthful transgression that should be forgiven.
The debate and its aftermath are likely to continue for months. Westfield police are investigating, and a state senator has asked county prosecutors to look into the case.
The whispers
Sophomore boys at Westfield High were acting “weird” on Monday, Oct. 16, whispering among themselves and being quieter than normal, said one mom, recounting what her daughter, a classmate, told her.
Girls started asking questions, the mom said. Finally, on Oct. 20, one boy told some of the girls what all the whispering was about: At least one student had used an AI-powered website to make pornographic images using girls’ photos found online, then shared them with other boys in group chats. Girls at Westfield reported the situation to school administrators, who began interviewing boys who might have known more information.
In an Oct. 20 email to parents, Westfield High School Principal Mary Asfendis said she believed the images had been deleted and weren’t being circulated.
“This is a very serious incident,” Asfendis wrote. “New technologies have made it possible to falsify images and students need to know the impact and damage those actions can cause to others.” She pledged to continue teaching children about responsible technology use.
Several girls were told by school administrators that some boys had identified them in the generated images, according to parents. The school district spokeswoman declined to say whether school staff members had reviewed the images.
Girls’ parents who spoke to The Wall Street Journal—including two of the four who filed reports with local police—said they and their daughters hadn’t seen the images. Police haven’t seen them either, according to a person familiar with the investigation.
“To be in a situation where you see young girls traumatized at a vulnerable stage of their lives is hard to witness,” Westfield Mayor Shelley Brindle said in an interview. The town’s first female mayor, she describes herself as a longtime advocate for women and girls.
Brindle, a former HBO executive, encourages people affected by this situation to give statements to the police. A spokeswoman for the town of Westfield said the police department wouldn’t comment.
Dorota Mani said her 14-year-old daughter, Francesca, was told by the school that her photo was used.
“I am terrified by how this is going to surface and when. My daughter has a bright future and no one can guarantee this won’t impact her professionally, academically or socially,” said Mani, who has filed a police report. She said she doesn’t want her daughter in school with anyone who created the images.
It only takes a phone
Digital bullying is widespread in schools across the U.S. Smartphones and their built-in cameras have already made the damage of digital harassment deeper and longer-lasting. While people have been able to doctor images with Photoshop and similar software for years, new AI image-makers make it easy to produce entirely fabricated photos. And any image can easily be shared widely on social and messaging platforms with a few taps.
“You would have needed an entire cluster of computers to generate images a few years ago. Now you just need an iPhone,” said Ben Colman, chief executive of Reality Defender, which works with companies and government agencies to detect AI-generated fake images.
Image generators from big companies—like OpenAI’s Dall-E and Adobe’s Firefly—have moderation settings that bar users from creating pornographic images. But a quick online search turns up dozens of results for face-swapping and “clothes removing” tools. Since these services likely use publicly available open-source software, moderation and technical guardrails are difficult, if not impossible, to enforce and implement, Colman said. It is almost impossible for the human eye to distinguish real from fake, he added.
More than 90% of such false imagery—known as “deepfakes”—are porn, according to image-detection firm Sensity AI. Tech firms including Snap and TikTok have pledged to work with government groups to stop such images from circulating. Snap and others say they ban AI-generated sexual images of minors and report them to the National Center for Missing and Exploited Children.
‘A serious crime’
Faked sexual images of real people are so new, federal law is lagging, say legal experts. A handful of states, including Virginia, California, Minnesota and New York, have outlawed the distribution of faked porn or given victims the right to sue its creators in civil court.
Jon Bramnick, a New Jersey state senator whose district includes Westfield, is looking into whether there are any existing state laws or pending bills that would criminalize the creation and sharing of such material. If not, he said he intends to draft a bill that would.
“This has to be a serious crime in New Jersey,” he said, adding that he has asked the Union County prosecutor to investigate the Westfield High case.
Laws covering child sexual-abuse material could apply in this situation, said Natalie Elizaroff, an intellectual-property lawyer in Chicago, because they prohibit digital images, computer images or even computer-generated images of minors engaged in sexually explicit conduct. The case would have to be taken up by state or federal prosecutors, who would have to establish that the depicted person is a minor, the content is explicit, and that someone created, possessed or distributed it, she added.
The Biden administration this week, as part of a broad executive order on AI, called for the prevention of generative AI from producing child sexual-abuse material or “non-consensual intimate imagery of real individuals.”
Another U.S. case involving fake porn depicting high-school students resulted in criminal charges. In April, a 22-year-old Long Island man was sentenced to six months in jail for creating and posting faked images depicting women from his old school, along with personally identifying information. The original photos were taken when the women were in middle school and high school.
Deleting social media
At a meeting held at her home Monday, Mani’s daughter, Francesca, described what she and other Westfield High School girls are going through.
“At first I cried, and then I decided I should not be sad,” she told the group. “I should be mad and should advocate for myself and the other victims.”
Among the attendees were parents of other students, along with Brindle, Bramnick, three female town-council members and a school-board member. (The Journal was invited to listen in by phone.)
At a group counseling session at school that day, some girls had said they were uncomfortable having to attend school with someone they believed had created and shared the images, Francesca said.
The incident has made some of her female classmates rethink what they post online, she said, given how little control they have over how it can be used. Some, she said, deleted their social-media accounts.
“We’re aware that there are creepy guys out there,” she told the group, “but you’d never think one of your classmates would violate you like this.”
—For Family & Tech columns, advice and answers to your most pressing family-related technology questions, sign up for my weekly newsletter.
Write to Julie Jargon at Julie.Jargon@wsj.com
Comentarios