Her enormous contingent of loyal “Swifties” expressed their outrage on social media this week, bringing the issue to the forefront. It affects everybody.”īut while the practice isn’t new, Swift being targeted could bring more attention to the growing issues around AI-generated imagery. We’ve seen stories about how this impacts high school students and people in the military. It’s nurses, art and law students, teachers and journalists. “It’s not just celebrities ,” said Danielle Citron, a professor at the University of Virginia School of Law. Meanwhile, a young well-known female Twitch streamer discovered her likeness was being used in a fake, explicit pornographic video that spread quickly throughout the gaming community. And with the rise and increased access to AI tools, experts say it’s about to get a whole lot worse, for everyone from school-age children to adults.Īlready, some high schools students across the world, from New Jersey to Spain, have reported their faces were manipulated by AI and shared online by classmates.
The circulation of explicit and pornographic pictures of megastar Taylor Swift this week shined a light on artificial intelligence’s ability to create convincingly real, damaging – and fake – images.īut the concept is far from new: People have weaponized this type of technology against women and girls for years.