Laura Bates has spent more than a decade documenting the quiet, everyday ways sexism shapes women’s lives. In 2012, she launched The Everyday Sexism Project, a website where women around the world could log experiences that might otherwise be dismissed—being talked over in meetings, harassed on the bus, asked about their clothes in professional settings. It was a small idea that turned into a global archive, and eventually a book.

Now, the British feminist author and activist is back with a new warning: sexism hasn’t disappeared, it has just gone digital. Her latest book, The New Age of Sexism: How AI and Emerging Technologies Are Reinventing Misogyny, published this week by Sourcebooks, charts the way biases are being replicated—and often magnified—in online platforms, artificial intelligence, and the metaverse.

Bates has lived the story she’s writing about. She has herself been targeted by deepfake pornography, the growing scourge of nonconsensual sexual imagery created with AI tools. In interviews, she describes how these technologies make it “hugely realistic” and alarmingly easy for anyone—of any age—to create pornographic images of women using nothing more than ordinary pictures scraped from social media. That ease of access, she argues, is lowering the bar for abuse in ways society has not yet reckoned with.

The book is both a study and an alarm bell. Bates interviewed tech developers, women who had been victimized, and even used chatbots and sexbots herself to understand how they operate. What she found was familiar: the same misogynistic assumptions that drive catcalls on the street or harassment in offices are now being baked into the design of artificial intelligence. A virtual assistant that’s coded to sound submissive, or an AI girlfriend that never says no, isn’t just a novelty. It’s a reinforcement of patterns that teach boys and men what to expect from women in the real world.

“I know people will think I sound like a pearl-clutching feminist,” she told WIRED. “But if you look at the top of the big tech companies, men at those levels are saying the same thing.” She points to former OpenAI researcher Jan Leike, who resigned last year after warning the company was prioritizing “shiny products” over safety. The issue, Bates argues, is not fringe; it is recognized even inside the institutions building the technology.

Her concerns extend beyond harassment. Bates highlights how AI is technology reflects the values and biases of those who build it, and without oversight, those biases calcify into code.

In one especially chilling example, she describes teenage boys customizing AI girlfriends with physical traits, personalities, and names—then interacting with avatars programmed to be endlessly compliant, even in scenarios involving violence.

The urgency, she says, comes from timing. “We’re on the edge of a precipice,” she writes. These technologies are not just new; they’re being embedded into the foundations of everyday life while still largely unregulated. And even in the short time since she finished the book, the number of stories about AI-related harms has exploded.

Bates’ project is not about fearmongering. It’s about a reminder that technology doesn’t exist outside culture—it carries culture with it. And unless society confronts the ways misogyny mutates inside these new systems, the promise of innovation could just as easily become another way to replicate the oldest of biases, dressed up as the future.

Trending

Discover more from Newsworthy Women

Subscribe now to keep reading and get access to the full archive.

Continue reading