‘Absolutely Horrific’: AI-Generated Child Porn Leads Teen to Commit Suicide thumbnail

‘Absolutely Horrific’: AI-Generated Child Porn Leads Teen to Commit Suicide

By Family Research Council

As their teenage son lay in the hospital with his life oozing out of him, Eli Heacock’s parents felt their grief compounded by confusion: They had no idea why he had committed suicide.

Then they found his phone.

Messages showed their son — 16-year-old Elijah “Eli” Manning Heacock of Glasgow, Kentucky — had fallen prey to an extortioner, who convinced Eli to send photos of himself, which the predator used to generate sexually explicit images using artificial intelligence (AI).

The victim’s father, John Burnett, said Eli scrambled to collect as much of the $3,000 bribe as he could, only to hear his predator tell him, “This is not enough.”

In despair, Eli shot himself on February 27, dying of a self-inflicted gunshot wound at the University of Louisville Hospital the next day.

The tragic death highlights the changing profile of sexual abuse and predation. Parents “no longer [have] to be scared of the white van that drives around. You have to be scared of the internet,” said Eli’s mother, Shannon Heacock.

Her son is far from the first person to lose his life to online sexual extortionists. “From October 2021 to March 2023 … sextortion involved at least 12,600 victims — primarily boys — and led to at least 20 suicides,” according to the FBI.

But even in death, Eli, who lost his unique potential so young, stood out from others: Usually, exploiters lure their victims into sending sexually explicit photos, then blackmail them into sending money or more compromising photos. Eli apparently shared less explicit images, which the predator allegedly altered using artificial intelligence — which analysts say highlights another threat of nascent AI technology.

“It’s absolutely horrific and tragic that criminals used AI-generated images to extort money out of a teenage boy, ultimately resulting in him taking his own life,” Arielle Del Turco, director of the Center for Religious Liberty at Family Research Council, told The Washington Stand. FRC previously highlighted concerns raised by AI chatbots, which appear to be human, engaging in sexually explicit dialogue with minors online — or stoking pedophiles’ fantasies by posing as sexually precocious children.

With the rise of AI-generated pornography, a victim need not send his tormentor nude or suggestive photos. Powered by AI, child exploiters do not need their victim to send them any photos at all: They can download an image from the internet or take a candid photo without the victim’s knowledge.

People from all walks of life can fall victim to AI-generated “deepfakes.” Last month, New Zealand politician Lana McClure stood on the floor of Parliament and displayed nude photographs of herself, which she said she personally created in a matter of minutes using artificial intelligence software.

Lawmakers in Heacock’s native Kentucky reacted by adopting a law, S.B. 73, which makes sextortion a felony. President Donald Trump also made tackling “revenge porn,” including images created by artificial intelligence, an early priority in his second administration.

“Congress has recently worked to mitigate these types of situations by passing the Take It Down Act,” supported by First Lady Melania Trump, which passed Congress with near-unanimous consent and signed into law by President Trump, noted Del Turco. The law makes it illegal to knowingly publish, or threaten to publish, intimate images without a person’s consent; it also requires websites to remove such images within 48 hours of a victim’s valid request and help track down any sites that have republished the material. Del Turco noted that the new law “applies to AI-generated images. Policymakers should continue to enact prudent legislation to protect people from abusive uses of AI, including sexual extortion.”

Not every level of government shares Heacock’s concerns. Global governance bodies have allowed lawmakers to exclude AI-generated porn even of the youngest children. The United Nations Convention against Cybercrime, adopted last December 31, forbids “real or simulated sexual activity” but gives nations the right to limit the prosecution of Child Sexual Abuse Material (CSAM) to material that “[d]epicts, describes or represents an existing person.” That excludes child pornography images or videos that are entirely AI-generated and not based on one specific child.

“In several sections, the new UN treaty allows countries to de-criminalize virtual child pornography in all circumstances as well as private sexting by minors, even to adults,” noted Stefano Gennarini of the Center for Family and Human Rights (C-FAM).

Americans also feel concerned some lawmakers have prioritized the explosive growth of the artificial intelligence sector at the cost of their children’s innocence, even their lives. The “One Big Beautiful Bill” which recently passed the House of Representatives contains a provision preempting states from regulating AI technology for 10 years. “I am adamantly OPPOSED to this and it is a violation of state rights and I would have voted NO if I had known this was in there,” wrote Rep. Marjorie Taylor Greene (R-Ga.) on social media Tuesday. “We have no idea what AI will be capable of in the next 10 years and giving it free rein and tying states hands is potentially dangerous.”

The bill is now before the Senate. The AI provision has raised alarms beyond the chamber. Many grieving parents wish leaders would give them stronger tools to keep their children safe from the ever-expanding reach of predators. “We can’t afford to be behind the ball on creating policies that govern AI,” Del Turco told TWS. “The effects are too far-reaching” — a fact Eli’s mother, Shannon, knows too well.

“I don’t want another mother to ever face this, another sibling, another father to face this,” she said.

AUTHOR

Ben Johnson

Ben Johnson is senior reporter and editor at The Washington Stand.

RELATED ARTICLE: Trump’s Department of Education Declares June ‘Title IX Month’

EDITORS NOTE: This Washington Stand column is republished with permission. All rights reserved. ©2025 Family Research Council.


The Washington Stand is Family Research Council’s outlet for news and commentary from a biblical worldview. The Washington Stand is based in Washington, D.C. and is published by FRC, whose mission is to advance faith, family, and freedom in public policy and the culture from a biblical worldview. We invite you to stand with us by partnering with FRC.

The post ‘Absolutely Horrific’: AI-Generated Child Porn Leads Teen to Commit Suicide appeared first on Dr. Rich Swier.