A computer. (Tianyi Ma via Unsplash)
The Unaffiliated โ€” All politics, no agenda.

As artificial intelligence expands its reach into more aspects of peopleโ€™s lives, Colorado lawmakers want to ensure the stateโ€™s policies around online harassment and child pornography are keeping pace.

Colorado Capitol News Alliance

This story was produced as part of the Colorado Capitol News Alliance. It first appeared at cpr.org.

โ€œIn a modern world where our identities and our selves exist far beyond our physical bodies and are vulnerable to attack on media, on social media, in text messages… we have to adapt to protect the people in these spaces,โ€ Jessica Dotter, sexual assault resource prosecutor with Coloradoโ€™s District Attorneys Council, told the Senate Judiciary Committee earlier this week.

She was there to testify in favor of Senate Bill 288. The legislation, sponsored by Majority Leader Robert Rodriguez, a Denver Democrat, would expand the existing laws around posting intimate images to include material created by AI.

Senate Majority Leader Robert Rodriguez, D-Denver, speaks at a pro-union rally on the west steps of the Colorado Capitol on Wednesday, March 19, 2025. (Jesse Paul, The Colorado Sun)

The issue exploded into the public conversation last winter, when deepfakes of megastar Taylor Swift emerged online. But the creation of false sexual pictures and videos has torn apart the lives of much less famous people, including a city commissioner in Florida and a teacher in Texas and high school students nationwide.

โ€œLet’s be clear: These images are not harmless. They’re not virtual, they’re violations,โ€ testified Will Braunstein, the head of the Denver Childrenโ€™s Advocacy Center.

According to backers, Colorado is late to tackle this issue. While the issue of obscene deep fakes is relatively new, states have been moving quickly to criminalize the practice, when itโ€™s used to harm someone. Thirty-eight states have made it illegal to use AI to create child pornography, something Coloradoโ€™s bill would also do.

That provision is raising concerns with both the ACLU of Colorado and criminal defense attorneys, who question the ethics of prosecuting people for possessing or distributing material entirely created by a computer.

โ€œWhen we talk about someone under the age of 18 and it’s not tied to a real person, how do you put an age on that?โ€ James Karbach with the state’s Public Defenders Office asked lawmakers. โ€œHow do you draw lines around that? And if it’s to be criminalized, should you criminalize it the same as something that harms and depicts an actual human?โ€

The Daily Sun-Up podcast | More episodes

Bill supporters countered that AI is trained on real images with real victims. And they note that as the technology improves, itโ€™s becoming increasingly difficult to separate deepfakes from real images of abuse, making it more important โ€” in their arguments โ€” to treat both the same under the law.

The debate over abusive deep fakes comes as Colorado lawmakers are generally trying to get their arms around emerging technology and its potential impacts on peopleโ€™s lives. Rodriguez is working behind the scenes on even more sweeping legislation: a first-of-its-kind effort to put guardrails around how companies use AI to make decisions with consequences for peopleโ€™s lives.

That bill has yet to be unveiled.

In the meantime, Rodriguezโ€™s exploitative content bill passed its first committee unanimously and is awaiting a vote of the full Senate.

Type of Story: News Service

Produced externally by an organization we trust to adhere to high journalistic standards.

Megan Verlee is the public affairs editor at Colorado Public Radio,