Some of my earliest memories are of playing with Tonka trucks during local government meetings. My dad was our town’s treasurer, and he’d often take me with him to council and committee hearings.
Those experiences taught me two things. First, government meetings are long and boring. Second, they really matter. Because tucked between complaints about squirrels and streetlights are discussions of laws, regulations and civic projects that can transform businesses’ and communities’ futures.
As an adult, I wanted to extract the important parts of those meetings so citizens and businesses could more effectively work with local, state and federal government — and in 2021, I teamed up with a friend to start an AI-powered company that does exactly that. We’ve helped scores of businesses stay ahead of statutory changes and connect with localities that need their services, and our business is growing steadily.
But I’m worried new AI regulations here in Colorado will put a wrench in our work.
Our business helps clients understand governments’ needs and intentions as they emerge — often before lawmakers start drafting bills or requesting proposals. At its core is an AI-enabled speech-to-text engine trained on millions of hours of government meetings. The engine quickly transcribes digital recordings of meetings across the country, which are all posted online, scans the transcripts for topics our clients are interested in, then provides them with customized reports. It’s like having a team of really sharp listeners sit through meetings for you, filter out the most important information, then send you actionable intel.
Those AI-enabled insights are really valuable. They have helped our clients do things like find communities with land available for solar farms, identify municipalities open to hosting data centers, and learn which cities need lower-cost waste disposal services. The insights help our clients grow their businesses, and localities find new and better vendors.
And it’s a win for citizens, whose tax dollars are used more efficiently thanks to the open, competitive bidding process we help foster. We also offer our service to local news outlets at a steep discount so they can be better “watchdogs.”
Unfortunately, heavy-handed new state regulations may soon make it hard for us to continue delivering those insights.
Last summer, Colorado passed Senate Bill 205, also known as the Colorado AI Act, a sweeping new law that requires AI-powered businesses to annually assess, summarize and publicly report how their AI systems risk being used for discriminatory purposes. In addition, they’re required to report how they’re working to mitigate those risks.
The new law, which will take effect in early 2026, is well-intentioned — but it’s unlikely to produce good outcomes. Many innovative AI-powered startups like mine that don’t pose any risk won’t be able to afford the hefty legal and consulting fees associated with compliance. Similar requirements in the European Union typically cost startups like ours tens of thousands of dollars.
In the runup to the bill’s passage, a number of small-business owners and developers expressed worry about the bill’s broad language, rushed passage, and impact on innovation in the state. And in signing the bill into law, Gov. Jared Polis told Colorado lawmakers he was “concerned about the impact this law may have on an industry that is fueling critical technological advancements across our state for consumers and enterprises alike.”
Obviously, we’d never want our technology to be used for discriminatory purposes. But it’s impossible to predict how others might use any technology, and it’s scary to think we might be held responsible for others’ bad actions. That’s like saying Ford should be held accountable if someone uses a Mustang to commit a crime.
In fact, the law could backfire and hit minority-owned startups particularly hard. That’s because minority-led small businesses and startups appear to be using generative AI at higher rates than their nonminority peers. If using AI becomes more expensive or risky to use, minority-led businesses would be disproportionately affected.
AI businesses doing things like improving student performance through personalized learning may be forced to close or leave if they can’t afford to comply with the state’s new regulatory requirements, dealing a blow to Colorado’s thriving tech industry and economy. If startups leave the state, AI development will be left in the hands of the biggest players with the deepest pockets, hurting innovation and competition.
As the legislature reconvenes, I’m hopeful the law will be carefully reviewed and revised. I urge state Sen. Robert Rodriguez, who sponsored the original bill, and Colorado’s lawmakers to seek additional input from AI developers and deployers so they can better understand the law’s real-world implications.
In addition, I urge legislators to clarify the law’s key terms (e.g. “algorithmic discrimination” and “high-risk”), and consider refining its impact-assessment requirements accordingly. Critically, lawmakers must also ensure the law can be easily reviewed and updated, given the astonishing pace of AI development.
Above all, I urge lawmakers to strive for balanced legislation that considers not only AI’s risks, but also its incredible potential to do good.
That may require a lot of long, boring meetings — but it would transform Coloradans’ future for the better.
Jeremy Becker, of Denver, is the chief revenue officer and co-founder of Denver-based Cloverleaf AI.
The Colorado Sun is a nonpartisan news organization, and the opinions of columnists and editorial writers do not reflect the opinions of the newsroom. Read our ethics policy for more on The Sun’s opinion policy. Learn how to submit a column. Reach the opinion editor at opinion@coloradosun.com.
Follow Colorado Sun Opinion on Facebook.
