Skip to Content

Oregon lawmakers look to regulate AI chatbots

MGN graphic

By Robin Linares, Oregon Capital Chronicle

SALEM, Ore. (KTVZ) -- Lawmakers who ”missed the boat” on regulating social media now have a chance to address the next emerging technology’s effects on youth, Oregon lawmakers contend.

Sen. Lisa Reynolds, D-Portland, and the Senate Early Childhood and Behavioral Health committee she chairs are championing Senate Bill 1546 to regulate artificial intelligence chatbots. The committee voted 4-1 to advance the bill with amendments to the Senate floor Thursday.  

The bill would require AI programs such as ChatGPT to remind users more regularly that they are speaking to an artificial intelligence tool, not a human. The legislation follows a recently passed California law and similar legislation introduced this year in New York and Washington.

Reynolds, a pediatrician, described parents’ struggles with managing electronic use among youth she works with, as kids spend more time using the  internet, social media, and now, AI.

“What is coming up for me all the time in my exam room is parents feel like they’re fighting a losing battle,” Reynolds said.

Assessing teen AI use

AI use among teens is increasingly common, with 72% of teens using AI companions, and more than  50% as regular users, according to the nonprofit Common Sense Media. Other research from the organization noted that nearly a third of teens find conversations with AI chatbot companions just as satisfying or more satisfying as real-life conversations. 

Robbie Torney, the head of AI and digital assessments at Common Sense Media, said teens are using AI chatbots as companions for emotional support or to discuss mental health. 

“Our testing shows that they consistently miss subtle warning signs — and even not so subtle warning signs — that another human being, a parent, a friend or an adult would catch,” Torney said.

There have been several cases over the past few years of AI chatbots, including ChatGPT and Character.AI, contributing to teen suicides, according to parents’ testimony before a U.S. Senate committee last year. 

The Oregon bill proposes additional guardrails for youth access to AI, including calling for programmers , to note that the platform may not be suitable for minors, to not show or promote sexually explicit content or conduct and to discourage spending extended time interacting with the platform. 

Expanding notifications is one way chatbots can promote responsible AI use, said Linda Charmaraman, senior research scientist at the Wellesley Centers for Women at Wellesley College and founder and director of the Youth, Media and Wellbeing Research Lab. She supports educating youth about responsible uses of AI, rather than outright bans.

“Whether it’s adults or for minors, just to remind people that there are limits to the technology and that there’s inaccuracies,” Charmaraman said. “If I could wave a wand, I would love for them to really focus on AI literacy from early ages.”

Suicide prevention focus

Beyond youth access, Reynolds added that the bill’s intent is to protect anyone expressing suicidal tendencies.

The proposed legislation would require programmers who make their tools available to Oregonians to develop protocols for their AI platform to detect signs of suicidal ideation, self-harm or thoughts of intending to do either.

In response, these platforms would be required to refer users to a suicide hotline and other crisis related resources and immediately interrupt conversation between the chatbot and user. The protocol would also be required to be publicly available on the AI program site.  

Reynolds has been in touch with Lines for Life, the Oregon-based suicide and mental health hotline, and its sister hotline YouthLine, about potentially being part of the mental health resources that AI chatbots could offer users. 

Hotline volunteers have already seen how AI has impacted users, according to Dwight Holton, executive director of Lines for Life. Youth volunteers are increasingly reassuring users in crisis that they are speaking to a human, not AI, he said. 

“We know that intervention works,” Holton said. “So, if we can convince our partners in the industry and legislatively, establish guardrails that require that kind of connection to intervention, we will get folks from that path of despair to a path of hope.”

Reynolds has also reached out to organizations like TechNet, which supports a network of tech agencies including Google, OpenAI, and Meta. 

TechNet officials, while largely supportive of the bill, mentioned some concerns about Oregon’s potential requirements of more frequent notifications than laws in other states. The bill has since been amended to align with other states’ notification provisions.

“I am working with a coalition of companies to try and make sure that we have clear definitions and clear requirements on notifications and guardrails and looking forward to working with the senator and the committee,” said Rose Feliciano, TechNet’s executive director for Washington and the Northwest. 

The bill could face legal challenges if passed because of a December executive order President  Donald Trump signed to limit state regulation of AI services. While Reynolds is unclear about where the executive order fits in, her plans remain unchanged in addressing unregulated AI use.

“Social media companies have had the opportunity to make some choices that would have kept kids safe from social media but instead they really double down on doing everything they can to keep their eyeballs on social media content for as long as they can,” Reynolds said. “I see it time and again in my exam room, so I don’t want to wait till it’s too late to put some sideboards on AI tools.”

Article Topic Follows: Technology

Jump to comments ↓

Oregon Capital Chronicle

BE PART OF THE CONVERSATION

KTVZ is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.