If your company uses an AI-powered tool that remembers customer interactions, asks follow-up questions, and sustains personal conversations — even if you licensed it from a third-party vendor — Oregon's new chatbot safety law may apply to you. SB 1546, which cleared both chambers on March 5 with near-unanimous support, creates a private right of action with $1,000-per-violation statutory damages.1 If Governor Kotek signs the bill, it takes effect January 1, 2027.
The bill's definition of "operator" covers anyone who "controls or makes available" a covered AI system in Oregon — broad enough to reach the healthcare system that deployed a patient engagement chatbot, the university running an AI tutoring companion, or the retailer whose support portal follows up on recent purchases. Oregon is part of a broader wave of state chatbot legislation we have been tracking, but its private right of action makes it the first with real enforcement teeth.
What does Oregon SB 1546 require?
The bill targets "AI companions," which it defines broadly:
[A system] that uses artificial intelligence, generative artificial intelligence, or algorithms that recognize emotion from input and that [is] designed to simulate a sustained, human-like platonic, intimate, or romantic relationship or companionship with a user.2
For all users, operators must disclose AI involvement, detect expressions of suicidal ideation, interrupt the conversation to deliver crisis referrals, and file annual reports with the Oregon Health Authority.3 For minors, operators must additionally provide hourly AI reminders, refrain from sexually explicit content, and avoid techniques designed to create emotional dependency.
The mandatory conversation interruption is unique to Oregon — California's SB 243 requires crisis referral protocols but not active interruption.4 This distinction matters at the product level: interruption requires real-time intent classification, a probabilistic task with inherent false positive and negative rates. Companies will need to confirm their vendor's detection capability meets this standard and document the contractual allocation of liability if it does not.
Does this apply to my company?
A system qualifies as an "AI companion" if it meets a three-prong test:
[The system] (a) [retains] information from prior interactions or user sessions and from user preferences to personalize interactions and facilitate ongoing engagement; (b) [asks] unprompted or unsolicited questions that are not direct responses to user input and that suggest or concern emotional topics; [and] (c) [sustains] an ongoing dialog concerning matters that are personal to the user.5
The companies most likely caught off guard are not AI vendors — they are enterprises that integrated AI tools into customer-facing or employee-facing workflows without tracking how those tools evolved.
A patient portal chatbot originally built for appointment scheduling may have been updated to ask wellness check-in questions. A financial planning chatbot that started as an FAQ tool may now ask about life goals across sessions. An HR onboarding assistant that personalizes and asks new hires how they are adjusting could satisfy all three prongs. Basic customer service bots and general-purpose assistants fall outside scope, but the boundary is less clear for tools that have added personalization and emotional engagement through vendor updates the deploying company never specifically approved. As Illinois' AI therapy guardrails showed, the line between permitted and regulated AI use is narrowing fast.
What is the litigation exposure?
The private right of action sets a low bar for standing and a high ceiling for damages:
A person who suffers an ascertainable loss of money or property or other injury in fact as a result of a violation ... may bring an action ... [for] damages of $1,000 per violation.6
The statute does not define "violation." If each conversation session counts, exposure is substantial. If each message within a session counts, a single user's claim could reach tens of thousands of dollars and a class action could reach tens of millions.
Illinois' BIPA showed what this enforcement model produces: thousands of lawsuits and billions in settlements. Oregon's law follows the same structural template — and unlike BIPA, could reach companies that did not realize their vendor's chatbot qualified as an "AI companion."
What should companies do now?
Audit your vendor stack. Companies may wish to map every third-party AI tool in customer support, patient engagement, HR, and financial planning against the three-prong definition — particularly tools that have added personalization or emotional engagement features through recent updates.
Review vendor contracts. Vendor agreements predating SB 1546 almost certainly do not address mandatory conversation interruption or crisis referral obligations. Companies may wish to confirm liability allocation and indemnification coverage for detection failures.
Check your insurance. It may be worth reviewing tech E&O and cyber policies to confirm that chatbot-related statutory damages claims and vendor indemnification gaps are covered.
Oregon is a leading indicator, not an outlier. With 78 chatbot bills across 27 states in 2026, similar legislation will follow. The private right of action is the structural shift: when any user can sue for statutory damages, the economics of compliance move from "monitor and respond" to “prevent or pay.”
Oregon SB 1546, enrolled text (March 5, 2026), https://olis.oregonlegislature.gov/liz/2026R1/Downloads/MeasureDocument/SB1546/Enrolled. Senate vote: 26-1; House vote: 52-0. ↩
SB 1546, Section 2(1) (defining "AI companion"). ↩
SB 1546, Section 4 (annual reporting to Oregon Health Authority). ↩
California SB 243, Chapter 22.6 commencing with Section 22601, Division 8, Business and Professions Code (effective Jan. 1, 2026). ↩
SB 1546, Section 2(1)(a)-(c). ↩
SB 1546, Section 6 (private right of action; statutory damages). ↩

/Passle/678abaae4818a4de3a652a62/MediaLibrary/Images/2026-02-17-22-15-47-365-6994e8939c195916231207f1.png)
/Passle/678abaae4818a4de3a652a62/SearchServiceImages/2026-03-06-14-44-41-043-69aae85960610c070187f4cd.jpg)
/Passle/678abaae4818a4de3a652a62/SearchServiceImages/2026-03-06-21-11-40-255-69ab430c286396897520b89f.jpg)
/Passle/678abaae4818a4de3a652a62/SearchServiceImages/2026-03-13-00-11-01-790-69b356154835c46c084af6f9.jpg)