Purpose

Document the legal framework, rights, and policies surrounding AI use during medical visits, including when doctors can refuse patient-brought AI assistants and when patients can refuse doctor-initiated AI tools.

Executive Summary

Can doctors refuse AI assistants during visits? Yes, but it depends on who initiated the AI use:

  • Doctor-initiated AI (AI scribes, diagnostic tools): Patients can refuse, but doctors can generally use AI with proper consent
  • Patient-initiated AI (personal recording/assistant): Doctors likely can refuse, though legal framework is underdeveloped

The legal landscape is rapidly evolving, with state-level regulations emerging in 2025-2026.


Two Distinct Scenarios

Scenario 1: Doctor Uses AI (AI Scribes, Diagnostic Tools)

What it is:

  • AI-powered documentation systems (e.g., Nuance DAX, DeepScribe, Abridge)
  • AI diagnostic support tools
  • Automated transcription services

Legal Framework:

LevelRequirementDetails
Federal (HIPAA)No explicit consent requiredAI vendor must sign Business Associate Agreement (BAA)
State LawsVaries by stateSome states require written consent before recording
AMA GuidelinesInformed consent recommendedPromotes trust and transparency

Patient Rights:

  • Right to be informed about AI use
  • Right to refuse AI tools
  • Right to request alternative documentation methods
  • Right to revoke consent at any time

Doctor Obligations:

  • Disclose when and how AI is used
  • Explain capabilities and limitations
  • Communicate data sharing practices
  • Document patient opt-out in chart
  • Provide alternative documentation if patient refuses

Recent Legal Developments:

  1. California AB 3030 (Effective Jan 1, 2025)

    • Requires notification when GenAI communicates “patient clinical information”
    • Exemption: if communication is reviewed by licensed human provider
  2. Sharp HealthCare Lawsuit (2026)

    • Class action alleging AI scribe use without consent
    • Alleged violations of California privacy statutes
    • Highlights importance of proper consent procedures
  3. Colorado AI Legislation (Effective Feb 1, 2026)

    • First comprehensive state AI law
    • Sets precedent for other states

Scenario 2: Patient Brings Own AI Assistant

What it is:

  • Patient’s personal AI note-taking apps (e.g., Otter.ai, voice recorders)
  • Consumer AI assistants (ChatGPT, Claude, etc.)
  • Personal recording devices

Legal Framework:

  • Underdeveloped - no clear federal or state guidance
  • Falls under general recording consent laws (varies by state)
  • HIPAA doesn’t directly address patient-owned AI

Doctor Rights (Likely):

  • Can refuse patient-brought recording devices
  • Can set clinic policy prohibiting patient AI use
  • Can decline to treat if patient insists on using AI

Privacy Concerns:

  • Consumer AI platforms rarely offer BAAs
  • Patient’s AI may not have proper HIPAA safeguards
  • PHI could be exposed to third-party AI companies
  • Doctor cannot control patient AI’s data handling

Notable Case:

  • Clinic called law enforcement when patient refused to let doctor use AI for notes
  • Demonstrates tensions and lack of clear protocols

Current State:

  • Most medical practices lack policies on patient-brought AI
  • Legal liability questions remain unresolved
  • Hospitals developing multidisciplinary teams to address these issues

One-Party Consent States:

  • Only one person in conversation needs to consent to recording
  • Patient can record without doctor’s permission (in most contexts)
  • Approximately 38 states

Two-Party Consent States:

  • All parties must consent to recording
  • Patient must get doctor’s permission to record
  • Includes: California, Connecticut, Florida, Illinois, Maryland, Massachusetts, Montana, New Hampshire, Pennsylvania, Washington

Important: Even in one-party states, healthcare facilities can prohibit recording via policy.

AI-Specific State Laws (Emerging)

StateLawEffective DateRequirements
CaliforniaAB 3030Jan 1, 2025Notification when GenAI communicates patient clinical info
ColoradoAI Consumer ProtectionFeb 1, 2026First comprehensive AI legislation (details evolving)

More states expected to enact AI-specific healthcare regulations in 2026-2027.


For Healthcare Providers Using AI

1. Written Consent Forms

  • Specify what AI tool is used and its purpose
  • Explain data handling and sharing practices
  • State patient’s right to refuse
  • Clarify that refusal won’t affect care quality
  • Document consent in patient chart

Example consent elements:

☐ I consent to [Practice Name] using AI-powered scribe technology to:
- Record our conversation
- Generate clinical documentation
- Improve accuracy of medical records
☐ I understand:
- The AI tool is [Tool Name]
- My data will be [encrypted/anonymized/etc.]
- I can opt out without affecting my care
- I can revoke consent at any time
☐ I decline the use of AI documentation tools

2. Verbal Consent (Less Preferred)

  • Must be documented in patient chart
  • Harder to verify later
  • Better than no consent, but written preferred

3. Patient Education

  • Explain what AI scribe does (listens, transcribes, generates notes)
  • Clarify human doctor still makes all clinical decisions
  • Address common concerns about privacy and accuracy

For Patients Wanting to Bring AI

1. Ask Before the Appointment

  • Contact clinic ahead of time
  • Ask about recording/AI assistant policy
  • Request written policy if available

2. Understand the Risks

  • Consumer AI lacks HIPAA protections
  • Your health data may be stored by AI company
  • Could affect doctor-patient relationship
  • Clinic may refuse service

3. Consider Alternatives

  • Request copy of medical records after visit
  • Take written notes during appointment
  • Ask doctor to use their AI scribe and share notes

1. What Happens When Patient Refuses Doctor’s AI?

Current Status: Unclear

Healthcare attorney Kathleen Healy: “Hospitals also must think about what they will do if they ask for consent to use AI tools to support a diagnosis, and the patient refuses.”

Possible Outcomes:

  • Doctor provides traditional documentation
  • Appointment takes longer
  • Patient receives same quality of care (legally required)
  • Practice may need alternative workflows

No reported cases yet of doctors refusing to see patients who decline AI.

2. Can Doctor Refuse to See Patient Who Insists on Using Personal AI?

Current Status: Likely yes, but untested

Legal Reasoning:

  • Doctors can refuse non-emergency patients for various reasons
  • Private practice has more discretion than hospital ERs
  • Comparable to patient recording policies

BUT:

  • Cannot refuse based on protected characteristics (race, disability, etc.)
  • Emergency departments have duty to treat (EMTALA)
  • May vary by state and practice type

No clear case law yet on this specific scenario.

3. Who Is Liable if AI Makes an Error?

Current Framework:

  • If AI assists diagnosis → Doctor still responsible for final decision
  • If AI transcription error → Depends on whether doctor reviewed notes
  • Malpractice liability remains with physician

Evolving Questions:

  • When does AI become “autonomous” enough to shift liability?
  • Are AI vendors liable for algorithmic errors?
  • How does informed consent affect liability?

Multiple federal agencies (FDA, HHS, CMS) are developing regulations.


Privacy and Data Protection

HIPAA Compliance

When Doctor Uses AI:

  • AI vendor must sign Business Associate Agreement (BAA)
  • BAA ensures AI company:
    • Protects PHI according to HIPAA standards
    • Reports breaches within required timeframes
    • Allows patient access to their data
    • Deletes data when contract ends

When Patient Uses AI:

  • Patient’s consumer AI not subject to HIPAA
  • Patient can share their own PHI however they want
  • But doctor cannot control where patient’s AI sends data
  • Creates potential security/privacy conflict

Common Consumer AI Platforms

PlatformHIPAA-Compliant?BAA Available?
ChatGPT (OpenAI)No (standard plan)Enterprise only
Claude (Anthropic)No (standard plan)Enterprise only
Otter.aiDependsBusiness plan only
Google RecorderNoNo
Apple Voice MemosNoNo

Key Point: Most consumer AI tools are not suitable for handling PHI during medical appointments.

FTC Health Breach Notification Rule

  • Applies to personal health record (PHR) vendors
  • Requires notification within 60 days of breach
  • Patient-facing AI apps may fall under this rule
  • Adds another layer of regulatory complexity

Federal Level

No blanket federal rule requiring doctors to inform patients about AI use.

Agencies Involved:

  • FDA: Regulates AI as medical devices (diagnostic tools)
  • HHS: HIPAA enforcement, privacy regulations
  • CMS: Medicare/Medicaid coverage and reimbursement
  • FTC: Consumer protection, unfair practices

AMA Position:

  • Using AI “at any point in the care process requires informed consent”
  • Promotes trust and transparency
  • Not legally binding, but influential

State Level

Rapidly evolving with states taking lead in AI regulation:

  • California leading with AB 3030 and privacy lawsuits
  • Colorado enacted first comprehensive AI law
  • Multiple states considering AI transparency bills
  • Patchwork of requirements creating compliance challenges

International Comparison

European Union (AI Act):

  • Requires transparency for AI in high-risk applications
  • Healthcare AI classified as high-risk
  • Stricter consent and disclosure requirements than US

Canada:

  • Artificial Intelligence and Data Act (AIDA) proposed
  • Emphasizes transparency and accountability

US is behind international peers in comprehensive AI healthcare regulation.


Practical Guidance

For Patients

If Your Doctor Uses AI:

  1. Ask what AI tool is being used
  2. Ask how your data will be protected
  3. Ask if you can opt out
  4. Request a copy of the consent form
  5. Know you can revoke consent anytime

If You Want to Bring Your Own AI:

  1. Call clinic before appointment to ask about policy
  2. Understand the privacy risks
  3. Consider requesting doctor’s AI-generated notes instead
  4. Be prepared for possible refusal

Your Rights:

  • Right to know when AI is used
  • Right to refuse AI tools
  • Right to access your medical records
  • Right to file complaints with state medical board

For Healthcare Providers

Immediate Steps:

  1. Develop written AI use policy
  2. Create patient consent forms
  3. Train staff on disclosure requirements
  4. Ensure AI vendors have BAAs
  5. Document patient opt-outs in charts

Policy Development:

  1. Form multidisciplinary team (legal, clinical, IT, compliance)
  2. Review state-specific requirements
  3. Establish patient-brought AI policy
  4. Create alternative workflows for patient refusal
  5. Update privacy notices

Risk Mitigation:

  • Always get consent before using AI
  • Document all AI use in patient records
  • Provide opt-out options
  • Review AI outputs before finalizing
  • Stay current on evolving regulations

Future Outlook

Predicted Developments (2026-2028)

  1. More State Regulations

    • Following California and Colorado’s lead
    • Likely focus on transparency and consent
  2. Federal Legislation Possible

    • Growing pressure for national standards
    • May preempt patchwork state laws
  3. Standardized Consent Forms

    • Industry groups developing templates
    • May become standard practice
  4. Clearer Legal Precedents

    • More lawsuits will establish case law
    • Courts will clarify ambiguous rights
  5. Patient-Brought AI Policies

    • Healthcare facilities will develop explicit policies
    • Likely range from prohibition to conditional allowance

Open Questions

  • Will patient use of personal AI become normalized?
  • How will reimbursement models adapt to AI documentation?
  • Will “AI-free” medical practices emerge as a niche?
  • What happens to AI-generated medical records in malpractice cases?

Key Takeaways

  1. Doctors can generally refuse patient-brought AI - though explicit case law is lacking

  2. Patients can refuse doctor’s AI - and must be offered alternative documentation

  3. Consent requirements vary by state - know your state’s recording and AI laws

  4. HIPAA applies to doctor’s AI - but not to patient’s personal AI tools

  5. Legal landscape is evolving rapidly - what’s unclear in 2026 may be settled by 2027

  6. Best practice: transparency and consent - regardless of legal minimums

  7. Privacy concerns are significant - consumer AI platforms lack HIPAA protections

  8. No clear protocols exist yet - for handling patient AI refusal or patient-brought AI


Sources