The Times Australia
Fisher and Paykel Appliances
Health

.

Can you say no to your doctor using an AI scribe?

  • Written by Saeed Akhlaghpour, Associate Professor of Business Information Systems, The University of Queensland

Doctors’ offices were once private. But increasingly, artificial intelligence (AI) scribes (also known as digital scribes) are listening in.

These tools can record and transcribe the conversation between doctor and patient, and draft structured clinical notes. Some also produce referral letters and admin outputs, and even update medical records[1] – but only after clinician review and approval.

Some estimates suggest about one in four Australian GPs[2] are already using an AI scribe. Major hospitals[3], including children’s hospitals[4], are also trialling them.

The pitch is simple: less typing for doctors, more eye contact with the patient. But what about patients’ privacy?

Until recently, the AI scribe market has been largely unregulated. But last month the Therapeutic Goods Administration (TGA) – Australia’s medical device regulator – decided[5] some scribes meet the legal definition of a medical device.

Here’s what this will change, and what patients should know – and ask – about AI scribes in the consult room.

What’s changing

Until now, many AI scribe vendors, from Microsoft[6] to rising Australian startups such as Heidi[7] and Lyrebird[8] – and over 120 other providers[9] – have marketed their tools as “productivity” software.

This means they have avoided the scrutiny of medical devices, which the TGA regulates[10].

Now, the TGA has found some AI scribes[11] meet the definition of a medical device, especially if they go beyond transcription to suggest diagnoses or treatments.

Medical devices must be registered with the TGA, shown to be safe and do what they claim, and any safety problems or malfunctions must be reported.

The TGA has begun compliance reviews[12], with penalties for unregistered AI scribes.

This follows similar developments overseas. In June 2025, the United Kingdom[13] health authorities announced tools that transcribe and summarise will be treated as medical devices.

Although still evolving, there are signs the United States[14] will move in a similar direction, and the European Union[15] may too.

In Australia, the TGA has only just begun reviewing AI scribes, so patients can’t assume they’ve been tested to the same standard as other medical products.

What patients should know about AI scribes

They can help – but they are not perfect.

Doctors report[16] spending less time on keyboards, and some patients report better conversations.

But tools built on large language models can “hallucinate” – add details never said. One 2024 case study[17] recorded casual remarks about a patient’s hands, feet and mouth as a diagnosis of hand, foot and mouth disease. The potential for errors means clinicians still need to review the note before it enters your record.

Performance varies.

Accuracy dips with accents, background noise and jargon. In a health system as multicultural as Australia’s[18], errors across accents and languages are a safety issue.

The Royal Australian College of General Practitioners warns[19] poorly designed tools can shift hidden work back to clinicians, who then spend extra time correcting notes. Research has found[20] products’ time-saving claims are often overstated once review and correction time is included, underlining the need for devices to be evaluated independently.

Privacy matters.

Health data is already a target for hackers and scammers, as the 2022 Medibank breach[21] showed. In recent research with colleagues, we found[22] unsecured third-party applications and lax data protection are among the leading causes of health data breaches.

Clinicians need a clear “pause” option and should avoid use in sensitive consults (for example, discussions about family violence, substance use or legal matters).

Companies must be explicit about where the audio and data are stored, who can access it, and how long it is kept. In practice[23], policies vary: some store recordings on overseas cloud servers while others keep transcripts short-term and onshore.

A lack of transparency means it’s often unclear whether data can be traced back to individual patients or reused to train AI[24].

Consent is not a tick box.

Clinicians should tell you when recording is on and explain risks and benefits. You should be able to say no without jeopardising care. One recent case in Australia saw a patient[25] have to cancel a A$1,300 appointment, after they declined a scribe and the clinic refused to proceed.

For Aboriginal and Torres Strait Islander patients, consent should reflect community norms and data sovereignty[26], especially if notes are used to train AI[27].

Five practical questions to ask your doctor

  1. Is this tool approved? Is it the clinic’s standard practice to use this tool, and does it require TGA registration for this use?

  2. Who can access my data? Where is the audio stored, for how long, and is it used to train the system?

  3. Can we pause or opt out? Is there a clear pause button and a non-AI alternative for sensitive topics?

  4. Do you review the note before it goes into my record? Is the output always treated as a draft until you sign off?

  5. What happens if the AI gets it wrong? Is there an audit trail linking the note back to the original audio so errors can be traced and fixed quickly?

Safer care, not just faster notes

Right now, the burden of ensuring AI scribes are used safely rests disproportionately on individual doctors and patients. The TGA’s decision to classify some scribes as medical devices is a positive move, but it is only a first step.

We also need:

  • the TGA, professional bodies and researchers to work together on clear standards for consent, data retention and training

  • independent evaluations of how these tools perform in real consults

  • risk-based rules and stronger enforcement, adapted to AI software rather than traditional devices.

Strong rules also weed out flimsy products: if a tool cannot show it is safe and secure, it should not be in the consult room.

References

  1. ^ medical records (www.racgp.org.au)
  2. ^ one in four Australian GPs (www.abc.net.au)
  3. ^ Major hospitals (www.goldcoast.health.qld.gov.au)
  4. ^ children’s hospitals (www.metrosouth.health.qld.gov.au)
  5. ^ decided (www.tga.gov.au)
  6. ^ Microsoft (www.microsoft.com)
  7. ^ Heidi (www.heidihealth.com)
  8. ^ Lyrebird (www.lyrebirdhealth.com)
  9. ^ over 120 other providers (www.linkedin.com)
  10. ^ regulates (www.tga.gov.au)
  11. ^ AI scribes (www.tga.gov.au)
  12. ^ compliance reviews (www.tga.gov.au)
  13. ^ the United Kingdom (www.healthservicesdaily.com.au)
  14. ^ United States (downloads.regulations.gov)
  15. ^ European Union (www.tandemhealth.ai)
  16. ^ report (doi.org)
  17. ^ One 2024 case study (doi.org)
  18. ^ as multicultural as Australia’s (www.aihw.gov.au)
  19. ^ warns (www.racgp.org.au)
  20. ^ Research has found (doi.org)
  21. ^ the 2022 Medibank breach (www.qld.gov.au)
  22. ^ we found (doi.org)
  23. ^ In practice (avant.org.au)
  24. ^ to train AI (helpcenter.medmehealth.com)
  25. ^ saw a patient (www.smartcompany.com.au)
  26. ^ data sovereignty (www.lowitja.org.au)
  27. ^ if notes are used to train AI (doi.org)

Read more https://theconversation.com/can-you-say-no-to-your-doctor-using-an-ai-scribe-264701

Active Wear

Times Magazine

World Kindness Day: Commentary from Kath Koschel, founder of Kindness Factory.

What does World Kindness Day mean to you as an individual, and to the Kindness Factory as an organ...

In 2024, the climate crisis worsened in all ways. But we can still limit warming with bold action

Climate change has been on the world’s radar for decades[1]. Predictions made by scientists at...

End-of-Life Planning: Why Talking About Death With Family Makes Funeral Planning Easier

I spend a lot of time talking about death. Not in a morbid, gloomy way—but in the same way we d...

YepAI Joins Victoria's AI Trade Mission to Singapore for Big Data & AI World Asia 2025

YepAI, a Melbourne-based leader in enterprise artificial intelligence solutions, announced today...

Building a Strong Online Presence with Katoomba Web Design

Katoomba web design is more than just creating a website that looks good—it’s about building an onli...

September Sunset Polo

International Polo Tour To Bridge Historic Sport, Life-Changing Philanthropy, and Breath-Taking Beau...

The Times Features

How early is too early’ for Hot Cross Buns to hit supermarket and bakery shelves

Every year, Australians find themselves in the middle of the nation’s most delicious dilemmas - ...

Ovarian cancer community rallied Parliament

The fight against ovarian cancer took centre stage at Parliament House in Canberra last week as th...

After 2 years of devastating war, will Arab countries now turn their backs on Israel?

The Middle East has long been riddled by instability. This makes getting a sense of the broader...

RBA keeps interest rates on hold, leaving borrowers looking further ahead for relief

As expected, the Reserve Bank of Australia (RBA) has kept the cash rate steady at 3.6%[1]. Its b...

Crystalbrook Collection Introduces ‘No Rings Attached’: Australia’s First Un-Honeymoon for Couples

Why should newlyweds have all the fun? As Australia’s crude marriage rate falls to a 20-year low, ...

Echoes of the Past: Sue Carter Brings Ancient Worlds to Life at Birli Gallery

Launching November 15 at 6pm at Birli Gallery, Midland, Echoes of the Past marks the highly anti...

Why careless adoption of AI backfires so easily

Artificial intelligence (AI) is rapidly becoming commonplace, despite statistics showing[1] th...

How airline fares are set and should we expect lower fares any time soon?

Airline ticket prices may seem mysterious (why is the same flight one price one day, quite anoth...

What is the American public’s verdict on the first year of Donald Trump’s second term as President?

In short: the verdict is decidedly mixed, leaning negative. Trump’s overall job-approval ra...