Google AI
The Times Australia
The Times World News

.

Our neurodata can reveal our most private selves. As brain implants become common, how will it be protected?

  • Written by: Christina Maher, Researcher, University of Sydney
Our neurodata can reveal our most private selves. As brain implants become common, how will it be protected?

“Hello world!”

On December 2021, these were the first words tweeted[1] by a paralysed man using only his thoughts and a brain-computer interface (BCI) implanted by the company Synchron.

For millions living with paralysis, epilepsy and neuromuscular conditions, BCIs offer restored movement and, more recently, thought-to-text capabilities.

So far, few invasive (implanted) versions of the technology have been commercialised[2]. But a number of companies are determined to change this.

Synchron is joined by Elon Musk’s Neuralink, which has documented a monkey playing the computer game Pong[3] using its BCI – as well as the newer Precision Neuroscience[4], which recently raised[5] US$41 million towards building a reversible implant thinner than a human hair.

Eventually, BCIs will allow people to carry out a range of tasks using their thoughts. But is this terrific, or terrifying?

How do BCIs work?

BCIs can be non-invasive (wearable) or invasive (implanted). Electrical activity is the most commonly captured “neurodata”, with invasive BCIs providing better signal quality than non-invasive ones.

The functionality of most BCIs can be summarised as passive, active and reactive. All BCIs use signal processing[6] to filter brain signals. After processing, active and reactive BCIs can return outputs in response to a user’s voluntary brain activity.

Signals from specific brain regions are considered a combination of many tiny signals from multiple regions. So BCIs use pattern recognition algorithms[7] to decipher a signal’s potential origins and link it to an intentional event, such as a task or thought.

One of the first implanted BCIs[8] treated drug-resistant seizures in some of the 50 million people with epilepsy. And ongoing clinical trials signal[9] a new era for neurologically and physically impaired people.

Outside the clinical realm, however, neurodata exist in a largely unregulated space.

Read more: Elon Musk claims his Neuralink brain chip could 'cure' tinnitus in 5 years. But don't hold your breath[10]

An unknown middleman

In human interaction, thoughts are interpreted by the person experiencing and communicating them, and separately by the person receiving the communication. In this sense, allowing algorithms to interpret our thoughts could be likened to another entity “speaking” for us.

This could raise issues in a future where thought-to-text is widespread. For example, a BCI may generate the output “I’m good”, when the user intended it to be “I’m great”. These are similar, but they aren’t the same. It’s easy enough for an able-bodied person to physically correct the mistake – but for people who can only communicate through BCIs, there’s a risk of being misinterpreted.

Moreover, implanted BCIs can provide rich access to all brain signals; there is no option to pick and choose which signals are shared.

Brain data are arguably our most private data because of what can be inferred regarding our identity and mental state. Yet private BCI companies may not need to inform users[11] about what data are used to train algorithms, or how the data are linked to interpretations that lead to outputs.

In Australia, strict data storage rules[12] require that all BCI-related patient data are stored on secure servers in a de-identified form, which helps protect patient privacy. But requirements outside of a research context are unclear.

What’s at risk if neurodata aren’t protected?

BCIs are unlikely to launch us into a dystopian world – in part due to current computational constraints. After all, there’s a leap between a BCI sending a short text and interpreting one’s entire stream of consciousness.

That said, making this leap largely comes down to how well we can train algorithms, which requires more data and computing power. The rise of quantum computing[13] – whenever that may be – could provide these additional computational resources.

Current BCIs aren’t advanced enough to quickly and reliably interpret a stream of thoughts — but a growth in computational power may allow this in the future. Shutterstock

Cathy O'Neil’s 2016 book, Weapons of Math Destruction[14], highlights how algorithms that measure complex concepts such as human qualities could let predatory entities make important decisions for the most vulnerable people.

Here are some hypothetical worst-case scenarios.

  1. Third-party companies might buy neurodata from BCI companies and use it to make decisions, such as whether someone is granted a loan or access to health care.

  2. Courts might be allowed to order neuromonitoring[15] of individuals with the potential to commit crimes, based on their previous history or socio-demographic environment.

  3. BCIs specialised for “neuroenhancement” could be made a condition of employment, such as in the military[16]. This would blur the boundaries between human reasoning and algorithmic influence.

  4. As with all industries where data privacy is critical, there is a genuine risk of neurodata hacking, where cybercriminals access and exploit brain data.

Then there are subtler examples, including the potential for bias. In the future, bias may be introduced into BCI technologies in a number of ways, including through:

  • the selection of homogeneous training data

  • a lack of diversity among clinical trial participants (especially in control groups)

  • a lack of diversity in the teams that design the algorithms and software.

If BCIs are to cater to diverse users, then diversity will need to be factored into every stage of development.

How can we protect neurodata?

The vision for “neurorights[17]” is an evolving space. The ethical challenges lie in the balance between choosing what is best for individuals and what is best for society at large.

For instance, should individuals in the military be equipped with neuroenhancing devices so they can better serve their country and protect themselves on the front lines, or would that compromise their individual identity and privacy? And which legislation should capture neurorights: data protection law, health law, consumer law, or criminal law?

In a world first, Chile[18] passed a neurorights law in 2021 to protect mental privacy, by explicitly classifying mental data and brain activity as a human right to be legally protected. Though a step in the right direction, it remains unclear how such a law would be enforced.

One US-based patient group is taking matters into its own hands. The BCI Pioneers[19] is an advocate group ensuring the conversation around neuroethics is patient-led.

Other efforts include the Neurorights Foundation[20], and the proposal of a “technocratic oath[21]” modelled on the Hippocratic oath taken by medical doctors. An International Organisation for Standardisation committee[22] for BCI standards is also under way.

Read more: Neuralink put a chip in Gertrude the pig's brain. It might be useful one day[23]

References

  1. ^ first words tweeted (www.businesswire.com)
  2. ^ commercialised (www.neuropace.com)
  3. ^ monkey playing the computer game Pong (theconversation.com)
  4. ^ Precision Neuroscience (precisionneuro.io)
  5. ^ recently raised (www.globenewswire.com)
  6. ^ signal processing (theconversation.com)
  7. ^ pattern recognition algorithms (recfaces.com)
  8. ^ first implanted BCIs (www.neuropace.com)
  9. ^ signal (jamanetwork.com)
  10. ^ Elon Musk claims his Neuralink brain chip could 'cure' tinnitus in 5 years. But don't hold your breath (theconversation.com)
  11. ^ may not need to inform users (fpf.org)
  12. ^ data storage rules (www.nhmrc.gov.au)
  13. ^ quantum computing (www.ncbi.nlm.nih.gov)
  14. ^ Weapons of Math Destruction (blogs.scientificamerican.com)
  15. ^ order neuromonitoring (link.springer.com)
  16. ^ military (theconversation.com)
  17. ^ neurorights (www.frontiersin.org)
  18. ^ Chile (neurorightsfoundation.org)
  19. ^ BCI Pioneers (www.bcipioneers.org)
  20. ^ Neurorights Foundation (neurorightsfoundation.org)
  21. ^ technocratic oath (link.springer.com)
  22. ^ committee (www.iso.org)
  23. ^ Neuralink put a chip in Gertrude the pig's brain. It might be useful one day (theconversation.com)

Read more https://theconversation.com/our-neurodata-can-reveal-our-most-private-selves-as-brain-implants-become-common-how-will-it-be-protected-197047

Times Magazine

A Report From France: The Mood of a Nation

France occupies a unique place in the global imagination. To many outsiders, it remains the land ...

“More Choice” Or Fewer Choices? Australia’s New Vehicle Emission Rules

The Changing Face Of Motoring When the Federal Government announced Australia’s new fuel efficien...

Female founders to benefit from new funding to turn their ideas into viable ventures

The University of Newcastle Integrated Innovation Network (I2N) has been selected by the NSW Governm...

GLOBAL SPORTS MARKETING HEAVYWEIGHTS CONVERGE IN BRISBANE FOR INAUGURAL VICTORY LAP

Australia’s premier sports marketing and creative summit, Victory Lap, has revealed its lineup of in...

The 2026 Met Gala: Fashion, Power and the Theatre of Exclusivity

Each year, on the first Monday in May, the global fashion industry converges on the steps of Metro...

Australian Wine Guide

A Quick but Informed Guide to the Varieties and Popular Brands of Australian WinesDon’t let a wine...

The Times Features

The Overlooked Link Between Flat Tennis Balls and Tenni…

Tennis elbow is the sport's most common injury. Up to 50% of recreational players will experience it...

The Australian Government will hand down the 2026/27 Federal Budget on Tuesday 12 May, and with co...

64% of Aussie kids are influencing family holiday plans…

Forget coats and heaters- think t-shirts, thongs, sunscreen and swimming. Whales aren’t the only one...

Health Insurance Recent Government Changes — And What T…

Part of the confusion surrounding private health insurance is that governments regularly adjust th...

A Report From France: The Mood of a Nation

France occupies a unique place in the global imagination. To many outsiders, it remains the land ...

The More Things Change: Change Can Hurt

The only constant in life is change. It sounds wise because it is true. Nothing stays still fore...

Seeking Financial Advice Before Investing: How Australi…

Australians are constantly reminded to “seek financial advice” before making investment decisions...

Female founders to benefit from new funding to turn the…

The University of Newcastle Integrated Innovation Network (I2N) has been selected by the NSW Governm...

MoleMap ANZ continues growth trajectory with acquisitio…

MoleMap, Australia and New Zealand’s leading skin cancer detection and surveillance service, has...