Amendments to California’s Consumer Privacy Act now include “neural data” under the category of “personal sensitive information,” which includes biometrics.
This comes at a critical time when many neurotechology companies are creating products to read, interpret, and collect neural data. As well as chip implants like Neuralink’s “Telepathy,” consumer-grade devices are more readily available than ever before. The Muse headband, for instance,is designed to improve your meditation practice by using EEG sensors to read brain activity patterns.
Noninvasive neurotech devices aren’t marketed as medical devices, meaning they’re unregulated and companies can collect and sell user data. This new law will protect that data from potential misuse.
Senator Josh Becker, Democrat of California, said that the importance of protecting neural data in the state “cannot be understated.”
The bill was widely backed by the American Academy of Neurology and several other medical and privacy regulatory organizations.
Some experts say the bill’s amendments weren’t necessary, as neural data was already covered under biometrics, and the change simply makes that clearer.
Other experts feel that neurotech companies have too much access to neural data.
A report from the Neurorights Foundation published in April 2024 looked at policy documents from 30 companies. It revealed that most companies have unlimited access to user’s neural data. Over 50% of the companies in the report shared this data with third parties.
It’s not just the data collected, but the inferences made from this data that pose a danger, according to experts such as Rafael Yuste, chair of the Neurorights Foundation and neuroscientist at Columbia University.
We’re already able to decode user’s feelings and thoughts accurately, and it isn’t enough to regulate brain data, according to Yuste. The bill needs to go further and prevent neurotechnology companies from making predictions based on users’ thoughts.
Professor of ethics of AI and neuroscience at Germany’s Technical University of Munich, Marcello Ienca, said that this inference was “extremely infringing” on privacy rights, regardless of whether neurotechnology, biosensors, facial recognition, or other technology was used.
According to Ienca, the wiser move would be to regulate the algorithms these companies use to make predictions, rather than the neurotechnology firms themselves and brain data.
California Follows in Colorado’s Footsteps
California isn’t the first state to pass this type of law.
Back in April, Colorado became the first US state to amend its privacy law to include neural data. Minnesota is also considering a standalone bill to protect brain data. But are actions taken by individual states enough?
Many believe that federal or global regulations are needed to prevent companies from collecting and selling our brain data. The Neurorights Foundation is campaigning for a new international treaty on neurorights, with an international agency to ensure everyone complies.
That will likely be difficult to achieve, or at the very least a way off. In the meantime, all neurotech devices could be categorized as medical devices, meaning they will require approval by the FDA.
Alternatively, changes could be made to US federal law to recognize brain data as “sensitive health data” ensuring it’s protected by the Health Insurance Portability and Accountability Act (HIPAA).