Consumer protection: California passes law to protect neural data
After Colorado, California is now also passing a law to protect consumers' brain data.
California has passed a law that expands the California Consumer Privacy Act (CCPA) and classifies neural data as personally sensitive information. This makes California the second US state after Colorado to legally protect data produced with the help of neurotechnology.
Similar laws already exist in Chile and Rio Grande do Sul , as the Neurorights Foundation writes on X. According to the Californian law, neural data is "information that is generated by measuring the activity of a consumer's central or peripheral nervous system and that is not derived from non-neural information". With the passing of the law, neural data now receives the same CCPA protection as, for example, genetic data, biometric data, geodata and consumer data.
However, the law only affects neural data collected by non-invasive medical neurotechnologies. Invasive neurotechnologies are already subject to the Health Insurance Portability and Accountability Act, writes Legaltech News.
Commercial use of data, vague laws
The CCPA applies to for-profit companies with annual gross revenues of $25 million or more that buy, sell or share personal information of at least 100,000 California residents or households. They must also generate at least 50 percent of their annual revenue from the sale or sharing of resident data. Meta's AR glasses "Orion" could fall under this law. However, a number of different start-ups would fall below this threshold.
Nita Farahany, Professor of Law and Philosophy at Duke Science and Society, welcomes the law. In a Linkedin post, she describes the definitions in California and Colorado as too vague. In her opinion, data collected through EEGs, eye-tracking or wearables could also provide information about a person's condition – for example about stress or emotions. However, this data is not protected.
In addition, the draft law excludes anonymized or aggregated data. However, the individuals from whom the data originates could be de-anonymized in combination with other sources. Farahany and her team write more about this in a paper recently published in the journal Neuron.
(mack)