Brain-computer interface (BCI) technologies are advancing rapidly, and 鈥 as with other innovations like artificial intelligence (AI) 鈥 regulation often lags, leaving the world to 鈥榩lay catch-up.鈥

In the US, a retroactive approach to regulatory policy has become standard practice for the Federal Trade Commission (FTC), ever since former FTC commissioner Maureen Ohlhausen popularised the concept of , a view that, in essence, defers regulatory intervention unless or until a market failure arises.

With President Trump back in the White House, the current administration鈥檚 disdain for regulation is clear. In January, a was issued, asserting that regulation has 鈥渕assive costs on the lives of millions of Americans, creates a substantial restraint on our economic growth and ability to build and innovate鈥.

BCI technology represents an altogether new privacy threat given the technology鈥檚 generation of neural data and the overall 鈥榠ntimacy鈥 of this data. US Democratic party leader Chuck Schumer wants to get ahead of the curve and shore up privacy protections while the technology largely remains at the developmental stage.

In a to FTC chair Andrew Ferguson, Schumer pointed out the following: 鈥淯nlike other personal data, neural data 鈥 can reveal mental health conditions, emotional states, and cognitive patterns, even when anonymised. This information is not only deeply personal; it is also strategically sensitive.鈥

Schumer urged the FTC to investigate whether companies developing or deploying BCI technologies were 鈥渆ngaging in deceptive or unfair practices鈥 and requested that the watchdog outline how it intends to shore up existing privacy standards and clarify how these will apply to neural data.

GlobalData Strategic Intelligence

US Tariffs are shifting - will you react or anticipate?

Don鈥檛 let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.

By GlobalData

The optimal strategy for managing neural data

According to Ravi de Silva, founder of regulatory compliance consultancy de Risk Partners, the world is entering a critical phase in emerging technology regulation, especially when it comes to the handling of neural data.

鈥淩ight now, most companies developing or investing in neurotech are operating in a space that has very little formal compliance infrastructure. That creates major blind spots for both regulators and businesses.

De Silva鈥檚 view is that companies cannot afford to wait for regulators to tell them what responsible data use looks like.

鈥淭hey need to set internal standards early and stay ahead of potential risks. Neurotech companies should be embedding governance into product development from day one. That means clear policies around what data is being collected, how it’s stored and used, who has access, and what consent looks like.鈥

The risks associated with neural data misuse

Although BCI technology hasn鈥檛 gone mainstream yet, the rapid advancement of AI suggests it could follow a similarly accelerated path of innovation.

Neuralink is in clinical trials and recently netted a further $650m investment in a Series E funding round, and once cleared, other companies will likely rise in the field, and a technology that sounds like the heady stuff of science fiction could be a regular part of many people鈥檚 lives within the next decade.

鈥淥ne of the dangers of the BCI information generated by companies like Neuralink is that it can be commoditised, monetised, and marketed in a way that is to the detriment to the people whose well- being depends upon the technology,鈥 says Antony K Haynes, partner, group head of the cybersecurity, data privacy & AI practice group at law firm Dorf Nelson & Zauderer.

Using Neuralink as an example, Haynes says that in the near future, many people may be reliant on such technology to lead normal lives, and this reliance could put them 鈥渁t the mercy of the vendor鈥.

Haynes points out that a vendor could increase the cost of their service or make a user pay more to avoid advertisements from being embedded into their BCI chip.

鈥淭he concern here is about the essential commodification of an individual鈥檚 identity and their overall ability to function as a human being.

鈥淭o me, the value in [Schumer鈥檚] letter to the FTC is to start flagging these issues. There may be a limit to what a regulatory agency can do, but Schumer is saying, do what you can do under the Federal Trade Commission Act.鈥

The regulatory makeup of the US

The US does not have a nationwide stance on privacy policy; it has always been managed on a state-wide basis.

Kurt Osburn, a director at cybersecurity company NCC Group鈥檚 risk management and governance team states that when doing privacy reviews for clients, the company generally sticks to California鈥檚 California Consumer Privacy Act (CCPA) guidelines as they’re the most restrictive.

On the likelihood of a nationwide law being enacted any time soon, Osburn says: 鈥淲e have a House of Representatives in the Senate that have to agree before it even gets to the White House for signature into law. Every year there’s something posted, and every year there’s something that runs through Congress, and every year it doesn’t make it to the President’s desk.鈥

This challenge is likely to be exacerbated further by the Trump Administration鈥檚 stance on regulation.

鈥淎ny regulation for neural data is going to have to come from States. It’s also going to have to be driven by people that want to monitor, like Schumer and the Democrats, or people that do want that data to be protected,鈥 says Osburn.

鈥淚 don鈥檛 see where the will to get it implemented as a law is going to come to from because it鈥檚 ultimately a 鈥榩olitical ball.鈥欌

Haynes notes that the Biden administration wasn鈥檛 making moves around privacy protections regarding AI, a reasonable bellwether in forecasting how regulating neural data may unfold, as quickly as civil liberty and privacy advocates would have liked.

鈥淏ut the Trump administration is moving in the opposite direction, trying to reduce regulation and restrictions on the ability of corporations to innovate and integrate new products.鈥

Without any umbrella privacy regulation like the EU鈥檚 AI Act of General Data Protection Regulation (GDPR), Haynes says the reality in the US of a piecemeal, state-by-state approach to privacy regulation makes it 鈥渟omewhat unstable and chaotic鈥. And Haynes points out that while some US states have begun to regulate AI, many corporations have made moves at a federal level to request states be prevented from regulating AI.

鈥淭here’s this idea that regulation will impede innovation and the ability to create new products and services, putting America at a competitive disadvantage compared to China and other countries,鈥 Haynes explains.

The likelihood of regulatory action

Given the popularisation of regulatory humility at the FTC and the broad favour for less regulation from the Trump administration, Haynes is doubtful the FTC will act on the Schumer鈥檚 letter.

The FTC letter proposed a 30-day action window, with Haynes鈥 view borne out by the lack of action from the FTC within this timeframe.

鈥淭hey have chosen not to act. I’m not surprised given that is consistent with the current sentiment,鈥 he says.

鈥淢ore attention should be given to protecting data privacy in general, and the data from BCIs. I just don’t think that the current agency leaders, the commissioners in the SEC, and the current composition the US House and Senate is supportive of that initiative,鈥 he concludes.

In the near term, as long as the matter of regulation continues to be a 鈥榩olitical football鈥 on both sides of the aisle, that regulation comes to pass for technology that is not yet widely available commercially, the likes of some 200,000 cochlea implants that use BCI technology in the US notwithstanding, for now, the regulation of neural data appears to sit firmly in the 鈥榮hould鈥 opposed to the 鈥榳ill鈥 column of regulatory inquiry.