Writing laws for when mind meets machine


Stuart Corner
Contributor

In 2019, Elon Musk announced he hoped to implant a two-way communication device into a human brain in 2020. He didn’t. The US Federal Drug Administration won’t let him, just yet.

However, he has put one into a pig and talks about, ultimately, a symbiotic relationship between neurotechnologically enhanced humans and AI. The company behind this is Neuralink, co-founded by Mr Musk in 2016 and into which he has invested $US100m.

If the technology comes anywhere near achieving what Elon Musk is suggesting, it will raise enormous ethical and legal issues. Not the least being to whom criminal responsibility should be sheeted home for an illegal act undertaken by a person fitted with such an implant.

artificial intelligence
Human-machine interface: Curly questions for the law

These thorny issues were discussed in a webinar hosted by the Australian Society for Computers and the Law (AusCL), by Dr Allan McCay and associate professor Tara Hamilton, and moderated by Tyrone Kirchengast, an associate professor and associate dean at Sydney Law School.

Amongst other roles, Dr McCay is a member of the NeuroRights Network an international group working towards responsible innovation in neurotechnology, and is the author of Neuro-interventions and the Law: Regulating Human Mental Capacity.

Professor Hamilton is from the UTS School of Electrical and Data Engineering and, amongst other roles is associate editor for Frontiers in Neuroscience (Neuromorphic Engineering).

Professor Hamilton offered a summary of some current brain interface technologies and achievements and was quick to put a damper on Mr Musk’s claim saying his pig demonstration “didn’t present anything that hasn’t been seen or done before,” adding that “We are really not good at connecting with the brain in a way which doesn’t destroy it, and which allows us to have a level of accuracy and sensitivity without actually hurting tissue.”

A whole new kind of ‘thoughtcrime’

In George Orwell’s 1984, a person was guilty of thoughtcrime simply by holding views contrary to those of Oceania’s totalitarian government. But with neurotechnology, Dr McCay canvassed the possibility that someone could commit a crime by controlling technology with their brain. This, he said, would be “quite disruptive for the law.”

He discussed a hypothetical example in which, through a brain-computer interface, an individual – who he called John – was able to commit the crime of intimate image abuse, better known as revenge porn, without any manual activity.

“A legal question might be: what did John do to make those images available to people on social media? A court might say a mental act, an act of the imagination, was the conduct constituting actus reus [the criminal act as distinct from mens rea, the mental element of a criminal act].”

This would be a radial step for the law, he suggested. “We in criminal law have learned that there’s the mental part, the mens rea, and that’s the guilt. And then there’s the conduct part that’s bodily. Now it seems this kind of distinction between the actus reus and the mens rea is not quite so firm as it was.”

Neurotechnology versus recidivism

As brain computer interface (BCI) technology evolves, Dr McCay suggested the legal challenges would increase significantly. “What happens if the BCI malfunctions, so it wrongly decodes the signal [that posts the intimate image. What happens if John’s BCI gets hacked?

“Judges, when they sentence, have to consider, deterring the offender. So perhaps the device could be reprogrammed. So, if John were to start looking at intimate images and assembling them, that could trigger a warning.”

This would require involvement of neurotechnologists, he says. “Could the criminal justice system start to ask people like [Associate Professor Tara Hamilton] to get involved in deterrence and rehabilitation? And should engineers be engaged in in that kind of project?”

“BCIs could be disruptive for criminal law in the ways that I described and, quite frankly, in many others.”

Legislators on the front foot

Dr McCay said some countries were already anticipating the potential of neurotechnology with legislation.

“Chile is changing its constitution, and passing some legislation called the Neuroprotection Bill. They want to be on the front foot, guiding responsible development of neurotechnology with an inter-disciplinary panel.”

According to Columbia University’s Neuro Rights Initiative, Chile’s constitutional amendment “defines mental identity, for the first time in history, as a right that cannot be manipulated.”

The bill “Includes five fundamental principles: the right to personal identity, free will, mental privacy, equitable access to technologies that augment human capacities, and the right to protection against bias and discrimination. … [and] defines all data obtained from the brain as ‘NeuroData’ and applies to them the existing legislation on organ donations, outlawing the commerce of NeuroData. It also applies medical legislation to the future use and development of Neurotechnology.”

Chile seems to be ahead of the game, but moderator Tyrone Kirchengast expressed confidence that, elsewhere, the law would rise to the challenges posed by neurotechnology.

“The principles of criminal responsibility have served us well for centuries. Don’t underestimate their inherent flexibility to adapt to the unknown,” professor Kirchengast said.

Do you know more? Contact James Riley via Email.

Leave a Comment