Focus On
NEW In-House Counsel | Insurance | Intellectual Property | Immigration | Natural Resources | Real Estate | Tax

Age of machines has huge implications for law

Thursday, August 18, 2016 @ 8:00 PM | By Geoff Kirbyson


The business of law is going back to the future. Driverless vehicles, seek-and-destroy military drones and artificial intelligence doctors — things that were seen as futuristic not that long ago — are here, revolutionizing society and raising all kinds of questions for the legal profession.

At first blush, if a driverless vehicle gets into an accident, is the manufacturer liable for the damages or is there a point at which the driver has an obligation to take control?

For example, the man killed this July in Florida in what is believed to be the first fatal crash of a self-driving car was reportedly watching a Harry Potter movie on the vehicle’s television screen when the autopilot mode failed to sense a white semi-trailer truck against a bright sky. The top half of the car was sheared off as it plowed into the truck crossing a highway.

But there are many other issues to be addressed, such as the values that are reflected or embedded in the design of robots; the legal, ethical and social issues that arise when humans delegate tasks to robots (especially anthropomorphic robots that are designed to elicit certain responses from humans, such as trust; how robotic technologies and artificial intelligence (AI) undermine the assumptions that justify existing legal frameworks; and what legal reform in response to these technologies and this legal disruption should look like.

Ian Kerr, Canada research chair in ethics law and technology and a professor at the University of Ottawa, gazed into his crystal ball seven years ago and decided to launch a course called, “Building Better Humans,” which focused on the legal ramifications of putting machine parts into humans that not only restored regular function but enhanced it. (His students dubbed the course “Cyborg Law.”)

Think The Six Million Dollar Man and The Bionic Woman television shows of the 1970s or the real-life athletic success story of South African sprinter Oscar Pistorius, who became the first double amputee to participate in an Olympics in London in 2012 thanks to the blades that were attached to his stumps. (Pistorius has subsequently been convicted of murdering his girlfriend in 2013).

Things are continuing to move at light speed and that means law and policymakers need to pick up the pace. Not a believer? Consider then, that IBM’s Watson, the supercomputer that gained widespread notoriety when it blew away a series of former champions on the television quiz show, Jeopardy! in 2011, has branched off into making medical diagnoses.

“We’re in a world where things that used to be done exclusively by human beings are now being delegated to machines,” Kerr says.

In many ways, Kerr believes robots and AI have the potential to revolutionize the legal profession just like the Internet has over the past several decades.

“We’re at the very stage that (Microsoft Corp. founder) Bill Gates was with computer technology in the 1980s,” he says.

Still not a believer? Maybe you should check in with Siri on your iPhone. The helpful confidante with human attributes and the ability to interact with people receives more than one billion requests per week.

How should the law treat driverless vehicles when they are programmed in a way that makes the outcomes they reach unpredictable, he asks.

“The whole point of having an automated vehicle is it will sometimes make decisions that (human drivers) won’t make. It will be safer than us, it will take better routes, use less gas and cause less congestion,” he says.

“What makes AI smart is machine learning. The outcomes generated by the machines aren’t predictable by the programmers. They’re sometimes unpredictable by design.”

Kerr is no longer the only academic focused on the cyborg field. Last year, the University of Windsor posted the first-ever position in a North American law school focused on law, robots and society. One of his former students, Kristen Thomasen, accepted the role.

She admits it’s not a black and white question of what the law has to do to catch up to technological advances. She also cautions that not all technologies have the same effect of “getting ahead” of the law, which is often sufficiently flexible that it can incorporate new technologies by analogy to existing technology.

“But when a technology creates new opportunities for human activity, behaviour, communication or interaction, then the existing law, which is based on now-old assumptions about human capabilities, might no longer be able to address some of the consequences or uses of the technology. In that case, analogizing to existing technologies can problematically obscure some of the important legal challenges raised by the new technology,” she says.

The growth of this emerging field will require new kinds of legal training, too. Thomasen believes that an interdisciplinary approach in law schools would be beneficial for the next generation of lawyers interested in this area.

“The more that future technology lawyers know about the current and potential capabilities of technologies, the better they will be able to think about how the existing law applies or needs to change,” she says.

Robots and AI intersect with so many areas of the law while also raising their own legal issues, such as how to regulate emerging technologies.

“Tort, property, and privacy are quite predominant right now, along with international humanitarian and treaty law, labour law and criminal law. New lawyers would be all the better equipped to counsel their clients when they have a sense of the implications that automated technologies might have for their practice area,” she says.

But Kerr isn’t prepared to let machines do their best to take over everything they touch. He’s actively involved in a campaign against killer robots because he believes the decisions to launch military strikes in which people are likely to be killed shouldn’t be delegated to machines and should be made by humans.

He readily admits he’s swimming upstream by trying to create a new generation of lawyers who look to the future in what is historically a “very backward-looking” profession.

“I’m trying to respond to typical legal training, which is rooted in history and precedent, and instead look to future and emerging technologies [so lawyers can] try to pull some of that towards them in the present,” he says.