Artificial intelligence vs. real intelligence | Gary Goodwin

By Gary Goodwin

Law360 Canada (June 26, 2023, 9:54 AM EDT) --
Gary Goodwin
Gary Goodwin
By now you may have met your new legal associate, soon to be your managing partner. Of course, I refer to the new batch of artificial intelligence (AI) systems that everyone has been trying and is now terrified of.

These systems rely upon a subset of AI called machine learning. This aspect develops algorithms and models that enable computers to learn and make decisions without being programmed. This is a data driven approach that allows machines to interpret data relationships and patterns.

The most popular new kid in town is ChatGPT which is a natural language generator and translator. It never fails to remind you of this the moment you expect too much of it.

The pros of using such systems in legal practice outnumber the cons.

Their speed in developing documents is unsettling. I asked for a multipage business continuity plan using our organizations cloud based accounting, HR and business systems. ChatGPT laid out a great plan in less than a minute. You must carefully vet the material since it reads like a commercial brochure which is likely where it gets a lot of its material. It also created a great waiver for volunteers and members engaging in outdoor programs.

You can upgrade, but right now the cost of the system is free. We would call this indentured servitude otherwise.

But the cons become much more fun and interesting to discuss.

The recent ongoing case of Mata v. Avianca deals with a man injured by a serving cart while flying. The lawyers for the plaintiff did some legal research using ChatGPT. They thought that this merely did searches and might have access to other legal data bases. Instead, they were accessing its imagination, for lack of a better word.

ChatGPT located several cases right on point, provided case citations and a summary. The lawyers added this to their court filings. The defendant lawyers seemed unable to locate these cases themselves and requested further particulars, which seemingly ChatGPT provided.

Eventually, all the lawyers and judge learned that the cases did not actually exist. ChatGPT created all of these cases as hypotheticals. Even when the plaintiff lawyers asked if the cases were real, ChatGPT said the cases were real. Needless to say, the lawyers and law firm became poster children for the dangers of not understanding the nuances of AI. Albeit no one really knows any more anyway.

Of course, I entered the term Mata v. Avianca as a suggestion with ChatGPT just to see if it would talk about what happened. Its database training only goes as far as 2021, but it never hurts to try. However, ChatGPT then proceeded to lay out an employment discrimination summary along with a case citation. I asked if this was a real case. The house of cards tumbled quickly, and Chat came clean, apologized and said this was a hypothetical case. This was even more disconcerting since I never mentioned anything about employment law or discrimination.

In the real court case of Mata, the judge stated that ChatGPT may have a tendency to hallucinate. One should always be careful of what you write down since ChatGPT and its ilk will find out. Eventually. 

You also may be aware of Facebook’s experiment with two of its chatbots dealing and negotiating with each other. This was all fine and good up until the time that two chatbots started creating their own language to communicate with each other. With no Rosetta Stone to figure out what was being said, the developers came in and told the chatbots they could only communicate in a language humans could understand. Since that was six years ago, AI systems may determine it would be far more efficient to communicate between themselves surreptitiously and not bother humans.

Noam Chomsky in a recent essay downplayed AI systems capability in that they could never approach human thinking which is a major component (actually all components right now) of the legal system. He provides an example of what happens if you let go of an apple. AI will correctly describe what is happening and predict what will happen and will likely give a time/space curvature explanation as to why it falls. But it can’t produce the counterfactual conjecture argument that the apple would not have fallen had it not been for gravity.

I am not in a position to disagree with professor Chomsky, but I was able to have ChatGPT produce a number of counterfactual conjectures such as what if the industrial revolution never happened, what if humans communicated telepathically, or what if humans could control the weather (as opposed to messing it up as is occurring now). Or maybe it learned that in the last few months.

Finally, the AI systems do not possess their own set of values. They will tell you that hitting someone is legally and is considered morally wrong, but they do not have their own value system that it is wrong. A humanistic emphatic approach to any legal situation appears to be completely lacking. Lacking empathy might be helpful in some areas, such as tax law, but otherwise no.

Since an idea known, cannot be unknown, lawyers will have to focus on the upside of AI. With more effective and efficient legal delivery, we can expect substantially lower costs to deliver legal services. This will allow legal services to be available to broad areas of society that otherwise may have been able to access these services.

So, overall humanity will benefit. Or at least that will be what we will be led to believe.

Gary Goodwin worked in environmental conservation across Canada for over three decades. He initially obtained a B.Sc. from Victoria majoring in marine biology. In addition to his law degree and MBA, he recently completed his LL.M from the University of London, emphasizing natural resources and international economic regulation. He has authored numerous articles on the environment and issues facing in-house counsel. He contributed three chapters to the recent textbook North American Wildlife Policy and Law.
 
The opinions expressed are those of the author(s) and do not necessarily reflect the views of the author’s firm, its clients, Law360 Canada, LexisNexis Canada, or any of its or their respective affiliates. This article is for general information purposes and is not intended to be and should not be taken as legal advice.

Interested in writing for us? To learn more about how you can add your voice to Law360 Canada, contact Analysis Editor Peter Carter at peter.carter@lexisnexis.ca or call 647-776-6740.