Law Practice News South Africa

Trust me, I'm a robot... even if you think my intelligence is artificial - Part 1

"The wheels of justice turn slowly" and "the law is reluctant to change". These expressions are quite common and pose the questions: how does the legal fraternity deal with Industry 4.0 challenges and the changes that may be needed in law degree curricula? It took the South African legislature many years to adopt the Electronic Communications and Transactions Act, 2002; to change the old Roman Dutch Law definition of rape in The Criminal Law (Sexual Offences and Related Matters) Act, 2007; and we have been waiting for the Cyber law and Cyber security Bill to come into operation since 2016.
Trust me, I'm a robot... even if you think my intelligence is artificial - Part 1
© Andriy Popov – 123RF.com

A trust issue

During my first Law of Evidence lecture to my 2019 third year LLB students, I introduced (with some excitement) the changes to our 2019 curriculum. I mentioned the importance of Industry 4.0, and how new technology, like machine learning and artificial intelligence, already has and, without a doubt, will have more influence on the practice of law in South Africa and internationally. To demonstrate this, I asked a few questions, stated some problems and invited the students to contribute ideas and possible solutions. I emphasised the importance of a solid understanding of old and new concepts in the context of the Law of Evidence.

One of the issues that I raised, dealt with the concept of trust in relationships. I asked whether they thought there was a difference between trust between human beings and trust between humans and robots. I ended this discussion by referring to the well-known saying that is commonly attributed to a former president of the United States, Ronald Reagan. During negotiations he often used the phrase “trust but verify”. I told the students that this might even become more important when we start relying on artificial intelligence.

A few days later, two students came to consult with me. In typical lawyer fashion, the first student started: “With all due respect sir, I do not agree with your and the USA president’s statement ‘trust but verify’.” According to him the statement is a “contradictio in terminis” and that you either trust, or you need proof or verification. The other student also disagreed with the phrase, but for different reasons. She asked the question whether anyone who does not know anything about algorithms and how the robot was programmed for the required actions and results, could really trust it. All of these questions were of course valid and forced me to re-visit the important issue of the role of trust in present and future relationships.

I then realised that we teach our law students to provide evidence based on facts in order to prove cases and that every statement in court must be verified. In that context there is not much room for trust without the willingness to provide proof. Acceptance of facts without the necessary proof to ascertain the truthfulness and reliability of the evidence is almost unthinkable. The Law of Evidence goes even further with “the rule against self-corroboration” and the requirement that corroboration must come from an outside source.

Another example is the Common Law admissibility requirement that a document must be “genuine and authentic”. Authenticity is also linked to “new generation” documents in the form of computer printouts, data messages and the like. It is then not surprising that confusion may arise when we promote a certain degree of trust in, for example, computers, artificial intelligence, machine learning and new technology like blockchain.

Is “trust but verify” a contradiction in itself, like my student alleged? Is there a difference between that phrase and “trust and verify” or can we leave out the trust part and just require verification?

I believe a case can be made out for more than one approach. In some relationships, a “trust but verify” approach may be acceptable and even essential going forward. It can, however, also be detrimental to a relationship.

Risk factor

A possible solution may be to distinguish between two completely different scenarios. If the outcome is fundamentally important and matters more than the relationship, we should use the “trust but verify” approach. Where the relationship matters more than the outcome, we should rather not use this approach. Where Reagan used the expression during the 1980s, he explicitly mentioned that the confirmation he required had to be reliable and there also had to be transparency related to the nuclear stashes. In that specific situation, the outcome was definitely more important than the relationship between the nations.

In life or death situations, such as surgical procedures, risky transport matters, or in cases of safety or security, we may indeed trust, but also need to verify. We may even negate the trust part and just require verification.

Negotiators to whom long-term work relationships are extremely important know that the future of these relationships, which can’t be predicated on the success of one particular task, depend on unconditional trust. It is, however, when we do not deal with human-human relationships but human-machine interactions that the concept of trust becomes complicated.

In part 2 of this article, Dawie de Villiers will discuss how AI is transforming the way people trust, accountability and the development of relationships with such technology.

About Prof Dawie De Villiers

Dawie de Villiers is a professor and Head of the Department Procedural Law at the University of Johannesburg's Faculty of Law.
Let's do Biz