Pitfalls and traps for legal practitioners when using ChatGPT

September 1st, 2023
x
Bookmark

By Prof Michele van Eck

The use of technology is fast changing not only the law but also the practice of law. ChatGPT is a good example of a technological tool used to increase the efficiency and speed of completing tasks. Irrespective of these potential benefits, there are significant risks in the use of such tools in the legal profession, which are also closely linked to the ethical and professional duties of legal practitioners. These risks and related pitfalls include, for example, maintaining the privilege and confidentiality of a client’s information when legal practitioners use ChatGPT, the ethics of charging fees for tasks completed by ChatGPT and ensuring the quality and correctness of any work product produced by ChatGPT.

Maintaining privilege and confidentiality

Section 3.6 of the Code of Conduct for all Legal Practitioners, Candidate Legal Practitioners and Juristic Entities requires legal practitioners to ‘maintain legal professional privilege and confidentiality regarding the affairs of present or former clients or employers, according to law’ (the Code). However, when using chatbots, like ChatGPT, privileged and confidential information is at risk. Take, for instance, OpenAI may save and use any information that is submitted to ChatGPT to improve OpenAI’s services (Michael Schade ‘How your data is used to improve model performance’ (https://help.openai.com, accessed 28-7-2023)). In addition, OpenAI notes that ‘conversations may be reviewed by [their] AI trainers to improve [their] systems’ (Natalie Staudacher ‘What is ChatGPT?’ (https://help.openai.com, accessed 28-7-2023)). This means that the information that is provided to ChatGPT would be stored and is reviewable by OpenAI. In instances where legal practitioners use ChatGPT in their legal practice, then any information provided to the chatbot may pose a significant risk to privileged, confidential and proprietary information of clients. In fact, OpenAI specifically warns users against providing sensitive information when using ChatGPT (Staudacher (op cit) at para 8).

To avoid and mitigate such risks, legal practitioners should refrain from disclosing any personal, sensitive, confidential, privileged, or proprietary information of clients or their employer when using ChatGPT. Failure to do so will not only be a potential breach of a legal practitioner’s professional and ethical duties but may also be a breach of their confidentiality obligations towards their clients and employer.

Charging of fees

ChatGPT has the capability of completing tasks much faster than the average legal practitioner would ever hope to achieve. A legal practitioner may, under these circumstances, be tempted to leverage such efficiencies and charge the same fee for the work produced by ChatGPT as what they would have charged for having completed the task themselves in an ordinary manner. Such conduct may be ethically and professionally questionable. Take for instance, similar conduct (in the context of standardised documents) which was strongly criticised by our courts when legal practitioners embarked on the mass production and use of court applications (effectively recycling standard documents, like affidavits) in the matters of Cele v the South African Social Security Agency and 22 Related Cases 2008 (7) BCLR 734 (D), Sibiya v Director-General: Home Affairs and Others [2009] 3 All SA 68 (KNP), Absa Bank Ltd v Havenga and Similar Cases 2010 (5) SA 533 (GNP), and Tekalign v Minister of Home Affairs and Others and Two Similar Cases [2018] 3 All SA 291 (ECP).

Legal practitioners should avoid the temptation of charging the same fees when less time is spent on the work produced for clients (especially when the only work done is of an administrative nature). This notwithstanding, a legal practitioner is entitled to a reasonable fee for the work produced but are not to overreach or charge fees that are unreasonably high (see paras 3.12 and 18.7 of the Code). Determining what is reasonable would relate to the actual amount of time spent, the type of work done, as well as the complexity of the matter. Put differently, ethical and professional duties prevent a legal practitioner from charging the same fee for drafting or amending a document as opposed to completing standard form documents purely from an administrative perspective, and such a principle may also apply to producing work through technologies like ChatGPT where little or no work is done by the legal practitioner (see, for example, M Van Eck ‘Ethical and professional duties in the use of recycled legal instruments: A trio of cases’ (2020) 2 TSAR 354).

Integrity of the work product

The risk with any technology, including ChatGPT, is that the information provided is not always accurate and correct. In fact, OpenAI admits this by stating that ‘ChatGPT will occasionally make up facts or “hallucinate” outputs’ (Staudacher (op cit) at para 13). In addition to this, OpenAI warns that ‘ChatGPT is not connected to the internet, and it can occasionally produce incorrect answers. It has limited knowledge of world and events after 2021 and may also occasionally produce harmful instructions or biased content’ (Staudacher (op cit) at para 4).

In this regard, a legal practitioner should be cautious in accepting the work produced and answers received by ChatGPT and should scrutinise all feedback from ChatGPT to ensure accuracy, correctness and appropriateness of the information and feedback received from the chatbot. Ultimately, a legal practitioner remains responsible for their work product and cannot blame ChatGPT (or any other technologies that may have been used) for mistakes or inaccuracies. This principle is found in para 18.14 of the Code, which notes that any work produced by a legal practitioner must be of a ‘degree of skill, care or attention, or of such a quality or standard, as may reasonably be expected of [a legal practitioner]’.

Concluding remarks

Although ChatGPT may be a valuable tool, especially in the increased efficiencies and the optimisation of time, the use of such technologies is not without risk. Legal practitioners should remain vigilant and use such technological tools with caution. After all, ChatGPT is not a ‘shortcut’ for legal practice.

Ultimately, regardless of the mode of producing work (whether manually or through technology), a legal practitioner’s ethical and professional duties remain intact and must be observed. In fact, one may say that a legal practitioner must be more vigilant in preserving professional integrity and honesty with the use of such technologies, like ChatGPT (see, for example, para 3.1 of the Code).

Prof Michele van Eck BCom (Law) (RAU) LLB LLM (UJ) LLD (UP) BTh (SATS) is an Associate Professor and head of the Department of Private Law at the University of Johannesburg.

This article was first published in De Rebus in 2023 (Sept) DR 11.

X