If a business owner relies on ChatGPT in their business and it makes a mistake, who is to blame? In the age of artificial intelligence (AI), businesses are increasingly using tools like ChatGPT to streamline customer service, enhance marketing efforts, and generate content. AI can simplify the effort involved in producing content, particularly customer communications, and it can improve efficiency by freeing up human resources for other activities. AI tools, and large language models (LLMs) in particular, have huge potential to enhance the way we work, but they must be used judiciously. They should not be a substitute for expertise or originality. But where does liability sit when things go wrong with AI? What do we need to consider regarding content generated by AI models like ChatGPT in customer interactions and public-facing information?
In a very short time, AI-generated content has become an integral part of modern business operations. From chatbots that handle customer inquiries to content generation tools that create blog posts and social media updates, AI is revolutionising the way businesses communicate. ChatGPT in particular has gained attention for its ability to generate human-like text and handle a wide range of conversational tasks. Even technology laggards cannot ignore the impact and potential of ChatGPT. Smart organisations will develop a set of guidelines around the use of AI, to ensure it is used consistently, ethically, and responsibly. This will also help manage liability.
AI-generated content in business operations creates several potential areas of liability:
Businesses can mitigate the risks associated with AI-generated content. There are several proactive steps they can take:
Currently, South Africa does not have comprehensive legislation governing the use of AI and generative language tools. Some countries are beginning to publish white papers and consider legislation, and legal precedents are emerging. If a liability issue arises, courts may consider factors such as –
How might a customer suffer harm as a result of AI? It is no different to the responsibility a business has when providing customers with any information, whatever the source. If a company is seen as an authority on a product or service offering, it has a responsibility to provide accurate information. Here is a theoretical example:
A retailer uses ChatGPT to handle online customer queries. A customer asks about the compatibility of a specific electronic device with their existing set-up. ChatGPT provides inaccurate information, and the customer purchases the device, only to find it is incompatible. They incur additional costs and inconvenience. The business might face potential liability due to –
The outcome of this case would depend on various factors, including the business’s efforts to ensure AI accuracy, its transparency with the customer, and the Consumer Protection Act 68 of 2008. There is currently a consultation underway in Europe on producer liability for digital products, but there is a lack of clarity on the way forward and this is unlikely to serve as a defence.
As businesses increasingly integrate AI-generated content into operations, understanding and managing liability is essential. They need to balance AI’s potential for efficiency and customer engagement with its risks. Organisations need to raise awareness internally of accuracy, transparency, data privacy, legal compliance, and customer education in order to reduce their exposure to liability when using AI-generated content. They should review and verify content against a reliable source and be aware of potential biases. Attribution is helpful, such as: ‘This content was created by a generative AI tool with respect to [whatever product, service, or instruction applies].’
The landscape of AI-generated content liability is ever evolving. Businesses looking to harness its benefits must ensure they minimise the potential legal challenges.
Simon Dippenaar BBusSci LLB PG Dip Legal Practice (UCT) is a legal practitioner at Simon Dippenaar and Associates Inc in Cape Town.
This article was first published in SA Lawyer in 2024 (January) DR 2.