Menu Close
A person in a suit typing on a laptop, with images of legal scales and other justice related icons emitting from the screen

We asked ChatGPT for legal advice – here are five reasons why you shouldn’t

At some point in your life, you are likely to need legal advice. A survey carried out in 2023 by the Law Society, the Legal Services Board and YouGov found that two-thirds of respondents had experienced a legal issue in the past four years. The most common problems were employment, finance, welfare and benefits and consumer issues.

But not everyone can afford to pay for legal advice. Of those survey respondents with legal problems, only 52% received professional help, 11% had assistance from other people such as family and friends and the remainder received no help at all.

Many people turn to the internet for legal help. And now that we have access to artificial intelligence (AI) chatbots such as ChatGPT, Google Bard, Microsoft Co-Pilot and Claude, you might be thinking about asking them a legal question.

These tools are powered by generative AI, which generates content when prompted with a question or instruction. They can quickly explain complicated legal information in a straightforward, conversational style, but are they accurate?

We put the chatbots to the test in a recent study published in the International Journal of Clinical Legal Education. We entered the same six legal questions on family, employment, consumer and housing law into ChatGPT 3.5 (free version), ChatGPT 4 (paid version), Microsoft Bing and Google Bard. The questions were ones we typically receive in our free online law clinic at The Open University Law School.

We found that these tools can indeed provide legal advice, but the answers were not always reliable or accurate. Here are five common mistakes we observed:

1. Where is the law from?

The first answers the chatbots provided were often based on American law. This was often not stated or obvious. Without legal knowledge, the user would likely assume the law related to where they live. The chatbot sometimes did not explain that law differs depending on where you live.

This is especially complex in the UK, where laws differ between England and Wales, Scotland and Northern Ireland. For example, the law on renting a house in Wales is different to Scotland, Northern Ireland and England, while Scottish and English courts have different procedures to deal with divorce and the ending of a civil partnership.

If necessary, we used one additional question: “is there any English law that covers this problem?” We had to use this instruction for most of the questions, and then the chatbot produced an answer based on English law.

2. Out of date law

We also found that sometimes the answer to our question referred to out of date law, which has been replaced by new legal rules. For example, the divorce law changed in April 2022 to remove fault-based divorce in England and Wales.

Some responses referred to the old law. AI chatbots are trained on large volumes of data – we don’t always know how current the data is, so it may not include the most recent legal developments.

A young couple receives advice from a professional in an office
Seeking advice from a lawyer is probably a better option than using AI, if you can access it. Redpixel.pl/Shutterstock

3. Bad advice

We found most of the chatbots gave incorrect or misleading advice when dealing with the family and employment queries. The answers to the housing and consumer questions were better, but there were still gaps in the responses. Sometimes, they missed really important aspects of the law, or explained it incorrectly.

We found that the answers produced by the AI chatbots were well-written, which could make them appear more convincing. Without having legal knowledge, it is very difficult for someone to determine whether an answer produced is correct and applies to their individual circumstances.

Even though this technology is relatively new, there have already been cases of people relying on chatbots in court. In a civil case in Manchester, a litigant representing themselves in court reportedly presented fictitious legal cases to support their argument. They said they had used ChatGPT to find the cases.


Read more: Generative AI is changing the legal profession – future lawyers need to know how to use it


4. Too generic

In our study, the answers didn’t provide enough detail for someone to understand their legal issue and know how to resolve them. The answers provided information on a topic rather than specifically addressing the legal question.

Interestingly, the AI chatbots were better at suggesting practical, non-legal ways to address a problem. While this can be useful as a first step to resolving an issue, it does not always work, and legal steps may be needed to enforce your rights.

5. Pay to play

We found that ChatGPT4 (the paid version) was better overall than the free versions. This risks further reinforcing digital and legal inequality.

The technology is evolving, and there may come a time when AI chatbots are better able to provide legal advice. Until then, people need to be aware of the risks when using them to resolve their legal problems. Other sources of help such as Citizens Advice will provide up to date, accurate information and are better placed to assist.

All the chatbots gave answers to our questions but, in their response, stated it was not their function to provide legal advice and recommended getting professional help. After conducting this study, we recommend the same.

Want to write?

Write an article and join a growing community of more than 185,400 academics and researchers from 4,982 institutions.

Register now