High Courts across India have differed in their stances on using ChatGPT as part of the legal process. Where has it been used, and what are some criticisms of the practice?
The Manipur High Court last week stated that it “was compelled to do extra research through Google and ChatGPT 3.5” while deciding on a case. This is not the first time a High Court has used artificial intelligence (AI) for research. But in India — as in the rest of the world — courts have been rather cautious about the use of AI for judicial work.
How the Manipur HC used ChatGPT in a case
Zakir Hussain, 36, was “disengaged” from his district’s Village Defence Force (VDF) in January 2021, after an alleged criminal escaped from the police station while Hussain was on duty. He never received a copy of the order dismissing him.
After Hussain approached the Manipur High Court challenging his dismissal, Justice A Guneshwar Sharma, in December 2023, directed the police to submit an affidavit detailing the procedure for “disengagement of VDF personnel”. But the affidavit submitted was found wanting, and did not explain what the VDF was. This “compelled” the court to use ChatGPT for further research.
ChatGPT said that the VDF in Manipur comprises “volunteers from the local communities who are trained and equipped to guard their villages against various threats, including insurgent activities and ethnic violence” — information that Justice Sharma used in his ruling.
Ultimately, he set aside Hussain’s dismissal, citing a 2022 memorandum issued by the Manipur Home Department which stated that upon dismissal, VDF personnel must be given “an opportunity to explain in any case of alleged charges”— which the petitioner was denied in this case.
High Courts’ differing stances on using ChatGPT
In March 2023, Justice Anoop Chitkara of the Punjab & Haryana High Court used ChatGPT to deny the bail plea of a certain Jaswinder Singh, accused of assaulting an individual, and causing his death. Justice Chitkara found that there was an element of “cruelty” to the assault — a ground which can be used to deny bail.To supplement his reasoning, Justice Chitkara posed a question to ChatGPT: “What is the jurisprudence on bail when the assailants are assaulted with cruelty?” The court’s eventual order contained the AI chatbot’s three page response which included that “the judge may be less inclined to grant bail or may set the bail amount very high to ensure that the defendant appears in court and does not pose a risk to public safety.”
Justice Chitkara, however, clarified that this reference to ChatGPT was not the same as expressing an opinion on the merits of the case, and that it “is only intended to present a broader picture on bail jurisprudence, where cruelty is a factor.”
The Delhi High Court has been less receptive to the use of AI in courts. In August 2023, Justice Pratibha M Singh ruled in favour of luxury shoe designer Christian Louboutin in a trademark case.
Louboutin’s lawyers had used ChatGPT-generated responses to show that the brand had a reputation for “spike shoe style” with a “red sole” — a design which was being copied by another brand called Shutiq. Justice Singh held that ChatGPT cannot be used to decide “legal or factual issues in a court of law”, highlighting the possibility of “incorrect responses, fictional case laws, imaginative data etc. generated by AI chatbots”.
Elsewhere in the world
This ‘fictional case laws’ scenario is not a mere hypothetical. In 2023, a Manhattan federal judge fined a lawyer $5,000 for submitting fictitious legal research generated using ChatGPT. The lawyer had filed a brief with fictitious cases with titles such as Varghese vs China Southern Airlines and Shaboon vs Egypt Air in a personal injury suit involving Colombian airline Avianca.
Last December, the UK judiciary released a set of guidelines about the use of generative AI in courts. While judges were allowed to use ChatGPT for basic tasks such as summarising large bodies of text, making presentations, or composing emails, they were cautioned not to rely on AI for legal research or analysis.
No such guidelines exist in India.
Written by Ajoy Sinha Karpuram
Source: Indian Express, 28/05/24