The Widening Chasm in AI-Powered Justice
by Sariga Premanad, Chithra MadhusudhananNeetiAI is a generative AI-powered legal assistant tool designed to help individuals navigate the complexities of the Indian legal system with ease. Promising to democratize access to justice, NeetiAI offers legal information, highlights potential pathways, and drafts complaints. Its tiered subscription model, from free to premium plus, offers flexibility and control, allowing users to choose the level of assistance based on their needs and financial capacity.
Padma, a domestic worker opts for its free basic version to address harassment from her employer. As the sole breadwinner for her family, she cannot quit her job. NeetiAI drafts her complaint, however, she soon encounters multiple challeges. Despite choosing speech-to-text translation, the tool fails to adequately understand her queries. She often encounters daily quota limitations. Her harassment claims are categorised as 'minor disputes', and she is directed to consult with NGOs or governmental legal aid. In contrast, middle-class users access the premium version, which offers more refined advice, while wealthy clients benefit from the premium-plus plan, combining AI insights with human legal consultation, thus receiving better outcomes.
As NeetiAI’s adoption grows, courts face many AI-generated cases, increasing backlogs. Overburdened advocates prioritize premium clients, leaving users like Padma marginalized. The situation escalates when an NGO files a public interest litigation, exposing systemic biases in NeetiAI. The investigation reveals that NeetiAI’s free version consistently generates weaker arguments and lower compensation demands for lower-income users. In contrast, its paid versions craft more compelling cases for affluent clients due to biased training data and inadequate representation of marginalized groups.
The end state reveals a painful irony: a tool designed to bridge the justice gap has deepened, underscoring the urgent need for responsible AI deployment.