Maja Daruwala is the chief editor of the India Justice Report (IJR), which regularly evaluates the structural capacity of states to deliver justice and ranks them accordingly. She is also the senior adviser at Commonwealth Human Rights Initiative, which focuses on police and prison reform and the right to information. She is a board member of the Population Foundation of India, Media Foundation, and Centre for Social Justice and sits on the advisory board of the Global Governance Forum.
In this interview with IDR, Maja talks about data’s engagement with democracy and the role it plays in public discourse and policymaking. She also discusses the evolution and impact of IJR, which is a first-of-its-kind effort using state-level public data to present a holistic view of the justice system to enable policy change.
We live in an unprecedented age of access to information, and misinformation. This can often lead to faults in decision-making—whether it’s by leaders or citizens. As someone who has worked extensively on the public’s right to information, can you talk about the role of data in a democracy?
Data is the bedrock of policy. Accurate data gives you an insight into reality and equips both government and active citizens with the capacity to engage in meaningful dialogues on the basis of objective facts.
Regularly collected and published data help to assess progress as well as chronic and acute problems. It sets the course for short, mid-term, and long-term course correction. Data helps evaluate the distance between reality and benchmarks, and suggests timelines to reach them. Data is also an important tool in the advocacy quiver of active citizens, whether they are involved in health, education, environment, fighting hatred, or—as I am—in the area of justice.
The government is putting out more data and citizen groups are using it, as with the India Justice Report.
Earlier, few people valued the use of data, and indeed there was not much data available. But with digitisation more and more data is produced and shared globally. In fact, there are so many national and international indices nowadays that it can be quite bewildering. The government is putting out more data and citizen groups are using it, as say with the India Justice Report. Citizen groups, think tanks, nonprofits, and special interest groups are also generating data from their own vantage points. This is a way to contribute to the national project of strengthening democracy and development.
But doesn’t data digitisation create more exclusion, in a country like India where digital literacy still has a long way to go?
We need to popularise the use of data for advocacy right down to the grassroots—through small and focused measurements of what is really going on at ground zero. Currently, this is still in the domain of the elite.
Democracy, for me, is characterised by the ability of people to have debates. And to disagree, if need be; to think and say what they want. We live in an age where, across the world, misinformation and disinformation are rife. It’s easy to get subverted from basic truths. Good data represents objective truth, at least as much as it can.
Being part of a democracy requires active citizens to participate. But informed participation—which is possible when honest
It’s very important to put out objective information, especially where society is being polarised by opinion that poses as fact.
The fierceness of ideology, the lapses of conjecture, and the vagaries of belief and impression can be tempered by data. The objectivity that we gain from data—just as with knowledge of law—can provide a common base for debates and dialogues.
So, it’s very important to put out objective information, especially where society is being polarised by opinion that poses as fact. Now, one may say that data can be ideologically skewed, that it is only as good as its methodology, or that it reflects the biases of the compiler. And all that is true. But questioning methodology, bias, and ideology surely leads to nuance and improvement. All fields—be it medical science or environment or civil liberties—have grey areas that must be navigated. Science itself is always questioning long-held beliefs through new inquiry based on experimentation and innovation, or by interrogating older statistical analyses and studies. Without these battles, there would be no progress.
That said, there is concern about statistics and data around the world: how much is collected, by whom, for what purpose, and how much of it is held close or published. Indeed, there is justified concern over the amount of personal data that government and commerce collect, as well as questions about what they use it for, power accumulation, and loss of privacy.
IJR began as a project to use public data to provide a holistic view of India’s justice system, and thereby point out where improvements can be made. Can you tell us a little bit more about it?
IJR was first published in 2019 and comes out every 15 months or so. It is a collaborative effort of several organisations: Centre for Social Justice, Common Cause, Commonwealth Human Rights Initiative, DAKSH, Prayas, and Vidhi Centre for Legal Policy. Originally born under the aegis of Tata Trusts with a mission to contribute to improving India’s justice delivery system, it measures the structural capacity of states to deliver justice and on this basis, in a first-of-its-kind effort, it ranks states. It relies entirely on the government’s own statistics and ranks the adequacy of budgets, human resources, infrastructure, workloads and diversity of police, judiciary, legal aid, and prisons. Quite uniquely, it brings together in one place a large number of statistics—that normally remain siloed in individual departments—to build a holistic picture of justice delivery, measure trends and improvements, and make visible stubborn bottlenecks that have remained unaddressed over time.
Is there an upcoming publication of IJR? What contemporary issues does it cover and how has it evolved over the years?
We’re getting ready for IJR’s third publication in April 2023. This edition, like its predecessors, seeks to understand whether the police, the judiciary, prisons, and legal aid have the structural capacity to fulfil their mandates. This is its focus, and this has not changed. For instance, if the data tells you that a large percent of the required judges in a given jurisdiction have been missing for several years, you can begin a discussion about how much this level of vacancy contributes to backlog or whether it does at all.
Some will argue that judges and backlog have little correlation, but the rate of disposal of cases is what matters. IJR gives statistics for both and encourages deeper dives into an issue that has been debated for decades. It helps bring nuance to the discussion.
Let’s take another example. The Model Prison Manual aspires to shift prisons away from being sites of retribution to being places of rehabilitation that return a more enabled person into society. It’s a progressive move. So, by looking at the statistics on the availability of social workers, psychologists, and equipment provided to prisons, and budgets devoted to skilling up prisoners, one can gauge how far each state has committed itself to turning this intention into reality.
Each report seeks to add depth and nuance via new indicators and pillars. In IJR 2020, for example, we added a section to highlight improvements and shortfalls within each state since 2019. There are many more sub-systems of the justice delivery system yet to be discovered—prosecution, forensics, specialised agencies such as human rights courts, commissions, and others. With each addition there will emerge a clearer picture of how far justice delivery has come to enable itself to fulfil justice needs.
So, while we continue to add these new pillars, our aim has always been to reflect data back—as a mirror—to policymakers and help them see the whole picture in one place, rather than in fragments. It is important to recognise that this reflection is not a confrontation or a complaint. It is a resource to help all stakeholders—policymakers, media, and active citizens—engage with one another with some degree of accuracy. The mirror can only reflect what it sees before it.
Besides the substantive issue of better justice delivery, there is another conversation to be had about the accuracy, regularity, consistency, and granularity of data: what is included, what is left out, and how data can constantly be better collected, standardised, and validated to fit the public purposes for which it is collected.
How has the government and the bureaucracy responded to the report?
We have presented before parliamentary committees, and they have listened to us with great attention. I believe the reason they listen is that they know we have done all the grunt work to put together something that can give them a holistic insight into the justice system. Responses in Parliament have also cited the report. States that have done well on certain indicators do take encouragement from the report and mention it in official documents. Others have on occasion held internal meetings to see what can be improved. We do see the report being used in various educational settings, and this spread in its use augurs well.
Data is built for public purposes and belongs to the public.
That being said, in our experience, there does exist a hesitancy to share data. When we make Right to Information (RTI) requests, sometimes jurisdictions ask who we are and why we need this information. These are irrelevant questions. Data is built for public purposes and belongs to the public. This is something that needs to be repeated again and again. It is collected as a duty by public servants and is paid for by public finances, and there really has to be a strong and reasoned cause to hold it back—especially since the information we ask for should inevitably be in the public domain under the RTI Act.
IJR seeks to hasten change for the better. There are many factors that influence change, and it is hard to attribute change to just one event or source. IJR uses data of one kind to show one aspect of justice delivery—the status of specific inputs and trends. There are other facets of delivery and access that need interrogation—procedures, internal subcultures, individual and institutional attitudes, accountability mechanisms, to mention but a few. Data generated from surveys, for instance, especially satisfaction surveys and social audits of how justice is experienced on the ground, need to be built into governance and welcomed, whether they are internal or generated by active citizens.
Meanwhile, of course, we believe our consistent work is providing signposts on the path to change. IJR is a constant, quality resource that helps create an environment for reforming the delivery of justice for all. One hopes that as a benchmark for excellence, the report will prompt states to deliver the best justice to people.