In a remote Gadaba tribal village in present-day Manyam district of Andhra Pradesh, 17-year-old Ganesh* was preparing to enrol in a private junior college in Parvatipuram district after completing class 10. He had done everything expected of him—attended school, stayed in a Tribal Welfare hostel away from home, and passed his exams. Yet errors in his Aadhaar card brought an abrupt and unexpected halt in his education.
The only correct detail on Ganesh’s Aadhaar was his photograph. His name, his father’s name, and his date of birth were of another person. These errors, made at the time of enrolment, were never detected by his family. When he went to apply for college, the authorities stated that the Aadhaar was mandatory. Because of the major errors in his card, Ganesh was denied admission.
Ganesh’s story is not an exception. It captures how digital welfare systems convert small administrative errors into life-altering exclusions, and how the burden of correction is shifted entirely onto those least equipped to bear it. When identity systems become rigid gatekeepers rather than enabling tools, access to basic rights of education, food, wages, and dignity becomes conditional.
Through our research and work with communities to improve awareness and access to social welfare in Adivasi regions, we have seen first-hand how digital authentication and database-driven processes increasingly shape access to basic entitlements.
We argue that India’s tech-first welfare architecture systematically produces exclusion by prioritising administrative control over constitutional guarantees of access and dignity.
The technocratic turn in welfare governance
Over the past decade, India’s welfare architecture has undergone a quiet but profound shift. Concerns about welfare leakages, duplication, inefficiencies, and corruption have increasingly been addressed through technology-driven control, instead of administrative reform or institutional accountability. Digitisation has been framed as a neutral, efficiency-enhancing solution, which can fix governance failures by replacing human discretion with automated verification and real-time monitoring.
This has produced a new welfare logic. Identity authentication, continuous surveillance, and database integration are no longer supporting tools. They have become organising principles of welfare delivery. Systems such as Aadhaar-based authentication, electronic KYC for ration, app-based attendance in rural employment programmes, and Aadhaar-linked payments are justified as technical improvements. However, they reflect a deeper transformation and a new welfare logic, where compliance, traceability, and administrative convenience override ease of access and correctability.
The failures discussed in this article, including exclusions from food entitlements, loss of wages, and wrongful deletions, are not isolated glitches. As a growing body of research and civil society documentation have noted, they are predictable outcomes of a technocratic approach that treats technology as a substitute for governance rather than as a means of support. When systems are primarily designed to control beneficiaries without building robust correction mechanisms, fallback options, and local discretion, exclusion becomes structural rather than accidental.

A constitutional lens: Welfare, dignity, and the right to life
Welfare delivery in India is bound by constitutional commitments. Rights-based legislations such as the National Food Security Act and MGNREGA (now replaced with the Viksit Bharat-Guarantee for Rozgar and Ajeevika Mission, or VB-GRAM-G) provide explicit statutory guarantees. However, even schemes without formal legal backing fall within the purview of Article 21, which protects the right to life with dignity. Over the decades, the Supreme Court has interpreted this right to include access to food, shelter, health, and livelihood.
When technology mediates access to basic entitlements, it must be judged by constitutional standards, not by administrative convenience.
Recent policy developments further highlight the stakes of these constitutional guarantees. For instance, the replacement of MGNREGA with the VB-GRAM-G framework has raised concerns about the dilution of the statutory right to work and the growing shift towards administratively controlled welfare systems. In this context, the design of digital welfare infrastructures becomes even more significant, as technological systems increasingly mediate access to livelihoods and basic entitlements.
Digital welfare must therefore meet a constitutional test: Does it enhance access and dignity, or does it obstruct them?
When a ration card holder is denied food because her fingerprint does not authenticate, it is not merely a technical failure. It is a denial of her right to food. When a worker loses wages because an attendance application fails to sync, it directly harms their right to livelihood. Technology does not sit outside constitutional scrutiny. When it mediates access to basic entitlements, it must be judged by constitutional standards, not by administrative convenience.
From efficiency to exclusion: How digital welfare recasts system failure as citizen fault
Technology is not a neutral tool for welfare delivery. The way systems and dashboards are designed determines who is seen, who is excluded, and who bears the cost of system failure. As several commentators have noted, when technology-led welfare systems fail, responsibility is rarely attributed to flawed system design or infrastructure gaps. Instead, failure is reframed as citizen non-compliance or incomplete authentication.
1. Administrative logic versus citizen logic
Most welfare technology is built around administrative priorities such as monitoring, standardisation, and leakage control. Success is measured in authentication rates, uploads completed, and updated dashboards. Citizens, however, value very different things. They seek predictable access, simple processes, the ability to correct errors, and dignity in interaction.
This mismatch produces exclusion. Digital welfare today embodies a suspicion-first design, requiring citizens to repeatedly prove who they are, where they are, and that their documents are valid. This burden rests almost entirely on the poorest.
2. One-size-fits-all technology in an unequal country
Digital systems assume that India is homogeneous in terms of access to technology. The same workflows, biometric authentication, e-KYC procedures, and mobile-based attendance are expected to function equally in urban wards, remote Adivasi hamlets, flood-prone villages, and migrant worksites.
This homogeneity trap ignores differences in geography, connectivity, literacy, disability, gendered access to phones, and migration patterns. Ultimately, digital systems end up working best where people are already well-served and fail where people are most vulnerable.
After Ganesh’s admission was denied, his grandmother, a daily wage labourer, spent several months travelling repeatedly to Aadhaar centres, banks, and mandal offices in nearby towns. She eventually had to visit the Tribal Welfare Department office at the district headquarters, nearly 60 kilometres from her village, and even travel to Visakhapatnam, about 120 kilometres away. Corrections were permitted for only one field at a time, which meant multiple visits, with each visit costing lost wages and borrowed money. To change his name and date of birth, officials eventually demanded a Gazette notification. Since such corrections are treated as legal identity updates, they often require a public Gazette notification, a requirement entirely beyond the reach of the family’s resources.
At the time of writing this article, Ganesh’s Aadhaar is still uncorrected, and his education remains suspended.
This also points to another major gap in digital governance: centralisation.

A centralised system is no silver bullet
Most digital welfare systems operate through large Management Information Systems (MIS) that function in a sequence. In MGNREGA, for example, unless attendance is uploaded correctly, muster rolls do not close. Unless muster rolls close, wage calculations do not proceed. And unless wage files are generated, payments do not begin. A failure at any single stage blocks the entire chain.
It creates many challenges on ground:
1. Exclusion is automated and instantaneous; correction is centralised and inaccessible
Frontline officials, who understand local realities, are often powerless to fix even obvious errors. Panchayat secretaries cannot override biometric failures. Block officials cannot restore deleted workers. Bank managers cannot reverse Aadhaar-based payment rejections.
The consequences of this design become visible in everyday welfare delivery. In our fieldwork in Bandaveedhi village in the Paderu region of Visakhapatnam, nearly 400 residents were unable to access Public Distribution System (PDS) rations when biometric authentication could not be completed due to a device malfunction at the ration shop. Local officials lacked the authority to override the system. Escalation to state IT systems took months. One device became the single point of failure for an entire community. The absence of non-Aadhaar alternatives in such circumstances only makes the exclusion worse.
2. Deletions travel seamlessly across databases; corrections do not
For instance, if a person is denied MGNREGA work because of an Aadhaar-related error, the same error may also disrupt access to other entitlements such as PM-KISAN. Yet correcting the Aadhaar record or fixing the problem in one programme database does not automatically update records in others. This requires the individual to pursue separate corrections across multiple systems.
In practice, this can involve correcting the Aadhaar record, resolving bank account or issues with NPCI mapping (a process which links an Aadhaar number with a bank account through the National Payments Corporation of India), and then approaching programme offices such as the local MGNREGA office to restore the job card. If the record has been deleted, further escalation would be required.
Inconsistency of information across databases has also resulted in people being excluded from certain entitlements, while still receiving others.
For example, in Parvathipuram Manyam district of Andhra Pradesh, an Adivasi worker named Rama Rao* was marked “dead” in the MGNREGA database after his job card was deleted for failing to comply with Aadhaar-based payment requirements. Yet he continued to receive rations through the PDS, and personally collected them every month from the ration shop. In effect, he remained alive in one government database while being officially “dead” in another.
3. Centralisation bypasses constitutional safeguards
In Scheduled Areas, centralisation also bypasses constitutional safeguards under the Panchayats (Extension to Schedules Areas) Act (PESA), which require contextual governance and Gram Sabha involvement. Digital interventions have been rolled out uniformly in tribal regions without meaningful consultation, undermining the spirit of constitutional protection.
These outcomes are not accidental. These experiences recur across regions and schemes, indicating structural design failures rather than isolated technological breakdowns. They reveal digital welfare systems that lack resilience, offer no meaningful fallback when technology fails, and leave frontline authorities without the power to respond when exclusion is evident.
Moreover, they also indicate fundamental asymmetries between the state and citizens when it comes to transparency and accountability.
Surveillance, interoperability, and one-way transparency
Digital welfare has ushered in unprecedented state visibility into citizens’ lives through attendance logs, authentication trails, and linked databases. Yet when citizens seek information or corrections, these systems remain opaque and siloed.
Payments can be centrally guaranteed, but corrections must be decentralised.
Interoperability flows only one way. It enables deletions and exclusions, not corrections or redress. The result is one-way transparency wherein citizens must constantly prove themselves, and be visible, to the state. This architecture privileges control over accountability and surveillance over empowerment.
What is needed instead is a rights-compatible digital model that distinguishes between what must be centralised and what should remain local. Entitlements and payments can be centrally guaranteed, but authentication fallbacks, corrections, and grievance resolution must be decentralised. While current systems invert this logic, there are examples of digital systems that uphold citizens’ rights and turn transparency into a tool for accountability.
A rights-first counterexample: JanMANREGA shows what is possible
For most MGNREGA workers, basic information, such as the number of days they have worked in a year, the wages they have received, the payments that are still pending, or whether a payment has been processed, often requires multiple visits to the panchayat office, block office, or disbursement agencies. This dependence on intermediaries weakens accountability.
The JanMANREGA app addresses this gap. Launched by the government in 2017, it offers workers real-time visibility of their muster rolls, days worked, pending wages, credited payments, and job card details. The app also includes the entire list of MGNREGA assets, allowing citizens, workers, and social audit teams to verify assets on the ground. The “near me” feature enables anyone, even from a different state, to view nearby worksites. For migrant workers, the app is especially valuable—they can track their wage status and view nearby worksites from wherever they are.
We were also involved in advising and documenting aspects of the app’s design, informed by long-term field engagement with MGNREGA workers. JanMANREGA demonstrates a different digital philosophy: it is voluntary and gives workers direct access to their own records without sitting between them and their wages. Importantly, it introduces no new points of failure. It simply opens up information that strengthens rights.
What good digital welfare requires
The structural gaps and failures highlighted in this article are not an argument against technology, but against designs that prioritise control over access. From design to delivery, digital welfare technology must rest on three pillars:
1) Rights-aligned design
Technology must respect constitutional guarantees, including Articles 21 and 14, as well as specific protections for Scheduled Areas under the Fifth Schedule and PESA.
2) Low burden, context-sensitive systems
Digital tools must reduce, not shift, the burden onto citizens, and must function in low-connectivity, high-diversity settings.
3) Correctability and local authority
Exclusion must never be easier than correction. Local officials must be empowered to act. Digital systems must supplement welfare delivery, not replace offline processes, especially where connectivity, literacy, or documentation barriers make digital access unreliable.
India stands at a crossroads. Digital welfare can either deepen rights or quietly erode them. Success must not be measured by dashboards or authentication counts, but by a simpler question: Does this system make life easier for the most marginalised and vulnerable person?
If it makes life harder, it fails, no matter how efficient it appears. A digital state must serve people, not replace constitutional protections with automated gatekeeping.
*Names changed to maintain confidentiality.
With inputs from BDS Kishore from LibTech India.
—
Know more
- Understand how failures in e-KYC verification of ration cards restricts access to food under the PDS.
- Learn more about how MGNREGA transformed people’s social and economic lives, and what is now at stake with its repeal.
- Read this analysis of the Economic Survey 2025–26 to learn about its implications for digital rights.






