Technology adoption becomes difficult when nonprofits work in siloes. Learning in cohorts enables shared problem-solving and faster implementation across the sector.

7 min read
This is the twenty-first article in a 24-part series supported by the Project Tech4Dev. This series seeks to build a knowledge base on using technology for social good.

View the entire series here.


Technology has become integral to how nonprofits design and deliver their work. Across the sector, organisations now rely on digital tools for a variety of tasks, including field data collection and community engagement. At Project Tech4Dev, we have worked with approximately 300 nonprofits through ten cohorts last year on tech solutions, including AI, chatbots, and data platforms. From this vantage point, one pattern stands out: A substantial gap persists between adopting a tech tool and making it work effectively in practice.

Part of the reason for this implementation gap is structural. In most nonprofits, technology is not owned by a dedicated team—it gets absorbed into existing roles. A programme manager takes on data systems; a communications professional ends up managing a chatbot; someone from the human resources team becomes responsible for a new field app. This model places tech adoption in the hands of professionals whose primary expertise and responsibilities lie elsewhere.

The result is a pattern that repeats across the sector. Organisations working on similar problems build parallel solutions like custom portals, apps, and platforms without knowing that another organisation in their orbit has already tried something similar, hit the same wall, and found a way through. 

What is IDR Answers Page Banner

For example, Adventures Beyond Barriers Foundation spent nearly a year working with a vendor to build a chatbot with limited progress. When they joined a cohort where other organisations were working on similar tools, they were able to build and deploy their own chatbot within a day. The difference was not the technology but access to peer learning and shared problem-solving.

This phenomenon mirrors patterns observed in online learning environments, where individual participation without peer support or accountability often leads to incomplete engagement. The difference is that in the social sector, the result of an incomplete journey is not just a half-finished certificate. It comes at the cost of time, money, and the communities who depend on these tools actually functioning.

The real gap is not tools, it is shared learning

There is a common assumption that the reason nonprofits struggle with technology is that they lack access to the right tools. But what many organisations actually lack is access to the experience of others who have used these tools.

A training session may show you how to build a dashboard, but it cannot tell you which data points matter.

Technology decisions are typically context-specific. An organisation working on maternal health may be grappling with data collection in low-connectivity areas, while another working on urban livelihoods is figuring out how to manage a WhatsApp-based helpline. These look like different problems, but both organisations are navigating the same underlying questions: How do we get our field teams to actually use this? What do we do when it breaks down?

There is also a difference between knowing how a tool works and knowing how to make it work for you. A training session may show you how to build a dashboard, but it cannot tell you which data points matter for your programme, how to get your field team to input data consistently, or what to do when the numbers do not match what you are seeing on the ground.

donate banner

Solutions to these implementation challenges frequently exist within the sector. Prior organisations have tested similar tools, documented failure points, and developed strategies to overcome them. However, in the absence of systematic knowledge-sharing mechanisms, this experiential learning remains confined within organisational boundaries. The result is sector-wide duplication of effort and repeated adoption failures.

Learning together, not just alongside each other

At a recent AI cohort, participants from Avanti Fellows came in looking for ways to use AI in their student-facing programmes. But watching another organisation, Simple Education, demonstrate their WhatsApp-based teacher assistant bot shifted their thinking. They realised they could adapt a similar approach for their teacher training modules, giving educators a space to reflect on classroom dynamics, particularly around engaging girls in STEM subjects. Additionally, Quest Alliance’s presentation on creating student personas helped them to move their thinking beyond what content to recommend to how to personalise recommendations. By the end of the two-day session, their roadmap had expanded well beyond their original scope.

This is what cohort-based learning enables. When organisations grappling with similar challenges come together, the nature of learning changes in meaningful ways—offering insights that no manual or training session can fully replicate. The same principle applies when nonprofits learn about technology together.

two speakers making a presentation before a group of people--cohort based learning
There is a difference between knowing how a tool works and knowing how to make it work for you. | Picture courtesy: Project Tech4Dev

First, it moves from one-time training to ongoing peer engagement. Cohorts create spaces where participants validate their thinking in real time through structured exchanges, not just presentations. In a recent AI cohort, for instance, the opening session helped organisations to map their tech use cases and challenges. As they shared and responded to each other’s inputs, participants began identifying overlaps, offering suggestions, and even spotting potential collaborations.

Second, it enables collective sense-making. When organisations share what did not work, such as design assumptions that failed, user engagement challenges, and technical approaches that needed rethinking, others are able to refine their plans before committing resources to them. In the same cohort, two organisations working on machine learning models for maternal health shared their learnings openly, helping each other navigate what to pursue and what to avoid.

Third, it develops decision literacy—the capacity to evaluate not just whether a tool works, but how it fits into broader goals. A workshop on responsible AI pushed participants to look beyond what tools can do and focus on the capacity needed to use them well. Seeing varied approaches—from behaviour-based user personas to emotion-based profiling—helped teams to question and refine their own design choices.

This approach allows an organisation to learn not only what another entity in a similar position to them decided, but also why they did so. It clarifies their reasoning, constraints, and the trade-offs they weighed.

When organisations think through tech problems together, they are more likely to arrive at solutions that can talk to each other.

Coming together as a cohort can also surface practical insights that simply cannot come from a manual or an instructor. For instance, hearing directly from a peer organisation with a similar field team structure that a data collection app proved too heavy for low-end smartphones offers immediate, actionable learning. Insights like these travel best through conversation, shortening the loop between experimentation and course correction and helping organisations to pressure-test assumptions.

The long-term benefit to this approach is that when organisations think through tech problems together, they are more likely to arrive at solutions that can talk to each other— shared data standards, compatible platforms, open-source tools that multiple organisations can build on. In contrast, working in silos leads to fragmented tech systems that are hard to scale and often depend on short-term funding.

Making collaboration actually work

Effective cohort implementation requires deliberate design. Without it, sessions become no different from webinars that people attend and leave without anything meaningfully changing. A few elements can make a difference.

1. Cohort composition

Rather than opening cohorts to everyone, it is worth investing time upfront to understand who should be in the room. This will depend on the level of familiarity each organisation has with tech, the kinds of problems they are working on, and what they are hoping to leave with. Mixed-experience cohorts can generate valuable peer learning, but only when composition is strategic. A beginner surrounded by experts may shut down. An expert surrounded only by beginners may disengage. The goal is a room where everyone has something to learn and something to offer.

Getting this composition right is harder than it looks. An application form alone will not do it. It requires research into participants’ contexts, some level of direct interaction before the cohort begins, and careful thought about how different organisations will work together. Over time, cohort design improves through iteration. The Glific cohorts, for example, evolved significantly over the past year, incorporating more hands-on time, better applicant scrutiny to understand knowledge levels and use cases, and a mix of online and offline sessions to create more touchpoints.

2. Hands-on learning

Technology adoption cohorts demonstrate higher implementation rates when structured around hands-on engagement rather than presentations. Cohorts with built-in time for participants to actually work with tools, share screens, hit problems in real time, and troubleshoot together tend to produce far more durable learning than those that stick to presentations.

3. Openness to simpler solutions

The answer to an organisation’s problems may not always be technology. Sometimes, the best outcome of a cohort session is the realisation that a simpler solution already exists. In several instances, organisations came to cohorts eager to explore AI for particular challenges. Through discussion with facilitators and peers, they realised simpler solutions like well-structured spreadsheets would serve them better, saving significant funds and time. Creating space for that kind of honest, problem-first thinking requires a facilitator who can hold the discussion without steering it toward a predetermined answer. It also requires building in conversations about responsible technology use from the start, not just in final documentation.

4. Informal peer learning

Cohorts also create conditions for ongoing peer exchange beyond formal sessions. Many participants stay connected through WhatsApp groups or informal check-ins, continuing to share learnings and troubleshoot problems together. Having worked through challenges in the same room creates a sense of community that extends beyond the programme itself.

Not every organisation will have access to a formal cohort programme. But the underlying principle that peer learning accelerates tech adoption can be acted on in simpler ways. A group of organisations working in the same district or on the same issue could set up a regular informal exchange, even a monthly call where one organisation shares a tech decision they are currently navigating. The critical element is not institutional infrastructure but commitment to transparent, collaborative learning.

From an organisational problem to a sector opportunity

Budgets and technical expertise are important for technology adoption. But just as important, and often overlooked, is access to peer learning networks, spaces where organisations can learn from each other’s experiences, avoid repeated mistakes, and adapt solutions to their contexts. When an organisation finds a way through a difficult tech implementation and keeps those learnings internal, the sector loses. The shift that is needed is not just a structural one such as establishing more cohorts or collaborative spaces, but a cultural one. One where admitting, “We tried this and it failed,” is seen as a meaningful contribution, not a vulnerability.

What often surprises organisations most is what happens after the formal cohort ends. The sense of community that develops during sessions continues through informal WhatsApp groups and ongoing exchanges. Participants continue learning from each other’s mistakes, troubleshooting problems together, and building on shared solutions, without needing the formal structure of the cohort to sustain it.

The question worth sitting with is not whether your organisation needs to collaborate more on technology. Most do. The more useful question is: What would it take to make such collaboration meaningful? Who would you want in the room? What would you be willing to share? And what might you be surprised to learn?

When organisations learn with and from each other, technology stops being a set of isolated tools and becomes a collective pathway to stronger, more sustainable impact.

Know more

  • Read how nonprofits can build data systems that support learning, not just reporting.
  • Read how organisations navigated building AI solutions together.
donate banner
We want IDR to be as much yours as it is ours. Tell us what you want to read.
ABOUT THE AUTHORS
Deepak Nanda-Image
Deepak Nanda

Deepak Nanda leads communications at Project Tech4Dev, where he shapes storytelling around AI, data, and digital public goods for the nonprofit ecosystem. He has previously led marketing and partnerships at Josh Talks and worked on content and strategic partnerships at CSRBOX. Earlier at RISE Infinity Foundation, Deepak directed communications and collaborated with organisations including UNICEF, Meta, Google, Oxfam, and J P Morgan.

COMMENTS
READ NEXT