Skip to main content
Community Contribution

Top 3 Myths about Collaborating, Learning, and Adapting Debunked

Jan 25, 2017
Monalisa Salib

Image of sketches on paper at a training

Monalisa Salib is the Manager of Organizational Learning & Research on the USAID LEARN contract.

In working with USAID missions and implementing partners, we hear a pretty consistent chorus of misconceptions about collaborating, learning, and adapting (CLA). Unfortunately, these misconceptions often stop staff from integrating CLA approaches—such as testing theories of change, fostering open relationships with local stakeholders and partners, or designing flexible mechanisms—that could ultimately increase their impact as development professionals.

I truly believe—and evidence bears this out—that if we all find ourselves collaborating a little bit more with the right people, learning something new during implementation, and adapting our work as a result, we may also find ourselves more excited to be at work (I know, I know, try not to roll your eyes) AND more effective overall. So consider this my humble attempt to debunk some of the most common misconceptions we have heard in hopes that it could help you integrate CLA into your work.

Myth #1: Collaborating, Learning, and Adapting takes too much time.

Let’s tackle this myth from multiple angles. First, I hate to break this to everyone, but you’re already collaborating, learning, and adapting. I know, it’s not what you wanted to hear, but it’s the truth. CLA is not something new. All USAID missions and implementing partners are doing it to some extent—the goal, rather, is to be more systematic and intentional about it so we make the most of our development resources.

We all have about 40 hours in our work week, the choice is what to do with them. If we’re willing to make the time upfront to be more systematic and intentional in how we do our work, then we’re more likely to save time (and money) in the long-run.

How? Well, if we are spending years implementing the wrong intervention or using the wrong approaches, how much of our time and resources are we wasting? And if we come to realize, only years later, that we reinvented a wheel that already existed and failed to function, how much time, energy, and money could we have been saving and putting towards something with greater promise?

Lastly, how often do you find yourself in—let’s admit it—a pointless meeting? Did you really need to be there? Or better yet, could it have been facilitated in a better way to enhance collaboration, generate learning, and make decisions that would lead to be better development? Part of debunking this myth is figuring out ways to integrate CLA into our existing work processes. We all need to monitor and evaluate our work, but are we just wasting time counting things that don’t matter or using that to inform learning and adapting? USAID missions have to conduct portfolio reviews, but are they fruitful reflections leading to action? We often find ourselves in annual retreats—but are the right people in the room, do we have the right data to inform our next steps, and do we follow through? These are all ways we can get a higher return on the time we already know we are investing.

Are you interested in practical examples of how to integrate CLA into your work? Check out Working Smarter blog series on USAID Learning Lab.

Myth #2: It’s too difficult to figure out what our approach to CLA should be.

We hear this from USAID mission staff often. And now with the recent changes to USAID’s operational policy (ADS 201), in which CLA/adaptive management take a front seat and a CLA plan is now required in the mission’s Performance Monitoring Plan, we expect this will become a more common frustration.

The first step in developing a CLA approach is determining where CLA can have the greatest impact on your team’s performance. USAID’s Bureau for Policy, Planning & Learning (PPL) and LEARN have developed a series of tools that can help you prioritize which CLA approaches would be most applicable to your context and of greatest benefit to your technical work. First, we encourage you to look at the CLA framework to get a sense of what CLA is and looks like at a team/organizational level. Secondly, we encourage you to get in touch with PPL (via [email protected]) to get access to the CLA Maturity Matrix, a tool designed to help missions in particular but also implementing partners identify areas in which a more systematic, intentional, and resourced approach to CLA could most impact their work. Implementing partners should work through their USAID counterparts to get access to the tool. Lastly, LEARN will be providing training on how to facilitate the CLA Maturity Matrix self-assessment and action planning process and can provide facilitation guidance to USAID missions. The training is also open to implementing partners, who should contact [email protected] to learn more.

Teamwork

Myth #3: Monitoring and evaluation is separate from CLA.

Out of all these myths, this is perhaps the one that gets under my skin the most. If we continue to separate monitoring, evaluation, and learning from each other, we will continue to do none of them well.

USAID and its implementing partners are guilty of this. Even though we combine our acronyms and put together monitoring, evaluation, and learning (MEL) plans, we so often start with a discussion of basic output indicators without first talking about what is most critical for us to learn to inform decision-making. So we combine MEL on the surface, but not in a truly meaningful way.

How can we flip the script and put learning and adapting at the center of our monitoring and evaluation efforts?

First, we have to see the potential of effective monitoring and evaluation for learning. This is why M&E for Learning is a critical subcomponent of the CLA framework, and looks in particular at whether we are collecting relevant monitoring data for decision-making and applying learning from our evaluations in the design and implementation of current and future programming. But even beyond this subcomponent, the entire component of Learning in the CLA framework requires high quality monitoring and evaluation—for example, we cannot test theories of change or do proper context monitoring without solid M&E.

So clearly M&E is part of learning and, hence, CLA. A more systematic and intentional approach to CLA can help ensure that our M&E is designed with learning in mind, that data generated from M&E efforts is analyzed and synthesized to inform reflection and decision-making, and that what we learn is shared with those who need to know. This is how we can move beyond a superficial approach to MEL to a more meaningful one.

USAID Learning Lab users, what other CLA myths you have come across? How have you helped debunk them? To our CLA skeptics: we welcome your input as well. If you have experienced other barriers to CLA integration in your work, please share with us here or via Ask & Answer so we can continue this dialogue.