Resilience Exchange

What have we learned so far?

Evaluating progress and supporting collective learning

This section covers learning from BRACED relevant to the operation of a large-scale resilience-building programme, providing information for those planning resilience programmes and for organisations leading consortia. In particular it reflects on the experience of evaluating progress in resilience building, and collective learning activities to support partners in a programme such as BRACED. All of these activities must take account of the fact that partners’ primary focus is necessarily the delivery of their projects, and at busy times monitoring and reflection can feel burdensome.

Evaluating progress

Measuring the progress of a resilience programme is challenging, complex and resource-intensive. Different evaluation approaches and methods are needed to understand changes in resilience at the intervention, project and programme levels. Evaluations need to be useful to practitioners, their partners and the communities in which they are working. Appropriate timing of evaluations is crucial, with some interventions likely to produce an impact only after the main project activities are concluded – a reality that should be taken into consideration in project, programme, and evaluation design. For example, to budget for impact evaluations after programme completion.1

A significant effort is being made in BRACED to measure progress on building resilience, developing innovative new methodologies and tailoring existing ones. This range of methodologies must meet the needs of the projects as well as those who are looking at BRACED at the whole programme level.

The experience has raised a number of questions about measuring and evaluating resilience building. Learning is ongoing, and the next Resilience Exchange will report more fully on solutions BRACED has found to these challenges.

The definition challenge

The first challenge in measuring and evaluating resilience building is to agree on what is to be measured: resilience of whom to what (e.g. household/person/community resilient to drought/flood/conflict/pests)? What needs to change and by how much, and how do these different dimensions interact?

At the individual project level, resilience is understood in different ways, and different working definitions are used for the purpose of project-level measurement and evaluation. For instance, the Myanmar Alliance defines five dimensions of change:

It is not only BRACED that has this difficulty: a recently published systematic literature review concluded: “There was no evidence of a common, agreed definition of community resilience.”2 The result is that it is difficult for many people to understand what resilience projects are, let alone measure their progress.

Collective reflection can help. In a writeshop organised by the BRACED Knowledge Manager, participants were encouraged to interrogate what they saw as innovations in resilience building in their own work. On the surface, many of the innovations identified looked like fairly typical development interventions – from village savings and loans associations to community radio broadcasts. However, as the participants went through a process of guided peer review, reflection and drafting, the resilience dimensions started to emerge.

At the programme level, the 3As concept breaks resilience down into three recognisable capacities of anticipation, absorption and adaptation, which can be tracked to assess progress on interventions aimed at enhancing resilience.3 This is not a means of measuring resilience per se, but rather an analytical lens ultimately adopted by all BRACED partners as a means of usefully organising their interventions and the outcomes they aim to achieve. In reality, most of the BRACED partners have mapped their existing indicators against the three capacities in the 3As framework, rather than designing new ones in response to it. While attributing changes detected in resilience to one of the 3As can be viewed as a somewhat reductionist exercise, this has at least offered some means of comparing outcomes across contexts.

The baseline challenge

In resilience building we need to understand progress in relation to the starting point – what would seem small steps or slow progress in one context may be vital foundational change in another. It is difficult to know what indicators to select in a baseline evaluation, since before the project starts it can be hard to know what is most relevant. In addition, BRACED projects do not happen in a vacuum, and in many of the project areas there are a number of other interventions and development efforts happening, with different but (at least partially) overlapping objectives, each making a contribution to overall resilience.

Furthermore, the climatic conditions at baseline may be very different to the conditions at the next data collection point. If we accept that resilience can only truly be measured in the face of climate shocks and stressors, the uncertain frequency and severity of such climate events means that, in many cases, measurement tools will remain untested within the life of a programme, relying instead on assumptions, albeit well-evidenced ones. To partially address this, projects use different climatic indicators in composite indices for measuring resilience to climate shocks. These are context-specific and based on their conceptualisation of resilience and how they aim to strengthen it with their interventions.

It is important to remember that climate shocks and stresses are dynamic and interpreted differently by different groups, including non-governmental organisation project and field staff, evaluators, academics, government officials and households. Ensuring a shared understanding of what climate shocks are and how they are reported is therefore critical at the outset when planning baseline data collection.

Baseline data-collection designs, including sample designs, should consider “shock-driven attrition”. A typical shock response employed by climate-vulnerable households may be relocating and thus leaving the project area. For impact evaluations that use panel surveys – interviewing the same people at baseline and endline – this will make it challenging to find the same respondents. Impact evaluations will need to consider including higher attrition estimates in sample-size calculations or a budget for resources to track respondents who relocate.4

The data collection challenge

Due to the contexts within which BRACED works, collecting high-quality data is extremely difficult. Challenges include moving around in project areas affected by conflict; reaching some remote rural areas that do not have all-weather roads during rainy seasons; collecting longitudinal information for the same people, particularly in nomadic pastoralist communities; language; illiteracy; and a range of cultural sensitivities.

Working in Chad on evaluation of the BRICS project, Tufts University learned important lessons:

  • Do not underestimate the importance of sound logistics.
  • Trust and relationships are important. Use an experienced mobiliser to persuade village chiefs to facilitate interviews with women.
  • Throughout the training, ensure your enumerators are invested in the research. Talk to them about what the information will be used for, and share past findings. The more invested your enumerators are, the more care they will take to collect accurate data.
  • Wealth-related questions are sensitive. Asking a woman in Chad how many cows her family has is like asking someone from the global North how much she has in her retirement account. Train your enumerators to challenge responses respectfully, and remind respondents that the information they provide is confidential.
  • Before your enumerators start the survey, have them verify that they are talking to the right woman. Many women have similar names, so equip your team to verify names in a way that is culturally appropriate.

Of course, context matters, and the types of challenges can vary depending on countries, seasons or organisations. These lessons are specific to experience in Chad, but some are likely to be common to other countries.5 

The timeframe challenge

Resilience-building activities need to work across timeframes from short-term (seasonal) to long-term (decadal). Funded programmes such as BRACED have a lifespan, dictated by donor requirements, which does not always allow sufficient time for results to mature. This raises the question: over what timeframe is it realistic and reasonable to expect evidence of increased resilience? How long does it take to strengthen resilience and build a solid evidence base?

The complexity of projects aimed at building resilience requires partners to understand and conceptualise resilience in order to design appropriate interventions. Many chose to combine individual interventions into packages to address the multidimensional nature of resilience. Experience in BRACED to date suggests timeframes to build different capacities vary.

“Building anticipatory and absorptive capacity to deal with current risks and threats is the first step for communities that are highly vulnerable to climate change. As BRACED projects continue in years 2 and 3, it is important to think about how anticipatory and absorptive capacities can be built in ways that provide a solid foundation for building adaptive capacity in the longer term. Understanding how resilience capacities interrelate – and revisiting whether adaptive capacity is more challenging to build in the lifespan of a BRACED project – will be key insights to inform theories for building community-level resilience on the ground.”

From Routes to Resilience: Insights from BRACED year 1, Key Message 4

This means that not only is progress more difficult to measure (with potentially multiple causal pathways), but it may also take more time for changes to be detectable. In the worst case this may result in a “null evaluation”, i.e. not being able to say whether anything has changed. Apart from representing an investment risk, this may be misleading – changes may occur but, given the short time between survey rounds, they may not be detectable.

To tackle this, it is advisable to revisit assumptions about the theory of change or programme logic that are associated with interventions having an effect in the lifetime of the project. Implementation delays and changes can mean these assumptions may not hold true. The importance of revisiting the project logic or theory in light of changes to implementation plans therefore cannot be overstated. This should be done at least annually, but more frequently would be preferable.

It is important to consider not only the amount of time between baseline and endline in order to detect change but also what time of year – month or season – data collection is conducted. For many of the BRACED countries, recurrent annual drought and heavy rains are a part of life and often lead to times of hardship and abundance, respectively. While these periods are becoming increasingly difficult to predict, owing to the effects of climate change already being felt, some effort should be made to align the timing of data collection to avoid introducing a source of potential bias. For example, if the baseline survey is conducted at the end of the dry season when agricultural communities are reliant on the remaining surplus, this may indicate lower levels of resilience. If the endline survey is conducted towards the end of the rainy season or start of the dry season around harvest time, respondents may report higher levels of food security.

In practice it may not always be possible to collect data at the same time of the year. Therefore, any differences should be highlighted as possible influencing or confounding factors when presenting results.

The complexity challenge

As stated at the start of this report, people and communities are not resilient or non-resilient: they may face a variety of shocks and stresses and be more or less resilient to each one. How suitable, therefore, is a simple count of the number of people who have had their resilience built as an indicator for assessing resilience?6 Projects use different approaches to measure resilience, and programme-level evaluation experience suggests that numbers of people cannot be the only indicator used.7

This is further complicated by the fact that there was no “turnkey” methodology at the start of the programme, so individual projects have developed and used different methods and indicators to measure the resilience-building outcomes of particular activities. This has led to the use of multidimensional composite indices or “scorecards” with constituent indicators that reflect project-specific capacities being built. Examples are given in the paper Laying the foundations for measuring resilience.

If simple indicators are not sufficient, how complex does monitoring and evaluation for resilience need to be? Complexity drives issues of resourcing, both on the evaluator side and, just as crucially, on the project partner side.8 The more complex the evaluation system, the more time and funding are required. Different ways to measure and evaluate resilience building each come with their own trade-off in terms of flexibility and resourcing.9

Given the expenditure, it is above all important that monitoring and evaluation be useful, going beyond accountability-driven exercises towards learning-based approaches. This usefulness ideally extends beyond the practitioners, supporting engagement of local communities and using participatory evaluation approaches.

At the local level, resilience building is not seen as a one-shot investment, but rather as a continuous process. People have valuable knowledge of what works and doesn’t work to help them build their resilience. Conventional cost/benefit assessments can miss both of these factors. DCF is therefore seeking alternative methods that also consider the value of natural resources, and how this might increase through better management.

The BRACED Knowledge Manager is also piloting a participatory approach to assessing costs and benefits, working with the Myanmar Alliance project.

Participatory cost/benefit analysis: experience in Myanmar

In Myanmar, a participatory approach for assessing costs and benefits associated with resilience-building activities has been trialled as part of an impact evaluation in Mawlamyine township, the fourth largest city in Myanmar, 300km south-east of Yangon. The main climate shock is flooding, and over time vulnerability to shocks has increased as poorer working people have moved into the township. Unable to afford established areas, they live in bamboo huts in flood-prone areas. Mawlamyine was one of the townships hit by cyclone Nargis in 2008. Dry-season fires started in wooden homes have occasionally become widespread. The BRACED Myanmar Alliance project uses community action planning to identify resilience interventions and coordinate with local government. Interventions have included dredging drainage channels and other flood-prevention activities, supporting a local fire service by digging ponds and providing pumps, and microfinance to support income-generating activities.

The approach to cost/benefit analysis uses a mix of participatory and economic modelling techniques. Participatory methods (group and key informant discussions) are used to understand changes that have occurred and to generate evidence that can be checked against other sources. This is combined with formal economic modelling that draws on studies from other countries in the region, for example on the value of a statistical life, disaster risks and the public health costs of dengue fever. Using identified costs and benefits from the past two years, estimates are then made on likely costs and benefits over the next eight years. The value is then expressed in the prices at the start of the project, discounting at a standard 12% which is used to compare costs and benefits that occur in different time periods i.e. at the start of project versus in 8 years’ time.

 

Benefits were found to be significantly greater than costs. The net present value of the interventions is £206,000, with benefits 4.3 times greater than costs using the most conservative assumptions. Flood-channel dredging (undertaken with community labour, government machinery and project funding for fuel) has produced the single largest benefit. The second largest type of benefit has been the returns on microfinance (small loans) provided by the project.

Collective learning

When designing activities to support learning, a range of approaches is needed to accommodate different learning styles and priorities, and the timing of learning activities is crucial. Flexibility to accommodate serendipitous learning – including budget and incentive structures for doing so – can enable project teams to discover valuable new connections, knowledge and insights.

Within a programme such as BRACED, supporting collective learning is rife with complexity. While learning from one another “on the go” is a tremendous opportunity, it is also evident that what collective learning and collaboration means to one BRACED project can vary wildly from what it means to another. Identifying and offering the right incentives for collective learning is vital, as well as offering a suitable range of approaches.

A range of approaches to support different learning styles and priorities

The BRACED Knowledge Manager has tested a variety of approaches to facilitate sharing of expertise and experience across and beyond the programme, including digital and face-to-face approaches, as well as traditional and out-of-the-box ones. These have had varying degrees of success. A dedicated Learning & Uptake team within the Knowledge Manager has led on offering the approaches, reflecting on the experiences and tailoring them further to fit the needs.

Face-to-face learning activities

Face-to-face events have included Annual Learning Events; writeshops; informal “Lunch & Learn” meetings; and “bolt-on” events that have often been informal in nature and have taken place at forums where a critical mass of BRACED partners was already in attendance (for example, a breakfast meeting between partners and Knowledge Manager representatives at the 10th Annual Community Based Adaptation Conference was convened to share stories and updates on latest activities, plans and insights). Proven experimental methodologies of our more innovative approaches to knowledge-sharing (learning marketplaces, serious games, Forum Theatre, solutions-focused reflection sessions also known as “Doctor-Patient clinics”) have been documented in Knowledge Manager Learning Factsheets so that others within and beyond the programme may adopt and adapt these for their needs.

One of the face-to-face engagements format were “writeshops”, a way of co-creating a publication using a set methodology. A Writeshop offers space for technical experts to gather for an intense few days to present drafts of their work, receive feedback from their peers, iterate further and ultimately produce polished, publishable material that enjoys ownership by all.

The first BRACED writeshop focused on the theme of gender; a second writeshop (conducted in French) focused on technology and innovation. On both occasions all participants evaluated the writeshop as useful to their work, in line with the overriding feedback the Knowledge Manager received on the importance of having face-to-face meetings to accelerate knowledge-sharing and learning. After experimenting with the writeshop methodology during year 1, the two writeshops to be held in year 3 are entirely demand-based. One of these will be a follow up to one of the earlier workshops, gathering some of the same individuals from BRACED projects to deepen their reflection and document more lessons learned.

Digital learning activities

Digital means of connecting have included Knowledge Manager-facilitated webinars and online discussion fora as well as internal and external newsletters. To date, over 25 webinars and 15 discussion fora have taken place in English and in French on topics ranging from “How to pitch BRACED stories to the media” to “Unlocking the potential of Africa’s livestock systems for climate-resilient economic development” and “Community Theatre – How does it work?” It is interesting to note that while webinars on resilience existed prior to the BRACED programme, they were used in a somewhat limited fashion to connect producers and users of knowledge and foster connections across large-scale programmes working on climate change and resilience. Within BRACED, the Knowledge Manager originally only offered discussion fora, as these enabled users with low bandwidth to connect and contribute to an online conversation. While both discussion fora and webinars have their place, today webinars are much more common within the programme.

“It was my first experience to share thoughts in that kind of learning platform and I found it very interesting. […] The feedbacks and questions that came out from participants were very useful.”

Mulugeta Worku, webinar presenter working for Christian Aid (BRACED Implementing Partner)

Some lessons on learning

1. Organisational governance and power dynamics, language, technology, and cultural barriers can all challenge collective learning.

Much time was initially spent learning about particular power dynamics and governance structures within the programme, grappling with very practical language, technology and cultural barriers, and testing solutions to address these. While the shared aim between all is to learn about what it takes to build resilience, it is important to examine assumptions about collaborative mindsets, a common understanding of learning styles and preferences, and proclivity to share experiences, including lessons learned from failure. As a result, explorative work focused on “Learning in Consortia” has been introduced, with an objective of unpacking some of these assumptions and analysing what approaches work best for incentivising collective learning in large-scale resilience programmes.

2. It is important to retain flexibility in the programme of learning activities, actively seeking feedback to inform adjustments.

Iterative learning: adjusting activities based on feedback

Active learning and reflection marked the trajectory of the annual learning and reflection gatherings of all partners. Each of the three BRACED Annual Learning Events to date has been modified based on participant feedback and internal Knowledge Manager reflection on the programme’s substance, target audience and accompanying planning process. For example, the first Annual Learning Event was designed by the Knowledge Manager for all direct BRACED partners (implementing partners, Knowledge Manager, fund manager and DFID representatives). The objective was to get to know one another and plan jointly. The second event incorporated feedback received from the previous year, and while it was again principally planned by the Knowledge Manager, it was now open to external partners for one day. For the third event, the full three-day programme was open to BRACED and external partners, based on preferences expressed to broaden the conversation and focus on deepening connections with practitioners working beyond the BRACED programme. In light of feedback from the previous Annual Learning Event, the Knowledge Manager broadened the planning process to include a steering group with representatives of Knowledge Manager research, monitoring and evaluation, learning and communications leads, regional partners, the BRACED partner hosting the event, and the Fund Manager. This boosted ownership and ensured the programme addressed needs and perspectives of the various groups attending the event. The programme thus allowed for a diverse range of cross-cutting sessions hosted by various partners.

3. Support for serendipitous learning is essential

Investing early on in identifying and addressing incentives and barriers to engage in collective learning processes can help plan for maximum participation. At the same time, a budget should be set aside to accommodate new or unexpected opportunities for learning and collaboration, which are otherwise likely to take a back seat to traditional output-focused tasks. The ability to acknowledge and participate in unplanned events or collaborations, and engage with accidentally discovered or even less structured information sources, can lead to highly beneficial learning. Accommodating serendipitous learning can lead to new connections, different perspectives and newfound inspiration.

To this end, in the third year of BRACED, a small “Collaboration Grant” has been created for partners to access up to £2,000 to support budding and unplanned opportunities between partners to exchange knowledge, foster synergistic learning and/or create joint products. The grant seeks to enable project teams to discover valuable new connections, knowledge and know-how that was not foreseen in work plans and logframes.