Guest post: The Efficiency criterion

Share this post

**6 min read**

This guest post was written by Michaela Raab and Jasmin Rocha, both based in Berlin, Germany.

The OECD/DAC criteria have been under review and debate since 2018. As Zenda Ofir explained in her widely discussed blog series, the criteria work when applied to relatively straightforward interventions that come with simple results chains. But they appear inadequate when we frame development as a complex adaptive system, with evaluation as a learning process as opposed to a form of performance audit.

Several panels and paper presentations at the 2018 biennual conference of the European Evaluation Society in Thessaloniki, Greece, were dedicated to this discussion and related issues, such as the role of evaluation in furthering greater equity in development. We presented a paper on Efficiency, and why it could still be a useful criterion if used intelligently.

The problem with measuring efficiency in development

The OECD/DAC glossary of evaluation terms (2002) defines efficiency as “a measure of how economically resources … are converted to results”. As Zenda pointed out, the criterion as used today does not capture hidden costs and unwanted results, such as harmful environmental or social impacts of an intervention.

Efficiency is also hard to measure in complex development processes. A paper commissioned by the German Ministry of Economic Cooperation has shown that cost-benefit analysis and calculating economic rates of return work only under conditions we hardly ever find in complex development processes. Putting reasonably accurate price tags to such interventions and their effects tends to be costly and time consuming – that is, it can be quite inefficient to try to measure efficiency!

As a result, those who commission evaluations – referred to as “clients” in this post – often reduce the efficiency concept to the parsimonious use of donor money. That makes it easier to come up with findings, but it risks turning an evaluation into an audit, sacrificing learning on the altar of compliance and implicitly reinforcing the power imbalance at the heart of international development.

Our case

A couple of years ago, we carried out an evaluation of a worldwide portfolio that included i.a. more than a hundred grants to a broad spectrum of human rights organisations. The evaluation was to cover all OECD/DAC criteria and dozens of individual evaluation questions designed by our client, who (co-)funded the human rights organisations. Our field studies, which looked at three organisations per country, were to validate and deepen findings from data we had gathered and analysed earlier on. The client insisted on adding a closer exam of “efficiency”, possibly with some budgetary analysis attached.

Insisting on “efficiency” in human rights work can give wrong incentives, encouraging a focus on short-term, easily measurable results - for instance, numbers of legal consultations provided by a legal aid clinic, regardless of the benefits that can reasonably expected from legal solutions in dysfunctional legal systems - and excessive emphasis on saving.

Also, we have found that an overly parsimonious use of funding can threaten the safety and sustainability of human rights organisations and their staff and volunteers. For instance, some organisations we worked with heavily relied on their teams’ personal resources - cars, mobile phones etc. - and had not budgeted for any kind of benefits package. Thus, staff could not be compensated for the risks to their personal health and their families’ resulting from the human rights work they carried out.

A holistic view of costs and effects

Rather than trying to find out where human rights organisations might have wasted resources, we started by learning about the true costs of human rights work in diverse, mostly hostile contexts. That is, instead of developing our own efficiency yardstick and holding the organisations’ performance against it, we facilitated workshops with human rights activists to find out what “costs” and “effects” meant in their work. We took the perspective of those who ran the project, attempting to understand how they made their decisions on the best possible use of limited resources.

What we came up with together was a holistic view of costs and effects that must be taken into account when making strategic decisions on what matters in the long run. Human rights work – especially in shrinking spaces for civil society activism – does not advance in neat three-year project cycles with pre-defined outcomes. In the opposite, one could argue it never reaches an end.

Of course, money and staff/ volunteer time continue to be two key elements in the cost structure of human rights work. The “money” part is not only about project budgets: Human rights groups also need reserves, or quick access to funding, to react to unanticipated threats and opportunities. The time and personal engagement people are ready to give depends to some extent on their working conditions, the risks they run and the way in which the risks are managed.

When human rights organisations plan their interventions, they reckon with the cost of inaction as well. Often, human rights work cannot bring about radical changes. But even in extremely difficult situations it may be necessary to continue working, so as to prevent the worst abuses, or at least to prepare for better times. Then, there are mundane hidden costs that funders tend to forget about, such as the costs of filling in increasingly complicated grant application and report forms ill adapted to the characteristics of human rights work, and of receiving poorly coordinated streams of visits from funders and evaluators.

On the effects side of human rights work, agreed project outcomes or indicators for progress are usually only milestones in long, complex processes that a single project has very limited influence on. If those milestones don’t display a clear relationship with the overall, longer-term strategies and goals of the project, then maybe it is inefficient to invest in “reaching the indicators”. The more important process for human rights organisations is to continuously reassess how likely it is the approaches and activities chosen for a particular project help reaching their longer-term goals, and whether alternative strategies would be more effective in a changing context. Adaptability, one of the qualities often proposed as a new DAC criterion, plays a big role here: If a planned project outcome doesn’t make sense anymore or the planned activities would even be counterproductive, for instance due to unanticipated political changes, then it is efficient to abandon the initial plan and go for something more useful.

Our advice: consider efficiency early on, and differently!

Reflection on efficiency can be important throughout strategic planning – not only at the end of a project, when external evaluators typically descend on international development projects. That reflection happens in human rights organisations, anyway – everybody needs to make decisions as to how they allocate their (limited) resources. But that thinking is not necessarily documented and shared with international funders – especially if those are only interested in efficiency calculations in monetary terms.

To change this unfortunate situation, we recommend a partner dialogue that determines at the planning stage of an intervention – or a funding contract – what efficiency means in the project, in its specific context.

Then, determine whether it is important to report on efficiency to the funder. If key aspects of that reflection are covered under other DAC criteria, such as relevance and effectiveness, it may be wise to focus on those, especially where monitoring and evaluation budgets are tight.

And finally, it is a widespread error to assume that all evaluations must cover all OECD/DAC criteria. That was never their intention, as reiterated by Hans Lundgren, one of the criteria’s creators at recent discussions.


Share this post

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top