Julian Barr’s Ten Top Tips for YEE consultants

**6 min read**

Julian is a Non-Executive Director in Itad, one of the highest profile consulting companies in the UK. This year he is also President of the UK Evaluation Society, one of the most active and innovative associations in our field. His vast experience in many countries around the world is reflected in his own very thoughtful contributions to evaluation practice. I thoroughly enjoy interactions with him, and any opportunity to learn from his insights. His ten Top Tips reflect beautifully his many years of experience as independent evaluator and part of a large, very successful evaluation consultancy. We can all learn from these tips.

Others in this series will, I trust, give you useful technical tips on evaluation designs, methods, data collection and analytical techniques – the tools in the evaluator’s toolbox. Here, however, is a set of hard-won and generally practical lessons to using those tools effectively as an evaluation practitioner. I’ve fallen foul of all of them at one time or another. I hope they can help you avoid some of the bumps in the evaluation road.

Starting an evaluation

Top Tip 1. Really work with commissioners to understand why the evaluation is being commissioned. What is it for? Who is going to use it? How is it intended to be used? (If you work as a commissioner, ask yourself if these questions are clearly answered in the TORs!) This will help you shape the evaluation and make it useful. Terms of Reference for evaluations aren’t always clear, so use an inception phase to clarify these points with commissioners and stakeholders.

Top Tip 2. Evaluations have been characterised as speaking truth to power. That makes them seem like they convey an objective truth. However, evaluations always have political dimensions. Take time to understand the underlying reasons for the evaluation, and its use and users. You shouldn’t compromise impartiality, but in engaging with evaluation stakeholders, consider how your evaluation will ‘land’.

Managing an evaluation

Top Tip 3. A common theme in these tips is that evaluators need to pay attention to soft skills as well as hard, technical ones. A major aspect is relationship management, with commissioners, team members and other stakeholders. Recognise and use emotional intelligence in building and maintaining client relationships. Dedicate time to face-to-face interactions with clients and commissioners early in the evaluation. Building social capital at this stage pays dividend later (when the evaluation might get more tricky!).

Top Tip 4. Think through vary carefully how you will get from data to deliverables. This may seem obvious, but some steps often get overlooked or not given enough time. For example, in a team situation, what are the steps to get from individuals’ field notes to a synthesised report? Workplans often jump from data collection to reporting writing; a dedicated analysis step is needed. How much time do you budget for this?

Communicating an evaluation

Top Tip 5. Write a report people want to read!

Humans like narratives and stories. Think about structure. Reports that have chapters for Relevance, Efficiency, Effectiveness, Impact and Sustainability are often boring and repetitive. Consider more narrative-driven structures.

Humans don’t read. They skim. Even more so since we have started to read much of our material online. Construct a report for skimmers – use informative heading; use plenty of sub-headings; incorporate a lot of pertinent diagrams, charts, tables, infographics, etc; learn how to use pull quotes; and if possible, work with a designer to improve the look and skim-ability of your report.

Universities don’t teach you to write. At least not in a way that people want to read. By the time most people have laboured through Introduction, Background, Methodology, you’ve lost their interest and used up most of their available time. All these sections may be very interesting to you – you’ve worked hard on the methodology, and you’ve learnt so much about the evaluand that you want to write down. Evaluands are usually projects, policies or funding organisations – the people involved know the background of these; why waste precious report space telling them what they already know? OK, you need to confirm to them you have understood your subject, but don’t lead with this. Methodology is very interesting to a limited few. It might be a point of contention if your findings are controversial, but again, don’t lead with this. Unlike this paragraph – start with your conclusion. And write an executive summary from scratch, not cherry-picked, cut-and-pasted sections from the whole report.

Use pyramid writing. As well as structuring your report to bring the conclusion forwards, write so that the first sentence is the summary of the following paragraph. Likewise, the first paragraph summarises the section. Read about, practice and use pyramid writing to improve the efficiency of how readers engage with your reports. You may also find Barbara Minto’s ‘Mutually Exclusive, Collectively Exhaustive’ (MECE) approach to structuring information is helpful.

Don’t swallow a thesaurus. Keep your sentences short and simple. Use simple language. Evaluators and commissioners will usually have a university education and accordingly have a high reading level. However, busy people like simple writing – it’s just easier to comprehend quickly. Also, evaluations should be accessible to all stakeholders.

Top Tip 6. Actually, it isn’t about the report. I’ve over-emphasised reports. The chunky evaluation report has become a norm. But a range of possible evaluation products exists, and most are more accessible to users than a long report. Some agencies require a set of different products for different audiences. Consider using very short reports, Powerpoint slide decks, infographics, websites, etc. Also, consider and discuss with commissioners how social media such as blogs and twitter can be used before and during an evaluation to engage stakeholders, and afterwards to promote uptake and use of evidence and recommendations.

Top Tip 7. People don’t like surprises. Especially, evaluation commissioners and evaluands don’t like nasty surprises when you report your findings. Communicate regularly at all stages of your evaluation. Set-up opportunities to discuss emerging findings, and ideally, workshops at which you and your stakeholders can co-develop recommendations.

Being an evaluator

Top Tip 8. Read. A lot. Keep abreast of trends. A very specific example: as the UK public sector moves away from post-Thatcher New Public Management (NPM) and post-Blair Evidence Based Policy Making (EBPM), evaluators need to learn about evaluating complexity and evaluation in adaptive systems (for example, see CECAN and GLAM).

Top Tip 9. It will be tough. Being an evaluator is not easy. Develop strategies that help when you meet challenges. To over-simplify – any evaluation challenge essentially has only three possible solutions:

Yourself: build resilience and grit, be healthy, learn how to workplan realistically, pay attention to continuous professional development, and ultimately – get the work done (Having to ‘suck it up’ is stressful. Being clear on TORs and timeframes, managing your team, working with colleagues, and developing a good client relationship will all reduce this.)

Your network: allocate responsibilities and manage your team well, draw on support from colleagues, develop professional and personal support networks

Your client: build good client relationships from day one so that you have social capital to help negotiate deadlines and deliverables, communicate regularly

Top Tip 10. Join your local/national evaluation society. National evaluation societies, such as the UK Evaluation Society (UKES), or regional ones – for example, the European Evaluation Society (EES) and the African Evaluation Association (AfrEA) – are wonderful resources. They exist to promote evaluation as a practice and profession, and to support evaluators. They hold training events and conferences and publish important guidance, such as UKES’ Guidelines for Good Practice in Evaluation and Evaluation Capabilities Framework.

Share this article

Julian Barr

Julian is an evaluation consultant and current President of the UK Evaluation Society. Previously he has worked in private sector research, the UK aid programme and academia. Since joining Itad (www.itad.com) in 2001 he has led evaluations and learning contracts for funders including DFID, the World Bank and the Rockefeller Foundation. These have covered topics including bilateral aid, research funding, climate change, social norms, peace building, livelihoods, democratic governance, and value for money. His current interests centre on making evaluation useful for learning.

5 Comments

  1. Am a young Evaluator,a member of International Development Evaluations Associations(IDEAS). I must admit this is very insightful and helpful tips,Thanks sir

  2. it was a delight to see your top two tips confirm my experience of working in the NHs
    about the politics of evaluation and the motivation.
    NHS evaluators tend to be recruited on their academic skills of research and measurement and judged on their report writing skills. Commissioning managers can then say they are “doing something” although as priorities change so does managerial focus and attention away from evaluation reports on to the next pressing issue

    No one tends to do an audit of evaluation studies and reports across the NHS, safely assuming that money was well spent rather than wasted (having minimal impact)

    Far too many academics are keen to develop tool kits which NHS staff pass on to trainees without any experience or concern with how evaluation can make a real difference and how to make change happen. How does evaluation have an impact on say patient care and service improvement that is sustainable over time. Discrete evaluation studies that are time limited
    have little impact unless they have powerful backers and leverage.

    Many evaluators don’t see themselves as change agents and don’t have the confidence to push for impact assessments as part of the contract or funding of their evaluation.

    Evaluating the readiness of the system to change needs to be incorporated in any evaluation and what the challenges to the status quo are

    • Thanks Edmund,

      I’m glad the points resonated. For the sake of time, the slides were cut down from a longer set. In other settings, I talk more about the consulting angle of evaluation, though I recognise that the NHS is more of a researchy arena. I talk about needing a mix of technical, intrapersonal and consulting skills. My perspective is that too often evaluations can end up with a double let-out: i) the evaluator presents a good evaluation, but is able to walk away at that point, without being part of helping recommendations land, and ii) commisioners tick a box and put the report on a shelf (as you describe), without using it for change.

      I personally like Peter Brock’s Flawless Consulting approach. Although this is more about management consulting than evaluation, points apply. In particular, that consultants should help fix problems so they stay fixed – this involves understanding the system in which change is being recommended. And that they should help develop client commitment to change. Also in a consulting context, I like this HBR article on consulting purposes: https://hbr.org/1982/09/consulting-is-more-than-giving-advice. The additional goals in the hierachy of consulting purposes all seem to relate to evaluation use.

  3. Hi Julian
    This is a very good list, much better than many I have seen with a similar purpose, and clearly the results of much-digested experience.
    I have one quibble re: “Evaluands are usually projects, policies or funding organisations – the people involved know the background of these; why waste precious report space telling them what they already know?” I have seen quite a few situations where the commissioners actually do not know very much about the portfolio of projects that they getting evaluated. So quite often I’m recommending that one of the evaluation questions, in fact often the first question, should be asking the evaluation team to provide a comprehensive description of the profile of the portfolios investments.

    • Hi Rick

      Thanks for this. I know what you mean; there are situations where commissioners and evaluands are not well connected. Nonetheless, there are too many of those reports in which the background section is a long, wikipedia-like screed of information that is fairly common knowledege. The point is to use this section insightfully, and ideally succintly.

Leave a Reply

Your email address will not be published. Required fields are marked *