Week 71: Two Basic Principles for Reporting for USE

This past week I was asked to review an evaluation report which was in its final stage and needed one more going over. Here is how that day went:

  • 10:00am - Colleague asked me to review and provide comments by 3PM
  • 10:10am - Begin reading
  • 12:10pm - Nap due to boredom as a result of reading report
  • 2:10pm - Eventually arrive at conclusions and decide that I couldn’t take it anymore

Now, I don’t mean to sound facetious, but perhaps you have experienced something similar, either as a client of evaluation or as an evaluator. Too often evaluation reports are looooong, technical, poorly structured and ineffective in presenting the data that have been collected. After reading it I had two simple suggestions I gave the authors of this report, which I will share with you here:

  1. Present key findings as sub headers, followed by evidence to support each one. This report presented 25 pages (single spaced) of "findings," organized by the criteria that were introduced in the introduction, with no clear findings, leaving it up to the reader to extract said findings, which pretty much guarantees some inconsistency in interpretation on the part of said readers. The evaluator is tasked with collecting a body of evidence and extracting from that a series of relevant findings. It is not good practice to present a body of evidence, leaving out reference to findings, and expecting the reader to arrive at these findings on their own. By clearly presenting each finding in a subheading and then discussing this finding with all relevant pieces of data available, you provide the reader with a clear picture of what it is that was found and how you went about finding it.
  2. Use data to support claims. Seems obvious, I know. But, as evaluators, we often begin to develop a deep understanding of a program or project that we are evaluating. We begin to get hunches and gut feelings about a program. These should NOT come out in our reports without an evidence base. Sure, these can be helpful for knowing where to look, or even what to look for, but they must not be used in a way which resembles an opinion piece in the Sunday paper. In this particular case, I noticed varying use of evidence and too many claims that had no apparent evidence base. Use the data that you have likely spent a lot of time collecting, analyzing and trying to make sense of. Share the knowledge you have gained in a way which is transparent, in that you are making it clear where each of these claims comes from and how you arrived at it.

These two things would have made this report far easier to read and far easier to follow. Instead, I struggled through the findings section and emerged weary and unclear of what the “story” was. What was I supposed to get out of this report, as an interested party? I think I knew, but should we operate on the basis that we let readers get out of a report what they will? I don’t think so. Instead, I would argue it is our imperative to make it clear to the reader exactly what it is we found during the course of the evaluation, and present to them why we believe this to be true.

It was clear to me that the authors of this report had done a substantial amount of data collection and work and had a deep understanding of the subject. This is how we often emerge at the time of report writing in an evaluation. We have a story, in our minds, about the program or project we are evaluating and we need to put this story down on paper to tell it clearly to our readers. Make it clear, make it digestible, make sure that the story comes smoothly and not in unclear chunks. These tips are nothing fancy, they are just key principles I believe we must all keep in mind as we produce evaluations for people looking to learn from them.

COREY’S USEFUL TIP: When writing reports, make it clear to the reader what it is they are reading by chunking out findings, lessons learned, conclusions, etc. Support each of these with data. Put yourself in the shoes of the reader who won’t have the deep, evidence-driven understanding of the program that you have as the evaluator.