Week 24: Share results often and in different ways!

I absolutely hate it when evaluation findings are only shared in a formal way once a year in an annual report…and it’s a long document full of technical words and tables, with many appendices of data that no one reads, that typically fulfills a funder requirement. Where is the use in that? Well, I suppose you could say it is useful in satisfying the funder and helping to procure future funding, but that’s not the kind of use that I get passionate about in my work.

Data and evaluation findings should be shared when they are available. Evaluators shouldn’t hold on to data or evaluation findings until the end of a fiscal year. This is one area that research and evaluation drastically differ. Researchers may not want to share findings with their subjects because it could invalidate the study by introducing bias and skewing the longitudinal analyses. Evaluators, on the other hand, typically are focused more on program improvement, and sharing evaluation findings more frequently can help with more responsive improvements and changes in the program design and implementation.

Data and evaluation findings should be shared in different ways. Not everyone responds well to a lengthy, technical report. Okay, most people don’t respond well to that! It’s is the evaluator’s responsibility to work closely enough with the client to understand in what ways the evaluation findings should be shared to be most easily interpreted and used. Here are some examples of ways to share the data and findings (in the future, we’ll talk more about how to do some of these and connect you with other blogs that do a great job of explaining some of them!):

  • Meeting summaries – instead of taking regular meeting notes, incorporate some on-the-spot analyses of what is going on to help the client, including points of tension, implicit decisions, emerging themes and patterns, and recommendations moving forward.
  • Dashboards – easy, visual way to share some high-level data or findings with the community groups, quick presentations, and boards.
  • Executive summaries – gives a little more detail than dashboards, can still be visually appealing, and can include some recommendations for action.
  • Targeted table of contents – instead of starting the report with a typical table of contents, integrate some key findings and specific suggestions for use within your table of contents.
  • Bite-sized reports – we like to create our evaluation reports so they can be easily parsed apart. That way, the client can copy one page to share at a meeting, and it will tell it’s own story within that one page. When taken in context with other pages, it tells a richer story, but programs can interpret and use the data in more digestible sections.
  • User-friendly visuals – we always recreate graphics that are created by our statistical software packages, making them more visual and drawing the client’s attention to the important learning from each visual.

DR. TACKETT’S USEFUL TIP: Sharing data and evaluation findings with clients regularly not only helps them improve their programs and shows them the ongoing value of evaluation, it helps build stronger relationships between program staff and evaluators and between program design and evaluation use