Week 20: I was at the European Evaluation Society Biennial Conference!

I was extremely fortunate to attend and present at the European Evaluation Society Biennial Conference in Dublin, Ireland last week. Not only did I get to hear noted evaluators like Michael Scriven, Bradley Cousins, Jennifer Greene, and Michael Bamberger, I had the opportunity to co-present with an international group of new colleagues including Sara Vaca (Spain), Gillian McIlwain (Australia), Emmanuel Sangra (Switzerland), and Graham Smith (Australia).

Today I want to talk about something that Michael Scriven said that resonated with me. In a follow-up session after his conference keynote, Michael said (and I wrote this down word for word), “P-values and statistical significance are bulls***. We need to focus more on unintended consequences of the work and the ethics behind the work.” He went on to talk about if evaluators were only focusing on if a program did what it was supposed to and to what extent, then the evaluation field hasn’t advanced much in the last 50 years. He feels we have a moral responsibility to try to understand the unintended outcomes as well and how the program influenced those unintended outcomes.

His comments made sense to me on so many levels.

When working with our clients, they rarely ever care about p-values or any statistical notations, so we rarely ever share them in reports or presentations. The statistical work is there, in the background, as it’s how we arrive at some of our conclusions and recommendations, but we present it in a way that is more meaningful to the client…using charts, pictures, vignettes, etc. that paint the story of what the data is saying.

Uncovering the unintended consequences of a program or intervention is what we call DSI – Data Scene Investigation (yes, totally stealing that from CSI). Personally, I love digging into the data to uncover what other stories can be told, outside of the stories we were initially looking for (which are critically important, too, of course). Sometimes patterns emerge that peak our interest, so we dig deeper. Other times we just throw out hypotheses and test them with the data to see if they can be supported or refuted. We triangulate our data sources (quantitative and qualitative) when we develop all of our conclusions and recommendations, but it’s those unintended ones that can be so revealing and compelling. Sharing those unintended consequences can sometimes be scary, but I loved that Michael said it was our moral responsibility. I agree! This speaks to my deontologist soul, as I truly believe that I need to do what’s right no matter what – it is my duty to do so. And, if working with a client uncovers some unintended information that may upset them, it is still my duty to share that information and help them use it in a practical way.

We also try to be proactive in evolving with our evaluation design, implementation, and data analyses. Sure, the work we do is based in theories, some that may have been developed 50 years go, but we need to constantly push our work forward to make it more meaningful and useful for the clients. That may involve new ways of designing, analyzing, triangulating, interpreting, or reporting data and findings.

I always love Michael’s down-to-earth, no-holds-barred attitude when sharing his latest words of wisdom. I will share more insights from the conference, including some about my own presentation, in future posts.

DR. TACKETT’S USEFUL TIP: The usefulness of understanding the unintended consequences of an intervention often outweighs the value of uncovering progress towards outcomes, but working on both concurrently helps build trust in and creates value for the evaluation process.

DR. TACKETT’S USEFUL TIP: The usefulness of understanding the unintended consequences of an intervention often outweighs the value of uncovering progress towards outcomes, but working on both concurrently helps build trust in and creates value for the evaluation process.