How is program evaluation like tennis?
I’ve been playing tennis on and off for as long as I can remember. I took a long break while my son was younger and my free time was a lot more limited. Now that I’m an empty nester and working on some fitness goals, it seemed like a good time to pick up my racquet again. But once I got on the court, I discovered that not only was I pretty rusty, my recall of how to hit basic strokes was off — even though I’d had plenty of practice and repetition in the past.
So how is program evaluation like tennis? It’s the issue of “drift.” With my tennis game, I was hitting my forehand shot how I thought I remembered it was supposed to be done. However, I quickly discovered with a new instructor that I had a lot to re-learn! In evaluation, we design protocols and procedures for collecting data, we train our data collectors, and off they go. However, without regularly checking in with data collectors, doing refresher trainings, and conducting data quality assurance, it’s all too easy for people to drift in how closely they follow data collection procedures. Moreover, if a project has multiple data collectors and each drifts with respect to protocols in different ways, there is no longer a consistent protocol being followed. This can lead to systematic differences in how data are collected by different people.
How to minimize data collection drift? Open and frequent communication, monitoring, and periodic training for data collectors. All are important parts of the evaluation process to maximize data quality. It’s important to ensure that everyone involved has a consistent understanding of how to implement data collection, how to address unexpected situations as they come up, and to make adjustments to make sure our measurement tools are working as intended – just like the tennis player who wants to make sure she hits the ball where she wants it to go instead of into the net.
PS – I recently read a blog asking how statistics are like knitting (and of course I can’t find the link to share here, sorry). The upshot of that post was that you get better with practice. The same can be said about program evaluation – and tennis.