The Measurement Group

Improving outcomes in health, behavioral health & social services through applied social research

  • Home
  • About
  • Blog
  • Contact

New Life for a Training Evaluation Module: Module 57-Training Evaluation Form: Skills, Attitudes, Comfort

October 19, 2022 By Lisa Melchior

Data collection image

We recently learned that an evaluation module that was part of a cross-cutting evaluation we conducted for the Health Resources and Services Administration (HRSA) has been adapted for use in the evaluation of HRSA’s Nurse Education, Practice, Quality and Retention (NEPQR) – Registered Nurses in Primary Care (RNPC) Training Program.

Module 57: Training Evaluation Form: Sills, Attitudes, Comfort was developed as part of our evaluation of the HRSA Special Projects of National Significance (SPNS) Cooperative Agreement projects in the 1990’s. This instrument measures self-reported changes in attitudes, skills, and comfort with training subject matter.

The citation for this measure is as follows:
Huba, G. J., Melchior, L. A., Staff of The Measurement Group, and HRSA/HAB’s SPNS Cooperative Agreement Steering Committee (1997). Module 57: Training Evaluation Form: Skills, Attitudes, Comfort. The Measurement Group, Culver City, California.

Published work that used this measure:

Panter AT, Huba GJ, Melchior LA, Anderson D, Driscoll M, Rohweder C, Henderson H, Henderson R, Zalumas J. Healthcare provider characteristics and perceived confidence from HIV/AIDS education. AIDS Patient Care STDS. 2000 Nov;14(11):603-14. doi: 10.1089/10872910050193789. PMID: 11155902.

Filed Under: Program Evaluation, Uncategorized Tagged With: Program Evaluation

Evaluation, Pandemics, and Running a Small Business (From March 2020)

March 25, 2020 By Lisa Melchior

Greetings from my home office! The last few weeks have brought unprecedented change to our world. We plan to re-launch this blog and share posts on a variety of topics related to our evaluation work, the programs we work with, the populations they serve, as well as assorted random thoughts about who knows what.

The AEA365 blog posted recently about the role of evaluation during a pandemic. Among other things, the piece stressed flexibility and responsiveness. I can’t agree more. I’d also stress keeping communication going in these difficult times. Our work is primarily with community-based health, behavioral health, and social service providers. They are having to re-design their services on the spot – working to maintain connections and support for their clients while keeping their staff safe and healthy.  Because community programs are focused on responding to the crisis, addressing evaluation concerns isn’t necessarily their highest priority. We are reaching out to our clients to let them know we are here for them and offering whatever support we can. And, it’s an opportunity to check in and evaluate whether we need to adjust our evaluation designs or implementations. For example, are our evaluation protocols working given new modes of service delivery? Do we need to adapt any of our procedures?

In terms of running this small business, we are fortunate – we can continue to do our work remotely without much interruption. But the logistics bring some challenges. Being able to just bounce something off one another takes a bit of planning (picking up the phone or messaging), rather than just walking across the hall in the office. On the plus side, though, we get to work surrounded by family and furry companions.

Be well,

Lisa

Filed Under: Program Evaluation, Uncategorized Tagged With: behavioral health, COVID-19, Lisa Melchior, Program Evaluation, The Measurement Group, TMG

From the AEA365 Blog: Lessons Learned in Evaluating Cross-Systems Programs

August 21, 2017 By Lisa Melchior

Credit/Copyright Attribution: Belight/Shutterstock

It’s Behavioral Health week on the American Evaluation Association’s AEA365 blog, and I had the opportunity to contribute today’s post on lessons learned in evaluating cross-systems programs. I’m grateful for the opportunity to share some of our experiences. It was an interesting exercise to think about how our work has taken a systems approach over the years.

 

Filed Under: Program Evaluation Tagged With: aea365, american evaluation association, behavioral health, Lisa Melchior, systems of care, The Measurement Group

How is program evaluation like tennis?

April 28, 2016 By Lisa Melchior

Image courtesy of nixxphotography at FreeDigitalPhotos.net
Image courtesy of nixxphotography at FreeDigitalPhotos.net

How is program evaluation like tennis?

I’ve been playing tennis on and off for as long as I can remember. I took a long break while my son was younger and my free time was a lot more limited. Now that I’m an empty nester and working on some fitness goals, it seemed like a good time to pick up my racquet again. But once I got on the court, I discovered that not only was I pretty rusty, my recall of how to hit basic strokes was off — even though I’d had plenty of practice and repetition in the past.

So how is program evaluation like tennis? It’s the issue of “drift.” With my tennis game, I was hitting my forehand shot how I thought I remembered it was supposed to be done. However, I quickly discovered with a new instructor that I had a lot to re-learn! In evaluation, we design protocols and procedures for collecting data, we train our data collectors, and off they go. However, without regularly checking in with data collectors, doing refresher trainings, and conducting data quality assurance, it’s all too easy for people to drift in how closely they follow data collection procedures. Moreover, if a project has multiple data collectors and each drifts with respect to protocols in different ways, there is no longer a consistent protocol being followed. This can lead to systematic differences in how data are collected by different people.

How to minimize data collection drift? Open and frequent communication, monitoring, and periodic training for data collectors. All are important parts of the evaluation process to maximize data quality. It’s important to ensure that everyone involved has a consistent understanding of how to implement data collection, how to address unexpected situations as they come up, and to make adjustments to make sure our measurement tools are working as intended – just like the tennis player who wants to make sure she hits the ball where she wants it to go instead of into the net.

PS – I recently read a blog asking how statistics are like knitting (and of course I can’t find the link to share here, sorry). The upshot of that post was that you get better with practice. The same can be said about program evaluation – and tennis.

Filed Under: Program Evaluation Tagged With: data collection, data quality assurance, Lisa Melchior, Program Evaluation, tennis, training

Mental health home visiting service model for child abuse prevention

March 23, 2016 By Lisa Melchior

cysrMy co-authors Katherine Reuter, Ph.D. and Amber Brink and I recently published an article about a mental health home visiting service model for child abuse prevention. The journal – Children and Youth Services Review – allowed us to share our findings not only by publishing the full article, but also in a brief 5-minute audioslide presentation. Click here to view the presentation and learn more about the model and results that support its effectiveness in improving family functioning among families with young children who are at risk for child maltreatment.

Filed Under: Populations, Program Evaluation, Service Models, Uncategorized Tagged With: Amber Brink, child abuse prevention, Children and Youth Services Review, Katherine Reuter, Lisa Melchior, mental health, Program Evaluation, The Measurement Group

  • 1
  • 2
  • 3
  • Next Page »

Helping innovative programs
improve their quality and
document their impact.

  • Facebook
  • LinkedIn
  • Twitter

Copyright © 2025 · Outreach Pro On Genesis Framework · WordPress · Log in