The Measurement Group

Improving outcomes in health, behavioral health & social services through applied social research

Helping innovative programs
improve their quality and
document their impact.

  • Facebook
  • Google+
  • LinkedIn
  • Twitter
  • Home
  • About
  • Blog
  • Anti-Racism Statement
  • Contact

Maternal Mental Health Awareness #MMHWeek2020

May 4, 2020 By Lisa Melchior 1 Comment

Fact: Maternal mental health (MMH) disorders impact up to 1 in 5 women; yet most never receive a diagnosis, treatment or support.

We recently teamed up with 2020 Mom, to conduct an evaluation of their pilot program to help identify ways in which certified peer specialist training could be adapted to address MMH. This pilot test included collaborations with several community-based partners, including Recovery Innovations (RI) which provided general Peer Support Specialist training.

The study included two components: (1) Certified Peer Support Specialist training and, (2) training specifically addressing Maternal Mental Health. The first was a two-week, live instruction that covered necessary skills in providing peer support in a behavioral health setting, Stages of Change, and addiction. The second component was provided by Women’s Health Innovations of Arizona and reviewed specific conditions and struggles related to maternal mental health over the course of 2 days.

The evaluation study demonstrated that those who participated:

  • Increased their confidence in their ability to provide peer support to other women;
  • Increased knowledge of information from training, including MMH-specific content;
  • Reported being satisfied or very satisfied with the training;
  • 100% use the information from the training in their current MMH work; and
  • Gained empathy, compassion and communication skills.

This evidence-based peer support specialist training had a positive impact on MMH peer support work and was beneficial overall to those who took part.

We are pleased to have had the opportunity to work with 2020 Mom to help tell their story! Check out this clip that includes our presentation at the Annual 2020 Mom Forum in February.

Share this post with #MMHweek2020!

 

Funding for the pilot test and evaluation study was provided by the Hope and Grace initiative. Click here to see the full report.

 

 

Filed Under: Behavioral Health, Populations, Program Evaluation Tagged With: #mmhweek2020, 2020 mom, Lisa Melchior, maternal mental health, mental health, Program Evaluation, The Measurement Group

Evaluation, Pandemics, and Running a Small Business

March 25, 2020 By Lisa Melchior Leave a Comment

Greetings from my home office! The last few weeks have brought unprecedented change to our world. We plan to re-launch this blog and share posts on a variety of topics related to our evaluation work, the programs we work with, the populations they serve, as well as assorted random thoughts about who knows what.

The AEA365 blog posted recently about the role of evaluation during a pandemic. Among other things, the piece stressed flexibility and responsiveness. I can’t agree more. I’d also stress keeping communication going in these difficult times.With the help of IT Support Clayton it is easier. Our work is primarily with community-based health, behavioral health, and social service providers. They are having to re-design their services on the spot – working to maintain connections and support for their clients while keeping their staff safe and healthy.  Because community programs are focused on responding to the crisis, addressing evaluation concerns isn’t necessarily their highest priority. We are reaching out to our clients to let them know we are here for them and offering whatever support we can. And, it’s an opportunity to check in and evaluate whether we need to adjust our evaluation designs or implementations. For example, are our evaluation protocols working given new modes of service delivery? Do we need to adapt any of our procedures?

In terms of running this small business, we are fortunate – we can continue to do our work remotely without much interruption. But the logistics bring some challenges. Being able to just bounce something off one another takes a bit of planning (picking up the phone or messaging), rather than just walking across the hall in the office. On the plus side, though, we get to work surrounded by family and furry companions.

 

Be well,

Lisa

 

Filed Under: Program Evaluation, Uncategorized Tagged With: behavioral health, COVID-19, Lisa Melchior, Program Evaluation, The Measurement Group, TMG

New Post on AEA365 Blog: Lessons Learned in Evaluating Cross-Systems Programs

August 21, 2017 By Lisa Melchior Leave a Comment

Credit/Copyright Attribution: Belight/Shutterstock

It’s Behavioral Health week on the American Evaluation Association’s AEA365 blog, and I had the opportunity to contribute today’s post on lessons learned in evaluating cross-systems programs. I’m grateful for the opportunity to share some of our experiences. It was an interesting exercise to think about how our work has taken a systems approach over the years.

 

Filed Under: Program Evaluation Tagged With: aea365, american evaluation association, behavioral health, Lisa Melchior, systems of care, The Measurement Group

Program Evaluation from Both Sides Now

January 11, 2017 By Lisa Melchior Leave a Comment

Today’s post looks at program evaluation from both sides now – from the perspective of Jackie Gelfand, MA a recent addition to our team here at The Measurement Group. Jackie has worn the hats of both evaluator and executive director of several community-based organizations (although not at the same time).

 


I have jackiehad the opportunity to participate in program evaluation as an evaluator, as well as a program director. The dual role allowed me to see the benefits of conducting program evaluation. It also helped me explain to staff that strangers would be hanging around with surveys, interrupting their work flow, and wanting to speak to “their” clients about intimate issues. Both roles required patience, an understanding of the process of evaluation, and excellent communication skills (to pass on that good understanding and patience).

In my experience as an executive director, the majority of staff come to work, do a good job, and like what they do. They complete required forms, attend required meetings, and grouse about other things. However, I think sometimes they can lack awareness of why they are doing things in a particular way. Do they understand how they have impacted the client? Are the clients improving? Getting better? Getting worse?

Program Evaluation

When unsuspecting staff lives are turned upside down by evaluation teams, they learn how their “interventions” are helping the clients. They learn what staff are doing right for clients, and what they may be doing wrong. Funders want to know how we are impacting clients. How can we tell that what we’re doing is good stuff? The funder also wants to see measurable goals and objectives and how our procedures deliver positive outcomes.

Preparation is an important factor in getting staff ready. People are coming. The evaluation team will talk to you about what you do and what the clients are doing. They’re going to talk to the clients. They will  talk to administrators, and the City and County and whoever else is around that deals with your organization and your clients. If you’re dealing with youth, they might even talk to parents (if they’re around). They could also speak with the neighbors where your agency is located who have never liked you being there. And, over the length of a year (perhaps more), they will process data continually so that they can give you the answers to questions you didn’t even know you were asking.

An Example

To illustrate, an agency I worked with received funds from a foundation to conduct program evaluation for a group home and foster family agency. The organization housed children from birth through their teens, many of whom had medical problems. What we learned from an internal informal evaluation was that many more of our youth ended up in permanent foster care than from other youth agencies. Our youth received a variety of services, including mental health, 24-hour nursing care, case management, and wellness programs. We were successful in our mental health program. We provided the support clients needed to function at school and sometimes within their families of origin. These youth were getting the assistance they needed in separating them from the child welfare system. The Department of Children & Family Services liked what we did. And we had the data to prove it.

Evaluating the Second Program

As an evaluator, this was the dream part of the evaluation. However, it wasn’t the only part that was being evaluated. Our Family Support Services Program was raucous and independent. They appeared to not understand what we were trying to do in this more formal evaluation. Interestingly enough, these particular staff members were more highly educated than the residential staff. They were therapists and therapy interns and case managers.

There were some negative outcomes. Parents complained that they weren’t getting their children back fast enough (this had to do with the Court System, DCFS and not our program). Our new foster family agency had difficulty signing up foster parents. This was true throughout Los Angeles County. However, we had numbers to meet that we promised to deliver. We did not meet those numbers ultimately. We did have a place to start to examine why we weren’t meeting our numbers. This was because of the evaluation.

I felt the frustration of the evaluators. I experienced some of the same frustration on the evaluator side when I was trying to get organizations to return calls or required forms. Staff arriving for weekly meetings and had not completed the tasks they were assigned. How do we report numbers or input them into a data base system if we don’t have any to report? Apparently I wasn’t doing that great of a job explaining the concept of evaluation and the buy-in was tenuous. Where we had success with the group home, we were not doing well in the foster family program.

The Outcome

Despite the good and the bad, I believe that the money from the foundation was well spent. We received information that was helpful to one program and that showed that we were benefiting the youth. The staff ended up questioning whether the new foster family program was viable in the current environment or with the particular population we served.

Without evaluation, we would have not been having discussions about any of this. Conducting the evaluation gave us the opportunity to look carefully at what we were doing. We were able to see how it impacted the clients, the neighborhood, the government entity, the staff, and, I would guess, the evaluators.

 

Filed Under: Program Evaluation Tagged With: evaluators, Jackie Gelfand, Program Evaluation, program staff, The Measurement Group

How is program evaluation like tennis?

April 28, 2016 By Lisa Melchior Leave a Comment

Image courtesy of nixxphotography at FreeDigitalPhotos.net
Image courtesy of nixxphotography at FreeDigitalPhotos.net

How is program evaluation like tennis?

I’ve been playing tennis on and off for as long as I can remember. I took a long break while my son was younger and my free time was a lot more limited. Now that I’m an empty nester and working on some fitness goals, it seemed like a good time to pick up my racquet again. But once I got on the court, I discovered that not only was I pretty rusty, my recall of how to hit basic strokes was off — even though I’d had plenty of practice and repetition in the past.

So how is program evaluation like tennis? It’s the issue of “drift.” With my tennis game, I was hitting my forehand shot how I thought I remembered it was supposed to be done. However, I quickly discovered with a new instructor that I had a lot to re-learn! In evaluation, we design protocols and procedures for collecting data, we train our data collectors, and off they go. However, without regularly checking in with data collectors, doing refresher trainings, and conducting data quality assurance, it’s all too easy for people to drift in how closely they follow data collection procedures. Moreover, if a project has multiple data collectors and each drifts with respect to protocols in different ways, there is no longer a consistent protocol being followed. This can lead to systematic differences in how data are collected by different people.

How to minimize data collection drift? Open and frequent communication, monitoring, and periodic training for data collectors. All are important parts of the evaluation process to maximize data quality. It’s important to ensure that everyone involved has a consistent understanding of how to implement data collection, how to address unexpected situations as they come up, and to make adjustments to make sure our measurement tools are working as intended – just like the tennis player who wants to make sure she hits the ball where she wants it to go instead of into the net.

PS – I recently read a blog asking how statistics are like knitting (and of course I can’t find the link to share here, sorry). The upshot of that post was that you get better with practice. The same can be said about program evaluation – and tennis.

Filed Under: Program Evaluation Tagged With: data collection, data quality assurance, Lisa Melchior, Program Evaluation, tennis, training

  • 1
  • 2
  • 3
  • Next Page »

5757 Uplander Way, Suite 200
Culver City, California 90230
310.216.1800


The Measurement Group (TMG) is a consulting firm specializing in the application of scientific program evaluation methods for health and social services.

Since 1988, TMG has helped programs assure and improve quality and document their impact. TMG focuses on working with innovative programs designed to reach underserved and vulnerable populations.

We are happy to answer any questions that may arise, and we look forward to speaking with you soon.

Lisa A. Melchior, Ph.D.
President

Contact Us

Copyright © 2021 · Outreach Pro On Genesis Framework · WordPress · Log in