The Measurement Group

Improving outcomes in health, behavioral health & social services through applied social research

  • Home
  • About
  • Blog
  • Anti-Racism Statement
  • Contact

We’re Hiring! Senior Evaluation Associate

January 31, 2023 By Lisa Melchior

We are looking to add a Senior Evaluation Associate to our team at The Measurement Group. If you think this position might be for you, please submit a cover letter and resume to info@themeasurementgroup.com.

Description

The Senior Evaluation Associate is a new position on our team. The Senior Evaluation Associate will be responsible for leading evaluation projects and managing multiple projects of varying size, duration, and complexity through all phases of the work – from proposal development through final reporting. We seek an experienced, well-trained evaluation professional with the ability to manage several ongoing projects simultaneously. The Senior Evaluation Associate will report directly to the company President.

Our current project portfolio includes evaluations of comprehensive, coordinated behavioral health care models, peer support programs within the context of maternal mental health, youth substance use prevention interventions, services for people re-entering the community from correctional settings, behavioral health workforce development initiatives, and other related health, behavioral health, and social service programs.

This position is primarily remote; occasional in-person meetings may be required depending on the nature of the work (at our Los Angeles area office and/or at client sites at various US locations). Applicants will need access to a home office with high-speed internet for this job. Occasional domestic travel may be required depending on the needs of a given project. We will provide a laptop and office supplies for your home office. It is an expectation that you have a place in or near your home where you can do your work. Having a professional setting appropriate for confidential meetings is required.

Primary Duties

  • Plan, design, and manage evaluation studies within budget and timeline
  • Manage and coordinate project teams and related workflow
  • Write and edit reports and other communications
  • Develop and implement evaluation measures (qualitative and quantitative, mixed method) and conduct appropriate analyses
  • Implement existing evaluation tools and protocols as required by project funders

Additional Duties

  • Develop and conduct trainings on evaluation methods and procedures.

Knowledge, Skills, and Abilities

  • Content knowledge/experience in health, behavioral health, or social service discipline
  • Proficiency and experience in program evaluation/applied social research methods
  • Ability to manage multiple simultaneous projects
  • Ability to work as a team member and independently
  • Ability to work with diverse clients and coworkers
  • Attention to detail and a commitment to quality
  • Comfortable with the ambiguities and tensions of work in which differing viewpoints are encouraged
  • Comfortable with the ambiguities that are part of doing creative professional work
  • Ability to contribute to a cooperative and supportive workplace environment
  • Excellent written and oral communication skills
  • Ability to present findings to diverse audiences, in writing and orally
  • Proficiency with general office productivity tools such as Microsoft Office, Google applications, etc.
  • Skills in design and use of a range of quantitative and qualitative data collection tools, including interviews, focus groups, observations, surveys, and assessments
  • Skills in quantitative, qualitative, and mixed method data analysis and data analysis software such as SPSS, NVivo, etc.
  • Ability to work collaboratively with coworkers and clients

Qualifications

  • Master’s degree or equivalent combination of education and experience in Social Work, Public Health, Psychology, Program Evaluation, or other field related to health, behavioral health, and/or social services
  • At least 5-7 years related experience (not including education), including prior project management or program development experience, plus solid experience with computer applications for office productivity and quantitative/qualitative/mixed method analysis
  • Fluency in oral and written Spanish preferred
  • Possible occasional domestic travel

Benefits

  • Employer-paid health, dental and vision insurance premiums
  • Vacation and sick time
  • Paid holidays
  • Employer-paid retirement account contribution (after vesting period)
  • Potential for future equity role in organization

Annual Salary

Depends on experience. This is an exempt (non-hourly) position.

To Apply

Submit resume and cover letter to Dr. Lisa Melchior at info@themeasurementgroup.com.

Filed Under: Program Evaluation, Uncategorized Tagged With: Program Evaluation, The Measurement Group

New Life for a Training Evaluation Module: Module 57-Training Evaluation Form: Skills, Attitudes, Comfort

October 19, 2022 By Lisa Melchior

Data collection image

We recently learned that an evaluation module that was part of a cross-cutting evaluation we conducted for the Health Resources and Services Administration (HRSA) has been adapted for use in the evaluation of HRSA’s Nurse Education, Practice, Quality and Retention (NEPQR) – Registered Nurses in Primary Care (RNPC) Training Program.

Module 57: Training Evaluation Form: Sills, Attitudes, Comfort was developed as part of our evaluation of the HRSA Special Projects of National Significance (SPNS) Cooperative Agreement projects in the 1990’s. This instrument measures self-reported changes in attitudes, skills, and comfort with training subject matter.

The citation for this measure is as follows:
Huba, G. J., Melchior, L. A., Staff of The Measurement Group, and HRSA/HAB’s SPNS Cooperative Agreement Steering Committee (1997). Module 57: Training Evaluation Form: Skills, Attitudes, Comfort. The Measurement Group, Culver City, California.

Published work that used this measure:

Panter AT, Huba GJ, Melchior LA, Anderson D, Driscoll M, Rohweder C, Henderson H, Henderson R, Zalumas J. Healthcare provider characteristics and perceived confidence from HIV/AIDS education. AIDS Patient Care STDS. 2000 Nov;14(11):603-14. doi: 10.1089/10872910050193789. PMID: 11155902.

Filed Under: Program Evaluation, Uncategorized Tagged With: Program Evaluation

Evaluation, Pandemics, and Running a Small Business (From March 2020)

March 25, 2020 By Lisa Melchior

Greetings from my home office! The last few weeks have brought unprecedented change to our world. We plan to re-launch this blog and share posts on a variety of topics related to our evaluation work, the programs we work with, the populations they serve, as well as assorted random thoughts about who knows what.

The AEA365 blog posted recently about the role of evaluation during a pandemic. Among other things, the piece stressed flexibility and responsiveness. I can’t agree more. I’d also stress keeping communication going in these difficult times. Our work is primarily with community-based health, behavioral health, and social service providers. They are having to re-design their services on the spot – working to maintain connections and support for their clients while keeping their staff safe and healthy.  Because community programs are focused on responding to the crisis, addressing evaluation concerns isn’t necessarily their highest priority. We are reaching out to our clients to let them know we are here for them and offering whatever support we can. And, it’s an opportunity to check in and evaluate whether we need to adjust our evaluation designs or implementations. For example, are our evaluation protocols working given new modes of service delivery? Do we need to adapt any of our procedures?

In terms of running this small business, we are fortunate – we can continue to do our work remotely without much interruption. But the logistics bring some challenges. Being able to just bounce something off one another takes a bit of planning (picking up the phone or messaging), rather than just walking across the hall in the office. On the plus side, though, we get to work surrounded by family and furry companions.

Be well,

Lisa

Filed Under: Program Evaluation, Uncategorized Tagged With: behavioral health, COVID-19, Lisa Melchior, Program Evaluation, The Measurement Group, TMG

From the AEA365 Blog: Lessons Learned in Evaluating Cross-Systems Programs

August 21, 2017 By Lisa Melchior

Credit/Copyright Attribution: Belight/Shutterstock

It’s Behavioral Health week on the American Evaluation Association’s AEA365 blog, and I had the opportunity to contribute today’s post on lessons learned in evaluating cross-systems programs. I’m grateful for the opportunity to share some of our experiences. It was an interesting exercise to think about how our work has taken a systems approach over the years.

 

Filed Under: Program Evaluation Tagged With: aea365, american evaluation association, behavioral health, Lisa Melchior, systems of care, The Measurement Group

How is program evaluation like tennis?

April 28, 2016 By Lisa Melchior

Image courtesy of nixxphotography at FreeDigitalPhotos.net
Image courtesy of nixxphotography at FreeDigitalPhotos.net

How is program evaluation like tennis?

I’ve been playing tennis on and off for as long as I can remember. I took a long break while my son was younger and my free time was a lot more limited. Now that I’m an empty nester and working on some fitness goals, it seemed like a good time to pick up my racquet again. But once I got on the court, I discovered that not only was I pretty rusty, my recall of how to hit basic strokes was off — even though I’d had plenty of practice and repetition in the past.

So how is program evaluation like tennis? It’s the issue of “drift.” With my tennis game, I was hitting my forehand shot how I thought I remembered it was supposed to be done. However, I quickly discovered with a new instructor that I had a lot to re-learn! In evaluation, we design protocols and procedures for collecting data, we train our data collectors, and off they go. However, without regularly checking in with data collectors, doing refresher trainings, and conducting data quality assurance, it’s all too easy for people to drift in how closely they follow data collection procedures. Moreover, if a project has multiple data collectors and each drifts with respect to protocols in different ways, there is no longer a consistent protocol being followed. This can lead to systematic differences in how data are collected by different people.

How to minimize data collection drift? Open and frequent communication, monitoring, and periodic training for data collectors. All are important parts of the evaluation process to maximize data quality. It’s important to ensure that everyone involved has a consistent understanding of how to implement data collection, how to address unexpected situations as they come up, and to make adjustments to make sure our measurement tools are working as intended – just like the tennis player who wants to make sure she hits the ball where she wants it to go instead of into the net.

PS – I recently read a blog asking how statistics are like knitting (and of course I can’t find the link to share here, sorry). The upshot of that post was that you get better with practice. The same can be said about program evaluation – and tennis.

Filed Under: Program Evaluation Tagged With: data collection, data quality assurance, Lisa Melchior, Program Evaluation, tennis, training

  • 1
  • 2
  • 3
  • Next Page »

Helping innovative programs
improve their quality and
document their impact.

  • Facebook
  • LinkedIn
  • Twitter

Copyright © 2023 · Outreach Pro On Genesis Framework · WordPress · Log in