My co-authors Katherine Reuter, Ph.D. and Amber Brink and I recently published an article about a mental health home visiting service model for child abuse prevention. The journal – Children and Youth Services Review – allowed us to share our findings not only by publishing the full article, but also in a brief 5-minute audioslide presentation. Click here to view the presentation and learn more about the model and results that support its effectiveness in improving family functioning among families with young children who are at risk for child maltreatment.
It’s time to resume blogging
I’ve been absent from the blog for some time. I had a death in my family and assorted deadlines pushed blogging down the priority list for a while. But now it’s time to resume blogging. Look for new posts soon!
New publication coming soon

It’s always exciting when we have the opportunity to publish findings from our program evaluations. We are pleased to announce that we have a new publication coming soon! An article resulting from our work with the Westside Partnerships for Families program has been accepted for publication in Children and Youth Services Review. The paper, An Intensive Mental Health Home Visiting Model for Two At-Risk Early Childhood Populations, is authored by Katherine Reuter, PhD, former Program Director at the Providence Saint John’s Child and Family Development Center (CFDC) in Santa Monica, California, and Lisa Melchior, PhD and Amber Brink, BA of The Measurement Group.
Partnerships for Families is a child abuse prevention program that was initially funded by First 5 LA in 2006. Part of a county-wide initiative, Providence Saint John’s CFDC implemented this program in the Westside Service Planning Area of Los Angeles County. In the PFF Mental Health Model, mental health professionals work with families in a home visiting setting. The purpose of the program is to reduce risk and build protective factors in families with young children at-risk for child maltreatment. The paper describes the PFF Mental Health Model and documents improvements in family functioning among participants. It also presents data showing how caregivers reduced their risk on a number of personal characteristics — particularly those characteristics amenable to change through mental health intervention. Results were examined in two groups referred to the program: families with young children (age 0-5 years) who were referred by the Department of Child and Family Services, and pregnant women who were referred by community service providers due to risk factors for child maltreatment (such as depression, substance use, and/or domestic violence). Positive outcomes were observed at both the individual caregiver and family level in both of these groups.
We are pleased to help disseminate the outcomes of this innovative program and contribute to the evidence base for its effectiveness. While the focus of this article is primarily quantitative, we are currently working on another that will present a qualitative analysis of selected case studies to illustrate ways in which program participants demonstrate aspects of family strengthening.
Time flies…or what’s been going on at The Measurement Group

I really didn’t mean for this much time to go by since my last blog entry in May. My colleagues and I at The Measurement Group have been busy with a number of interesting projects. For example:
We recently completed a five-year evaluation of the Los Angeles Integrated Add Us In Consortium, led by the Integrated Recovery Network. This project was funded by the US Department of Labor, Office of Disability Employment Policy (ODEP) to expand employment opportunities for people with disabilities. The Los Angeles Consortium created and implemented a successful model for helping people with primarily mental health disabilities to gain “soft skills,” connect with employers, and gain and maintain jobs that match their abilities and interests. Across the five-year project period, the Los Angeles Consortium helped its clients — many of whom are homeless and dealing with mental illness — in successfully obtaining 193 jobs! It was an incredibly rewarding experience to work with this group and help them document and share their outcomes.
Since earlier this year, we have been working with Arizona First Things First on the evaluation of an online developmental screening program they have implemented in three regions. We’ve interviewed a number of stakeholders to understand issues that they encounter in providing developmental screening to families with young children (0-5 years). In particular, we’ve been learning about the successes and challenges they’ve had in using an online screening tool. We continue to gather and share information with our colleagues on this project to help inform their future developmental screening strategies.
These are just a few highlights of the work we’ve been doing in recent months. In addition, life got busy with high school graduations, family trips, and transitions with my son starting college. So, blogging took a back seat for a while. But with lots of news to share, I will be returning to active blogging soon…if not now, then I’ll make an early new year’s resolution?
Some of my professional influences in evaluation

An evaluation blog that I follow, aea365.org, had a recent post by Liz Zadnik about her professional influences and inspirations in evaluation. Reading Liz’s post got me thinking about some of my own influences over the years.
First of all, I am grateful for the mentorship of George J. Huba, PhD – my partner at The Measurement Group for 23 years. Back in the day, George started TMG from his kitchen table. Shortly after securing our first contract, I joined him in this exciting venture. Even though we both had professional training and experience in social science research methods (he much more than I), we learned a lot over the years about translating that into doing evaluation in the real world with programs that serve vulnerable populations. I learned from George’s ability to take results from complex analyses and turn them into straightforward, bottom-line recommendations for health, mental health, and social service professionals. George’s high standards set a bar for my work that I continue to strive for. Now a “retired evaluator,” George is currently a social media force to be reckoned with, blogging at www.hubaisms.com and tweeting prolifically as @DrHubaEvaluator.
My list of professional influences would not be complete without Vivian B. Brown, PhD. Dr. Brown, an accomplished community psychologist and founder of Prototypes (retired), was an early adopter of program evaluation, back when very few community treatment program administrators really understood the value of collecting data to demonstrate their outcomes. I have many fond memories of meetings with Vivian and George where we would brainstorm about evaluation questions, ways to disseminate our findings to the field, or how to use the information from our evaluation work to improve services for the women and children in Prototypes’ outreach, prevention, and treatment programs. One of the most important things I take from my work with Dr. Brown is to always be cognizant of the burden of collecting data from program participants, especially with respect to asking sensitive questions and the context in which they are asked.
Before I became an evaluator, I was trained as a research psychologist. So what made me want to do that in the first place? Jonathan M. Cheek, PhD was my undergraduate advisor at Wellesley College. There was something about doing research with him that sparked my interest. It struck me as something that certainly wouldn’t be boring. I have always appreciated the opportunities I had early in my career as an undergraduate to conduct and publish psychological research in several articles and book chapters. And although my work in personality research was a long time ago, it’s been exciting to see a renewed interest in the study of traits (for example, Susan Cain’s excellent book “Quiet”). My background in individual and group differences has certainly influenced my approach to evaluation — realizing that it’s not only whether a program is effective, but for whom and in what circumstances.
There are many more influencers and inspirations than I can acknowledge in this blog post. So, I’d like to think of this as the start of an occasional series and I look forward to giving a shout out to the rest of you in the future.