By: AIF Staff
The American Idea Foundation, in consultation with Stand Together, the Sorenson Impact Center, and Notre Dame’s Lab for Economic Opportunities, worked together to create a user-designed clearinghouse that enables caseworkers to identify evidence-based programs and refer children and families to them.
The clearinghouse, Connect2Impact, was born from a desire to help social service providers identify evidence-based programs more easily. It was designed to fill an information gap that exists in the poverty-fighting space between end-users and those recommending programs and researchers who have evaluated these strategies. In short, it is all too difficult for harried caseworkers or parents to identify programs that truly work, so the clearinghouse aims to centralize this information with an emphasis on evidence and data.
The federal government has attempted to address this information gap before. It passed the Families First Prevention Services Act which required a new clearinghouse to show which poverty-fighting programs work and which do not. The federal government has also required other clearinghouses — for education, for welfare programs, for job-training programs – but they are rarely utilized by people on the ground.
Connect2Impact serves to make information about evidence-based programs and strategies available to individuals and families. It started first with child and family welfare programs and plans to broaden the scope of searchable programs going forward.
The decision to start with child and family welfare programs was made because of the profound effects the COVID-19 pandemic will undoubtedly have on our nation’s youth. A year of isolation, a prolonged disruption of regular routines, and a lengthy removal from in-person schooling is expected to negatively impact all children. Those children at-risk of entering into foster care are even more vulnerable.
In April 2022, Speaker Ryan convened and moderated a conversation about the thinking behind the Connect2Impact clearinghouse and the importance of promoting evidence-based strategies that make tangible differences in the lives of children. Ryan was joined in conversation by:
- Sara Peters, Vice President of Impact and Evaluation, Stand Together Foundation
- Brendan Perry, Project Design Manager, Wilson Sheehan Lab for Economic Opportunities
- Lilly Myers, Impact Strategy at Sorenson Impact Center
Excerpts from the policy panel, which have been edited lightly for clarity, follow. Video of the panel discussion can be accessed HERE
Sara Peters on the rationale behind the user-designed Connect2Impact clearinghouse
“Two weeks into COVID-19, I was leading a portfolio practice group and I was talking to a number of non-profit executive directors who were facing the reality of reduced budgets and an increase in program sign-ups. They were dealing with a lot of the sign-ups that they weren’t predicting — people who are anticipating getting laid off or people who were worried and on the precipice of poverty. These people were increasing the needed dosage of programming and they wanted families to receive it but program directors didn’t think they were ideally prepared to serve all of those new clients.
“I started to have some conversations about why isn’t there a clearinghouse that is client-focused and customer-focused, where you have some predictive algorithms where we can load some information from randomized control trials and external validations and some other programmatic characteristics? It would be a clearinghouse where users can self-select the characteristics that matter and over time, we can have a recommendation engine that program directors and program leaders and families can use when looking for services.”
Brendan Perry on the Literature Review that informed Connect2Impact’s approach
“What we learned was that while there are a lot of studies out there and a lot of research has been done on child welfare programs, around 500 studies that we found, there are really far too few studies that are conducting rigorous research with large enough sample sizes where you’re able to get a really clear result.
“And so, after we went through all of these, there are maybe only 20 or so studies that met our highest level of rigor and the highest level of a sample size. This really leads to a landscape where there is less causal evidence on the effectiveness of programs, which is less than ideal for such an important issue. Broadly, I would say we learned that there’s a need for more rigorous, large-scale studies on child welfare programs that not only improve our understanding of what works, but also for whom they have worked.”
Lily Myers on working with Utah-based providers to make Connect2Impact relevant:
“After the initial pilot program, a beta-version of the website was created and seeded with programs that were evidence-based and local to Salt Lake County. We were able to sit down with supervisors and practitioners from the Utah Department of Child and Family Services to have them actually use the website, walk them through the features, and get their feedback on what was useful and what they wanted to see from it.
“Overall, the response was incredibly positive. They were very excited about this kind of tool. A lot of times, the way that they find programs for their clients is word of mouth or something even as rudimentary as Googling to find what’s local to them. So, the opportunity to have a tool that combines what programs are actually local to their area that they can feasibly recommend for their clients and what the evidence is behind them, that was very valuable to those practitioners….”
“It was a big endeavor to just map out and characterize the programs offered to children and families within Salt Lake County. But from there, we’ve discussed opening it up to Utah as a whole and characterizing all of the state’s programs. I think it would be tremendously useful and impactful moving into other states and even starting just with larger cities.”
Ryan on the government’s lagging behind in developing data-driven child welfare strategies
“Until recently, policymakers have ignored the child welfare space. The recent passage of the Families First Prevention Services Act was the first major reform to this area going back to the early 1980s. This isn’t for a lack of problems that the system has been experiencing – far too many children were taken out of their homes too quickly, while other children were left languishing in really difficult situations. We just weren’t getting it right.
“Thankfully, there are a number of hard-working individuals in this space who are working to provide permanent safe homes to children. And even more importantly, we are working to prevent the need for youth to enter the foster care system in the first place. The creation of a searchable and accessible website for caseworkers and for other people who refer children and families to the child welfare space seemed necessary and that’s why we created Connect2Impact.”
Perry: Expanding usage of evidence-based programs requires more research and greater dissemination
“One part of it is getting research and one part is getting the study results into the hands of practitioners. I think the Connect2Impact tool is going to be really vital in bridging that gap. And as you said, I think it also revealed to us that there are some important research questions that aren’t adequately addressed in the existing literature. One of which might just be what the effects of these programs are on some of the long-term outcomes.
“It’s great to know what the effect of program X is on reunification or days in foster care. But it would be even more helpful to know what the effect of program X is on high school completion, college completion, interactions with the criminal justice system, and earnings down the road, and to understand the long-term impact of these programs….”
“In terms of what researchers should be doing to make evidence more usable, obviously, academic papers are a big part of what researchers do and they’re important to validate results but we need to stop thinking about academic papers as an end-product in any way. If a paper that’s evaluating your program is published and then it just sits on the shelf of another academic, it’s really not doing what it’s supposed to be doing and that is to inform the end users, the case managers who are sitting there with clients and policymakers who are making decisions.
“There is some onus on researchers and I know that we feel this a lot, but we have to take academic results and then package them and disseminate them in a way that can be used by the audiences that really need the information. I think this tool will go a long way in bridging that gap but there’s certainly a need for more work to be done on this and the creation of evidence and how evidence is disseminated to these different groups.”
Sara Peters on how evidence matters, but simple factors may matter more to end-users:
“From my discussions with practitioners, [the value of evidence] varied a lot based on their ability or frequency of using evidence in the behind-the-scenes decision-making of the programs that they use. From the evidence standpoint… it’s not as cut and dry as is there evidence or isn’t there evidence. All of these trials and studies have some kind of limitations and how applicable they are on certain sample populations.
“So, we realized the big thing that practitioners are looking at and do care about is the sample population and the groups that this research was done on and that ended up being a strong piece of information that workers on the ground want to see and want to be able to assess for themselves.”
Brendan Perry on how Notre Dame’s Lab for Economic Opportunities helps non-profits utilize data
“It’s definitely a long and exciting process sometimes, but really, it’s all about finding those innovators who are on the ground, who are doing something that they believe is moving the needle on poverty. It’s about those groups that are having a positive impact and showing these organizations the basics and the importance of doing impact evaluations.
“And then from there, our goal is really walking hand-in-hand with them to see how we can overlay a research design that’s going to be minimally impactful to their everyday work because we know that doing research is just another thing on their plate sometimes. We want to take as much of the burden from them as possible so that we can design a rigorous study in the least invasive way.
“Once we get it up and running, we work with them to understand the results and that brings us to how we disseminate results. One thing that we’ve begun to do as a way of sort of increasing dissemination is to build engagement plans with our partners about how to use results – whether those results are positive, negative or neutral – so that we can communicate to their internal teams what the results were, what they mean, and then communicate with other providers in the same space to put on different webinars, to have connections to funders, to help make connections to the media, connections to local policymakers, and be able to promote the result in a way that’s going to improve programming for their current clients and their future clients….
“Then finally, a big part of using evidence is replication so when we find those all-star, rock-star programs, it’s about making sure that we can package those programs and describe them in a way that makes them easy to scale and replicate in other places.”
This panel discussion on child welfare was part of a quarterly series of policy conversations hosted by the American Idea Foundation to draw attention to evidence-based policies aimed at expanding economic opportunities. Past policy conversations have focused on building a 21st century workforce, reforming the Earned Income Tax Credit, reducing recidivism and promoting 2nd chances, and properly implementing Opportunity Zones.
###