The Pathways Clearinghouse team wants to make sure you can access the evidence you need! Browse responses below to some common questions about the Pathways Clearinghouse. If you do not see the answer to your question, contact the team.

Or use the filters below to narrow down the choice of FAQ and Answers:

What is the difference between the Pathways Clearinghouse and the Department of Labor’s Clearinghouse for Labor Evaluation and Research (CLEAR)?

The Pathways Clearinghouse, an investment of the Office of Planning, Research, and Evaluation in the Administration for Children and Families within the Department of Health and Human Services, identifies interventions that aim to improve employment outcomes, reduce employment challenges, and support self-sufficiency for populations who are low income, especially recipients for public programs such as TANF. The Pathways Clearinghouse was designed for use by direct employment service providers and TANF administrators. Potential users also include policymakers and researchers.

The Department of Labor’s Clearinghouse for Labor Research and Evaluation (CLEAR)'s mission is to make research on labor topics more accessible to practitioners, policymakers, researchers, and the public more broadly so that it can inform their decisions about labor policies and programs. CLEAR reviews literature across a range of labor-related topics, including on employment and training, health and safety, worker benefits, and employer compliance.

These Clearinghouses are coordinated federal efforts given the overlap in content. While the scope of the Pathways Clearinghouse also includes research eligible for review by CLEAR, Pathways specifically focuses on programs designed to meet the needs of populations who are low income or who face chronic barriers to employment.

How can I stay up to date on new information released by the Pathways Clearinghouse?

We will send occasional mailings to users who subscribe to our email list, alerting them to new website content and open calls for studies. For real-time updates on Pathways Clearinghouse activities, events, and presentations, follow the Office of Planning, Research, and Evaluation (OPRE) on Twitter or Facebook and use the #PathwaysClearinghouse hashtag to launch a conversation. You can also follow Mathematica using @MathematicaNow across all social media platforms.

Where can I find a list of well-supported or supported interventions?

Use the intervention search tool to find the most up-to-date information on the evidence about helping youth and adults with low incomes succeed in the labor force. Begin by going to Find Interventions that Work, and then use the sorting options to show interventions that are rated Well-supported well-supported or Supported supported by domain at the top of the search results. 

How do I submit an employment and training intervention for consideration?

The Pathways Clearinghouse welcomes submissions of research studies that have evaluated a particular employment and training intervention. Researchers or others interested in having the Clearinghouse review studies of specific interventions may submit the intervention name and its associated citations or research studies to the question below: How do I submit my research for consideration?) You can also submit developmental interventions, which have research of their effectiveness underway but not yet completed, and case studies of interventions using new and promising practices with no research of their effectiveness. The team will log all recommendations. Those interested in submitting such studies should note, however, that the Pathways Clearinghouse does not review interventions on a rolling basis. Instead, we review interventions periodically, as resources allow. As a result, interventions submitted to the Pathways Clearinghouse will not be reviewed immediately upon request. In addition, submitting a recommendation does not guarantee that we will review studies of the intervention.

How does the Pathways Clearinghouse choose research for review?

The Pathways Clearinghouse team uses prespecified keywords to systematically search databases of journal publications, evaluation reports, and unpublished literature (such as working papers). We also incorporate studies cited in literature reviews and those submitted in response to a public call for relevant studies. We screen studies for their eligibility to be included in the review against a set of predefined criteria. Then, we prioritize the eligible studies based on publication date (examining the most recent first and turning to older research as resources allow). For more information, please see the Protocol for the Pathways to Work Evidence Clearinghouse.

How do I submit my research for consideration?

Occasionally, the Pathways Clearinghouse issues a call for studies and invites the public to submit research. When open, we will post a link to more information on the Pathways Clearinghouse home page and will circulate the call to users who subscribe to our email list. Interested stakeholders can submit research at any time to, even when no call for studies is posted, and the team will log the submission for consideration alongside submissions from the next call for studies. Submitting research does not guarantee that the study will be reviewed. We decide which research to review using the systematic process outlined in the Protocol for the Pathways to Work Evidence Clearinghouse.

Can I appeal the rating that the Pathways Clearinghouse has applied to my study?

The Pathways Clearinghouse Quality Review Team (QRT) handles any challenges stakeholders make about a review’s findings, the inclusion of a study within the Pathways Clearinghouse, or other individual judgements the Pathways Clearinghouse team makes. The QRT addresses any issues with reviews that stakeholders raise, so long as they are (1) submitted in writing to, (2) related to a specific study or well-defined set of studies, and (3) coherently explained (and the inquirer is available to answer any clarifying questions).

When a request is submitted to the QRT, a team member first verifies the request meets the criteria listed above. After this confirmation, the team member examines the study and any related materials, discusses the review with the original study reviewers, and presents a summary of the review and any potential flaws to the QRT. The QRT then determines whether the initial review should be revised, notifies OPRE and the inquirer of its findings and, if necessary, edits any Clearinghouse products to reflect the updated review.

How does the Pathways Clearinghouse identify developmental interventions?

Developmental interventions have a rigorous impact evaluation underway, but no available findings at the time that Pathways Clearinghouse identifies them. Nominations for developmental interventions can come from federal staff, evaluators, and Pathways Clearinghouse stakeholders. If the evaluations meet the eligibility criteria established in the Protocol for the Pathways to Work Clearinghouse: Methods and Standards, they are added to the Developmental interventions page. These evaluations need to be randomized controlled trials or comparison- group quasi-experimental designs of an employment or training intervention in the U.S. or Canada that focuses on populations with low-incomes.

It is important to note that the Pathways Clearinghouse is not a registry for impact evaluations and submitting evaluations does not count as registering research. If you have a developmental intervention to share with the Pathways Clearinghouse, please email the name of the intervention, a brief description, and links to any additional publicly available materials (such as websites or reports) to

How does the Pathways Clearinghouse estimate the effects of an intervention?

For the Pathways Clearinghouse, the effects shown are the estimated changes in the percent of low-income adults who are employed, average annual earnings, average annual public benefits received, and percent of low-income adults with any education and training credential. The Pathways Clearinghouse takes each effect and estimates comparable effects in 2018 dollars and percentages by calculating the average impact of the intervention in standard deviation units (Hedges’ g effect sizes) across outcomes and studies and comparing that average to the distributions of outcomes for adults with low earnings potential in the 2019 Current Population Survey Annual Social and Economic Supplement. The impacts in standard deviation units are also combined into domain average effects by taking the average across studies and outcomes, giving more weight to studies with larger sample sizes. These averages are then also converted into 2018 dollars and percentages.

What’s the difference between effect and effectiveness rating?

An effectiveness rating is the assessment of the Pathways Clearinghouse, based on the existing evidence from impact studies, of the extent to which a given intervention improves a specific type of outcome. The effectiveness rating is a holistic assessment of whether an intervention is likely to produce favorable results if faithfully replicated with a similar population. An effect size is a standardized measure that allows us to make direct and meaningful comparisons across different outcomes, settings, and interventions. Both the effectiveness rating and the effect size can be compared across studies in the Pathways Clearinghouse. The effect size is the impact found in the available research, and the effectiveness rating is the likelihood that an intervention would produce a similar, favorable effect if implemented again with a similar population and context.

How can I use effectiveness ratings to know what impact an intervention would likely have if implemented again?

An effectiveness rating assesses whether an intervention is likely to produce favorable results if faithfully replicated with a similar population. Outcome domains with well-supported ratings are those that the evidence suggests are most likely to improve if an intervention were replicated with a similar population. Outcome domains with supported ratings have some evidence that the intervention improves them, but the evidence is less conclusive. Outcome domains that receive a rating of not supported have strong or consistent evidence that the intervention is unlikely to produce favorable effects.

However, because implementation challenges and successes often vary and no two implementations of an intervention are identical, the well-supported and supported ratings do not guarantee success.

The intervention search results show one effectiveness rating per domain, but when I click on the individual intervention, many domains have more than one rating. What does the effectiveness rating by domain on the intervention search page mean?

The four domains for which the Clearinghouse rates effectiveness are earnings, employment, public benefit receipt, and education and training. Within the first three of these domains, the Pathways Clearinghouse reports on short-term, long-term, and very-long term outcomes [the education and training findings are reported at the longest follow-up, and not segmented into these three time periods]. To make the intervention search results display easier to view and navigate, the effectiveness ratings on the search page represent the highest rating given to the short-term, long-term or very-long term outcomes for that intervention. For example, if an intervention has a supported effectiveness rating in the long-term for earnings, but not in the short-term or very-long term, we will display the Supported supported icon for the earnings domain. Users can click on the individual interventions to see whether the effectiveness ratings apply to short-term, long-term, or very-long term outcomes.

The Pathways Clearinghouse reports the effects of an intervention for earnings, employment, public benefit receipt, and education and training. Why do some interventions lack effectiveness ratings or effects for some of those domains?

The Pathways Clearinghouse’s ability to report on the effects of an intervention are tied to the existing evidence. Not all interventions have been studied along all the domains on which we report [earnings, employment, public benefit receipt, and education and training]. An intervention may have studies that examined effects in some outcomes but not in others. In other cases, the quality of evidence may vary across outcome domains. If we did not find any studies that rated moderate or high quality that studied the intervention’s effect on outcomes in a given outcome domain, that outcome receive an effectiveness rating of no evidence to assess support. In addition, we may not have all of the data to extract an effect size from the original study (as discussed in the previous FAQ). In those cases, we cannot report on the effects on an intervention for those domains.

What are very long-term outcomes?

Very long-term outcomes are those measured 5 years or more after participants in the study’s intervention group are first offered services.

How can an intervention with a negative overall effect on earnings have a supported rating in earnings?

This has to do with the difference between how we calculate whether or not an intervention has evidence of being effective on a given outcome domain and how we calculate the size of an intervention’s effects on that outcome domain. In order to receive a supported rating, an intervention must have at least one statistically significant, favorable finding and no statistically significant unfavorable findings in the given outcome domain. Effect sizes, on the other hand, are an average of all findings for a given outcome domain, including those that are not statistically significant.

Take, for example, a study that finds three effects on earnings, one of which is statistically significant, and favorable, and two of which are statistically insignificant, but unfavorable. Because the study identified a statistically significant, favorable effect on earnings and no statistically significant unfavorable findings, the intervention receives an effectiveness rating of supported on earnings. In calculating its overall effect on earnings, however, we average all the findings in this domain, the two unfavorable, but statistically insignificant findings, along with the statistically significant, favorable effect. The average of these three findings might result in an overall negative effect on earnings.

Where can I find information on the cost or cost-effectiveness of an intervention?

We provide cost information for interventions that have at least one supported supported icon or well-supported well-supported icon effectiveness rating. Cost information is located on the details page for an intervention, the page which supplies more context about interventions with favorable outcomes. All cost information reported on the website is taken directly from the study authors. Therefore, cost information on intervention pages will vary and might include costs per participant, administrative costs, or other details that authors choose to report. If the study authors conduct a cost-benefit analysis, we will also provide that data. The Pathways Clearinghouse does not factor information on cost into the effectiveness ratings displayed on the website.

Because the cost information is taken directly from the study authors, cost information is not directly comparable across interventions. The types of costs reported across interventions vary, and reports of costs are not adjusted to reflect the same economic year. For example, some costs per participant for one intervention might be reported in 1996 dollars, and others might be in 2013 dollars. However, the cost details provide a useful starting point for thinking about the costs to organizations that implemented the interventions.

During the next year, the Pathways Clearinghouse will continue to add implementation information, including author-reported cost details, for interventions that have a supported or a well-supported effectiveness rating in an outcome domain.

Why aren’t effect sizes reported for some study findings and interventions?

The Pathways Clearinghouse estimates effect sizes for each finding with a high or moderate rating when the study or study authors provide sufficient information to calculate an effect size. The Pathways Clearinghouse team contacts authors to obtain the information necessary to calculate the effect size, but in some cases, sufficient information is not available.

Why does the Pathways Clearinghouse report more information for studies with high or moderate quality ratings than those with low ratings?

Studies with low quality ratings demonstrate little evidence that findings are attributable, in part or in full, to the intervention examined. A low quality rating suggests that there is a high risk of bias. The Pathways Clearinghouse focuses on providing detailed information for studies rated high or moderate quality because these studies have a lower risk of bias, and these interventions are more likely to have contributed to the reported outcomes. In other words, the Pathways Clearinghouse focuses on well-implemented studies because they provide the most relevant, useful information for practitioners and decision makers.

Why do some studies use a different set of race and ethnicity terms?

Whenever possible, the Pathways Clearinghouse follows federal standards for classifying data on race and ethnicity. For some studies—in particular, studies published before 2000—Pathways Clearinghouse uses earlier classifications of race and ethnicity to be consistent with how studies tended to report this information at the time. Pathways Clearinghouse may use either set of race and ethnicity terms, depending largely on when a study was published. There are three key differences between the two classification systems:

  1. Some earlier studies asked individuals to select both their ethnicity (Hispanic) and their race (White, Black, Asian, and so on). In practice, this means that, in these older studies, an individual who self-identifies as both Hispanic and White could be identified in the data as both Hispanic and White. By comparison, under current standards the same individual would be identified only as “Hispanic or Latino”.
  2. The language used to describe certain racial classifications has shifted over time. For example, some earlier studies might use the term , whereas later studies use Native Hawaiian or Other Pacific Islander.
  3. For some earlier studies, Pathways Clearinghouse combined into a single category (1) the cases in which race was unknown or not reported by the study author, and (2) cases in which another category was used that is not aligned with the Pathways Clearinghouse categories. For example, a study author might present a population that is 40 percent White, 40 percent Black, 10 percent Middle Eastern or North African (MENA), and 10 percent unknown. In this scenario, the authors use a race classification, MENA, that is not clearly aligned with a category in the current federal standards. If this study was described using the older race and ethnicity classifications, then the Pathways Clearinghouse would combine the percentage of individuals who identified as MENA with those whose race was unknown, reporting that the study sample was 40 percent White, 40 percent Black, and 20 percent unknown, not reported, or another race. If the study sample were described using the current standards, by contrast, the Pathways Clearinghouse would classify the sample as 40 percent White, 40 percent Black, 10 percent another race, and 10 percent unknown or not reported. The move towards disaggregation is intended to provide practitioners with the most complete information about the demographics of the population served by a given intervention.
Why do the race and ethnicity totals sum to more than 100 percent for some studies?

The Pathways Clearinghouse strives to present a complete set of race and ethnicity data for each study, in accordance with the federal standards for classifying data on race and ethnicity. In practice, for most studies this means that an individual can fall only into single race or ethnicity category. For example, a person would identify as either Hispanic or Black, but not both. Because of this, most studies present race and ethnicity data that sums to 100 percent. In some cases, however, study authors treated race and ethnicity as non-exclusive categories. For example, an individual could self-identify both as Hispanic (ethnicity) and Black (race). In cases where individuals report on race and ethnicity separately, the sum total for race and ethnicity demographics might be greater than 100 percent.

Why does Pathways Clearinghouse list a portion of the samples for some studies as being of an unknown, not reported, or another race or ethnicity?

Individuals within the study sample could fall into these categories because they chose not to identify their race, because the authors did not collect or report race and ethnicity data, or because the individual identified with a race other than one of the federal categories. Wherever possible, the Pathways Clearinghouse differentiates between cases where race and ethnicity were unknown or not reported, and cases where race and ethnicity were reported but fell into a different category. However, a number of earlier Pathways Clearinghouse reviews of studies published before 2000 did not differentiate between unknown, not reported, and other cases, instead reporting these as a single category.

Why does the Clearinghouse report demographics on biological sex rather than gender?

Most of the studies reviewed by the Pathways Clearinghouse report information using the biological sex categories of male and female. Fewer studies report on gender demographics, including but not limited to transgender and nonbinary-inclusive gender categories. In order to present information consistently across all studies, the Pathways Clearinghouse has, at this time, limited itself to presenting biological sex categories rather than gender.

Why does the Pathways Clearinghouse only report the effects for some study findings?

The Pathways Clearinghouse reports on the effects for study findings that are rated high or moderate quality in our four domains [earnings, employment, public benefit receipt, and education and training]. When the quality of a study is high, that means we can be fairly confident in the study findings because the study finding is solely attributable to the intervention examined. This rating is reserved for study findings from high quality RCTs with low attrition of sample members. The Pathways Clearinghouse also reports effects for moderate quality studies, where we can be somewhat confident in the study findings.

There are several reasons that some findings in a study may not be rated. Specifically:

  • Studies may have outcomes that are rated low quality, where we cannot have much confidence in the study findings. Other important factors could have influenced the study findings, and the study did not account for them. The Pathways Clearinghouse does not report outcomes that are rated low quality.
  • If the original study provides findings for multiple outcome measures in a given domain, the Pathways Clearinghouse prioritizes findings to review and report based on the outcome measure, following the prioritization process summarized in Exhibit III.2 of the Protocol for the Pathways to Work Clearinghouse: Methods and Standards. For example, if study authors used both surveys and administrative records to assess earnings, Pathways Clearinghouse reviewers select two sets of earnings findings for review: one measured using survey data and one measured using administrative data.

In addition, the Pathways Clearinghouse may not have sufficient information to report on the magnitude of study findings. We calculate an effect size using study-specific data, if we can obtain from the study authors the information needed to do so.