Incorporate open science standards in identifying evidence-based social programs

Incorporate open science standards in identifying evidence-based social programs

Evidence-based policy uses peer-reviewed research to identify programs that effectively address important societal issues. For example, some agencies in the federal government operate clearinghouses that review and evaluate the quality of peer-reviewed research to identify programs with evidence of effectiveness. However, the replication crisis in the social and behavioral sciences raises concerns that research publications may contain an alarming rate of false positives (rather than true effects), due in part to selective reporting of positive results. The use of open and rigorous practices—such as study registration and the availability of code and replication data—can ensure that studies provide valuable information for decision makers, but these characteristics are not currently collected or included in assessments of research evidence.

To remedy this issue, federal clearinghouses should incorporate open science practices into their standards and procedures used to identify evidence-based social programs eligible for federal funding..

The details

The federal government is increasingly prioritizing the curation and use of research evidence in policy making and supporting social programs. In this effort, federal evidence clearinghouses—influential repositories of evidence on program effectiveness—are widely relied upon to assess whether policies and programs in various policy sectors are truly “evidence-based.” As one example, the Every Student Succeeds Act (ESSA) directs states, districts, and schools to implement programs with research evidence of effectiveness when using federal funds for K-12 public education; The What Works Clearinghouse—an initiative of the U.S. Department of Education—identifies programs that meet ESSA’s evidence-based funding requirements. Similar mechanisms exist in the Departments of Health and Human Services (Preventive Services Clearinghouse and Labor Evidence Clearinghouse), Justice (CrimeSolutions), and Labor (Clearinghouse for Labor and Evaluation Research). Consequently, clearinghouse ratings have the potential to affect the allocation of billions of dollars appropriated by the federal government for social programs.

Clearinghouses generally follow clear standards and procedures to assess whether published studies used rigorous methods and reported positive results for outcomes of interest. However, this approach relies on the assumptions that peer-reviewed research is reliable enough to inform important decisions about resource allocation and is reported with sufficient precision for clearinghouses to discern which reported results represent true effects. that can be scaled back. Unfortunately, published research often contains results that are wrong, exaggerated, or unacceptable. The social and behavioral sciences are experiencing a replication crisis as a result of many large-scale collaborative efforts that have had difficulty replicating new findings in published peer-reviewed research. This issue is partly attributable to closed scientific workflows, which hinder reviewers’ and evaluators’ efforts to detect issues that adversely affect the validity of reported research findings—such as multiple undetected hypothesis testing and selective reporting of results.

Transparency and openness of research can mitigate the risk of false positives informing policy decisions. Open science practices, such as prospectively sharing protocols and analysis plans, or releasing the code and data required to replicate key results, would allow independent third parties such as journals and clearinghouses to fully assess reliability and replicability of research evidence. Such openness in the design, execution, and analysis of program effectiveness studies is essential to increasing public confidence in the translation of peer-reviewed research into evidence-based policy.

Currently, standards and procedures to measure and encourage open workflows—and to facilitate the detection of harmful practices in research evidence—are not implemented by either clearinghouses or peer-reviewed journals that publish effectiveness research. of the program that examine clearinghouses. When these practices are left unchecked, incomplete, misleading, or invalid, research evidence can threaten the ability of evidence-based policy to fulfill its promise to produce population-level impacts on important societal issues.

RECOMMENDATIONS

Policymakers should enable clearinghouses to incorporate open science into their standards and procedures used to identify evidence-based social programs eligible for federal funding and increase funds allocated to clearinghouse budgets to allowed them to take on this additional work. There are several barriers to clearinghouses incorporating open science into their standards and procedures. To address these barriers and facilitate implementation, we recommend that:

  1. Dedicated funding should be appropriated by Congress and allocated to federal agencies for clearing budgets so they can better incorporate evaluation of open science practices into research evaluation.
    • Funding should facilitate the hiring of additional staff dedicated to data collection if open science practices are used—and if so, if they are used well enough to assess the comprehensiveness of reporting (eg, screening of publications on results with future protocols) and reproducibility of results (eg, repeatability of analyzes using study data and code).
  2. The Office of Management and Budget should establish a formal mechanism for federal agencies that run clearinghouses to collaborate on common standards and procedures for reviewing open scientific practices in program evaluations. For example, an interagency task force could develop and implement updated standards of evidence that include evaluation of open scientific practices, consistent with the Guidelines for Promoting Transparency and Openness (TOP) for clearinghouses.
  3. Once funding, standards, and procedures are in place, federal agencies sponsoring clearinghouses should create a roadmap for eventual requirements on open science practices in program effectiveness studies.
    • Other open science initiatives aimed at researchers, research funders and journals are increasing the spread of open science practices in newly published research. As open science practices become more common, agencies may introduce open science practice requirements for evidence-based social programs, similar to the research transparency requirements implemented by the Department of Health and Human Services for the marketing and reimbursement of medical interventions.
    • For example, evidence-based funding mechanisms often have several levels of evidence to distinguish the level of certainty that a study has produced true results. Agencies with tiered evidence funding mechanisms can begin by requiring open science practices at the highest level, with the long-term goal of requiring a program that meets each level to be based on open evidence.

CONCLUSION

The momentum from the 2022 White House Evidence for Action and the 2023 Open Science year provides an unparalleled opportunity to connect federal efforts to strengthen the infrastructure for evidence-based decision-making with federal efforts to advance open research. Evidence of program effectiveness would be even more credible if favorable outcomes were found in multiple studies that were prospectively enrolled, comprehensively reported, and computationally reproducible using open data and codes. . With the support of policymakers, incorporating these open science practices into the clearinghouse’s standards for identifying evidence-based social programs is an impactful way to connect these federal initiatives that can increase the credibility of the evidence used for policymaking.

To learn more about the importance of open science and to read the rest of the published memos, visit the open science policy landing page.

Leave a Comment

Your email address will not be published. Required fields are marked *