Expanding the impact narrative to include transparent research practices: A pilot project between DataSeer and Silverchair

Defining the true impact of research is an elusive task. Open science practices such as data sharing, code sharing, protocol and methods sharing, preregistration, and others, can establish pathways towards increased impact. Communicating that impact hinges on easy access to as many elements of the research, metadata surrounding the research, and its usage, as possible. It takes a village. 

For funders, transparent research maximizes the impact of every charitable dollar spent. That’s because transparent science practices are associated with high-quality, reproducible research—demonstrating the rigor and validity of scientific research, facilitating its reuse, and expanding its reach. For publishers, the ability to identify transparent research practices underpins organizational efforts to support open access, improve research discoverability, and showcase successes. Access to reliable, consistent Open Science Metrics are vital to both stakeholder groups.

Open Science Metrics meet Sensus Impact

In order to craft an engaging narrative around research impact, and the part transparent research plays within it, DataSeer, a provider of artificial intelligence data solutions for research stakeholders, is partnering with Silverchair, a leading independent platform provider for professional and scholarly publishers. The organizations have collaborated on a pilot that aims to display research transparency data on a funder level within the new Silverchair product Sensus Impact. This trial combines DataSeer’s Open Science Metrics with Sensus’ dashboards, creating another element in the dynamic, at-a-glance impact narrative that the platform’s funder microsites aim to tell. 

Using natural language processing, DataSeer analyzes thousands of published research articles and returns top-level metrics on transparent research practices such as data sharing, code sharing, protocol sharing, preprint posting, repository use, and more. Metrics can be filtered by date, journal/publisher, corresponding author affiliation, and funding source—making it easy to identify trends and track changes over time. With Open Science Metrics, scholarly communications stakeholders can establish meaningful benchmarks, set goals, implement policies and strategies, and quantify outcomes.

Developed in partnership with Oxford University Press (OUP) and with development informed by nearly 30 industry organizations that participate in the Community of Practice, Sensus Impact aims to centralize fragmented measurement and reporting into a single, free-to-access resource with a focus on demonstrating the impact of research supported by funders. Sensus captures access and usage data from sources including publishing platforms and Digital Science’s Altmetrics tool, enabling funders to track article views, downloads, research citations, patent and policy citations, and more.

Together, the reporting tools have the potential to seamlessly integrate value data with data on transparency and integrity. 

“Giving funders the information they need to make up-to-date decisions about their research outputs is a huge step forward for science communication. We’re really pleased to be part of the great work that Sensus Impact is doing, and hope to make a major contribution to the ways they can help funders make good decisions,” said Tim Vines, founder and CEO of DataSeer.

“We’re very excited to have DataSeer’s incredibly vital Open Science Metrics as an element of the data dashboards on the funder microsites selected for this pilot,” said Hannah Heckner Swain, VP of Strategic Partnerships at Silverchair.

“We’re very excited to have DataSeer’s incredibly vital Open Science Metrics as an element of the data dashboards on the funder microsites selected for this pilot.”

“We think that this is a great way to increase the value of Sensus Impact, showcase publisher efforts to encourage the adoption of open science practices within their author base, and paint a picture for funders of the reach and impact of their investments. We look forward to exploring the possibilities of expanding this pilot to include all funders, as well as more publishers as they sign on to participate in Sensus Impact.” 

Results of a trial of transparent research reporting featuring three funders

DataSeer analyzed a sample of articles publicly available through PMC and supported by one of five major funders: National Science Foundation (NSF), Bill & Melinda Gates Foundation, and National Cancer Institute (NCI), Canadian Institutes of Health Research (CIHR), and National Aeronautics and Space Administration (NASA). All five funders have open science and data-sharing policies in place. CIHR has, since 2015, required grantees to deposit specific types of data in public repositories and to retain original datasets for 5 years or longer. Gates is a Plan S signatory. NCI, NSF and NASA are U.S. federal funding bodies impacted by requirements of the Nelson Memo. Long before the Nelson Memo, NCI followed the National Institutes of Health’s 2003 Data Sharing Policy, which requires that research proposals seeking $500,000 or more include a data-sharing plan with their application, and release data as described in the plan no later than acceptance for publication.

The dataset included 5000 articles published between 2001 and 2024, 1000 from each funder, with a majority published since 2017. Across the sample, the rate of data sharing was 37%, code sharing 19.83%, and preprint posting 24.32%.

Viewed by individual funders, patterns begin to emerge. Researchers supported by NCI and NASA, and researchers supported by Gates and NSF shared data at similar rates. Authors funded by CIHR were less likely to share data.

These authors’ use of different data sharing methodologies (online, supplemental, or a combination) were broadly similar, with the exception of NASA-funded researchers, who were much more likely to share online than any other method.

NCI and NASA funded authors continued to exhibit similar behavior when it came to code-sharing. Authors funded by Gates were least likely to generate code during the research process; CIHR authors were least likely to share code if generated. Authors who shared code did so almost exclusively online, typically on GitHub.

Preprint posting was most common among authors funded by NASA (69%), NSF (24%) followed by Gates (10.8%), CIHR (9.89%), and NCI (7.8%).

When viewed by publication year from 2017 to 2023, all three transparent research practices showed a general upward trend. All three remained relatively stable in the early years, between 2017 and 2020, before ramping up in 2021—likely due to the combined effects of the pandemic and increasingly stringent Open Science mandates.

Looking ahead

We envision a future in which funders have transparent research adoption data at their fingertips alongside other more established and standardized metrics on article usage. That business intelligence will enable funders to demonstrate the real-world impact of transparent research, empower them to make data-driven policy dec