HomeNews and blogs hub

Evidence for the importance of research software

Bookmark this page Bookmarked

Evidence for the importance of research software

Author(s)
Alejandra Gonzalez-Beltran

Alejandra Gonzalez-Beltran

SSI fellow

Michelle Barker

Daniel S. Katz

Posted on 8 June 2020

Estimated read time: 9 min
Sections in this article
Share on blog/article:
Twitter LinkedIn

Evidence for the importance of research software

Posted by j.laird on 8 June 2020 - 9:40am

By Michelle Barker, Daniel S. Katz and Alejandra Gonzalez-Beltran.

(This post is cross-posted on the URSSI blog and the Netherlands eScience Center blog, and is archived in Zenodo).

A black holeFirst image of a black hole. CC BY 4.0

This blog analyses work evidencing the importance of research software to research outcomes, to enable the research software community to find useful evidence to share with key influencers. This analysis considers papers relating to meta-research, policy, community, education and training, research breakthroughs and specific software.

The Research Software Alliance (ReSA) Taskforce for the importance of research software was formed initially to bring together existing evidence showing the importance of research software in the research process. This kind of information is critical to achieving ReSA’s vision to have research software recognised and valued as a fundamental and vital component of research worldwide.

Methodology

The Taskforce has utilised crowdsourcing to locate resources in line with ReSA’s mission to bring research software communities together to collaborate on the advancement of research software. The Taskforce invited ReSA Google group members in late 2019 to submit evidence about the importance of software in research to a Zotero group library. Evidence could be from a wide range of sources including newspapers, blogs, peer-reviewed journals, policy documents and datasets. 

The submissions to Zotero to date highlight the significant role that software plays in research. We analysed the submissions and tagged them based on how some community research software organisations categorise focus areas. Some of the documents were also tagged by country and/or research discipline to enable users to search from that perspective. This resulted in the following tags, which are explained in the following sections:

  • Meta-research 
  • Policy 
  • Community
  • Education and training 
  • Research breakthroughs 
  • Software

Submission contents

Explanations of each category, and examples of some of the resources, are highlighted below. It should be noted that some of the resources mentioned below are important in a number of the categories; however, have only been cited here in one. 

Meta-research contains resources that include analysis of how software is developed and used in research: 

Policy is used to tag resources that focus on policy related to software, including the need for increased policy focus in this area.

Community considers work on the importance of research software communities in ensuring best practice in software development.

Education and training identifies work that considers issues round skills, training, career paths and reward structures.

Research breakthroughs include papers on significant research accomplishments that acknowledge reliance on research software tools:

Software has been applied as a tag to resources that mention particular pieces of software, including when this is part of a broader focus in the resource. This category includes a sample of the tens of thousands of papers that rely on research software and that collectively build knowledge in a field, for example:

Other useful approaches

This analysis has been useful in elucidating some of the ways in which the value of software can be demonstrated. However, there are also other approaches that could be useful. For example, methods to evaluate economic value are providing valuable statistics for research data, but comparable examples for research software are rare. The recent European Union publication, Cost-benefit analysis for FAIR research data, finds that the overall cost to the European economy of not having Findable, Accessible, Interoperable and Reusable (FAIR) research data is €10.2bn per year in Europe. A 2014 Australian study by Houghton and Gruen similarly demonstrates the economic value of data, estimating the value of Australian public research data at over $1.9 billion a year. One of the few economic valuations of research software is a 2017 analysis by Sweeny et al. of the return on investment generated by three Australian virtual laboratories, which provide access to research software and data for researchers, was at least double the investment for every measure. This indicated that the services had a significant economic and user impact - by one measure the value of one virtual laboratory was over 100 times the cost of investment. It would be useful if more studies were undertaken to demonstrate the economic benefits of research software using different methodologies. 

Conclusion

This summary of evidence of the importance of research software to research outcomes illustrates increasing recognition of this fact. It could also be used to encourage the community to consider where additional work could be useful (such as expanding existing surveys in specific countries and disciplines to get a more global scope), and to inspire the recording of more of this information (which could include striking examples that convey the impact of failing to understanding the costs and responsibilities of thoughtful software management).

We encourage readers to submit additional resources to the ReSA resources list, which is publicly available:

  • Add it directly to the ReSA Zotero group library (requires Zotero account).
  • Submit an issue in GitHub (requires GitHub account).
  • Email it directly to ReSA.
Share on blog/article:
Twitter LinkedIn