Software metrics

Regular Institute collaborator Dr. Jeffrey Carver of the University of Alabama is conducting a couple of studies relating to the way that people develop research software. These will help provide the community with a better understanding of how different practices, including code review and software metrics are being used in the development of research software.

If you'd like to provide input into these studies, please participate in the following web surveys (each of which will take approximately 15 minutes to complete): 

Code review survey (in conjunction with Nasir Eisty of the University of Alabama) : https://universityofalabama.az1.qualtrics.com/jfe/form/SV_bBdeMr08ix8YbXL

Software metrics survey (in conjunction with Dr. George Thiruvathukal from Loyola University-Chicago): https://universityofalabama.az1.qualtrics.com/jfe/form/SV_darjzw2JlY3OXY1

 

Your participation is completely anonymous and voluntary.  You are free not to participate or stop participating any time before you submit your answers. Both research studies have been approved by the University of Alabama Institutional Review Board.

Understanding how to choose a piece of software is difficult. What code should I bet my research on? Will the project producing the software grow or shrink? Is the code base stable or changing? Does the project depend on one organisation or many? Is the community healthy or hopelessly ill?

At the Software Sustainability Institute, we want to ensure that research software is sustainable. One of the ways we can do this is by measuring the general health of the community around the software and developing methodologies and tools for analysing modern software development. With this in place, we can improve the health of projects and make it easier to answer the questions above.

We are therefore delighted that the Software Sustainability Institute is a founding partner in the Community Health Analytics Open Source Software project (CHAOSS). CHAOSS is a new Linux Foundation project focused on creating the analytics and metrics to help define community health that was officially launched this week.

The aims of the project are to:

  • Establish standard implementation-agnostic metrics for measuring software community activity, contributions, and health, which are objective and repeatable.
  • Produce integrated open source software for analyzing software community development.

Other members contributing to the project include Bitergia, Eclipse Foundation, Jono Bacon Consulting, Laval University (Canada),…

Continue Reading

Software metricsBy Neil Chue Hong, Software Sustainability Institute, Daniel S. Katz, University of Illinois Urbana-Champaign, Thomas Kluyver, University of Southampton, David Mawdsley, University of Manchester, Patrick McSweeney, University of Southampton, Geraint Palmer, Cardiff University.

This post is part of the Collaborations Workshops 2017 speed blogging series.

Software is important to research. Whether you think software is a primary product of research or not—or indeed not yet—it’s clear that a lot of researchers rely on a lot of pieces of software. From short, ill-planned, thrown-together temporary scripts to solve a specific problem, through an abundance of complex spreadsheets analysing…

Continue Reading
Subscribe to Software metrics