Can we improve the sustainability and reusability of academic surveys?
Posted by j.laird
on 9 March 2021 - 9:30am
By Gillian Law, Techliterate.
Surveys are used in academia to collect data, investigate research questions, or to understand environments and drive policy. Can we improve their sustainability and reusability?
The Software Sustainability Institute has used surveys extensively over the years to track the development of research software practices and sustainability. It isn't straightforward to develop survey projects that run over several years to collect longitudinal data, whilst ensuring the data will stay interoperable over time, and comply with open standards like the FAIR principles.
Making a real impact
We were recently contacted by the following people with questions about a survey they had been commissioned to run on open data use in Africa: Anelda van der Walt, director of Talarify - a South African research support consultancy; Louise Bezuidenhout, research fellow at the Institute for Science, Innovation and Society, University of Oxford, who works as a social scientist examining the evolving Open Data/Open Science landscape and the evolution of data sharing infrastructures, practices and communities; and Thomas Mboa, Professor of Digital Humanities at the Advanced School of Mass Communication in Cameroon, who has strong interests in the Maker Movement, social Innovation, open science and scholarly communication.
Their Surveying Open Data Practices project aimed to create an open, interoperable data bank with questions and responses that can be used by different stakeholders to understand open data practices in research. The initial survey was launched in 2020 in eight African countries and in three languages: English, French and Portuguese.
The project’s funding is coming to an end, so the team started looking for new funders and potential hosts for the resources they had created. Bezuidenhout says:
“If our survey is to have a real impact, it’s probably not going to do that from this single data set that we’ve gathered. It’s going to make an impact through being added to, by being cross-referenced to other countries, or longitudinally, so that it can look at changes in practice over time. But because we have a very small amount of funding and no automatic home for the survey, it’s very unlikely that our survey would be reused.
We’ve done an analysis of the major surveys that have been done in the last five years around data sharing, and the variability in questioning is mind blowing. And that makes the data sets absolutely non-comparable because if you’re asking slightly different questions and you’re surveying slightly different populations, you’re going to get data that can’t be compared.
We’ve been doing a lot to think about reusability and how to lower the entry bar so that more people can use it. Survey questions are made available in machine-readable formats to facilitate uploading into any survey platform. Analysis scripts are developed in a free and open-source tool, R, and shared openly along with an interactive visualisation app. We’ve provided a template for ethics approval, and for the budget you would need. We’ve been thinking a lot about these very pragmatic things that often get overlooked.”
Challenges of surveys
Van der Walt approached the Institute based on prior experience with its International Research Software Engineer (RSE) survey. “Our surveys are very different, as are our audiences, but we do have similar problems,” said Van der Walt, discussing the challenges related to surveys including: sustainability, interoperability, accessibility, recognition of efforts, quality, and internationalisation (translations in terms of language and context).
Simon Hettrick, deputy director of the Institute, says he immediately recognised the issues faced by van der Walt, Mboa and Bezuidenhout: “You start a survey with volunteer effort and it becomes popular because people want the data. Then changes over time become interesting and suddenly you’re running a longitudinal study. The problem is, your volunteers need to focus on their day jobs. Balancing this need whilst keeping the survey alive is really difficult. It’s a sustainability problem and there’s a lot of overlap with the problems we face with software sustainability.”
Many of the issues relating to surveys chime with Hettrick’s interest in research outputs that are not considered ‘conventional’. Hettrick has set up the Hidden REF competition to raise awareness of the vast number of research outputs that are not recognised. He says, “This lack of support for sustaining research outputs comes up time and again, and part of the problem is down to the fact that we tend to overlook the need to manage research projects, run instruments or run longitudinal surveys, and the people who do this work rarely get recognised.” If people’s efforts with surveys aren’t recognised (or measured), it is hard to formally justify the money, time, infrastructure and other resources needed to sustain them.
“I’d like to hear from other people in the research community who are having the same problem,” Hettrick says. “To be useful, these surveys of communities and their practices need to run for a long time and that takes a lot of effort. The people who begin the surveys are rarely the ones with the contacts and experience of gaining funding for longitudinal surveys. We really need people to start discussing these issues, and I hope the problems faced by Anelda, Thomas and Louise can help to start the conversation. Please get in touch if you recognise this problem, and especially if you have any solutions to propose!”