HomeNews and blogs hub

Policy update: surveys, jobs and diversity

Bookmark this page Bookmarked

Policy update: surveys, jobs and diversity

Author(s)
Simon Hettrick

Simon Hettrick

Deputy Director

Posted on 18 August 2015

Estimated read time: 7 min
Sections in this article
Share on blog/article:
Twitter LinkedIn

Policy update: surveys, jobs and diversity

Posted by n.chuehong on 18 August 2015 - 8:37pm

By Simon Hettrick, Deputy Director and Policy Lead.

This is the third in a series of blog posts taking you behind the scenes at the Institute. Today, we here about the recent activities of the Policy team.

Thus far in 2015, the efforts of the policy team have mainly been focused on providing data and support for the Institute’s funding bid for a second phase. With this out of the way, July has seen the policy team re-focusing on research. We’ve also had time to catch up with our Research Software Engineer campaign and added a new member of staff.

Surveying further

In October 2014, we ran a nationwide survey to determine researchers’ views on software. We were keen to quickly analyse and publish these results and get the preliminary message out. Due to staff availability, we ended up conducting this first pass analysis in Excel. Although we published our analysis, Excel is not the best package for transparency, which is why we rightly received opprobrium from open-data advocates. But we reasoned that our approach would be acceptable as long as we repeated and extended the work in a more transparent manner in the future.

Our new starter this July, Olivier Philippe, arrived with some serious R skills and was immediately tasked with making the survey analysis transparent. He is using knitr to document both the results and the analysis in html. We will publish a paper later in the year about the analysis, and support this by publishing the data, the analysis and results in Zenodo.

Investigating the job market

Our general interest in software careers led us to study jobs.ac.uk - the de facto job site for UK academia - to understand how many people are employed to conduct software development in academia. When we first started this analysis, our preliminary analysis on a subset of posted vacancies showed that around 4% of all academic jobs have a slant towards software development. Half a year later, it appears that our original work was about right, with about 7% of the job adverts now being related to software development. The average salary of a software developer in UK academia appears to be around £34,000.

The jobs investigation has benefited from the hard work of many of our staff members. Mario Antonioletti has been conducting the initial analysis and has developed R code to analyse the job adverts. He’s also been instrumental in developing the methodology of the analysis. Steve Crouch wrote many of the early scripts that identified jobs related to software. Iain Emsley has been working on a data dashboard that will take the results and automatically present them on our website.

However, the jobs analysis is being held back by an important problem. We are identifying software-related jobs with a simple word-matching text search (e.g. looking for terms like software developer). Although simple, this approach appears to be achieving a reasonable hit rate, with what we believe to be a low percentage of false positives. However, we want to base the job identification on more solid foundations, so this month we began to investigate text mining techniques. This is a new approach for the policy team, and one that could lead into some exciting areas of research, but it’s early days so we don’t expect to collect publishable results until the end of the year at the earliest.

The Institute’s policy team have been working to support Research Software Engineers since the team was created in January 2013, so we welcomed the RSE Fellowship call made by the EPSRC earlier in the year. The first stage of this call was based on an intent to submit which gave a very short overview of the candidate and their plans. In July we began discussions with the EPSRC team to determine how best to extract information from the intents to submit that could be of use to the research community. Certainly, we’re interested in the geographic spread of the RSEs, the institutions in which they’re based, the demographics of RSE applicants and their plans for the Fellowship.

We are working closely with the EPSRC to identify questions that can be answered by the intents to submit. Due to privacy restrictions, the analysis will be conducted by the EPSRC, but we will jointly publish the results with them. Expect that information by the end of the year.

Diversity in the workplace

Finally, in July I took part in a panel discussion at the CASC/HPC-SIG meeting held in Oxford. The meeting brought together HPC researchers and providers from the US and the UK. The panel I sat on was tasked with the question of diversity in software. I was joined by Toni Collis, who has conducted some very interesting research into diversity in HPC, and Curt Hillegas and John Towns representing the US perspective. My role was to discuss careers for software developers in academia and, since the EPSRC’s Susan Morrell had unfortunately become embroiled in the tube strike, I was also asked to provide an overview of the RSE Fellowship.

Although I tried hard to steer the conversation back to careers, I have to admit that the room was much more interested in issues around diversity – with a focus on gender and disability. It was a fascinating discussion which raised an interesting question about evidence. The people on the panel and in the audience were all scientists, so everyone expected an evidence-led debate. It was interesting to see how people reacted to gaps in the evidence.

There has been a study of gender diversity in HPC in the US. As anyone working in the sector would expect, this showed that there are many more men than women working in HPC. These results were extrapolated to the UK, which is when the schism formed. Some people in the room argued that there was no evidence of gender inequality in the UK, so the argument was moot until further evidence could be supplied. Others argued that the US and UK are similar countries, so it was likely that the US evidence would be roughly applicable to the UK. What’s more, people working in HPC in the UK had seen many fewer women working in the sector – as evidenced by the handful of women in the very room in which the debate was taking place.

Personally, like most scientists, I believe that decision making is irrational unless its evidence led. But it’s simply not practical to only investigate issues where there is already cast iron evidence. Where there are good reasons to believe an issue exists, we must deal with that issue, whilst being aware that it is not yet proven and that one of the first dealings should be the collection of evidence. In fact, this is exactly the situation we find with research software. We had been working with software for years and collected significant amounts of anecdotal evidence of a heavy reliance on software, but it wasn’t until we conducted our software survey that we finally had conclusive evidence of software use.

If you have any questions about the Policy team’s work, or would like to work with us, please get in touch.

Share on blog/article:
Twitter LinkedIn