Case studies

LUX_watertank-1024x587.jpgBy Gillian Law, technology writer

The LUX-ZEPLIN project is building the largest and most sensitive dark matter detector of its type ever constructed. The detector will be built a mile underground in the Sanford Underground Research Facility (SURF) in Lead, South Dakota and is due to go live in 2020.

Potential detector materials are currently being screened prior to their use in the experiment, and the results are collated and analysed using a 43-sheet Microsoft Excel spreadsheet. The spreadsheet has worked well to date, allowing researchers to share and view data, but moving to a more versatile and robust database solution will be very useful once the experiment begins, says Dr Alex Lindote, LZ Background Simulations project lead, who is based at Laboratory of Instrumentation and Experimental Particle Physics (LIP)-Coimbra, Portugal.

Lindote set up the spreadsheet in late 2015, bringing in data from a Google spreadsheet that had been set up by researchers to share their data.

“It was getting hard to track who was making changes and what was happening, so I was asked to start taking care of it. I decided to move it to an Excel file that I could control more easily,” Lindote says.

Once it became clear…

Continue Reading

MONC By Selina Aragon, Communications Officer, in conversation with Adrian Hill, Met Office

This article is part of our series: Breaking Software Barriers, in which we investigate how our Research Software Group has helped projects improve their research software. If you would like help with your software, get in touch.

Adrian Hill, the project’s primary contact, talked to us about the usefulness of the Institute’s collaboration with the Met Office and EPCC to promote the uptake and development of MONC. Adrian especially highlighted the invaluable help he received from Mike Jackson, Research Software Engineer, in setting up the basis for what has progressed into successful software with unexpected benefits and long-term value, used by researchers as well as PhD and masters' students.

Collaborative efforts

In collaboration with EPCC (Edinburgh Parallel Computing Centre) and the Met Office, the Institute provided help to rewrite the Large Eddy simulation model (LEM) as its successor, the Met Office NERC cloud (MONC). MONC is a complete re-engineering of LEM, which preserves LEM's underlying science. MONC has been developed to provide a flexible community model that can exploit modern supercomputers…

Continue Reading

Practice of Reproducible ResearchBy Justin Kitzes, University of California, Berkeley

We are very happy to announce the launch of our open, online book The Practice of Reproducible Research, to be published in print by the University of California Press later this year. In short, this book is designed to demonstrate and teach how research in the data-intensive sciences can be made more reproducible. The book centres on a collection of 31 contributed case studies, in which experienced researchers provide examples of how they combined specific tools, ideas, and practices in order to improve the reproducibility of a real-world research project. These case studies are accompanied by a set of synthesis chapters that introduce and summarise best practices for data-intensive reproducible research.

Within the overall context of reproducibility, our book focuses specifically on the goal of achieving computational reproducibility in individual research projects. We defined a research project as computationally reproducible if a second investigator can recreate the final reported results of the project, including key quantitative findings, tables, and figures, given only a set of files and written instructions. This focus reflects our belief that computational reproducibility forms a first and most foundational goal for individual investigators interested in the broad goals of reproducible…

Continue Reading

key-concepts.pngBy Selina Aragon, Communications Officer, in conversation with Trung Dong Huynh, University of Southampton

This article is part of our series: Breaking Software Barriers, in which we investigate how our Research Software Group has helped projects improve their research software. If you would like help with your software, get in touch.

From concept to software

Provenance is traditionally the record of ownership of a work of art or an antique, used as a guide to authenticity or quality. Although mostly used to track the origins of a work of art, the term is now used in an array of fields ranging from palaeontology to science. It refers to having knowledge of all the steps involved in producing a scientific result, such as a figure, from experiment design through acquisition of raw data, and all the subsequent steps of data selection, analysis and visualisation. Such information is necessary for reproduction of a given result, and can serve to establish precedence. This concept also applies to the digital world; that is, data also originates from a particular point, and provenance provides evidence of its point of origin or discovery by establishing its ownership, custody, and transformations.

Continue Reading

Briefcases_0.jpgAs we complete work with projects, we will write up each project as a case study. The case studies are short, focussed on achievements and easily the best way in which to see the kind of work that we do - and how we could work with your project.

If you would like to discuss working with the Software Sustainability Institute, please contact us.

Infrastructure and project management

Continue Reading

By Steve Crouch, Research Software Group Leader, talking with Matt Gerring, Senior Software Developer at Diamond Light Source and Mark Basham, Software Sustainability Institute Fellow and Senior Software Scientist at Diamond Light Source.

This article is part of our series: Breaking Software Barriers, in which we investigate how our Research Software Group has helped projects improve their research software. If you would like help with your software, let us know.

Building a vibrant user and developer community around research software is often a challenge. But managing a large, successful community collaboration that is looking to grow presents its own challenges. The DAWN software supports a community of scientists who analyse and visualise experimental data from the Diamond Light Source. An assessment by the Institute has helped the team to not only attract new users and developers, but also increase DAWN’s standing within the Eclipse community.

The Diamond Light Source is the UK’s national synchrotron facility based at the Harwell Campus in Oxfordshire. By speeding up electrons to near light speed, they give off light that is 10 billion times brighter than the sun. Over 3000 scientists have used this light to study all kinds of matter, including new medicines and disease treatments, structural stresses in aircraft components, and fragments of ancient paintings, to name but a few.

Supporting and developing software for such a diverse community presents a number of challenges. The DAWN team already…

Continue Reading
Subscribe to Case studies