ARCHER

The Software Sustainability Institute is organising the “Workshop: Impact of international collaborations in research software”, taking place on Tuesday 24th April 2018, at the Museum of Science and Industry in Manchester.

We welcome submissions of posters from researchers based in the UK that demonstrate the impact of computational research / research enabled by software. We’re particularly looking for examples of how collaboration has benefited your work and will give priority to EPSRC researchers, though all research domain areas will be considered. The best examples will also be offered a short presentation slot (5-10 minutes) at the event.

Submissions (of no more than one A4 page) should include a short description of your research and the software used, an example of the impact it has had, and the role that collaboration has played in your work.

Please submit your proposal via this form by 26th March 2018.

Register for the event at http://bit.ly/rseimpact

Further information

Earlier this year, EPSRC awarded the Software Sustainability Institute and EPCC money to fund UK-US RSE collaboration and to run a “Best Use of Archer competition”. As part of the planned activity funded by this grant, this event will showcase the impact of the awards and provide a space to discuss opportunities to build on international collaboration.…

Continue Reading

Most of us recognise that diverse teams are good for productivity and output. But do you know how to improve diversity and build a more inclusive environment? Have you ever heard of unconscious bias, stereotype threat or imposter syndrome? Do you ever feel like you aren’t good enough to be in the community or feel like a ‘fraud’? This WHPC event will discuss the real effects of these three topics on the workplace, providing the audience with an introduction to each theme, how they may affect you and how they impact employers, employees, advisors, managers or your peers.

This event will take place on Wednesday 5th April 10am-3pm (Coffee & Registration from 9.30am) and will encourage audience participation with the use of audience focused discussions based on case studies. This session aims generate lively discussion, generate new and novel approaches to solving some of the challenges we face and we encourage the audience to discuss challenges they have faced both as hirers/managers and as women working with HPC across all research disciplines.

The session welcomes participation from everyone that is interested in discussing improving diversity in the HPC community across all discipline.

This event is organised by Women in HPC as part of the ARCHER Outreach project.

Registration is free at EventbritePlaces are limited, so register early to avoid disappointment.

Focus groups: Alongside this event we will be running 1 hour focus groups that…

Continue Reading

CloudsBy Mike Jackson, Software Architect.

We're helping EPCC and the Met Office promote the uptake, and ongoing development, of the Met Office NERC cloud (MONC) model within the atmospheric sciences community. We're assessing how easy it is to deploy MONC, helping set up a MONC virtual machine and advising on setting up resources for engaging with and supporting researchers.

At the Met Office weather and climate are predicted using numerical models, in particular, the Unified Model (UM). The UM is run on super-computers to produce high spatial resolution forecasts (1 km) for the UK. The details of forecasts are valuable for many public institutions and companies.

A vital tool for the development and testing of the UM is the Met Office Large Eddy simulation model (LEM). The LEM is used to simulate atmospheric phenomena, such as fog, clouds and deep convection at very high resolutions (10 to 100 s metres). The LEM was first developed in the early 90s and parallelised in the mid-1990s. While it can be argued that science undertaken with the LEM underpins many of the atmospheric parameterisations in the UM, the LEM can no longer capitalise on supercomputer enhancements, as the code structure and…

Continue Reading

Clouds We're helping EPCC and the Met Office promote the uptake, and ongoing development, of the Met Office NERC cloud (MONC) model within the atmospheric sciences community. We're assessing how easy it is to deploy MONC, helping set up a MONC virtual machine and advising on setting up resources for engaging with and supporting researchers.

Modelling clouds for weather forecasting

The UK Met Office uses software to create its weather forecasts. This software simulates the behaviour of weather using complex mathematical models. These models can use information about past weather to forecast future weather. The Met Office's best known weather model is the Unified Model (UM), which generates national and international forecasts down to a scale of 1 kilometre. The Met Office also has a number of other models that concentrate on specific aspects of weather. One of these is the Large Eddy Simulation model (LEM) which models clouds, atmospheric flows and turbulence.

The LEM has been developed over the past 30 years. But, it is now showing its age and LEM's performance does not significantly improve if run on more than 512 processes whereas many modern super-computers have tens of thousands of processors. A consequence of this limitation is that the UK atmospheric sciences community relies on…

Continue Reading

The Software Sustainability Institute in collaboration with ARCHER and Women in HPC is organising the first Software Carpentry workshop for Women in Science and Engineering (WiSE) in the UK. The event will take place at the University of Manchester on 14-15 December 2015.

Software Carpentry's mission is to help researchers get more work done in less time and with less pain by teaching them basic lab skills for scientific computing. This hands-on workshop will cover basic concepts and tools, including program design, version control, data management, and task automation. Participants will be encouraged to help one another and to apply what they have learned to their own research problems.

The course is aimed at female postgraduate students, researchers and engineers who are familiar with basic programming concepts (like loops, conditionals, arrays, and functions) but need help to translate this knowledge into practical tools to help them work more productively.

Apart from learning a set of useful skills the participants will have a unique opportunity to network and discuss their experiences in working in computational research area. More details and registration.

The Software Sustainability Institute, ARCHER and Women in HPC are planning to run the WiSE events on regular basis at different sites in the UK.

Motorway at night

By Mike Jackson, Software Architect, Andrew Turner ARCHER Computational Science and Engineering Support Team Leader, and Clair Barrass, ARCHER training administrator.

In 2013, the DiRAC consortium rolled out the DiRAC driving licence, a software skills aptitude test for researchers wanting to use DiRAC's high performance computing resources.

Now, ARCHER, the UK National Supercomputing Service, is to roll out an ARCHER driving test. Despite their similar names, these tests differ in nature, intent, scale and reward. In this post we compare and contrast these two supercomputer tests.

DiRAC Driving Licence

DiRAC is the UK's integrated supercomputing facility for theoretical modelling and HPC-based research in particle physics, astronomy and cosmology. The DiRAC driving licence was originally developed in 2012 by Mike, Greg Wilson of Software Carpentry, and, from DiRAC, Andy, James Hetherington of the…

Continue Reading

Shoot that poison arrow to my hearrrrr-rrrrt...By Gillian Law, TechLiterate, talking with Prashant Valluri, University of Edinburgh.

This article is part of our series: Breaking Software Barriers, in which Gillian Law investigates how our Research Software Group has helped projects improve their research software. If you would like help with your software, let us know.

There's a difference between writing code and writing good code, says Prashant Valluri, Lecturer at the University of Edinburgh's Institute for Materials and Processes, laughing as he describes how much he learned while working with the Software Sustainability Institute.

Valluri's team has developed code called TPLS (Two-Phase Level Set), for mathematically modelling complex fluid flows. The code aims to provide much more effective computational fluid dynamics (CFD) analysis for academia and industry, by providing efficient multi-phase models, better numerical resolution and efficient parallelisation.

TPLS uses an ultra-high resolution 3D Direct Numerical Simulation approach combined with the Level-Set method for tracking the developing interface between phases. It employs a 2D Message Passing Interface (MPI) process decomposition coupled with a hybrid OpenMP parallelisation scheme to allow scaling to thousands of CPU cores.

Continue Reading

By Mike Jackson, Software Architect, Iain Bethune, EPCC, The University of Edinburgh​, Lennon Ó NáraighSchool of Mathematical Sciences, University College Dublin, and Prashant Valluri, Institute of Materials and Processes, School of Engineering, The University of Edinburgh.

Mathematical modelling of complex fluid flows has practical application within many industrial sectors including energy, the environment and health. Flow modelling can include oil and gas flows in long-distance pipelines or refinery distillation columns, liquid cooling of micro-electronic devices, carbon capture and cleaning processes, water treatment plants, blood flows in arteries, and enzyme interactions. Multi-phase flow modelling models flows consisting of gases, fluids and solids within a single system e.g. steam and water or oil and gas within a pipe, or coal dust within the air.

Simulations of this sort are highly computationally intensive, so high-performance computing (HPC) resources are required. However, current commercial computational fluid dynamics (CFD) codes are limited by a lack of efficient…

Continue Reading

Parallel sustainability with TPLS 

A detailed interface structure from a high flow-rate simulation showing both linear and non-linear features including 'pinch-off' of dropletsMathematical modelling of complex fluid flows has practical application within many industrial sectors including energy, the environment and health. Flow modelling can include oil and gas flows in long-distance pipelines or refinery distillation columns, liquid cooling of micro-electronic devices, carbon capture and cleaning processes, water treatment plants, blood flows in arteries, and enzyme interactions. Multi-phase flow modelling models flows consisting of gases, fluids and solids within a single system e.g. steam and water or oil and gas within a pipe, or coal dust within the air.

Simulations of this sort are highly computationally intensive, so high-performance computing (HPC) resources are required. However, current commercial computational fluid dynamics (CFD) codes are limited by a lack of efficient multi-phase models, poor numerical resolution and inefficient parallelisation features. This severely restricts their application within both academia and industry. Industry, for example, continues to rely on empirical modelling and trial-and-error pilot-scale runs, which incurs significant capital cost investments and delays before commissioning.

TPLS (Two-Phase Level Set) is a CFD code developed by Prashant Valluri of the Institute of Materials and…

Continue Reading

With the launch of its new ARCHER supercomputing service, EPCC is now recruiting. Founded at the University of Edinburgh in 1990, EPCC is a leading European centre of expertise in advanced research and technology transfer. We have three job roles in a variety of areas currently being advertised.

The first role is as an Applications Developer - Computational Science and Engineering. This involves working in the Computational Science and Engineering team who are responsible for the provision of in-depth technical support on the UK national HPC service ARCHER. The role's main responsibility is to provide programming and software engineering solutions for projects using ARCHER, but the role also involves working on other EPCC development projects too.

The second role we have available is for an Applications Developer - Data Science. This involves working as a programmer, analyst and technology expert in the broad area of large-scale distributed data systems. It also requires helping domain scientists in a variety of disciplines to learn more from digital data. Our projects range from exploring experimental computing platforms for data-intensive analysis through to “big data” collaborations…

Continue Reading
Subscribe to ARCHER