CW16 Hackday

The winner of the Collaborations Workshop 2015 Hackaton was: Research software sentiment analyser (Carl Wilson, Raquel Alegre [team leader], Sinan Shi, Gary Macindoe, David Perez-Suarez, Olivier Phillipe; Idea, source code and presentation and video). 

The runner up was MatchMakedemia - Research focussed MatchMaking for local academics + RSE speed dating (Olivia Wilson, Angus Maidment [team leader], Anelda van der Walt, Rob Dunne; Idea, source code and presentation and video). 

Research software sentiment analyser team

Other groups

Panoramic view of the room during the Hackday

Team leaders are in bold.

Paper Hackathon: Quality Assurance Practices in Research

Motivating Idea and video.

Team members:

  • James Hetherington
  • Oliver Laslett
  • Vince Knight
  • James Davenport
  • Steve Lamerton

How to Write a Comparative Software Review

Motivating Idea, draft and presentation and video.

Team members:

  • Ross Mounce
  • Richard Domander
  • Neil Chue Hong
  • Amy Beeston
  • James Baker

Should I cite the software

Motivating Idea, source code and webpage and video.

Team members:

  • Michael R. Crusoe
  • Louise Brown
  • Dmitrijs Milajevs
  • Iza Romanowska

Research Portfolios driven by ORCID, customized by users + Altmetric Narrative Creator

Motivating Idea and source code and video.

Team members:

  • Niall Beard
  • Raniere Silva
  • Olivia Guest
  • Melodee Beals
  • Jens Nielson
  • Benjamin Laken
  • Heather ford

Lesson Pre-requisite Manager

Motivating Ideademo and video.

Team members:

  • Swithun Crowe
  • Larisa Blazic

Crediting closed-community software

Motivating Idea and source code. and video.

Team members:

  • Laurence Billingham
  • Mayeul d'Avezac
  • Martin Hammitzsch
  • Steve Harris
  • Craig MacLachlan

Open Source Software Guidelines

Motivating Idea and video.

Team members:

  • Mateuz Kuzak
  • Steve Crouch

You could work all night and all day at the CW16 Hackday but you might be interested in knowing how you are going to be judged, if so - this page is for you!

Background

Each idea/pitch needs to be presented and registered by the evening of Tuesday 22 March 2016 to be officially part of the CW16 Hackday (HD). When registering your idea/pitch you will be asked about the team leader, details of what you plan and the category of your idea/pitch (e.g. Software and Credit, Data/Code Sharing, Reproducible Research, Bring-Your-Own-Data (BYOD), Collaborative working etc.) Note the ideas need not be about writing software: they could be standards related, paper hackathons or some other research software related activity.

Each idea will have a Team leader; the leader could be the idea owner, the pitch author/leader or someone who has decided to form a team around someone else’s idea/open data.

Each team can have a maximum of six people who are not Institute Staff in it. We recommend a minimum of two people in a team; we have a limit on the number of teams - if there are more than 15, preference will be given to the bigger teams (not who came first). If a team becomes too big we may ask you to become two teams but working on the same pitch/idea.
Once your team is formed towards the latter part of the evening of the 22nd March, we strongly suggest you have a free and frank discussion with your team about the licensing around the code and data that is being used so it’s upfront.

Judges

The judges for the HD will be:

  • Robin Wilson, Software Sustainability Institute Fellow, University of Southampton.
  • Catherine Jones, STFC
  • Martin Callaghan, University of Leeds
  • Rob Davey, Institute Fellow, The Genome Analysis Centre
  • Simon Hettrick, Software Sustainability Institute.
  • Aleksandra Pawlik, Software Sustainability Institute.
  • Shoaib Sufi, Community Leader, Software Sustainability Institute

Judges will visit each team during the following day, see how they are doing, offer advice and ask questions. They may also prod you to start writing up your presentation and thinking about your demo of the HD work you have been doing to try and offset the ‘just one more commit’ urge that can set into a team as presentation/demo time approaches. Judges may make notes on the teams they are visiting to help assess teamwork, they may also visit you more than once during the day.

All decisions of the judges with regards to marking and prize giving are final and neither they nor the Institute will entertain any appeals.

Criteria

What follows are the criteria for how your HD entry will be judged.

During the 5 minute presentation of your HD work each team must show how they address the criteria. Failure to do this might prevent a good entry from getting a good score during its assessment.

Each category will be scored from 0 (lowest) to 10 (highest); weighting may be applied to the categories but the judges will decide on this during their meeting on the HD.

  1. Novelty, creativity, coolness and/or usefulness

Can you clearly define the problem that is being solved and how are you trying to solve it?

Are you doing something new, better, slick or really useful to yourself or others?

Is your solution purely self-serving, or is it enabling in some other way. You need to  provide reasons as to how your HD benefits a wider community of potential users/developers to get the best marks during assessment.

The advice here is indicative; other justifications in this space are welcome (within the constraints of presenting).

  1. Implementation and infrastructure  

Are you following research software best practice for the use of infrastructure? Is a source code repository being used? Is there documentation? Are appropriate services and infrastructure being used (e.g. cloud computing, databases)?

If you are building on existing work, it’s essential that you are clear about what was done at the HD in terms of adding features and functionality etc. (If this is not clear you will lose marks).

Does your solution work for the stated purpose - can this be shown during the demo?

If your team is developing a standard, are you using collaborative techniques and tools to allow contribution from the whole team?

For paper hackathons involving presentation of data or analysis, are you using reproducible frameworks for the paper authoring?

For other research software related hacks, is it clear you are using best practice in the construction of the work?

  1. Demo and presentation

Did the presentation and demo show how your hack has fulfilled the judging criteria?

Did your team communicate the essence of why they did what they did and why it was important?

If your team were demonstrating results (e.g. from an analysis), were they appropriate for the data chosen?

  1. Project transparency

Was your source code available on an open repository at presentation time? Teams may choose to work open or work closed. If you happen to decide that you want a publication from this work then you may choose to be open about your methods but not your data, for example. However, building and being able to build on each other's work during the HD will be viewed favourably.

If you happen to decide that you want a publication from this work then you may choose to be open about your methods but not your data, for example.

Ideally your repository should contain a README covering configuration, make and run instructions included with the code. In addition there should be a brief description of the project and what the software/scripts do, along with a license. 

These criteria may not be directly relevant for certain categories of entry; in this case other aspects of transparency and openness will be used as decided upon by the judging panel.

  1. Future potential

Was it clear how your work could be taken forward in the future, could it modify existing work, or be part of a new paper, initiative or bid?

Were ideas of future steps provided?

Was it mere fun or did the idea show usefulness in the long term?

  1. Team work

Was your team led well, were they able to involve all interested team members?
Were non-technical members directed towards meaningful contributions; e.g. documentation, testing, usability and logo design in the case of more software-related hacks?

Did your team’s software practices support synchronised working and decrease duplication? Did your team achieve more together than would have been possible separately?

Was your team atmosphere healthy: disagreements are fine, but were they conducted agreeably?

Did it appear enjoyable and/or fun to be part of your team?

 

It's not the winning, it's the taking part that counts! At the Institute we broadly agree with that sentiment, however there is nothing quite like competitions and prizes to liven things up.

On Collaborations Workshop 2016 we gave Raspberry Pi 3 plus kit containing official case, official psu, 16GB NOOBS microSD card, touch screen, touch screen case, camera module and Astro Pi HAT as first prize and Raspberry Pi 3 plus kit with official case, official psu and 16GB NOOBS microSD card as second prize for each member of the winner group.