Following the recent Nature article "Computational science: ... Error - why scientific programming does not compute" spawned by the Climategate affair, there's another interesting article titled "Changing software, hardware a nightmare for tracking scientific data" from the Nobel Intent blog on Ars Technica. Again, it is the pace of technological advance, so important for making new discoveries, which is also causing us to have to question if we can reproduce our past results.
The author notes the difficulties of keeping a fully reproducible analysis pipeline working, with issues such as software and hardware obsolescence, data decay and dependency on services provided by others all contributing to the challenges of providing reproducible research.
Some of the things touched on, e.g. the potential to have to keep outdated hardware running, are part of the work we have recently completed with Curtis + Cartwright on the purposes, benefits and approaches to software preservation. Another is the problems arising when "the focus tends to be on getting the job done, not writing easy-to-maintain or well-commented code" something which the SSI is trying to address through our workshops and guides.
It's good to see that responsible software development, and software sustainability, rising up the agenda - now how can we tackle these challenges in a funding climate where the emphasis is on short-term impact and efficiency