Alexander Struck

By Alexander Struck, Chris Richardson, Matthias Katerbow, Rafael Jimenez, Carlos Martinez-Ortiz, and Jared O'Neal. Software written to solve research questions gains more recognition as a research result of its own and requires evaluation in terms of usability as well as its potential to facilitate high-impact and reproducible research. Indeed, in recent times there has been increased emphasis on the importance of reproducibility in science – particularly of results where software is involved. In order to tackle this problem, there have been efforts for evaluating reproducibility of research…
By Stephan Druskat, Jurriaan H. Spaaks, Netherlands eScience Center, and Alexander Struck. In order to enable attribution and credit for Research Software Engineers, and other developers of and contributors to research software, software must be made citable, and must be cited. One of the obstacles for correct and comprehensive software citation is the lack, or suboptimal discoverability, of relevant metadata. While, for instance, papers provide their metadata quite obviously (i.e., title, authors, containing publication, publication date, etc.), software hardly ever does.

By Martin Callaghan, University of Leeds, Daniel S. Katz, University of Illinois, Alexander Struck, Cluster of Excellence Image Knowledge Gestaltung, HU-Berlin, and Matt Williams, University of Bristol, 

By Daniel S. Katz, University of Illinois Urbana-Champaign, Robert Haines, Research IT, University of Manchester, David Perez-Suarez, Research IT Services, UCL, Alexander Struck, Humboldt-Universität zu Berlin

Subscribe to Alexander Struck