By Kayla Iacovino, University of Cambridge
Remember research before computers? I sure don’t. But I’m told that back then, people still used slides to give talks. Figures were hand drawn, and references were found after hours upon hours in a brick and mortar library going through stacks of paper journals.
These days, we’ve automated a lot of the more mundane tasks involved in doing science. As someone doing their PhD in 2013, I’ve never had it any other way.
Even among us geologists, where one might suspect the research techniques to be as old as the dirt we study, science moves at a high-tech pace. One of the common tools used to analyse rocks is a type of vibrational spectroscopy known as Fourier Transform Infrared (FTIR* - essentially, this is an infrared light shined through a sample that
interacts with the molecules in its path in a specific way). The spectra of light that comes out the other end can then be analysed to get at information about the sample’s chemical makeup.
This technique is extremely popular due to its relative simplicity and low instrumentation cost, and because it can give vital information pertaining to a vast array of studies on volcanoes, crystal formation, deep sea environments, and even lunar geology.
If you trawl through the literature for examples of the technique, you won’t be surprised to find that, as late as the 90s, FTIR spectra were still being analysed by hand, using a French Curve as a guide to draw various polynomials on printouts and measuring the curve values with a ruler. What you might be surprised to find, however, is that this technique is not a thing of the past. In fact, it is still the most common way to analyse geological FTIR data today.
As you can imagine, this causes a lot of problems. Humans are prone to subjectivity and the ever present ‘human error’ factor. Who is to say that the curve I draw will be the same as the curve you would draw? And why did I draw it like that anyway? Might I draw it differently in a month’s time? These are tough questions to answer, but no one’s offered up an alternative method so it’s the best we can do for now.
My colleagues and I are hoping to take a step toward changing that. We are currently developing a software tool that will analyse FTIR spectra in a precise and meaningful way. Not only will it save printer ink and ward off carpal tunnel syndrome, but it will also provide internal consistency within data sets and a limit to variation between analysts. What’s more, using first principles to help define our curve fitting algorithms should help us to begin to answer the question, “Why did I draw it that way?”
That’s not to say we’re out to revolutionise a well-established technique by any means. We’ve simply just found an obvious place where software could be easily implemented to solve a rather mundane problem faced by scientists in our field.
* FTIR is used in many fields and for many applications. Here, I refer to the analysis of solid samples (e.g. glasses and minerals). The analysis of gases, for example, uses well-established, very complex retrieval programs and does not rely on hand-drawn curves.