What's wrong with computer scientists?

JobCentre.jpgBy Simon Hettrick, Deputy Director.

Over the last few years, HESA's study of graduate careers has awarded computer science with the unwelcome honour of the lowest employment rate of all graduates. Last year, about 14% of computer scientists were unemployed six months after graduation. So what's wrong with computer scientists?

We will soon be attending a strategy meeting on the future of UK computer science degrees, and we want to represent your thoughts on this problem. If you have any ideas or arguments, please comment below, email us or tweet with the hashtag #wwwcs.

Industry has reported unfulfilled demand for computer science positions, which seems odd with a surfeit of computer scientists available. It's not yet clear whether the positions are being offered to graduates from other disciplines or being left vacant. There have also been complaints from industry that graduates lack the skills needed for the workplace, but these are general statements that do not single out computer science in particular.

Two interesting explanations for the low employability of computer scientists are raised in a response to the HESA study from the Council of Professors and Heads of Computing (CPHC).

Computer scientists are far more likely than other graduates to study at post-92 universities (64.4% of computer scientists study at post-92, whereas only 13% study in the Russell Group). In other words, the CPHC argues that the lower employability of computer scientists is actually a symptom of industry choosing graduates from more established universities. Another important factor is that computer science has been very successful in attracting black and minority ethnic (BME) students: around 56% more BME students study computer science than the average for all courses (see table 6 of the CPHC report). Due to this factor, the CPHC argues that the lower employability of computer scientists is in part due to an equally troubling problem: the lower than average employment rates of BME students. (This point is convincingly argued in a recent Guardian article.)

The problem of computer science employability has reached the very top of the academic hierarchy. David Willetts, the Minister for Universities and Science, will chair a workshop next week on the future of computer science graduates. It's being attended by Vice Chancellors, CEOs of leading ICT recruiters, and the Software Sustainability Institute. If you think the employability debate is missing a fundamental point, just let us know (by commenting below, emailing us or tweeting with the hashtag #wwwcs) and we will raise the issue at the workshop.

Further information

A summary of the 2011/2012 career study and the data related to employability is available on the HESA website. It should be noted that computer science, as far as the study is concerned, means the amalgamation of courses on computer science, information systems, software engineering, artificial intelligence and the rather ambiguously titled others in computing science.

Posted by s.hettrick on 31 October 2013 - 1:07pm

Submitted by Anonymous on 31 October 2013 - 3:13pm


It doesn't help much that Comp Sci is very watered down. At a lot of unis it basically consists of design theory, Java and UML. Next to Computer Systems Engineering, in which (at the university I went to) you're expected to write in assembler for at least 3 different architectures while you're there and build small amounts of hardware from scratch (designing the PCBs and putting it all together). Comp Sci can't compete.

Submitted by Anonymous on 31 October 2013 - 7:56pm


I think most of the problem is most of the job posts seem to be written by someone who doesn't know CS. Is most offputting when they want someone who is highly knowledgeable and they seemingly have no idea. See - http://dawood.in/if-carpenters-were-hired-like-programmers/

Submitted by Anonymous on 1 November 2013 - 11:42am


In my experience being interview for jobs there is a major focus on experience, even for jobs that I feel don't need it. I lost out on a data entry job tpo someone with more experience, being a recent graduate myself, however when being interviewed for the job the duties describe were easier and far less effort than I would have expected and the fact that they thought they needed someone with more experience to do the job was confusing.

Submitted by Anonymous on 2 December 2013 - 10:10pm


So I recently graduated in a masters from Computer Science. It included UML, Java, C++, AI, Algorithms , so quite a vast amount of Computer Science. The issue I found when coming from university, is quite often graduate positions in CS are limited. It seems more experienced programmers are at a much higher demand and people with specialisms. For example, if I am interested in going into iOS dev, the best way is to do this as a project, as well as try and get an internship in this. Unfortunately even then this often isn't enough. I found it is rare for a company to take on a graduate. In the startup world there are many programming jobs, but very few for graduates. Most want people with 5 years experience.

Submitted by Anonymous on 30 December 2013 - 11:48am


I'm a university lecturer in Computer Science, I teach a second year module in programming. I've spent my Christmas "holidays" marking the end-of-term test I set for my students. Every year it's the same - no more than a third of them are showing the sort of ability I would want in anyone doing a coding job. One-third of them are so poor at programming that one would be surprised to hear they had spent more than a couple of weeks supposedly learning about it, never mind half-way through a degree in it. This is the hidden issue, which academics moan about in private but don't like to say in public. It's not just me or where I teach, the same occurs everywhere. If you really test them on decent programming skills, you get a huge failure rate. In this country it's thought bad to fail students, so mostly we find ways of getting them through even though they don't really have the skills. My advice to any employer looking for real programming skills is don't just look at the degree class, look at the module grades, and don't take on anyone whose 2i or 1st has been gained by good grades in other modules balancing weak grade in the programming modules. But even with the programming modules, check out how the assessment was done. One issue does seem to be natural ability - some have got it, some haven't. We've not found a good way to work out who has it when we recruit the students, and league table pressure now means we are forced to recruit the students with the highest A-level grades regardless of anything else, and A-level grades certainly are NOT a good test of natural ability for coding. There's some correlation with the grade in A-level Maths, that's about it. However, these days when all the pressure is to recruit high grade students because that pushes us up the league tables, central admin won't let us take on applicants with reasonable A-level maths and outside interests which suggest genuine coding ability rather than applicants with just higher grades in other subjects. But another issue is lack of student discipline. Too many of them just don't realise how to learn this stuff properly, you have to practice, practice, practice. Too many of them are stuck in the mentality that to do well all that is needed is to "revise" in the week before the exam. Unfortunately, we have a culture which puts forward the impression that tests and exams are "memory dumps". I keep telling my students it isn't like that for our subject, they don't believe me. I see this in their test and exam answers, where you so often see something which completely misses the point written down because it's something they've memorised and they've dumped it on paper in response to a trigger word in the question. Note also, if the assessment is more through coursework than exams, a large proportion of what is submitted will not be the students' own work. It may not be actual copying, but it is often "We discussed it together" i.e. one of them did the work, and the others made cosmetic fiddles and claimed they were just helped by the discussion. Or some friend or relative did it for them, etc. So I would say, if assessment is not largely exam based, don't take them on even if they do have high grades in the programming modules. This is also why I have no faith in "MOOC"s, as most of the seem to have assessment based on memorisation and multiple choice rather than deep analysis of real skills. Sure, you can automate marking of really simple introductory programming exercises, but anything beyond 1st year coding, you can't, because automated marking won't test subtle issues such as good style or solving a problem in a particular way. I set tests and exams with an emphasis on writing actual code and showing deep understanding, but it doesn't make me popular, and it doesn't lead to a high pass rate. Many others succumb to the pressure, and set tests and exams with questions which are multiple-choice and centre on memorising definitions rather than solving coding problems. That makes it MUCH easier to mark, and you get a higher pass rate as well. If I had done that I wouldn't have my wife nagging me "It's Christmas - why are you doing work?", and I wouldn't have my boss nagging me "Why does your module have such a high failure rate?" and I wouldn't have my student nagging me "Why are you so tough on us?".

While I don't have any experience teaching computer science subjects in schools, I do have experience as an ex-student of undergraduate computing, and professional experience in IT, and at the post-graduate level. Basically the education system has made their own circumstance, by short term thinking, and trying to maintain their monopoly. In highly technical industry, where outcomes and quality is constantly assessed, then performance is performance is performance etc. Basically, in order to be employable, you have to have skills that allow your to perform tasks efficiently and reliably. The training therefore needs to educate students on the theory and practice of performance, and the assessment needs to test students on the theory and practise of performance of those tasks. MOOCs are a poor form of assessment, it's easy to achieve good grades on lots of MOOC courses, without having a clue, or deep knowledge. However MOOCs are a source of learning, which for motivated students provide the groundwork for more formal curriculum and assessment. While formal higher education is failing to provide this structure, that the industry requires, eventually, the MOOCs will have the cash resources, and skills themselves to implement a parallel (or replacement) learning pathway, away from the hypocrisy and incompetence of the university academic system.

Submitted by OwlHoot on 23 July 2014 - 7:13pm


The basic problem is not an IT skills shortage but a shortage of British people willing to work for a minimum wage. Knowing this is all they can typically expect has naturally discouraged many practical people from studying Comp Sci. So more and more FTVs (Fast Track Visa wallahs) are allowed in, which drives down salaries even more in a vicious circle and makes the UK ever more dependent on foreign IT (and associated commercial and technological) skills.

Rubbish. If you are any good at software, you will command much more than minimum wage. If even a raw graduate joins us to program, they will start on £32k basic. Sadly, of the last batch of 52 applicants, only one made the grade. There was one near miss, but the other 50 were frankly rubbish.