In the games industry – especially in the UK – big employers have spent the past 10 years claiming there is a “skills gap” – that not enough people are being “trained by universities” (which shows how stupid the speakers were; Universities don’t do training, and most never will – it’s against the core principle of a University). Meanwhile I’ve been counter-claiming that they’re making this up, that there’s no “gap”, and that they know this full well – they just want an excuse to artificially pay lower wages than they deserve to.
Now someone’s published a book on the topic. Unlike my straw-poll arguments, this has actually been researched :), so it may be a lot more convincing. I haven’t read it yet, but this interview with the author has enough juicy details to have me convinced it’ll be a good read.
For instance, here’s a segment on “how does an employer start with 25,000 candidates for a role, and then declare that there exists no-one suitable?”:
“…and the way screening works is you build in a series of typically yes/no questions that try to get at whether somebody has the ability to do this job. And a lot of that ultimately ends up, it’s all you can ask about, is experience and credentials. So you end up with a series of yes/no questions. And you have to clear them all, and I think people building these don’t quite understand that once you have a series of these yes/no questions built in, and the probabilities are cumulative right? You have to hit them all, then you pretty easily end with no one that can fit.
So say that the odds are 50 percent that the typical applicant will give you the right answer in terms of what you’re looking for for the first question, and a 50 percent that they’ll give you the right answer to the second question. Well, then, you’re down to one in four people who will clear those two hurdles, and once you run it out to about 10 questions, it gets you down to about one in 1000 people [ADAM: i.e. on statistics alone – independent of quality etc!] who would clear those hurdles.
… the first hurdle is usually, What wage are you looking for? And if you guess too high, out that goes, right?
… at the end of the day, you find that nobody fits the job requirement.”
6 replies on “There IS no skills gap; employers are lying to themselves”
I was just reading Inc. the other morning and came upon this article all about the problems companies build into their hiring processes: http://www.inc.com/magazine/201111/the-four-worst-hiring-mistakes.html
Ha! I love the tagline: “The problem might be you” :)
Anyone thinking of following that link, some of the summary is:
– Without a deliberate hiring strategy, founders often gravitate toward job candidates who share their personality.
– Wonder why it’s so hard to find good people? Maybe you’re asking too much.
– So what if you make a hiring mistake? Here’s how to beat analysis paralysis.
…sounds like a good start to me, along the route to analysing your hiring success/failure so far…
Most of the universities in the UK that teach “computer games” are former polytechnics. They have a tradition of training rather than education. Rather than viewing their students as their customers, they view industry as their customer. This contributes to the sense of entitlement that industry has.
If the games industry wants good programmers (say), it should pay the kind of salaries good programmers can expect elsewhere. You don’t have to love games to program them; you have to love programming. Good software engineer graduates can expect a starting salary elsewhere that’s 150% of what the games industry pays. Those who nevertheless do apply to the games industry are lost in the mass of applications from chop house universities.
I’m not persuaded that the games industry needs “skills” per se from graduates anyway. Yes, applicants do need to demonstrate core competencies (that is, show that they can learn the necessary skills), but every company has different requirements so everyone will need on-the-job training regardless. It’s better that they show interest, enthusiasm, knowledge and imagination than that they have already-outdated skills.
As someone else pointed out: boom/bust cycles in tech “buzzword” skills tend to last about 3-4 years.
So … whatever you decide to teach today, because it’s the Next Big Thing … will be starting to collapse *just* in time for this crop of students to graduate.
OTOH, I remember being very frustrated that my Uni taught almost no skills at all (mostly because the faculty didn’t possess many – they were primarily pure academics, chosen for their illustrious research backgrounds). Of the few they taught, one was to program (in some depth) in Java – which proved very useful. They chose not to teach programming Patterns (which hugely surprised me at the time), and I think in general they probably worked TOO hard at avoiding skill-based teaching; they could have achieved a lot with just a little bit more Skills in their curriculum.
There most certainly IS a skills gap. It’s not with the Universities, though.
UK companies are to blame, they’re notoriously bad at training, especially ongoing training.
And well, one of the reasons I get work as a VL (as a designer) is that they want someone who isn’t a pure academic. Thing is, the age of Universities not teaching skills is going out the window in most subjects – it’s not going to be tolerated by people paying as much as they are.
I think that’s fair. The only gap that exists is an internal gap generated on the *employer’s* side.
There’s no “general” gap.