In the Kansas City Star recently, an educator posted an editorial that suggested all students graduating from college these should days should be “job ready.” The educator argued that the current college curriculum was defined for the “gentlemen” of the 19th and 20th centuries, but this is the 21st century, and the cost of higher education has skyrocketed and what with the pace of technological change, the shrinking of our job markets, and the new educational options (online courses, etc.), our colleges, universities, and institutes have to radically reduce their costs and produce “job-ready” graduates.
Now, I have to admit that institutions of higher education in the US have many problems. They are beset with much more competition for students and they keep raising their prices, driving likely students to community colleges, for-profit schools, and on-line courses. Moreover, they have skimped on their most important resource: their faculty. In many large institutions, a large part of the core curriculum is now taught by part-time, underpaid adjunct professors or teachers assistants. But I’ll leave the question of the quality of instruction for another time; right now I want to focus on content.
When I went to school, there were practically no computer science (or even programming) courses offered at my school or, for that matter, in any but the most advanced schools. While I was a mathematics major, it by no means prepared me for a 40-plus-years career in IT and systems design, but my liberal arts education did. And so did my three years in graduate school studying philosophy. What the liberal arts/mathematics/philosophy education taught me was how to think. And that is, I am sure, one of the reasons that I’ve been able to survive and prosper in a field that is constantly changing.
Once, at a computing conference, I happened to sit in on a speech on technical education by a researcher. While I’ve forgotten his name, I will never forget what he said. He said that if we are not careful, we will spend all of our training/education efforts producing a kind of “perishable competence.” Over the years, I have used this quote many, many times, and it remains as relevant today as it was two decades ago.
Very few technologies have come as far so fast as computing/software has in the last 70-plus years. Spurred first by defense concerns and then by industrial ones, computing/software/Internet technology have been changing faster than anyone — even the most optimistic scientists — could possibly have imagined. What has transpired over the last few decades has been mind-boggling. Nearly everyone in the developed world carries around with them or has access to technologies that are real and that before were science fiction. Moore’s law, a more or less offhand prediction, has been overtaken time and again. And Moore’s law represents only a small portion of what’s happened most of our lifetimes. If we simply teach young people the current technology (e.g., Python, Java, and C++) what will they be programming in 15 or 20 years from now? What will be the technologies then and, more than that, what will be the opportunities and challenges?
In the brave new world of the 21st century, how much help will today’s “perishable competence” based on Python or Java be? I’m sure that hacking won’t be enough, unless the hackers can educate themselves to think broader, wider, and deeper. And if programmers have been lucky enough to have a background in mathematics, logic, history, philosophy, science, and perhaps even art and music, who knows what they may be able to come up with. What will some new-age Steve Jobs think up? How will we integrate great new ideas into our increasingly complex technological world unless we understand things like “systems dynamics”? I suspect “job ready” may just not do the job.