• September 4, 2015

Decoding the Value of Computer Science

Decoding the Value of Computer Science 1

Michael Morgenstern for The Chronicle

In The Social Network, a computer-programming prodigy goes to Harvard and creates a technology company in his sophomore dorm. Six year later, the company is worth billions and touches one out of every 14 people on earth.

Facebook is a familiar American success story, with its founder, Mark Zuckerberg, following a path blazed by Bill Gates and others like him. But it may also become increasingly rare. Far fewer students are studying computer science in college than once did. This is a problem in more ways than one.

The signs are everywhere. This year, for the first time in decades, the College Board failed to offer high-school students the Advanced Placement AB Computer Science exam. The number of high schools teaching computer science is shrinking, and last year only about 5,000 students sat for the AB test. Two decades ago, I was one of them.

I have never held an information-technology job. Yet the more time passes, the more I understand how important that education was. Something is lost when students no longer study the working of things.

My childhood interest in programming was a product of nature and nurture. My father worked as a computer scientist, first in a university and then as a researcher for General Electric. As a kid, I tagged along to his lab on weekends, watching him connect single-board DEC computers into ring networks and two-dimensional arrays, feeling the ozone hum of closet-sized machines. By my adolescence, in the mid-1980s, we had moved to a well-off suburb whose high school could afford its own mainframe. That plus social awkwardness meant many a night plugged into a 300-baud modem, battling other 15-year-olds in rudimentary deep-space combat and accumulating treasure in Ascii-rendered dungeons without end.

Before long I wanted to understand where those games came from and how, exactly, they worked. So I took to programming, first in Basic and then Pascal. Coding taught me the shape of logic, the value of brevity, and the attention to detail that debugging requires. I learned that a beautiful program with a single misplaced semicolon is like a sports car with one piston out of line. Both are dead machines, functionally indistinguishable from junk. I learned that you are good enough to build things that do what they are supposed to do, or you are not.

I left for college intending to major in computer science. That lasted until about the fifth 8:30 a.m. Monday class, at which point my enthusiasm for parties and beer got the upper hand, and I switched to the humanities, which offered more-forgiving academic standards and course schedules. I never touched Pascal again.

Fortunately, other students had more fortitude. They helped build and sustain the IT revolution that continues to make America the center of global innovation. But the number of people going this way is in decline. According to the Computing Research Association, the annual number of bachelor's degrees awarded in computer science and computer engineering dropped 12 percent in 2009. When Zuckerberg started Facebook, in 2004, universities awarded over 20,000 computer degrees. The total fell below 10,000 last year.

This "geek shortage," as Wired magazine puts it, endangers everything from innovation and economic growth to national defense. And as I learned in my first job after graduate school, perhaps even more.

There, at the Indiana state-budget office, my role was to calculate how proposals for setting local school property-tax rates and distributing funds would play out in the state's 293 school districts. I did this by teaching myself the statistical program SAS. The syntax itself was easy, since the underlying logic wasn't far from Pascal. But the only way to simulate the state's byzantine school-financing law was to understand every inch of it, every historical curiosity and long-embedded political compromise, to the last dollar and cent. To write code about a thing, you have to know the thing itself, absolutely.

Over time I became mildly obsessed with the care and feeding of my SAS code. I wrote and rewrote it with the aim of creating the simplest and most elegant procedure possible, one that would do its job with a minimum of space and time. It wasn't that someone was bothering me about computer resources. It just seemed like the right thing to do.

Eventually I reached the limit of how clean my code could be. Yet I was unsatisfied. Parts still seemed vestigial and wrong. I realized the problem wasn't my modest programming skills but the law itself, which had grown incrementally over the decades, each budget bringing new tweaks and procedures that were layered atop the last.

So I sat down, mostly as an intellectual exercise, to rewrite the formula from first principles. The result yielded a satisfyingly direct SAS procedure. Almost as an afterthought, I showed it to a friend who worked for the state legislature. To my surprise, she reacted with enthusiasm, and a few months later the new financing formula became law. Good public policy and good code, it turned out, go hand in hand. The law has reaccumulated some extraneous procedures in the decade since. But my basic ideas are still there, directing billions of dollars to schoolchildren using language recorded in, as state laws are officially named, the Indiana Code.

A few years later, I switched careers and began writing for magazines and publications like this one. It's difficult work. You have to say a great deal in a small amount of space. Struggling to build a whole world in 3,000 words, I thought back to stories my father would tell of consulting on the Galileo space-probe project. To make it work with 1970s technology and withstand the rigors of deep space, he and his fellow programmers had to fit every procedure—"take pictures," "fire rockets," "radio NASA back on Earth"—into a machine with 64K bytes of what we now call RAM. That's about 1/60,000th of what you can buy in a cheap laptop at Best Buy today. So every program on Galileo was polished to a gleam.

Good editors know there is no Moore's law of human cognition that would double our ability to retain and process information every 18 months. Instead, the technology-driven surfeit of modern information has made the need for clarity and concision more acute.

My editors rarely said a word about words, in the sense of how to phrase an idea. The real work was in structure, in constructing an unbroken chain of logic, evidence, and emotion that would bring readers to precisely where I wanted them to be. I learned to minimize the number of operating variables (people). I also learned that while some talented writers can get away with recursive scene-setting, it is usually best to start at the beginning and finish at the end. I discovered that skilled writers, like programmers, embed subtle pieces of documentation in their prose to remind readers where they are and where they are going to be.

Writing, in other words, is just coding by a different name. It's like constructing a program that runs in the universal human operating system of narrative, because, as Joan Didion once said, we tell ourselves stories in order to live. Authors adhere to a set of principles as structured in their own way as any computer language. Publications worth writing for insist on Galilean standards of quality and concision.

Computer science exposed two generations of young people to the rigors of logic and rhetoric that have disappeared from far too many curricula in the humanities. Those students learned to speak to the machines with which the future of humanity will be increasingly intertwined. They discovered the virtue of understanding the instructions that lie at the heart of things, of realizing the danger of misplaced semicolons, of learning to labor until what you have built is good enough to do what it is supposed to do.

I left computer science when I was 17 years old. Thankfully, it never left me.

Kevin Carey is policy director of Education Sector, an independent think tank in Washington.


1. 11122741 - November 08, 2010 at 11:35 am

As a cogntive psychologist and someone who teaches cognitive psychology and education, all I can say is great article Ken you hit all of the nails on the head and this article is no kludge. There are so many things that i really cannot teach this generation and that they have little understanding of as they simply have no concept of how things work ...really work ...they do not know things experientially and absolutely; they are slowly becoming the Eloi in Well's novel the Time Machine. It is one of the real downsides of miniturization, automation and "appliance" technology in all areas. programming should be the New Latin.

2. inarchetype - November 08, 2010 at 04:03 pm

Outstanding article. One of the best I've read lately.

Computer programming came easily when I took my first course in it as an undergraduate because I first took a course in the philosophy department called symbolic logic, composed entirely of writing Fisk proofs. It I have long beleived that everyone should be forced to take symbolic logic for all the reasons you discuss, in that it is more general than computer programming.

But computer programming offers the majority of the same benefits, in a package more accessible to those who have less tolerence for abstraction, and less patience with strictly academic pursuits. It is hands-on, useful, fun, and you get to see what you made working. But if you logic is flawed, your program won't work, so it inherently trains rigor and coherance in a way that less structured activities do not.

So I agree with the anonymous first poster. I used to think that formal logic should be the new latin, but am now convinced that programming has the edge.

The only problem is that, if taught with any rigor, it seems to be the sort of thing that some people can't- that there is a strongly bi-modal distribution of performance in programming courses. Typically, those who really aren't cut out for it, whose brains simply aren't wired that way, simply choose other things to do. If training of this sort were made universal, I fear that to give everyone a fair shot at success would require dumbing it down such that it would loose much of its value.

3. fizmath - November 08, 2010 at 08:51 pm

There is no geek shortage. That is industry propaganda to scare Congress into raising the annual cap on H1-B visas. Microsoft and Oracle reject 98% of their applicants. The industry is doing everything it can to offshore IT jobs. Look up the writings of Norm Matloff for a better explanation of this issue.

Other than that, learning how to program is very beneficial, even if you won't get a job with it.

4. robertwgehl - November 09, 2010 at 09:33 pm

I would love to see far more connection between computer science/software engineering and writing. I recently read Fred Brooks's classic The Mythical Man-Month and am struck by how readily his prescription for software design maps onto the ways in which I've been teaching composition. Brooks calls for two stages: architecture and implementation. The architecture is the overall goal of the software - what it does, what it means to the user. The implementation is the nitty-gritty details that bring the architecture to life. It parallels the thesis-support model that so many students seem to struggle with these days.

5. tuxthepenguin - November 10, 2010 at 06:54 am

You studied computer science before Java. Object oriented programming is more interesting to computer scientists, but is not a good introduction to programming. In the old days it was BASIC and Pascal and Fortran that were used in introduction to programming courses. Today it's Java. The object oriented approach is better but just too complicated.

6. 11167997 - November 10, 2010 at 07:06 am

Kevin: stick with writing like this because (a) it touches a lot of us, and (b) in this case, it validates your own journey. I wrote first and discovered programming (and SAS) in my mid-40s. Somewhere in that mid-life interstice, I thought, "haven't I seen this somewhere before?" There it was in a box in the basement: the mimeographed text for my freshman college math course, named something like Math 1: it was finite math, symbolic logic, binary coding. The epiphany enhanced the programming, but combined with the prior writing trajectory, it meant that I could make numbers sing. That doesn't leave you, either, and it's never too late.

7. 3224243 - November 10, 2010 at 07:17 am

I guess I'm a geek. I preferred DOS to Windows, enjoyed compiling a SPSS routine much more in DOS than using the graphical interface. The challenge of writing a routine that work (finally) was a pleasure never matched by the visual versions of the same programs.

It's like childhood was - back in my "day," we had to figure out how to get enjoyment by creating from what was around us. Most of today's kids don't create - they consume (without an understanding of where it came from or how it came to be) and compile (from a variety of sources without taking a moment to evaluate the substance of the content). They've lost the curiosity because they're too occupied with their smart phones. They have no time to just think and wonder because their lives have been scheduled to the minute by Mom and Dad.

How sad.

8. csgirl - November 10, 2010 at 08:52 am

I am a geek - but I learned object oriented programming first, and still find it much easier. The only programming language that ever defeated me was Basic, with its twisting morass of spaghetti. So I fundamentally disagree with the poster who thinks it is a better introductory language.

9. csgirl - November 10, 2010 at 08:56 am

I also want to comment on the sad state of computer science. When I was teaching in the 90's, we used to get lots of students who were hobbyists - they had run their own bulletin boards, written their own little games, opened up their computers and tweaked them. They had a creative, can-do, spirit and were genuinely interested in the inner workings of almost everything. The students I have now are like lumps. If the slightest thing goes wrong, they throw up their hands and cry "Professor! Come fix this!". In their high school "technology" courses, they were taught to make PowerPoint presentations rather than to program or play with logic circuits. They have no curiousity, and cannot think out of the box. It is very sad, because without that tinkering, creative spirit, I don't know what will become of the U.S.

10. fundmaven - November 10, 2010 at 08:59 am

Beautiful article! It's well-written and carries a poignant, if not fearful, portrait of intellectual malaise in our culture.
However, I am not so sure that the phrase "forgiving standards" accurately depicts the rigor of true academics in the humanities.
I submit that a thorough study into the complexities of economics, History (NOT "social sciences"),language, and literature (including poetry)provides context in the understanding of markets and cultures that influence them. It also promotes the abilities to express one's ideas fluently and cogently whether in code, technical prose, or poetry.

11. ronknott - November 10, 2010 at 09:03 am

In the 70s, I was an English major in the same college where my father was an English professor. After I took an introductory computer programming class (Fortran) to meet a core requirement, I flirted with taking a computer science minor, and signed up for advanced computer programming. Part way through the course I announced to my father at the supper table that his English department ought to require all English majors to take the same course. It was all about logic and rhetoric. I got the lowest grade in that course of any during my college career (I got distracted by other college fun). It blasted my GPA and was the end of my computer science minor. But I admired everything about the discipline of thought it required of me. I need to take a similar class again.

12. impossible_exchange - November 10, 2010 at 10:45 am

I am a humanities scholar who dual-boots Linux (my fiance demands that Windows be on all but one of our computers), and I am also an ex-programmer geek from high-school and one advanced C++ course as an undergrad years ago.
The narrative or ideology as coding is hardly new, Clifford Geertz among many others. The risk of thinking of language in the same way we think of computer language is that code is only denotative while language and narrative is overflowing with connotative meanings.
Code might be the new Latin but the old Latin is hardly no more and in America any language other than English might be the new Latin.
Code has its uses within the machine but any metaphor that extends it beyond the machine is fundamentally reductive and falls well short of reality. For all the complexity of my dual core laptop, custom Gnome desktop, and tweaked Linux kernel it pales before the complexity of Kevin Carey's little article or my half-remembered dream from last night.

So, while the humanities may lack rigor, by which you mean there is little memorization and a lot of open ended questions and thinking, I am here to encourage say, yeah no. There are many fruitful ways of looking a code and coding in relation to narrative, language, and writing. But, there is more complexity in Hamlet than there is in the combined complexity of every different OS ever written.

So before you science geeks--spoken as one--get all pumped--as I admittedly was when I first read this essay--let me be the call from reality. Writing a good code for operating an all-in-one print station is easy compared to writing something as "simple" (seemingly) as this:
A toy-maker made a toy wife and a toy child.
He made a toy house and some toy years.

He made a getting-old toy, and he made a dying

The toy-maker made a toy heaven and a toy god.

But, best of all, he liked making toy shit.
----Russel Edson.

13. onlineasllou - November 10, 2010 at 11:08 am

3224243 (previous post#7): Are you sure you are not me? As I read this article, I knew I just had to add my comments. Then I saw your comment and found "my" thoughts there.

I loved high school math, took symbolic logic as a first semester freshman "just for fun," and who was in grad school working in a research lab the year SPSS introduced its graphical user interface. (I hated it.) I work every day with people in that group who "just don't get it" when it comes to logic and feel sorry for them.

Thank you, Kevin, for this fine article. It has helped me to put a few things about my life and career into perspective.

14. tuxthepenguin - November 10, 2010 at 11:10 am


I have no doubt that you found OO programming to be a good introduction. You are probably the 1 in 10 or 1 in 100 for whom it works (although we don't actually know that it was better for you; I'm going to say that you'd probably have learned just as easily in an old-school course.)

The problem with your criticism of BASIC is a great example of the problem with introductory programming courses. You are referring to a twisting mess of spaghetti. That term refers to large programs, not those that are used to demonstrate the fundamentals.

Here's a Hello World program in BASIC:

10 PRINT "Hello World!"

And the same thing in Java:

public class HelloWorld {
public static void main(String[] args) {
System.out.println("Hello, World");

You just lost a big chunk of students right there. It's worse than a twisting mess of spaghetti. It's an unnecessary foreign language that few students can follow.

And it doesn't get better the second lecture either (for you spend the entire first lecture in Java explaining Hello, World).

Introductory programming courses should demonstrate the fundamentals, which are not tied to a particular language. Use Python or some other language with a low barrier to entry. Java is an excellent choice for a second language. I've seen the students who struggle, curse, and rant against programming because they have instructors who think OO programming is the way to go.

15. zagreb - November 10, 2010 at 11:16 am

I would have loved to forward this widely and discuss it with my colleagues and friends, but I can't -- it's such a shame that this article takes that final swerve into childish Two Cultures anti-humanism, when that's so beside the point. Once you've come to the important realization that coding is writing, or at least a form of writing, why take this as grounds to dismiss those who teach and study writing?

16. max_hailperin - November 10, 2010 at 11:42 am

I'm going to ignore all the fascinating ideas in this article and instead comment on one of its assumptions: that there is a "geek shortage". Set aside the equation of computer science majors with geeks, though that is problematic as well. Instead focus on this question: what is the correct lesson to take away from the huge boom and bust cycles in computer science enrollments? The obvious possibilities would be to point to a "geek surfeit" during the boom years, a "geek shortage" during the bust years, or some combination. But instead I would point to a different shortage: a shortage of rational ways for young people to discover the field of study that suits them well. Lacking such means of finding their paths, the young people follow the only guide they have: the volatile trends of fashion, whether the herd is stampeding into the field or out of it. This seems like a much broader concern than anything connected specifically to computer science, though it happens to be made most apparent to us by computer science. Neither the needs of society nor the aptitudes of humans changes dramatically over the space of a few years. What causes wild swings in enrollment is the decoupling of enrollment decisions from those fundamentals.

17. m3_vt - November 10, 2010 at 12:54 pm

I truly enjoyed reading this article and it brought back many memories over the 45, or so, years I've been learning, practicing, and teaching computer science and related material.

While I generally agree with almost everything said, it seems to me what is really missing from a significant portion of the students and practitioners I've had contact with is the joy of problem solving and the solving of problems that don't already have answers. To me, this is the science of computer science and most of what we teach has little to do with it.

There are so many good approaches to problem solving that having a "tool box" of them is critical when facing a novel problem. It seems to me we often provide just one approach as the right one to use in a certain circumstance and that this reduces the play attitude of problem solving. Having only one problem solving approach contrains thinking.

I'd like to provide an example of how we might need to restructure our whole approach if, in fact, the goal is to produce real problem solvers. Envision a math class where the goal is not to learn [algebra | calculus | topology, ...] but to solve a set of problems in at least five ways and study various approaches to problem solving like the Dewey method, Thorndikes approach, Gestalt, dual problem spaces, and so forth.

While good problem solving methodologies are important, each new computer creation should not be constrained by an inappropriate methodology. Think about Sir Issac Newton, Ada Lovelace or George Boole; if they weren't creative problem solvers how would they have gotten their work done?

18. tolerantly - November 10, 2010 at 07:07 pm

I'm one of those too, and though a lib-arts person, am deeply grateful for the training in logic I had at the hands of my dad's Trash-80 and subsequent machines. It shaped my thinking, shaped my view of how things work, shaped my language, and has been a great help to a person otherwise inclined to the woolly. I have great attachment to those generations of language post-Basic, the network structures, the ingenuity in noticing and accommodating other communicative and functional desires.

One of my favorite parts of The Social Network was the rapid-fire info-mining monologue, as he's lifting faces from facebooks: Yes. Exactly. And the excitement and sense of power in the club scene? That's it.

I only wish some of this could be communicated to my daughter's K12 teachers, who tend to be afraid of the computers and the languages themselves, and are convinced that the kids have to learn some pared-down pushbutton app now in order to be able to compete for jobs in 2025. They miss the point pretty radically.

19. bobster - November 10, 2010 at 07:07 pm

While I agree with many of the details in this article, I can't get over the fact that the author ignores the fact that both he and Zuckerberg never really studied much computer science in college. It sure sounds like he picked it on the job and then he moved on at age 17. It didn't take Zuckerberg much longer to move on from studying.

Given this, I'm amazed he would decry the fact that the number of full degrees has dropped from 20,000 to 10,000. Maybe more and more kids are following Zuckerberg's lead and skipping the college part of studying computer science.

Don't get me wrong. I like the study of computer science, but at $200,000+ for a degree, I just don't know if it's worth it. All of his advantages make sense, but the cost is becoming prohibitive.

20. richardtaborgreene - November 10, 2010 at 07:59 pm

I went into artificial intelligence programming at MIT in 1966 because that is where the smartest people were gathering around lunch tables thinking really bold thoughts on napkins and trying to understand how the mind did things so they could get codes on machines to do some of the same human-like creative things. That was AFTER writing a LISP 1.5 program in a high school job to control a giant crystal growing machine for the man across the street, writing an APL program to plot new found stars/nebulae on star maps, and writing a Fortran I program that binomially expanded bridge design stress integrals, keeping integer coefficients (on IBM 1401 with 8 digit limit, had to hack my own matrices for simple numbers).

I have noticed over the years that the minds and talk of programmers work faster than those of others. When with someone with real programming skills (not coding skills) I find us talking faster and faster till my tongue cramps. My wife observes the same thing---with programmers I talk at an accelerating rate. I have on exception---a double PHD MD medical department chair who talks equally fast. So with my limited data I can say that one programming background equals two Phds and one MD.

21. fatmarauder - November 11, 2010 at 09:57 am

Over the past four years I've done a lot of work with incoming STEM freshman. I would emphasize the word "rhetoric". We give them tests to see if they have critical thinking skills: as measured they do. But they can't formulate an argument! I don't care how they learn it, they need to be able to think and reason.

22. macdonagh - November 11, 2010 at 12:17 pm

People who want to build and solve things can express that urge through computer programming. But they can also do it through biology or law or writing. What I like to think I recognise here is an echo of my pet evolving thesis is that people are more inclined to want to construct and resolve rather than, as some practitioners and disciplines still do, merely deconstruct, criticise or flail around in the empty world of "well your opinion's just as valid as mine" . No it isn't. Somebody's more right and somebody's more wrong. Humanities people might say it's a shame but I think and hope and submit that we are all getting more scientific.

23. dank48 - November 11, 2010 at 02:45 pm

Hear, hear!

Not the least valuable part of my computer science education (halfway to a master's before I realized I needed to reprioritize) was a passage from "Let's Talk LISP" by Laurent Siklossy:

"If all this seems confusing at first, don't worry. Remember: you never learn anything; you just get used to it."

That beats any sentence I can recall from any of the texts for any of the education courses I had to take in order to teach high school in the state of Indiana.

Programming is also a great innoculation against the belief that incomplete, approximate instructions will somehow result in complete, precise results. Once you realize that computers do what you tell them to do, not what you want them to do, and that they just cannot read your mind, you begin to suspect that people are a bit like computers in that way as well. Also, I have yet to see a program that would be improved by the use of "like," "awesome," etc.

24. austinbarry - November 12, 2010 at 07:44 am

A great article! I especially liked the law -> computer program representing law -> simplified computer program -> simplified law example, since I see that sort of thing all the time.

By the way, Java isn't the only Object Oriented programming language, and the "hello world" example isn't a good example of an OO program. In most books, one must read material from at least 3 or 4 chapters to fully explain the "hello world" example.

25. celticmagic - November 12, 2010 at 08:05 am

Nice article.

I think, though, that philosophy would afford superior training in logic and rhetoric.

26. austinbarry - November 12, 2010 at 08:30 am

@celticmagic - I would agree in general, but one can't execute a philosophical argument and see the results. A flawed computer program will either not run at all, or produce results which (to an intelligent human) seem "wrong" (but to the computer, they are exactly what they should be, given the program).

27. squeakers215 - November 15, 2010 at 01:25 pm

While the drop from 20,000 to 10,000 is interesting, it might not tell the whole story. As a TA for an intro CS class (in Python) I'm hearing a lot of students talking about going into Computer Engineering or Electrical Engineering instead of Computer Science. Apparently you make more money if it has engineering in it...? Anyway, the point is that it is possible that rather than a reduction in overall numbers people are simply moving into closely related majors which are not counted as strictly Computer Science.

28. jbode - November 18, 2010 at 03:54 pm

Computer Science isn't for everybody; it requires a certain personality, just like Engineering or Biology or English. I knew plenty of people who were bright and enthusiastic, but simply couldn't *get it* (either the act of programming or the concepts behind it).

I also have to flog my favorite hobby horse: CS is, at its core, a mathematical discipline, which is largely orthogonal to the act of programming. I've known people who were good computer scientists but wretched programmers, and pretty sharp programmers who only knew the most basic theory (mention Markov chains and they'll look at you funny).

There was a surge of interest in computer science and technology because 15 years ago people realized you could get filthy rich by setting up a Web site with no intelligible business plan and venture capitalists would throw piles of money at you. Since that particular ship sailed (and ran aground, and then exploded) interest in the field has waned. I'd suggest we're seeing a return to *reasonable* levels of interest by people who find the subject appealing for its own sake.

29. marcyrw - November 19, 2010 at 11:58 am

The discipline of music, especially composing, seems to utilize similar principles as those in computer science. Logical flow, the mathematics of theory and tempo, articulation, organizing harmony and rhythm for maximum effect, using crafted musical language that is also replete with emotion to capture the listener are all skills that can be extrapolated into other areas such as writing. As an attorney and a musician, I have found that to be so. (Also, I was lucky enough while I was in law school to have had a wonderful Advanced Legal Writing Professor).

30. strewdoll - November 19, 2010 at 11:27 pm

Yeah! Of course,computer-programming has become very demanded by Companies and especially IT companies.We can say it is receive official approbation by IT secor.But overall,Artices was very wow comment able.Bodyquick

31. john_jay - November 22, 2010 at 01:51 pm

It's not just a problem with computer science. The real shortage is of people who can solve problems systematically, whether those problems are with cars or with code or with the economy.

One minor quibble, though. There is not a "geek shortage" in general. There are plenty of perfectly good programmers over 50 who could do what needs to be done for another 15 or 20 years if companies were willing to hire them. The shortage is of young and inexpensive geeks coming up through school now.

32. hariseldon - November 26, 2010 at 02:40 am

We can tell if there is a "geek shortage" by observing the salaries of geek professions. Students certainly have; hence foreign students fill our engineering schools while the best American-born students opt for law, medicine, and finance.

33. mmn13ps3 - November 26, 2010 at 02:19 pm

Just a couple of years ago, I was deeply interested in computer programming. And even though I am still studying ICT in 11th grade, somehow, I don't think I have enough interest in it. Engineering seems like a more lucrative profession now.

So, thank you Kevin Carey for this fine article. It has helped rejuvenate some of my lost attraction towards programming. And hopefully it will help me within a couple of years, when I begin college.

Add Your Comment

Commenting is closed.

  • 1255 Twenty-Third St., N.W.
  • Washington, D.C. 20037
subscribe today

Get the insight you need for success in academe.