They claim this is not strongly correlated with intellectual capacity or grades for example in math. However, what they present is that there is a simple predictor of success in an introductory course in computer programming. Even before the course starts and assuming that the students have no prior knowledge in programming you give a number of problems of the following type:
What are the new values of a and b?
The important thing is how to analyse the answers: The students not having been taught the correct meaning of the progamming language have several possibilities: Either they refuse outright to answer these problems as they do not know the answer. Or they guess. Given the way the question is phrased they might guess the the equal sign is not a logical operator but some sort of assignment. Then still they do not know how it works exactly, but there are several possibilities: Right to left (the correct one), left to right, some kind of shift that leaves the originating variable 'empty' or some add and assign procedure. It doesn't matter for which possibility the students decide, what counts (and that they are not told) is that between problems they stick to one interpretation. According to the paper, the final grades of both groups, the consistent and the inconsistent students, both follow some Gaussian distribution but with the consistent students in the region of good marks and the inconsistent students in the fail region.
This brings me to my topic for this post: Different philosophies approaching a computer. Last week, I (for the first time in my life, me previous computer experiences being 100% self study and talking to more experienced users one to one) at to sit through a computer course that demonstrated the content management system (CMS) that LMU websites have to use. It started out with "you have to double click on the blue 'e' to go to the internet' but it got better from there. The whole thing took three hours and wasn't to painful (but net exactly efficient) and finally got me the required account so I can now post web pages via the CMS. The sad thing about this course was that obviously this is the way many people use computers/software: They are told in a step after step manner how to do things and eventually they can perform these steps. In other words they act like computers themselves.
The problem with this approach of course is that the computer will always stay some sort of black box which is potentially scary and you are immediately lost once something is not as expected.
I think the crucial difference comes once you have been programming yourself. Of course, it is not essential to have written your own little office suite to be able to type a letter in word but very often I find myself thinking "if I had written this program, how would I have done it and how would I want the user to invoke this functionality?". This kind of question comes especially handy in determining what kind of information (in the form of settings and parameters) I have to supply to the computer that it can complete a certain task. Having some sort of programming experience also comes handy when you need find out why the computer is not doing what you expect it to do, some generalised version of debugging, dividing the problem into small parts, checking if they work, trying alternative ways etc.
This I consider the most basic and thus most important part of IT literacy, much more fundamental than knowing how you can convert a table of numbers into a pie chart using Excel or formating a formula in TeX (although that can come close as TeX is Turing complete... but at least you have to be able to define macros etc). You cannot start early enough with these skills. When you are still a kid you should learn how to write at least a number of simple programs.
20 years ago that was simple: The first computer I had under my fingers (I didn't own one but my friend Rüdi did, mine came later as my dad had the idea of buying a home assembly kit for an early 68k computer that took months to get going) greeted you with "38911 BASIC BYTES FREE" when you turned it on. Of course you could play games (and many of my mates entirely did that) but still the initial threshold was extremely low to start out with something along the lines of
10 PRINT "HELLO WORLD": GOTO 10
With a computer running windows this threshold is much higher: Yes you have a GUI and can move the mouse but how can you get the stupid thing to do something slightly non-trivial?
For Linux the situation is slightly better: There the prompt comes natural, and soon you will start putting several commands in a file to execute and there is your first shell script. Plus there is C and Perl and the like preinstalled you already have it and the way to the first "Hello world" is not that long.
So parents, if you read this: I think you really do your kids a big favour in the long run if you make sure they get to see a prompt on their computer. An additional plus is of course that Linux much better runs on dated (i.e. used) hardware. Let them play games, no problem, just make sure programming is an option that is available.
And yes, even C is a language that can be a first programming language although all those core dumps can be quite frustrating (of course Perl is much better suited for this as you can use it like the BASIC of the old days). My first C compiler ran on my Atari ST (after my dad was convinced that with the home build one we didn't get very far) which then (1985) had only a floppy drive (10 floppies in a pack for 90DM, roughly 50$) but 1MB RAM (much much more than the Commodore 64's of those days and nearly twice as much as PCs) so you could run a RAM disk. I had a boot disk that copied the C compiler (and editor and linker etc) into that ramdisk and off you went with the programming. The boot up procedure took up to five minutes and had to be repeated every time you code core dumped because you had gotten some stupid pointers wrong. Oh, happy days of the past...