This seems fair enough

Teaching children coding is a waste of time, the OECD’s education chief has said, as he predicts the skill will soon be obsolete.

Andreas Schleicher, director of education and skills at the Organisation for Economic Co-operation and Development, said that the skill is merely “a technique of our times” and will become irrelevant in the future.

“Five hundred years ago we might have thought about pen literacy,” Mr Schleicher said. “In a way coding is just one technique of our times. And I think it would be a bad mistake to have that tool become ingrained.

“You teach it to three-year-olds and by the time they graduate they will ask you ‘Remind me what was coding’. That tool will be outdated very soon.”

Comparing it to trigonometry, he said: “We are going to get into the same dilemma. I think is very important that we strike a better balance about those kinds of things.

“For example, I would be much more inclined to teach data science or computational thinking than to teach a very specific technique of today.”

It’s important to distinguish between teaching the concepts of something and the techniques. Boyle’s Law is important, being able to fiddle around with an ICE engine isn’t a skill someone born today is likely to need. No, not because electric, but because computerisation.

The concept of coding, sure, but the techniques of javascript? I tend to think that’s all going to go the way of car engines in fact. Time was when any driver had to know how to maintain at the very least. Nowadays, just turn it on and drive. It’s a black box that works. Computers are getting there. There’s this group over here, engineers, who code. Then 99.9% of the world who don’t.

At which point not worth teaching the details of what’s not used to all, is it?

91 comments on “This seems fair enough

  1. Whatever programming language you use to teach, it will be obsolete before you’re ever likely to have to work with it. At uni in the early 70s we were supposed to use Algol* for coding (we weren’t ‘taught’ it, just pointed at a text book and expected to get on with it). By the time I started paid programming work a few years later, you struggled to find a compiler for it.

    * I sensibly (or luckily) chose to write in Fortran instead, which is still a useful skill in some specialist areas.

  2. It was at least 30 years ago that I read the first article saying that all programming would be automated within the next 10 or so years. I expect human coding will become obsolete shortly after the first commercial fusion reactors come on line.

  3. “Time was when any driver had to know how to maintain at the very least. Nowadays, just turn it on and drive. It’s a black box that works. Computers are getting there. There’s this group over here, engineers, who code. Then 99.9% of the world who don’t.”

    It’s not the same thing. Coding is automating data processing. Now, some of that got replaced by someone writing some code that gives a user a function (e.g. querying data in Excel vs writing a loop or some SQL) but what if someone hasn’t already built the boilerplate function you need? In a lot of companies you have people doing data analysis who are writing code?

    Data science? What does he mean by that? This is such a fashion/buzzword term, and if you’re doing data science you generally need to know some R or some Python.

    I’m not even saying that kids should learn tons of coding. But how is “only a few kids need to learn code” any different from anything else? Beyond 13 year old maths and one subject that people excel at, how much of school is useful? Have I ever made a dovetail joint since, or had to know much about cell structures? 90% of secondary is a waste of time and the only justification for it is discovering the 10% that someone is good at.

  4. “how much of school is useful? ”
    It’s not what you learn. Like BoM4 says, 90% of what’s taught at school will be useless. It’s learning how to learn. Then you can learn anything.
    Which is why the current credentialisation mode of education is worse than useless. Produces people superbly qualified to be ignorant.

  5. How ever much computer crap you attach to a car it still has suspension, brakes & steering components which DO wear out and need replacing. This applies to EV’s just as much as conventionally powered vehicles. Please don’t assume that cars are only repaired by automated processes or “technicians” in clinically clean clothing – there will always be some poor sod getting covered in dirt and grease…

  6. By definition if you are teaching coding to three/five/seven year olds then you aren’t likely to be teaching them in a language designed for real world work, you are going to be using a toy language. That however isn’t a reason not to teach it at all. Teaching coding at that level is basically teaching logic, algorithms, and rational thinking. Those skills are always going to be vital and transferable skills.

    Its also worth remembering that for decades coding has been taught in primary schools in the UK – probably all children of the 80’s and later in the UK at least learnt turtle graphics or similar.

  7. That’s not my point as you well know.

    Coding as a specialist occupation? Sure, just like car mechanic. As a general societal skill, like being able to boil an egg? No….

  8. Learning Pascal wasn’t entirely wasted, it taught me a lot about the general concepts behind computing, and also helped improve my general logic and rational thinking, as the Mole says.

    However, I have been thinking about this myself in regards to my own kids, and I’ve decided that I’m probably going to steer them away from doing computing as an option. Something like history is much more important. But I’m not totally sure yet, as I still think learning some programming did develop my brain.

    And what about doing a foreign language? Some say don’t do it, you won’t need to know a foreign language in the future as translation units will do it all for you. But even if this is true, isn’t there still a value in learning a foreign language, in regards to getting you to betterr understand language in general, and also being exposed to a different culture and literature? Is reading a French novel in the original a better experience than reading a translation? I don’t really know, as I dropped languages as soon as I could.

  9. Ha ha ha

    Learning to code is learning how to break a problem down into a series of steps. This is a generally useful skill.

    I was told when I started in ’72 that programming would be automated in a few years. Are we there yet?

    The thing that non programmers don’t get is that it is a creative art. Computers don’t do creative. Let me know when a computer can write its own graphics driver by reading the h/w spec.

    BTW There are more programmers now than there have ever been.

  10. We live in a technological world where the population is a thousand times the non-technology support capability.
    Yet we are breeding a population – and a ruling class – who see the world as magic and miracles.
    Meat comes in a plastic tray from Lidl, not from growing, killing and dismembering an animal. Electricity comes from a socket in the wall, who needs power stations?
    Cars will stop using petrol(ban them!) and run on unicorn farts or something.
    The polticians spout it, the teachers believe it and the children are taught this nonsense until they truant to recite it back.
    So anything that helps teach some basic underlying principles is good. Law of Conservation of Mass/Energy. Falsifiable theories. Boyles law (as you suggest).
    And yes, some basic coding & IT concepts, so that people have even a primitive understanding of what a computer is doing. It isn’t magic, unless they stay (or are kept) ignorant.
    It isn’t vocational, except for a few, but it’s essential for making informed decisions about our world.

  11. And as a nod to our host, the Law of Supply and Demand perhaps?
    Not everyone grows up to become an economist, but some basic understanding of the role of price might be beneficial? 🙂

    But teaching spudda that penal taxes drive people (and tax) away: too much to ask!

  12. My late missus was a COBOLler and wrote automatic programmers. Feed the logic in one end and the code came out the other. Building the machine took months, but code was produced instantly and was generally bug free, because any errors applied across the board and need be ironed out once only.
    Now personally, I am a rotten programmer but managed 25 years without getting caught. Like al trades, some can bodge it, some are experts. The important thing to teach is not “what language to use” but how to think to be able to program. Logical steps with an end product should be the result. In other words teach them how to do maths properly.

  13. Algol vs Fortran?

    Languages of the Algol family every time for me: Fortran is (or was) a horrible botch of a language. I found being obliged to program (as we used to say) in Fortran painful.

    The solution turned out to be to get other poor mugs to do it and for me to supervise their efforts. Such a pity; I had enjoyed programming my own maths though I suppose I might have been ready to grow out of it.

    And talking of hateful things in computerland, does anyone remember JCL? Aargh!!

    Anyway, as part of education – sure, do a bit but opt for a language that puts as few imbecilic conventions in the way of the children as possible. The purpose would be to teach them the application of logical thought. (Is there a good reason why they couldn’t learn some of those lessons by using spreadsheets, though?)

  14. TtheC

    I agree whole heartedy
    We live in a world where the Laws of Thermodynamics are under constant attack.

  15. Well, for some reason,

    “you’ve still got the paradigms print gave you, and you’re barely print-literate.”

    Popped up into my head.

    Anyway, 3yr olds? Personally, I would have thought that they’ve got enough on their plate already, like how not to shit yourself in public.

    But, yeah, you don’t really want to be teaching the details of a specific environment, or the idiot behaviour of the RTE. Coding something up is pretty much about model building and translating the model into something the environment can deal with, hopefully correctly, with any luck(*). So, does kind of depend on what this nerk means by “computational thinking”, but I don’t necessarily have a problem with the general idea.

    Although, “black box” rings some fucking alarm bells.

    (*) Oddly, teaching this using Brainfuck might be interesting, with the added bonus of making teachers up and down the country say the word several times a week to a bunch of teenagers.

    Now, anyone got some Clipper or MUMPS work going?

  16. “…probably all children of the 80’s and later in the UK at least learnt turtle graphics or similar…”

    er…no. I’m afraid they stopped teaching computing in the 90s concentrating on “IT” (basically using MS Office) before bringing it back about 10 years ago. My older kids missed out, my youngest didn’t.

  17. >Learning to code is learning how to break a problem down into a series of steps. This is a generally useful skill.

    Yes, this is the generally useful part of coding that develops your rational thinking. It forces you to understand how a job needs to be tackled in a logical way.

    Is Geography in high school a waste of time now? At Uni level it is, it’s mostly been taken over by the BS mob, but maybe at high school it’s still all right?

  18. Hector: you’ve made the classic mistake of confusing “coding” with “computing”. “coding” is automotive engineering, “computing” is a lot closer to “driving a car”.

    And yes, we shouldn’t be throwing resources at forcing everybody to learn code/be an automotive engineer, but yes we *should* be getting as many people as possible to learn computing, or rather IT, as that is today’s equivalent of “being able to drag a pen across of sheet of paper” – without which you will not be able to function in the workplace.

  19. “Is reading a French novel in the original a better experience than reading a translation? I don’t really know, as I dropped languages as soon as I could.”

    Yes, it is. Translations always lose certain aspects about the culture and meaning of language. I’ve watched French films and seen the subtitles where the translator did about the best job they could, but the use of a particular word or expression has more meaning than the translated word to the culture.

    But is it useful? Worth the investment? I don’t think it is, in general. Would I try and get school time allocated so that kids did it? No. Would I stop someone spending their evenings and weekends reading about 19th century military life and the French language so they can really get under the skin of Cyrano de Bergerac? Again, no.

  20. Trying to teach programming to 5 year olds is dumb.

    Trying to teach programming to 15 year olds is a good idea. They can learn some logic skills, but, more importantly, be exposed to the IT world. It’s good to expose teenagers to many fields, so they can pick a direction they want to pursue.

    Specific tech isn’t all that relevant. E.g., processing loops are relevant. How it’s done in different languages isn’t.

  21. “The important thing to teach is not “what language to use” but how to think to be able to program.”

    EXACTLY! Programming is programming is programming. I’ve had job enquries go:
    *What can you program?
    *Computers.
    *No, what languages can you use?
    *Whatever ones you have.

    If asking for a car driver you don’t ask what car they drive, yet employers refuse to employ you on a job using BumbleWee 2.0.1.4j because in your last job you were using BumbleWee 2.0.1.4h.

  22. Yes, TtC. The West is decadent. I once heard Nancy Pelosi talking about something and she made it clear that she thought bread came from the grocery store. A national leader who has no clue where bread comes from. The farms, the transport, the processing. The ignorance of the decadent dumbass Democrats is not keeping them from trying to implement a communist revolution in the U.S.

    See AOC’s Red New Deal.

    Correct, DW. Computerization of automobiles helps with diagnosis, not actual repairs. Reporting a misfire in cylinder 4 does not fix a misfire in cylinder 4.

    ‘being able to fiddle around with an ICE engine isn’t a skill someone born today is likely to need. No, not because electric, but because computerisation.’

    I strongly disagree. But our differences are in our speculation of what the future holds. Hence, neither of us is “wrong.”

  23. While reminiscing on old languages, anyone remember APL?
    Now, any language that cannot be typed on a keyboard is a surefire success!

    There’s also a competition on code obfuscation (or used to be) with some really bizarre C programs, but I lost the link.
    Maybe someone has it?

    Driving a car is much easier and safer, if you have some basic engineering appreciation of friction, skids, gearboxes, torque and so on. Not motorhead, just what they do.

  24. Some rather fascinating comments here, some of which tell me that a part of the population of commenters are in their seventies, or approaching it. Algol v. Fortran? Of course Algol was superior, but Fortran was (and still is) everywhere. The last major revision/standard was – as far as I know – in 2008. Fortran becomes more Algol like with every revision, but like with any evolution, the obsolete features are still there. The appendix, anyone?

    I can remember Algol and Fortran taught to engineering undergraduates. There was a school of thought that Basic, particularly BBC Basic, was being taught in schools, so in due course, no programming needed to be taught because it would be a skill that freshmen joined with. I’d just seen working spreadsheets, and knew that for the listing and tabulating jobs that formed most of the exercises, there was a better way. There isn’t for programming big engineering analyses, but few graduates will do that, let alone people from the general workforce.

    Sometimes we forget the difference between education and training. My father, an artilleryman, could use a slide rule under fire to perform certain computations, – he’d been trained – but he never knew how it worked, nor could he do anything else with it.

    I’m afraid that current generations of ‘educators’ don’t know the difference. You train kids to read and write, you educate them into understanding what they have read. You train arithmetic, but algebra, and trigonometry, are education.

    Programming, whatever language is used, is a combination of both.

  25. >Hector: you’ve made the classic mistake of confusing “coding” with “computing”. “coding” is automotive engineering, “computing” is a lot closer to “driving a car”.

    My comments were mainly about coding, but my queries were about Computing, the UK high school subject, the current content of which I am mainly ignorant.

  26. A lot of programmers here it seems. As one myself I think you should all listen when I say that all kids should be forced to learn the Enterprise programming language. When they eventually leave school there will be no end of jobs as Enterprise Developers available to them to pick from at will.

    Joking aside, I’m very much on the side of those that think a particular language, or even certain OOP concepts, are not worth teaching to all kids but a general introduction to BASIC/Scratch/Logo and logic, algorithms and breaking complex tasks down into small logical steps would probably help most students in all sorts of things and probably should be a part of the curriculum.

  27. I haven’t done much coding but what little I did taught me the value of annotating the hell out of things. My memory is bad and I can’t remember how I did something a week later, so I learned to put copious notes and comments beside anything I do.

  28. I’m no programmer but learning a bit of Python from this book called ‘automate the boring stuff’ has been a godsend. Learning a bit of it can help anyone.

    I can’t see gow understanding basic stuff wont be an asset for the foreseable future.

  29. Teaching kids to think logically? I wonder if our Progressive Establishment has realised quite how dangerous that would be for them. The kids might start using this rational, logical thought process in other areas, such as dismantling the ridiculous arguments they are told in support of the many ideological stupidities forced down our throats these days.

  30. 36 comments and counting. You can tell that our host used to write for The Register.

    Learning the fundamentals of any subject is useful to avoid being bullshitted by experts in that subject. Whether it’s your mechanic tells you that your brake fluid needs replacing after 5k miles, or your builder claiming that he needs two weeks to fit a kitchen, a little knowledge goes a long way in spotting scammers. Our host isn’t a trained economist, but he knows enough to spot a cock & bull story a mile away.

  31. I sometimes claim I mistrust any language designed after I was born. That leaves me Lisp, Fortran and Cobol. Maybe some assemblers.

    Actually spent most of my career using C, which is fine, unlike the abominable C++.

  32. Of course, the first and most obvious question to be asked is whether Andreas Schleicher has the faintest idea of what he’s talking about. Given that he’s a high-level educational bureaucrat, my guess is that he doesn’t. And never will.

    Educators usually haven’t a clue (and don’t want one) about the sort of education that might be appropriate for the real world… either present or future.

  33. Obfuscated C https://www.ioccc.org/

    APL? That was peculiar. Once found a compiler for an Amiga or ST and played with it for about an hour.

    TimN, always write your code as if the person maintaining it will be a psycho with an axe and anger management issues who knows where you live.

  34. DmcD: Many thanks for the link. That’s the site I remember. I will enjoy catching up.

    AT: I once heard that BASIC was invented by the Amalgamated Guild of Code Fettlers to ruin the next few generations of coders, thereby preserving jobs for its members.

    Having been taught BASIC at Uni, I earned my keep writing C for many years. The trick is to use lots of #DEFINE macros so that the GOTO statements are hidden.

    But this thread isn’t an old coders convention. The OP was related to how people can cope in a world without any understanding of how things work. Pretty much all of school should be that outline appreciation of everything, and only move into detail once the child has started finding their specialities. That means basic science, arithmetic, economics, communication and computing., to list just a few.

    If the school leaver has no concept of how a computer does what it does, how can they operate in a computer-dependent society? etc. By shamanism, it would seem.

  35. I’m no longer a programmer, but even so I built two wireless routers to piggy-back on the local free WiFi service so I can avoid paying BT Line Rental.

    Just a couple of Raspberry Pi’s and two WiFi antennas from eBay for £12.95 a piece.

    What is important is not the language you do stuff in (my two homebuilt routers are little more than bash scripts with a bit of curl automation built in), but that the form used. I could have done everything in C, but why bother?

    As for automated code generation, sure I’ve used them. I’ve even built a couple, but the commercial ones aren’t worth the cost and most of it seems hype and vapourware.

    We were supposed to have full GUI drag-and-drop process automation where any decent business analyst could codify the business rules and it would spit out the software. It’s 2019 and I’m still waiting for that particular piece of vapourware to become real.

    As mentioned above, we’ll get automated computer programming some time after we get self sustaining fusion power with net energy output.

    I suspect I will be long dead before either happens.

  36. TimN, “value of annotating.”

    When people asked me what my job was, I said, “systems archeologist,” as I tried to figure out WTF people were thinking when they wrote the code.

    I used a half dozen languages during my 30+ year IT career. Mainly Fortran for ten years, then Cobol for 20 years.

    “Fortran is (or was) a horrible botch of a language.”

    I ran a factory with it. I rather enjoyed using it. I was rather pissed when forced to use Cobol when I moved to an accounting system.

  37. “I was beginning to wonder if NiV had found a transgender slant to computing, somehow…”

    Are you asking for one?

    Because I re-iterate, I don’t start it. It seems to me people talk more about stuff they’re actually interested in.

    On coding, I expect the ideas will be around for another few generations at least. Technology progresses by automating the unskilled jobs out of existence, making skilled jobs unskilled, and making impossible jobs skilled. We build tools to do jobs we used to do manually – so in the future we need to learn how to use the tools. You need to understand what it is you need to produce, so that you can manipulate the tool to produce it. And while it is true that a lot of jobs that currently require skilled coders in future will be able to be done by the unskilled using templates and application-builders, there will be other jobs currently beyond even the skilled that will need a deeper knowledge of code.

    We can’t predict what we will need for the next generation of technology – there’s no point even trying to guess. But what we can do is build the most powerful, most general mental toolkit, that we can apply to whatever comes along. At the moment, that’s coding. So while it’s true that coding as we currently know it will disappear, and the coding to come hasn’t been invented yet, the latter will almost certainly be built on, and be an extension of the former. So I’d say it was well worth learning, anyway.

    On the language wars – my attitude has always been that the best coders are those who know lots of different languages. Those who only know one language tend to mold their methods around its particular quirks and flaws and limitations, and miss solutions that a broader experience would suggest to them. Coders who know only procedural languages don’t see solutions that coders of functional languages find obvious, for example. Coding for parallelism takes a very different mindset to a sequential programming language. Don’t teach kids only one way of thinking, and trap them in a box. Don’t tolerate the parochial attitude that there’s only one right way to code, and anyone who codes differently is insane or incompetent. Teach them to be flexible, open-minded, tolerant of other viewpoints. Coding paradigm diversity is key.

    The benefits of cultural diversity apply to coding too!
    There! How’s that, Julia? 🙂

  38. Old programmers never die. They just…
    – decompile.
    – memory.
    – branch to a new address.
    – can’t C as well
    – recurse.
    – terminate and stay resident.

    …or language specific…

    Old C programmers never die. They are just cast into void*
    Old Java programmers never die. They are garbage collected.

  39. “36 comments and counting. You can tell that our host used to write for The Register.”

    Er…. for El Reg wouldn’t that be 100100 & counting,?

  40. It turns out that being able to break things down into smallish, comprehensible steps has real value in the real world. Lets you tackle big problems – and lets you get out of big things which shouldn’t be done before you’ve spent money and time on the whole thing.

    But what’s missed is that the things must be correct. It’s no damn good breaking complex task A into a sequence of littler tasks a0, a1, … if the little a’s are wrong.

    That’s why encoding the little things into some encoding (‘language’) and trying them out is Very Important. For real world things, you might encode them into some discrete event simulation language, which might let you discover that you’ll take 200 years to get nowhere. Or a spreadsheet, to discover you can’t afford it.

    As an aside, the real world has a lot of parallelism – understanding that you can break stuff down into littler chunks, many of which can be concurrent, and getting their interactions right, can be a Very Good Thing, too)

    So, long story short – yeah, breaking down into smaller chunks is good. But VERIFICATION – knowing that your solution is correct enough – is terribly important. And that’s what encoding your chunks into a formal language/system of some nature lets you do. You can think of this as modeling rather than encoding if you want…

    Wouldn’t it be a Good Thing if it was an ingrained habit of essentially everyone to verify that a proposed course of action (in any sphere) actually was likely to work as desired?

  41. @ Arthur Teacake
    You’ve forgotten machine code.
    I freely admit that I’ve forgotten 87.5% of the machine code I learned N+n years ago, but #1 son (the bright one) was impressed when I mentioned it.
    @ Roue le Jour
    Very true! Until I started to learn programming I just did sums in my head and was periodically stumped when someone asked me for my working ‘cos there wasn’t any. A useful discipline if one ever wants to explain anything.
    @ everyone
    Fortran was devised by IBM to suit the computers they built. Algol 60 was created by programmers to be a useful language for programming.
    @ Excavator Man
    No, they’re mostly in their 60s – their comments relate to a time after I gave up being a trainee programmer.
    Sorry that there seem to be no female programmers commenting here – two of mentors (as a teenage trainee) were female, and my big sister took up programming some years after I gave up.
    @ Chris Miller
    Staff Department never let me get near IT because I knew too much about it and should have pointed out what was wrong. By the time you arrived I was in a quiet but valuable backwater with a barrier of Actuaries to protect normal human beings from encountering me.

  42. “Ignorance is Knowledge”
    “Stupidity is Wisdom”

    Our schools – and the politicians that run them – are following Orwell’s textbook thoroughly.

  43. @ Tim the Coder
    George Orwell wrote that as a *Warning* not a textbook.
    Only post the fall of the USSR our domestic totalitarians have taken it as a textbook – before that the read the “Daily Worker” or “Morning Star”

  44. Ah SNOBOL. I looked at that once but gave it a miss. The ideas live on in Regular Expressions, which are native in many scripting languages and available in libraries for just about all the others. It took me a while to grok REs but I’ve found them most useful for text processing.

    Better than programming as a subject, or even CompSci as a high school subject, having them all do physics to GCSE level would be more useful. At the grammar school I went to in the 60s all students had to take physics and chemistry or the single subject ‘physics with chemistry’ to O level on the basis that the arty types would eventually find it useful.

  45. While reminiscing on old languages, anyone remember APL?
    Now, any language that cannot be typed on a keyboard is a surefire success!

    I think I have a couple of green-screen serial terminals with APL keyboards here somewhere…

  46. I did a short ‘computing’ course at the RAF civilian tech school back in the day. We had to program a Ferranti Argus 200, which involved plugging ferrite beads into a pegboard: if there was a bead in a hole it was interpreted as a ‘1’ and if a hole was empty it represented ‘0’. After 15 minutes or so of bead plugging you could get the lights on the front panel to flash in a nice pattern.

    The danger of this system was that if you tripped up carrying the tray to the 19” rack all the beads fell on the floor and you had to start again.

    Thankfully I then was able to move up to the Argus 700 which had the advantage of using punched tape!

    Incidentally, Mrs W’s first job was punching Hollerith cards.

    (Next post – how we coped with sweet rationing. )

  47. @Hector Drummond, Vile Novelist
    @Diogenes

    Pascal, Cobol, C and ML were languages we were examined on – BSc Business & Computing 1988

    First post grad job was TI/JMA’s IEM and IEF

    I dropped languages as soon as I could

    Lucky you; our school made us do one until O Level >= C achieved or left school

    .
    PS I don’t understand exams today where D or E are classed as passing

  48. I was, on first reading this post, feeling rather displeased with Andreas Schleicher, director of education and skills at the OECD. However, a slant on the teaching issue has made me reconsider.

    I have, for work, programmed seriously in the following high-level languages: FORTRAN IV, ALGOL-60, RTL/2, FORTRAN 77, Basic, Pascal, C, C++ and Python. I have dabbled in CORAL-66, Ada, Algol-68, Modula-2 and others.

    On assembly languages, my work has included: Intel 8080 and derivatives, PDP11, 6502, FPS Array Processors, TMS320 and Transputer; for a university project, PDP8.

    Doubtless others, of my generation and earlier (very likely some of the following ones too) can compete strongly.

    And the ‘formal’ teaching of programming I received: 7 one hour lectures in FORTRAN IV (and a modest project) back in the first term of my first year at uni. [Though admittedly, I also have an MTech in computer science – though that did not involve being taught any programming language – though some teaching on why several existed and were/are the way they were/are.]

    So, I do wonder why much formal teaching is required for ‘coding’.

    And I am with Tim-the-Coder, JGH and many others above. Programming languages develop, live a bit, then die or stagger on. It is the concepts that matter and anyone with decent programming skills can be (fairly easily and quickly) soon up-to-speed in a new language – even though some new languages do have additional material concepts that are useful.

    And main among those concepts (at least for those up to age 16) are: sequence, selection and iteration; also a small bit too on functions/procedures, data types, arrays and data structures.

    Good luck for those who think it helps to go further (at a young age) with the likes and formalities of: abstraction, encapsulation, recursion, object-oriented programming, polymorphism, inheritance, parallelism / concurrent programming. Also the language differences between imperative, declarative and functional – to say nothing of the fact that most modern languages are somewhat mixed on these.

    Best regards

  49. Trigonometry will become obsolete?

    Yeah fuk u Euclid !

    Wot have the Greeks ever done for us eh ?

  50. “Trigonometry will become obsolete?”

    I think the point was that it’s a classic case of kids being taught something that 95% of them will never need or use again in their adult lives.

    “Good luck for those who think it helps to go further (at a young age) with the likes and formalities of: abstraction, encapsulation, recursion, object-oriented programming, polymorphism, inheritance, parallelism / concurrent programming.”

    I’d not even consider teaching that sort of stuff. You start with the applications – what can you use it for? So you show people how to use macros to speed up repetitious document editing tasks, how to produce high-quality graphics for presentations and reports, or posters and artwork, how to tabulate data, do statistics, then pull those numbers out and put them in documents, how to automate their email and web activity, like how to scrape data from a website, or set up a mailing list. How to set up a blog, or a web shop. How to control robots and other machinery (all the ‘internet of things’ stuff).

    You can write code to help you do all that. You might dip into some of the computer science theory in the process (efficient algorithms and data structures are particularly beneficial), but no kid is going to be interested in a dry lecture on polymorphism! They want to know how to program their own ideas for games and apps and artwork.

    And while it’s true that by the time they grow up many of those applications won’t need coding any more, and the languages used will be dead, there will be new applications that do and new languages that aren’t. The new always builds on the foundations of the old.

  51. A few thoughts.

    My dad is a software engineer. With a bit of guidance I wrote a functional (if not very strong) GUI based chess engine in C aged around 8. By the time I was a teenager, I probably could have got a job as a developer somewhere, but didn’t want to.

    These days, most of my computer time at work is creating complex engineering models and drawings in 3d parametric Cad, and my coding principles are actually very applicable, even if I write no actual code.

    While both I and my boss draw similar things, you can tell who created any given model from miles away. My models are all essentially use a set of declared variables – e.g. if a plate is 20mm thick, then that gets entered once, and all subsequent references to that plate thickness get pointed back to that dimension. My boss just types 20mm every time – it’s less hassle – until of course some joker decides that actually we should be using 25mm plate. At that point, my model will adjust in 20 seconds, the boss’s won’t.
    This is because it was drummed into me from a very early age that you never ever duplicate enter data as “hard code”, when it should be a variable declared somewhere sensible.

  52. BTW, Our lads in the workshop use Trig all the time to mark steel out etc – very much an essential skill in my trade.

  53. There’s something El Tio de Tejas finishes a comment on above bears considering:

    “Wouldn’t it be a Good Thing if it was an ingrained habit of essentially everyone to verify that a proposed course of action (in any sphere) actually was likely to work as desired?”

    Yes, wouldn’t it be wonderful if kids were taught to check their working. Tim often draws our attention to some blindingly obvious balls-up in the dead-trees. When was it people stopped checking what they say & what they do against reality? What with being told obvious bollocks, given stuff not even remotely like what was requested & having to deal with people’s total fuck-ups for no other reason than their inability to self verify I despair.

  54. What I find educational about programming is the way getting a program to work is a bit like working out a new piece of scientific truth. The computer is blind to your feelings, exacting in its demands, and utterly honest. Learning to deal with this is a great preparation for any and every fact-bound line of work or study. Computer languages become obsolete, the work rises to higher levels of abstraction, but the fundamental demand of the subject is the same.

  55. Going back to the original quote, why worry about it eventually becoming obsolete while it’s something very much needed now?

    In primary school we had access to (I think) a single BBC micro and when I say access I’m not sure I ever once used the thing; the only thing I can recall seeing run on it was a game so it was probably not much real use to anyone.

    At secondary we went from having 4-5 machines in the mathematics lessons running a small suite of very dubious educational games, I think the only one I can recall let you put in several numbers which would cause things to bounce around at different angles and speeds or something like that, fun only in so much as you had less of a normal lesson when using them.

    Things did improve later, they ended up having actual IT rooms (the one time I miss a lesson the teacher manages to install a CPU the wrong way around, I was gutted) but they were used to teach us how to use windows programs (stuff like Office etc).

    I never begrudged the time spent teaching us how to use word processors etc, it’s something most people are going to use once they go to work but as has been stated repeatedly you can take the things learned from programming and apply them to a wide range of unrelated subjects.

    I would have benefited immensely from them making even a half arsed effort to introduce us to any language especially as I went on to be a web developer and yet they were happy to piss away many hours on all that other nonsense.

  56. bloke in Spain

    “There’s something El Tio de Tejas finishes a comment on above bears considering:”

    Thank’ee sir.

    Verification is the lost art in technology. And, it appears, in society.

    Would that it were different.

  57. I doubt the usefulness of teaching or learning coding for general purpose education. Mathematics is more fundamental & a better pursuit. Good mathematicians make good programmers & good lawyers earning high scores on the LSAT for example. Building Microsoft, Bill Gates was fond of hiring mathematicians. When math is grasped it is the foundation of so many things, engineering, physics, programming, law & more.

    Funny, true story. When IBM/360 mainframes started replacing ledger machines I was hired as a consultant by a city which had recently installed one & was converting from adding machines & ledger machines. An old lady, near retirement & not trusting computers, was causing problems claiming the computer output was wrong. Nobody in the city could convince her of the accuracy of an IBM/360 & my job, as an outside expert was to convince her to shut up. The mayor rolled his eyes about her when he hired me to end her foolishness so the city could continue the conversion. She presented her ledgers & hugely long adding machine tapes & the computer reports adding up 1000s of figures not matching her totals. When I looked at the code, I discovered the programmers were using Floating Point Arithmetic! That won’t work without adjustment. Floating point does not represent all decimals, e.g. IBM/360 floating point does not have 0.10 in binary. It is like 1/3 in decimal. You cannot write 1/3 as a decimal. It is 0.3333….to infinity. Same with IBM/360 & 0.10 & other decimals that don’t convert to a limited number of binary bits. So, when the computer adds 1000s & 1000s of numbers in floating point there is a problem. The old lady was right.

    Note: IBM/360 floating point is not the same as INTEL floating point, but still, not all decimal numbers can be expressed in any binary floating point format.

  58. “Teaching children coding is a waste of time,”
    and
    “For example, I would be much more inclined to teach data science or computational thinking than to teach a very specific technique of today.”

    How are you going to check your “computational thinking”?
    Easy peasy. You write a program… oh bugger!

  59. The problem is this word ‘coding’. It goes back right to the early days: a ‘mathematician’ or ‘analyst’ will provide the solution and the ‘programmer’/’coder’ —some rude mechanical— will do the drudge work. It isn’t like that, programming isn’t a mechanical process. The language is secondary to the creation by writing; I am tempted to say ‘digital literacy’ but that’s another term that has been captured by the no-nothings.

  60. When you lose analytical skills and an appreciation of what the code is *actually* supposed to be doing then danger lurks.

    The computer it says….

    Teaching coding skills is wider than just typing in a program listing or even connecting a few bubbles with lines – one needs to *exactly* define the required task and the the result of the construction process needs testing….

    Broken software is still not regarded as akin to a wheel coming off a car – we’re getting there as more control is handed to code but gender aware STEM isn’t going to do much to improve the situation.

    Little seems to be said about coding failures – would a wider appreciation of reliable principles in data handling be beneficial? Assorted (mostly) public sector catastrophes suggest that it might be worth pursuing – NHS IT and Chinook engine controllers anybody?

    There’s not many futureologists who’d care to be paid by correct outcomes.

  61. @ djc
    No, it does *not* go back to the early days – fifty-odd years ago most programmers had a Maths degree (I knew one who was a Chemical Engineer).

  62. “Little seems to be said about coding failures – would a wider appreciation of reliable principles in data handling be beneficial?”

    I’ve long thought that it needs a wider appreciation of the benefits of the theory of ‘evolution’ over the theory of ‘intelligent design’.

    In ‘intelligent design’ they try to plan everything up front, document all the requirements, set out the test plans, write books and books of rules and procedures, with the aim of eliminating error, sit down and build the thing, and then move on, because the product will be perfect in every detail.

    In ‘evolution’ they start with something small and simple, create it with little planning, use it, realise its flaws, make modifications and tweaks, fix bugs, add features, use it, realise the new flaws, fix them, and so on. Every now and then when some bit gets too rococo and unwieldy, you use what you’ve learnt over the past years to start again, but it’s always the start of a new cycle of evolution. The software is never finished. It never makes big leaps, only a sequence of small steps. It’s never perfect, or bug-free. It just needs to work well enough to be useful. It continually adapts to changing circumstances. And its practical use is tightly entwined with its development, guiding and driving it.

    It’s like the difference between a command economy with ‘five year plans’ and a marketplace. The marketplace is messy – a hodge-podge of patched-together systems, full of work-arounds to resolve the many incompatibilities. Wouldn’t it be better and more efficient to plan everything in advance, to build all the components from the ground up to fit together perfectly? Set standards, impose regulations, glorify ‘slogans’ and ‘principles’, drive ‘quality’, don’t tolerate sloppiness, and document, document document! Planners needs lots and lots of paperwork! Does that all sound familiar?

    There are certain patterns of human behaviour that recur endlessly. We learn only very slowly.

  63. ‘When was it people stopped checking what they say & what they do against reality?’

    Late 80s. The consultants convinced businesses that middle management was a waste of money. In the publishing business, that meant getting rid of editors.

  64. Verification is the lost art in technology. And, it appears, in society.

    And not just in programming (user testing? – we’ve heard of it), but also in spreadsheets (complex ones being a simple form of programming). There are entire businesses being run on spreadsheets and about 25% contain egregious errors (~100% contain some error, of course).

  65. @ NiV
    “Elegance” is deemed admirable by mathematicians and elegant solutions in programming, as elsewhere, have markedly reduced risks of unobserved errors.
    There are other short-cuts to getting it right than your un-Intelligent evolution process: for instance the Research Director of my last employer devised a set of spreadsheets, all but one of which had a self-checking cell built in. He also got someone else to check them.
    My elder son’s last job before his current one was supposed to be database administration but 80-odd% of the time was sorting out the problems and correcting the code in programmes, that hadn’t been checked properly, written by someone who had left the company three or fours years before he arrived. The improvement in his health since he moved is, literally, visible.
    The problem with evolution is the non-survival of species that we should like to survive.

  66. @theProle February 22, 2019 at 10:19 pm

    This is because it was drummed into me from a very early age that you never ever duplicate enter data as “hard code”, when it should be a variable declared somewhere sensible.

    +1

    Although I, not any tutor, drummed it into myself as logical common sense.

  67. @SimonB February 22, 2019 at 10:58 pm

    I never begrudged the time spent teaching us how to use word processors etc

    Taught? We were told “all assignments must be typed, no hand written accepted” – available were Apple ][ , PC, PET, Sun Unix, Prime all in short supply thus self-teach Vi, Wordstar etc

  68. ““Elegance” is deemed admirable by mathematicians and elegant solutions in programming, as elsewhere, have markedly reduced risks of unobserved errors.”

    Agreed! As a mathematician, I’ve noticed that myself.

    “There are other short-cuts to getting it right than your un-Intelligent evolution process”

    There’s nothing unintelligent about it. The best people to test a software product are the users – the people who know the topic, what it needs to do, how best to do it. It’s extremely difficult to get a formal specification out of them defining every aspect of ten years experience doing a job, but very easy for them to criticise a tool they’re trying to use to do it. The ideal is when the coders and users are the same people. But the key is to iterate.

    Newton-Raphson can very rapidly solve complicated convoluted non-linear equations that sophisticated analytic methods throw up their hands at, even though there seems to be a lot less ‘intelligence’ to the method. But ‘intelligence’ is really about problem-solving ability, and evolution (fixed-point iteration) is very, very good at problem solving.

  69. “Computerization of automobiles helps with diagnosis, not actual repairs.”

    Actually it doesn’t help with diagnosis much either…………..very few cars breakdown, and then the problem is diagnosed by plugging in a computer, what actually happens is the computer gets some erroneous readings from a faulty sensor (or a minor problem that doesn’t actually affect the functioning of the vehicle) upon which it shuts everything down and the car won’t go. You plug it into the computer and get a million error codes, as the initial sensor error has caused a cascade effect though the entire system and you’re no wiser to the actual problem.

    I recently had a problem with the brake lights on one of my vehicles, the mechanic plugged it in, got a screed of errors. After a bit of messing about decided that ‘most likely’ the lighting system ECU was shot (replacement cost £800+) Was just about to leave when he had a brainwave, changed one of the bulb holders for another, turned on the ignition, viola! all worked fine. Turns out it was a short in the bulb holder (replacement cost £15)……..all the error codes were no use at all.

    Personally give me a purely mechanical engine with no computerisation any day, all you have to do is listen to it, if there’s something wrong you can hear it.

  70. @Jim

    Ah yes. my old Citroën C2 engine overheat warning, always happens going downhill after heavy rain. Wait ten minutes restart, carry on, nothing wrong, except a tendency to declare ‘lights faulty’ or ‘data inconsistent’ and random moments. The cause: water leaks through the wire entry at the top of the tailgate, causes short and corrosion in rear-light bulb holder.

  71. I would no more trust the word of the head of the OECD on trends in information technology than I’d trust a random code monkey on the outlook for global finance. He hasn’t got a scooby.

    The best that can be said is that no-one really knows what’s going to be valuable five years from now. Five years ago I was maintaining a large and unwieldy codebase in PHP. Now the bulk of my development work is in Node.js, but I’m the primary Go developer in my company and the only one who knows Rust, both of which languages I learnt in the last 18 months. I’ve just started a major project in C++, which I started learning in about 1989, when it was four years old.However, it should be noted that I’m using C++17, which is a very different beast from C++98 or prior. But languages are really of only minor importance. A really important skill these days is learning how to operate in the cloud. All my code is designed to run on AWS instances and understanding the benefits and pitfalls of this is orthogonal to the language used to write the code.

Leave a Reply

Name and email are required. Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.