Codecademy vs. the BBC Micro
July 24, 2019 8:12 AM   Subscribe

Two-Bit History praises how the BBC’s Computer Literacy Project and the legendary BBC Micro demystified computers by explaining the principles that computers rely on to function – in contrast to Codeacademy, where “after a course or two … you understand some flavors of gobbledygook, but to you a computer is just a magical machine that somehow turns gobbledygook into running software.” posted by adrianhon (25 comments total) 30 users marked this as a favorite
 
Way to miss the point? CodeAcademy teaches people how to write software, not circuit design. Although a circuit design course could be added at some point.

Software and hardware are distinct fields, and free resources typically aim at one or the other. This is not actually a problem.
posted by Ahniya at 8:20 AM on July 24, 2019 [14 favorites]


This is a major problem with tech culture, and something that gets outed very quickly when such "programmers" are tossed out of their comfort zone, such as being asked to use a new tool or language. Teaching someone to "code" is superficial if they don't learn the basics of computation that underpins it all.
posted by NoxAeternum at 8:21 AM on July 24, 2019 [6 favorites]


Illogical, nonequivalent comparison.

I've built 2 houses and several other buildings without ever knowing, or being bothered by how to forge a hammer.
posted by humboldt32 at 8:23 AM on July 24, 2019 [11 favorites]


The article is complaining that two wildly different resources with wildly different aims dealing with wildly different types of technology do not cover the exact same ideas, and pontificating about how much better things were in the old days. I didn't know that old man yells at cloud to get off his lawn was newsworthy.

There is an interesting discussion to be had about the changes required to transform software development from its current local-witch/maybe-trade-guild approach into a modern profession with standardized requirements. This article is 100% irrelevant to that discussion.
posted by Ahniya at 8:25 AM on July 24, 2019 [3 favorites]


Way to miss the point? CodeAcademy teaches people how to write software, not circuit design. Although a circuit design course could be added at some point.

There's a reason why the CS curriculum I went through had a number of EE courses in it, including circuit design (which was very early on in the course.) Trying to write software without some understanding of the underlying hardware is a good way to build bad software, especially if you are working in constricted environments.
posted by NoxAeternum at 8:27 AM on July 24, 2019 [5 favorites]


In my experience, CodeAcademy has been an effective way to get past the paralyzing terror of knowing nothing and having no idea where to start, to the point where I understand what I don't understand and am capable of asking the right questions. No, a gamified tutorial won't makes you an engineer, but it's a place to start (and it's fun).
posted by Reyturner at 8:32 AM on July 24, 2019 [4 favorites]


TLDidR: Education is not the same as Training.

I'm of an age to remember the Computer Literacy Project, but bought a ZX-81, tossed it in disgust as too useless—I still have the Casio programmable calculator I bought to replace it—bypassed the Beeb as too expensive, and only really got into computers with the Amstrad PCW8256, which was actually a useful business machine that outsold the IBM PC and clones for some years in UK in the mid-1980s.

The CLP was useful, even if you didn't follow it into the world of BBC BASIC programming. And it underlined a lot of the syllabus later taught in GCSE 'O' level and 'A' level computer science in the 80s, whih was ditched in the 1990s for "information technology" (aka how to use Microsoft Office). I'm just old enough I missed out on CS in secondary school, but instead took a night school A-level course after university, dropped out because it wasn't challenging enough, and enrolled in a graduate entry conversion degree instead (which I completed).

But anyway … in the mid-oughties, the UK government woke up again and realized they'd inadvertently neutered the computer literacy syllabus, resulting in a generation who knew nothing but Microsoft products at a user level. Which is where the Raspberry Pi project got its impetus from, and turned out to be much more successful than the competing MIT Media Lab One Laptop Per Child program. (19 million Raspberry Pi boards sold; OLPC claim 3 million shipments over a longer period.)

Significantly … the Raspberry Pi can run BBC BASIC, but mostly it's used with Python and exposes the hardware for tinkering: you can build your own computer around it, unlike the elegant but sealed OLPC.

There is a moral to this story, and it is that if you want to educate people about computers, they need to know more than programming abstractions.
posted by cstross at 8:38 AM on July 24, 2019 [20 favorites]


This is a major problem with tech culture, and something that gets outed very quickly when such "programmers" are tossed out of their comfort zone, such as being asked to use a new tool or language. Teaching someone to "code" is superficial if they don't learn the basics of computation that underpins it all.

As someone whose coding is entirely self-taught and who comes from an industry in which, well, we're mostly all self-taught and flying by the seat of our pants, I admit to being a little bit eye-rolly at the whole "oh, well, you haven't got the CONFIDENCE" aspects of the linked post. Literally I learned to code by being handed a few MATLAB routines and told to figure out what they did and how they did it, and to tweak them myself until I understood. It was not an optimal introduction. Later I "learned" R via a two-week session in my core course, which I was indifferent at and disliked. I learned Python later via Codecademy, and that was so much better, so much more useful, so much easier to capture in--yes, short little lessons. I actually enjoyed learning that, and I have been slowly doing more and more of my work in Python as it's available to me, in part because Python is familiar and friendly and has good associations for me.

What is missed by this approach? I know what a for loop is for and how it works; once you know how to write one and see one in action, it's pretty self-evident. I don't, day to day, really need to understand either hardware (beyond working out what to purchase in order to execute the routines I'm writing) or binary; I do need to know how to construct the architecture of code, but I'm not entirely sure that the described approaches really do that better than the Codecademy model of encouraging people to try some basic routines and build things they might like on their own. Which it did for me, by the way; it got me the literacy I needed with Python to tackle APIs and write some pretty credible pieces that could interface Todoist and Habitica, and then from there I started using it for statistical analysis, and then there were other things that occurred to me to try as I needed them. I'm not sure how that is any less solutions-oriented and therefore confidence-building than describing looms is.

I have another discussion of the emphasis on LOOMS, HOW DO THEY WORK from the perspective of someone who is moderately fluent in fibercrafts, and emphasizing the interactions between coding and fibercraft and the way that they are really all very similar, but: eh. Comment's long enough.
posted by sciatrix at 8:41 AM on July 24, 2019 [3 favorites]


Well I was taught BBC basic at school and again at university and I still didn't have a clue at the end of either. So in your face, Two Bit History.
posted by biffa at 8:44 AM on July 24, 2019


I think also, fundamentally, I am coming from a perspective that sees having the literacy to tinker with and build things that are entirely based on interfacing software, and the literacy to tinker with and build things that are based on interfacing hardware with software, as equivalently important skills.

Bear in mind that I literally do both things for a living. My sound processing setup I built alongside my PI at the same time as I learned how to analyze acoustic data, from soldering the speaker cables from a huge round of coaxial cable on up to writing the routines in MATLAB to control the damn thing. I've also either built or modified much of the software that controls it, from the visual BASIC routines that the processor runs on to the MATLAB routines that now use ActiveX controls to automatically run and trigger those routines, to the playout files that it produces. This rig is a huge part of my PhD, and I understand it better than probably anyone else in my lab, and I am not formally trained to do any of it. None of us are. Wish we were, but again: I'm a fucking biologist, and all my formal training is in biology.

So I've been spending the past, oh, three or four months trying to troubleshoot an issue that is pure software, working out why MATLAB isn't talking correctly to the activeX controls within a routine. That's all a matter of wrapping my head around the software, including languages I don't speak (whatever the hell the activeX controls are done in) and those I hate speaking (proprietary visual BASIC is the worst). I have a really hard time understanding why a deeper knowledge of the hardware would make this problem less of a pain in the ass to solve and work out, as opposed to a more generalist approach to reading and processing varying forms of syntax. And I am incredibly glad to have resources like Codecademy to help me with those skills, because not everyone who needs these skills has the privilege to have formal training or support in it.
posted by sciatrix at 8:54 AM on July 24, 2019 [6 favorites]


Teaching someone to "code" is superficial if they don't learn the basics of computation that underpins it all.

I could just as well say that it's superficial that all our mainframe people don't know anything about web accessibility and treat the front end like it must be magic and automatic. The closest to the metal that I care about is the JVM, and I don't even want to care about the JVM. On the other hand, I care a lot about JavaScript and CSS. The tech that people use every day requires a ton of people who are specialists in all kinds of things. It's important for some of those people to understand how their operating system works. It's important for some of those people to understand how their CPU and RAM and hard drive actually work. But for a lot of us, those things are not actually relevant to our daily output, and they are not any more "real" necessities for software development than understanding how Chrome handles DOM events.

The problem with Codecademy in this context is that they sell themselves as getting you career-ready but what they actually have is basically a 101-level intro program that won't get you a job without serious networking. But the problem is not that they need more "basics of computation"; it's a lack of practical application on projects that resemble real-world situations.
posted by Sequence at 9:02 AM on July 24, 2019 [5 favorites]


coding and fibercraft and the way that they are really all very similar

I always enjoy sharing crochet patterns with non-fiber-arts-enthusiast programmer friends and pointing out that they are a form of assembly language, complete with some of the features they'd expect from languages that are processed by a computer rather than by our brains and hands.
posted by asperity at 9:23 AM on July 24, 2019 [7 favorites]


sciatrix, you should be an electrophysiologist.
posted by biogeo at 9:28 AM on July 24, 2019


biogeo: sciatrix, you should be an electrophysiologist.

"Sound processing setup... acoustic data... MATLAB... PhD... fucking biologist"

yup checks out
posted by adrianhon at 9:31 AM on July 24, 2019 [1 favorite]


I came into this lab emphatically declaring I hated brains

what is my life, what are my choices

I will say that it was very validating when we got two new postdocs who had done acoustic work and both of them were slightly terrified of my system
posted by sciatrix at 9:43 AM on July 24, 2019 [1 favorite]


It was necessary for the CLP to explain hardware and electronics for two reasons. When it was conceived, society was far less steeped in electronic devices. Computers might as well have been powered by tiny electrical elves for all the public knew, so the "literacy" part of the CLP had to dispel the myths. Secondly, the hardware was much closer to the user back then. There was no OS managing tasks and mediating access to disks and files on a micro. An entire system disassembly could fit into a paperback. It was possible for one person to fully understand an entire microcomputer, while now … I don't even know what I don't know about computers. The hardware mattered because you could control all of it and you had to work within its severe limitations to get anything done. So point not remotely missed.

If you want to get your BBC BASIC on without retro hardware or having to dabble in the so-weird-it-hurts RISC OS for Raspberry Pi, Richard Russell has released BBC BASIC for SDL 2.0 as a cross-platform free development package. Richard's been involved in BBC BASIC since it was just a spec document. If you've ever used BBC BASIC on anything but an Acorn or ARM machine, Richard wrote the interpreter. He's done amazing things for computing in the UK, but his health is failing now. Somebody please interview him while he's still with us …
posted by scruss at 10:02 AM on July 24, 2019 [4 favorites]


Tutorial: How to set the "kids these days" dip switch back to off on Geezernix.
posted by srboisvert at 10:41 AM on July 24, 2019 [3 favorites]


I am a CS-degree educated programmer who was taught a lot of the ground up principles mentioned in the article, but like many/most people of my era, I got my start doing things like making Commodore 64 sprites of underpants fly randomly around the screen and general dickering about, with precisely zero understanding of the principles.

My point such as it is is that principle-free dickering about environments that spark curiosity are, I'd say, more important than being taught the underlying principles of things, because why on earth would you want to learn about them otherwise? And thankfully today's environment provides a wealth of them.
posted by Jon Mitchell at 10:54 AM on July 24, 2019 [6 favorites]


Layers of abstraction can be wonderful for productivity, but it's very powerful to know what's going on a layer or two down from the level you normally work at. That is how you debug things, and how you gauge whether the resources you are using are reasonable and the performance you are getting is fair.

In the 8 bit days that meant knowing things all the way to the metal, but now there are more layers. One or two down is enough to be useful.
posted by w0mbat at 11:38 AM on July 24, 2019 [2 favorites]


sciatrix : Knowing how to work at a lower level ultimately buys you the freedom to stop stitching other people's flawed and incompatible black boxes together and build a single coherent working machine from scratch.
I understand if that is not practical for your project, but for many projects that is the best solution once you know what you want.
posted by w0mbat at 11:45 AM on July 24, 2019 [2 favorites]


Couldn’t Codecademy replace a lesson or two with an interactive visualization of a Jacquard loom weaving together warp and weft threads?
Don't you agree that Grod should have demonstrated how to knap an obsidian knife before he jumped right into bronze working, Thog?

---

A lot of this article boils down to "back in my day, harumph" but there's a very serious problem here. What people were taught in the 80's was a model of how a computer works. It was a decent approximation of how a machine of that era operated. From a low level, today's machines are just unrecognizable. You can't just boot up and write a byte to memory, because that memory is now dynamic ram that needs to be refreshed at scheduled intervals. Because you need to pass through multiple cache memories and internal busses to reach that memory. Because you need to communicate with an entirely different processor just to set up that memory. Because the "you" in that sentence can be any one of several different processing cores with different capabilities, pipeline depths, and clock speeds.

A modern computer looks almost nothing like the old 80's model of how a computer operated, but programmers haven't changed, and as a result there is a huge amount of hardware, firmware, and software dedicated to making sure that modern programmers can still pretend that this is "how" a computer works. But of course, it really doesn't work that way, and that's why we still end up blindsided by spectre, meltdown, rowhammer...

I think that teaching the old model of how computing works is still useful, but that's primarily because we're still building these massive emulation layers to make it appear valid. I'd love to see someone out there just hold their nose and try to explain how one of these beasts really works, because it would be a fascinating dive.
posted by phooky at 12:06 PM on July 24, 2019 [12 favorites]


Others have said it more succinctly above, but...
biogeo: sciatrix, you should be an electrophysiologist.
As technology advances and things that were hard become easy then become reliable then become commodity specializations evolve at each level. Those who helped drive the early technology are often annoyed at this, but it's the world they enabled by standardizing and generalizing lower level principles.

This is really what we're discussing. It's entirely possible to be a very successful interior designer or carpenter without being an electrician or building contractor. Many successful folks of any role might credit their familiarity with other domains as relevant to their success but others may be completely successful with a deeper specialization in one domain and relying on competent project management to ensure all the pieces come together. Each endpoint of that spectrum thinks the other is misguided.

Still others may imagine their expertise spans all layers but find that lack of specialization relegates them to many similarly structured jobs each with unique challenges totally invisible to the customer (and thus valued less).

All I'm saying is there is not an intrinsic superiority to understanding the full stack of computing, because as often as it guides you away from the fallacy of "don't worry about the resources or performance at scale, that's a problem for the JVM/cluster/cloud/load balancer" it can just as easily guide you to the fallacy of "optimize early like you're running on bare metal, eat the cost of maintainability or readability to increase performance at any cost".

Further, there are software and data abstractions that greatly frustrate electrical engineers due to the hundreds of layers of abstraction invented (and often perfected at least to the point of "it basically usually works") in the last five decades. I've known EEs who didn't trust object oriented programming because it seemed inefficient to keep copies of the executing code with each object instance (you don't) so you never know what's actually running (you do). But at a deeper level, they're kind of not wrong in that OO has built a lot of programmers who like to ignore complexities like object lifecycle to the point we worked very hard to get good at automatic garbage collection, which is good enough for a huge majority of cases but can still get you in trouble if at some level you aren't aware that someone eventually can overwhelm resources in ways they don't expect.

Then things like catastrophic security bugs based on Intel branch prediction for instructions shows up and the lowest level engineers scream "vindication!" - and they're right - but ignoring the decades of problems being solved and value being created by folks who never really cared about branch prediction as anything other than a component of aggregate performance. And in the end, it is fixable at some aggregate cost where those working at the higher level just have to throw more cores at the same problems, and eventually that gets factored into cost of business, and on we go.
posted by abulafa at 1:27 PM on July 24, 2019 [6 favorites]


Zachtronics games are pretty great at presenting computer science concepts in an entertaining way.
posted by snuffleupagus at 2:13 PM on July 24, 2019 [2 favorites]


Trying to write software without some understanding of the underlying hardware is a good way to build bad software, especially if you are working in constricted environments.

"especially if" meaning here "only if". Most people don't even have to worry about things like cache lines, much less circuit design.
posted by kenko at 3:12 PM on July 24, 2019 [3 favorites]


It's hard not to read a subtext of class insecurity into this argument: namely, a fear that college-educated professionals will be crowded out by hungry amateurs who have, "in the hour or two they have each night between dinner and collapsing into bed", learned just enough programming to "level up their careers".
posted by Pyry at 1:34 AM on July 25, 2019 [2 favorites]


« Older This Artist Makes Cheese from the Mould That...   |   Deconstruction not to criticize, but to defend Newer »


This thread has been archived and is closed to new comments