Why are you still recommending the Dragon book?
October 29, 2019 10:18 AM   Subscribe

Are you a self-taught software engineer, bootcamp grad, precocious high school student, or just have a few wintry months to kill? Time to Teach Yourself Computer Science with an opinionated list of the best autodidactic resources on a range of CS topics!

Many of the resources are available for free online:
Programming: SICP (aka the Wizard Book) and accompanying lectures, How To Design Programs (you're going to want DrRacket for these)
Architecture: The Elements of Computing Systems (aka Nand2Tetris) and accompanying course
Algorithms and Data Structures: lectures for the Algorithm Design Manual
Mathematics: Discrete Mathematics (PostScript file!), Essence of Linear Algebra video series
Operating Systems: Operating Systems: Three Easy Pieces
Networking: Wireshark Labs from Computer Networking: A Top-Down Approach
Databases: Readings in Database Systems (aka the Red Book), UC Berkeley CS 186
Languages and Compilers: Stanford Compilers course, Make a Lisp
Distributed Systems: Distributed Systems
posted by theodolite (38 comments total) 107 users marked this as a favorite
 
I've tried many times to work through some of these books, and I've come to the conclusion that "working through the book" is not a sufficiently motivating goal on its own for me. I think the problem is that these books tend to present their subjects in a bottom-up fashion, which is efficient if you're reading them for a college course and you're fairly confident that you'll cover all the material. For someone like me who's reading them in the hope that I'll get some useful knowledge I can see and apply in the real world, it just delays the satisfaction so long that my initial burst of motivation runs out before I learn anything interesting.
posted by J.K. Seazer at 10:37 AM on October 29, 2019 [10 favorites]


I was half expecting from the title that this would be tutorials for coding on PDPs and Alphas...

I'm glad that CS theory is popping up more, but can someone explain to me the recent resurgence of LISP? What was ground zero for that? It's like waking up one morning and suddenly everyone is going gaga over Forth or something.
posted by phooky at 10:50 AM on October 29, 2019 [2 favorites]


I thought that said "Treat Yourself, Computer Science" and now I am a little sad.
posted by gwint at 10:55 AM on October 29, 2019 [7 favorites]


If you (like me) are a seasoned or semi-seasoned software developer who missed out on the whole Computer Science thing and still don't understand what the hell lambda calculus is, The Impostor's Handbook is an interesting read. It goes into some of these topics in a more top-down fashion, with lots of code samples.
posted by The Lurkers Support Me in Email at 11:00 AM on October 29, 2019 [13 favorites]


It's like waking up one morning and suddenly everyone is going gaga over Forth or something.

there is not a SNOBOL chance in hell of that happening
posted by thelonius at 11:01 AM on October 29, 2019 [9 favorites]


Something I'd add for databases: the CMU Database Group's YouTube channel.

Andy Pavlo is an excellent+engaging professor, and the videos come with slides+notes. I've dipped into most of the lectures from the "Intro to DB Systems" course to refresh my dated understanding of database internals, and I can't recommend it enough.
posted by ripley_ at 11:01 AM on October 29, 2019 [3 favorites]


I'm glad that CS theory is popping up more, but can someone explain to me the recent resurgence of LISP?

Paul Graham (of y-combinator and Hacker News fame) got rich by selling his LISP based company to Yahoo!, and he's written some essays about how great it is. So you now have a bunch of 25 year old "pg-heads" trying to cargo cult their way into being the next AirBnB or whatever.
posted by sideshow at 11:04 AM on October 29, 2019 [11 favorites]


Aargh, missed the edit window, that's imposter. 🤦‍♂️
posted by The Lurkers Support Me in Email at 11:07 AM on October 29, 2019


thanks, sideshow, and ugh, that was a much sadder answer than i anticipated.
posted by phooky at 11:10 AM on October 29, 2019


Did Lisp ever go away? It's lived on in Scheme, Racket, Clojure, and Emacs, at least.
posted by floomp at 11:28 AM on October 29, 2019 [4 favorites]


Sideshow's not wrong, but there's more to it than that. I think there're a couple of things going on:

1. LISP is a genuinely useful set of ideas, and (in part as a result of pg's essays), people rediscovered that fact in, oh, 2005 or so. I'm one of those "pg-heads", or I was in high school when I went through bits and pieces of SICP. My feelings about pg, hn, and startups are a lot more mixed than they were then, but I still use the concepts I learned every day. (I'm a physicist, doing numerical methods research.) And I'm not the only one:

a. LISPy ideas show up all over the place, from Clojure to Julia (my favorite) to Python, and

b. I think the broad cultural acceptance of these tools (which, ime, genuinely make for more effective programming) owes a lot to pg's essays.

2. I've heard that after the death of the LISP machines in, what, the '80s, there was a long dry spell for anything other than straightforward imperative programming: these were the days of C, C++, Java, and Perl. (This is before my time, so I can't speak from experience.) Sort of like---and I suspect not unrelated to---the AI winter.
posted by golwengaud at 11:34 AM on October 29, 2019 [11 favorites]


I'm glad that CS theory is popping up more, but can someone explain to me the recent resurgence of LISP?

In addition to what sideshow said, I also assumed that Clojure's relative ease* of use as a prototyping language for data-crunching problems had something to do with it.

*The preceding remarks in no way represent the author's opinions re: LISP and/or the wisdom of porting it into the JVM
posted by Mayor West at 11:36 AM on October 29, 2019 [1 favorite]


A post about beginner CS resources that turns into a discussion about the relative merits of LISP-- welcome to our world, grasshoppers.
posted by gwint at 11:44 AM on October 29, 2019 [11 favorites]


I always liked The Turing Omnibus, because it gave you a taste of some of the niftier CS concepts without having to actually, y'know, work through any examples or prove you understood anything.
posted by RobotVoodooPower at 12:01 PM on October 29, 2019 [6 favorites]


The article's justification for learning Computer Architecture:
> If you don’t have a solid mental model of how a computer actually works, all of your higher-level abstractions will be brittle.

This seems like a strange justification to me. Are they talking about the programmer's mental abstraction of the machine, or the abstractions used in the program structure? If the latter, I would argue the opposite, that knowledge of architecture leads to emphasis on performance, which can lead to compromises in abstraction.

I still think one should learn architecture, but more for the sake of writing efficient code and understanding things like linkers, ABIs, etc.
posted by scose at 12:05 PM on October 29, 2019 [1 favorite]


but going through this you're gonna end up one of those insufferable functional programming types who write code that no-one on their team understands and writes all their own interface code to talk to the rest of the team's code base because the interfaces aren't "pure" enough. vi config dickering, clicky keyboards and fedorahood will surely follow.

look, I get that you have to do some mythologizing of the space to prove your value, and a little bit of job-security wrenching is fair enough, but writing a system that generates your company's JavaScript website updates in a custom Erlang/Haskell dialect with operators solely in fullwidth Unicode inside a Connection Engine virtual machine maybe doesn't return the best shareholder value.
posted by scruss at 12:15 PM on October 29, 2019 [10 favorites]


The Manga Guide to Databases probably belongs on the list. I picked this up for lulz, but ended up genuinely learning from it. It gave me an "Aha!" moment on a couple of core concepts I'd previously struggled with.

>If you don’t have a solid mental model of how a computer actually works, all of your higher-level abstractions will be brittle.

True, but: You'll never have a solid mental model of how a computer actually works, just increasingly low-level abstractions. Even if your understanding it at the level of, say, electron orbitals -- it's turtles all the way down. Abstractions are useful and necessary; just remember that they are abstractions.
posted by sourcequench at 12:22 PM on October 29, 2019 [2 favorites]


vi config dickering, clicky keyboards and fedorahood will surely follow

Surely, clicky keyboards are the first step (he types, loudly)
posted by supercres at 12:26 PM on October 29, 2019 [3 favorites]


This seems like a strange justification to me. Are they talking about the programmer's mental abstraction of the machine, or the abstractions used in the program structure?

Yeah, I think they're talking about your mental model of the machine. In fact, I would say that a good understanding of architecture is the best antidote for novice programmers addicted to spurious optimizations. I was recently implementing the Set card game in C so I could simulate many games and try to find some useful stats. The most basic game logic is the rule for determining which card needs to be added to a pair in order to form a "set", so I really wanted to make that code fast. After a few hours I had come up with a solution that used six bitwise operations on the binary representations of the cards, which were all packed into single bytes, and I was feeling pretty proud of myself. Then I realized that 81*81 bytes easily fits into L1 cache, and one lookup from cache is probably faster than six operations that can't be parallelized. And indeed, a simple lookup table ended up blowing my fancy bitbashing out of the water.
posted by J.K. Seazer at 12:34 PM on October 29, 2019 [7 favorites]


If you don’t have a solid mental model of how a computer actually works, all of your higher-level abstractions will be brittle.

As a practical matter, you don't and you can't. The architecture is an illusion anyway. Under the covers modern processors execute speculatively and out of order, memory access is faked by caches, and memory itself has weird problems (e.g., rowhammer). You have to know about these things to produce the most highly-performant code, but it's a dark art with a lot of unexplored corners and very clever programmers have found ways to bust through the architecture abstraction to compromise system integrity.

And that's just CPUs. GPUs are even more obscure.

Anyway, I'm actually agreeing that understanding how a computer actually works is a good thing in many cases. And it's also a good thing that lots of people can be very productive in their domains without understanding architecture. But we're at a point where understanding the actual computer architecture is basically impossible because it's proprietary and hidden beneath abstractions that are largely fictitious and it's only going to get worse and everything you know will be wrong in the next generation if it isn't wrong already.
posted by sjswitzer at 12:35 PM on October 29, 2019 [6 favorites]


Speaking of the Dragon Book....

You used to be able to write a pretty good compiler using only the techniques in the Dragon Book. (As good as hand-written assembly that you didn't slave over.) Sure, you could do better instruction scheduling and register allocation with more sophisticated techniques, but your gains would be on the order of 10% in most cases. That's a couple of months of Moore's Law (since rescinded). And there's a real benefit when your program runs in a way that corresponds closely to the way the source code is written. It's always struck me as peculiar that optimised "release" builds are so different from development "debug" builds: you can't quite symbolically debug the release code you ship to your customers! Is a 10% performance gain worth that? Apparently it is, but debugging customer problems from fragmentary core dumps and poor symbol interpretation is a chore at best and hopeless at worst.

(I've known guys who spent their careers writing optimizations for specialized microarchitectures: If we reorder the instructions just so, it will be faster for this processor version, but not others.)

But now we lean on the optimizer to untangle the mess created by generated code (templates, etc.) and you basically can't write a good compiler without heavy-duty data-flow analysis and rewriting. A cache miss is now so expensive that you can't afford to not hoist variables into registers, whatever cost that may be to debugging or exception handling. I've even written template libraries with extra code to give the compiler enough information to reliably refactor and eliminate that very code! It's a funny way to have to think, but here we are.
posted by sjswitzer at 1:11 PM on October 29, 2019 [7 favorites]


there was a long dry spell for anything other than straightforward imperative programming

I can't speak for industry, but in my college in the early 90s LISP was one of the main teaching languages.

Of course, that really just means "1 or 2 profs preferred to teach in LISP". (The other language was C++, and really the majority of my CS courses had no actual programming, just algorithms and theory).
posted by thefoxgod at 1:15 PM on October 29, 2019 [1 favorite]


Oh, nice. As a schmuck with an art degree who gets trusted with CRM databases for some reason, I appreciate all the resources I can get.
posted by Phobos the Space Potato at 1:28 PM on October 29, 2019 [4 favorites]


vi config dickering, clicky keyboards

Oh

Oh no

I just...just like a full-size mechanical keyboard for the touch-typing feel of it, and I had to mess with my vi config because the default color scheme was rendering comments as dark blue on a black background, and I was going blind. PLEASE DON'T FEDORA ME
posted by Mr. Bad Example at 2:49 PM on October 29, 2019 [7 favorites]


To be fair, you’re not really a good example of this.
posted by Huffy Puffy at 4:01 PM on October 29, 2019 [14 favorites]


"LISPy ideas show up all over the place..."
As per one of the longest running scripting languages ever, Tcl, was just described as:

"I tend to call it "bash bitten by a radioactive lisp"


posted by aleph at 5:49 PM on October 29, 2019 [2 favorites]


As a schmuck with an art degree who is working on retraining into network admin, I also appreciate the Wireshark link.
posted by tautological at 5:55 PM on October 29, 2019


Tcl is csh all growed up into a real language. For better or worse.
posted by sjswitzer at 5:59 PM on October 29, 2019


I think there are still some aspects of architectural understanding that will continue to pay off in the long term, one of the biggest being memory models, atomic instructions and memory fences: although consistency guarantees vary between architectures, understanding the nature of those guarantees and how to build more strictly consistent abstractions on top of them is only going to become more relevant if you're working at that level.

Also, I think the Lisp/FP bashing going on in here is unfortunate. It's no less equal to any other paradigm in terms of suiting how some people think, or being ideally applicable to some domains and less applicable to others, or being vulnerable to being written in hostility, but it seems to me that the volume of nastiness directed towards it, usually directed at a Pretentious Lisp Devotee strawman, thoroughly exceeds whatever the actual enthusiasts of that paradigm are putting out.
posted by invitapriore at 6:50 PM on October 29, 2019 [7 favorites]


I just...just like a full-size mechanical keyboard for the touch-typing feel of it

Awesome! But not so much in a shared office, where it is the aural equivalent of spending all day microwaving fish.
posted by thelonius at 7:07 PM on October 29, 2019 [3 favorites]


Unfortunately -- and I say this as a great admirer of the languages -- the straw men in question (always men) are quite real.
posted by ead at 7:45 PM on October 29, 2019


vi config dickering, clicky keyboards and fedorahood

Ah, looks like I'm safe as I don't wear hats and would never spend time on vi configs (too busy writing elisp....)

LISP is interesting, although other than the aforementioned emacs scripting I haven't used it in decades.

Awesome! But not so much in a shared office, where it is the aural equivalent of spending all day microwaving fish.

In the open office I work in (and previous ones), you'd struggle to hear a clicky keyboard above all the talking and other loud sounds. I'd trade for a space where the only sounds were 100 clicky keyboards in a heartbeat.

(The real answer, as it is to almost any workplace problem, is individual offices which used to be less rare...)
posted by thefoxgod at 8:48 PM on October 29, 2019 [1 favorite]


Paul Graham (of y-combinator and Hacker News fame) got rich by selling his LISP based company to Yahoo!, and he's written some essays about how great it is.

Honestly the one good thing he's done...

There's a little more to it than that, though. Functional programming was coming back a bit anyway - for one thing it's a good fit for certain kinds of concurrent/distributed applications - and while there's a lot of FP that isn't Lisp now it does inevitably shine some light on Lisp.
posted by atoxyl at 9:05 PM on October 29, 2019 [3 favorites]


In the open office I work in (and previous ones), you'd struggle to hear a clicky keyboard above all the talking and other loud sounds.

No, they are making the situation worse, not being covered up by other noise. The office smelling of Axe body spray doesn't make farting cool.
posted by thelonius at 2:36 AM on October 30, 2019 [1 favorite]


The office smelling of Axe body spray doesn't make farting cool.

Don't say the bad words. Someone will do both things in attempt to prove you wrong.
posted by Abehammerb Lincoln at 10:50 AM on October 30, 2019 [3 favorites]


Any book suggestions along the lines of "Simple metrics to hopefully improve your company's underfunded and completely fucked IT organization?"
posted by Abehammerb Lincoln at 10:52 AM on October 30, 2019 [2 favorites]


> The office smelling of Axe body spray doesn't make farting cool.

oh this is the bad place...
posted by Reclusive Novelist Thomas Pynchon at 12:46 PM on October 30, 2019 [3 favorites]


Raises hand... never stopped being gaga over Forth. Everybody should give it a shot. Just like they should give every other sort of programming paradigm a bit of a shot. Even if it's just a few days reading through "Learn you a X" style introduction. Same goes for architecture. Not necessarily to know something inside and out, but just to get the general idea. I've been watching this dude's ( Ben Eater) videos recently. Let's make a shitty graphics card out of breadboards and chips. Or "Hello World" on a 6502.

I sorta still think you need to take a (maybe shallow) tour of dipping toes into different thing just so that when it comes up you can think "oh, this reminds me of that" and you know where to start looking.
posted by zengargoyle at 3:17 PM on November 2, 2019 [1 favorite]


« Older “Numderline.”   |   What the Katie Hill story means for young women in... Newer »


This thread has been archived and is closed to new comments