Let's Not Dumb Down the History of Computer Science
January 28, 2021 10:43 PM   Subscribe

How has mathematics managed to escape this so far? I suppose it's because historians of math have always faced the fact that they won't be able to please everybody. Historians of other sciences have the delusion that any ordinary person can understand it, or at least they pretend so. A 2200-word edited transcript of a talk Donald E. Knuth gave in 2014 about historiography in computer science.
posted by cgc373 (23 comments total) 31 users marked this as a favorite
 
This was a fascinating read. Absolutely love Knuth's perspective on this, as on nearly everything else I've seen him talk about (which has admittedly all been various aspects of computing and computing history).
posted by Dysk at 11:11 PM on January 28, 2021 [1 favorite]


I have a master's degree in computer science and would be thrilled to retrain as a historian and do just this sort of work. But I think the audience for this kind of history is vanishingly small. It would almost certainly be something I did for kicks post retirement.
posted by potrzebie at 11:42 PM on January 28, 2021


Actually it would be a great PhD project for someone in early retirement
posted by mbo at 11:46 PM on January 28, 2021 [1 favorite]


The skills required to do these kinds of works are handsomely paid in the industry. I wonder how much does this fact has to do with the lack of these kinds of works.
posted by valdesm at 12:25 AM on January 29, 2021 [5 favorites]


Wow, OK. I'm going to have to share this with all my history of maths & computing colleagues because it reads extremely problematically to me, and absolutely does not reflect the field I work and teach in. His reading of what 'internal' and 'external' history is is totally off base, and the idea that one is 'dumbed down' is actively offensive. Historical research isn't less worthy than maths research; historical theory and methodology isn't intrinsically 'dumber' than STEM theories and approaches. It's also, as far as I can tell, a rant focusing on one particular (retired, old-school) historian, not history or sociology of STEM as a diverse, healthy, activist academic discipline.

Part of the conclusion is solid: if you want good history of computer science then you need to fund it, and you need to fund it in such a way that people with training in computer science will be willing to *retrain* as historians and do research (or so that historians can retrain in computer science). But to do that you also have to respect history and other humanities work as fields in their own right, and dismissing them as 'dumbing down' and 'catering for the masses' is a terrible way to start that process.
posted by AFII at 12:44 AM on January 29, 2021 [13 favorites]


Hmm, the link is down for me. But here's the recording of the lecture.
posted by St. Oops at 12:51 AM on January 29, 2021 [2 favorites]


I think there's a clear place for history of science that's more targeted at practitioners of that science than to laypeople. I agree he makes that point very patronizingly, but it absolutely sucks how many of the most fundamental documents of computer science history are locked up in various corporate and academic repositories, never to be seen by people who have the background to understand them at a deep level and write about them for colleagues (or potentially by anyone else before bit rot sets in!). Computer science as a field has amnesia and sometimes it feels like it's by design.
posted by potrzebie at 12:54 AM on January 29, 2021 [3 favorites]


I didn't read him as being dismissive of non-technical history. If I understand correctly, he refers to the non-technical history as encompassing points 5. and 6. in his list of reasons why he reads history. All of which he views as important. I think his defensive tone more reflects the missing points 1. - 4., rather than the inferiority of 5. and 6.

I don't read as much history as I probably should, so I can't assess whether his claim about the "dumbing-down" of modern computer-science history is fair, but I'm sympathetic to his point. In a sense, the history being collected is "only" a humanistic history that we might all find relatable. This is good, but what's missing is a scientific history, that could help scientists become better scientists. Although we might worry that the audience for such history is too narrow, perhaps we should rather hope that such history could help non-scientists, with some effort, to understand the actual process of science.
posted by Alex404 at 1:48 AM on January 29, 2021 [4 favorites]


what's missing is a scientific history, that could help scientists become better scientists.

It's not missing, though. There are heaps of dedicated specialists in history and STS departments globally writing this sort of history - and moreover actively teaching it to STEM students. What's grating on me is exactly this assertion that it doesn't exist. It does. It doesn't get the *money* that the CS Department does though, and it's the first module to get cut when budgets get tight (what use is history to a computer programmer? lol, humanities!!).

It's pushing on a bigger sore point - that attitude that 'oh, I'll retire and write a little history, what a lovely project' (when no one in their right mind would say 'when I retire from being a historian I think I'll take up neurosurgery for fun!'), which is so often said by people who haven't actually spent the time figuring out who is already doing this work and where they can read about it, and how they can contribute to the visibility of a struggling profession.

The idea that historians of mathematics and computer science are not writing any history that is for mathematicians and computer scientists, with an active desire to help them learn from the past, is just bollocks. Unfair bollocks at that.
posted by AFII at 2:14 AM on January 29, 2021 [15 favorites]


Historians of other sciences have the delusion that any ordinary person can understand it, or at least they pretend so
Heritage professional and non-academic historical researcher checking in. There’s a distinction to be drawn between computing history and computing heritage; the first can and really should be comprehensible to the lay reader or at least anyone affected by computing (i.e. the majority of people on the planet); the second is the profession’s own responsibility to itself and to interpret, in courtesy, to the rest of us. They both happen to involve the past, but that’s only coincidence.

Professional/vocational and business history is old, old; there’s a vast amount of history & heritage work on lots of fields, let’s say the retail industry (to give one example I know a bit about): selling and buying consumer items is incredibly important and consumer culture shapes our lives—is it necessary, to understand retail’s impact on popular culture, consumer culture, human experience, to explain in detail what an SKU database is, in the context of point of sale systems? Maybe, or maybe not. Is it therefore dumbed-down?
posted by Fiasco da Gama at 2:26 AM on January 29, 2021 [5 favorites]


[for what it’s worth I once asked an extremely successful computer scientist, and quite a rich man, what background he got in his CS degree in the history or philosophy of science. He shrugged and said they did an ‘ethics’ week where someone told them not to build killbots, and something about the Space Shuttle’s O-rings]
posted by Fiasco da Gama at 2:53 AM on January 29, 2021 [3 favorites]


Actually it would be a great PhD project for someone in early retirement

Believe it or not, research is work, i.e. not the thing you retire to do.
posted by hoyland at 3:30 AM on January 29, 2021 [4 favorites]


Fiasco da Gama: Add on "something something THERAC-25" and you basically have what I got in my CS programme in the mid-90s.
posted by rum-soaked space hobo at 3:50 AM on January 29, 2021 [5 favorites]


...which is so often said by people who haven't actually spent the time figuring out who is already doing this work and where they can read about it, and how they can contribute to the visibility of a struggling profession.

I appreciate your perspective on this, but I think you're being quite uncharitable to Knuth. He is, after all, responding to a historian who tried to figure exactly this out, and apparently celebrated the decline in specialist history. And, apparently, it's the historian (Martin Campbell-Kelly) who suggested that Knuth moonlight as a historian, which Knuth does not appear to have taken very seriously.

I believe you that there's good historical work on exactly these subjects, but this work has been somehow passed over by both the historian Campbell-Kelly, and the computer scientist Knuth (and in my own studies of philosophy and computer science I'm also inclined to agree that good work is hard to come by). Perhaps you have links or cites you could provide for literature that fits the bill?
posted by Alex404 at 4:50 AM on January 29, 2021 [7 favorites]


Thanks for posting this, it's super fascinating. I think Knuth is so tragically out of touch that it's almost cute, were it not for his certain insistence on a diagnosis that so completely misses the mark. I happen to have studied computer science at Stanford, and as far as I can see it, the problem is so clearly the unprecedented reality of so-called computer 'science'. I fear that I may come off as needlessly dismissive here, perhaps even strident, but there's only so much I can say, even with posting under a sock puppet. I've gone on to work in computational linguistics research, and the fact of the matter is: profit over everything, including and very necessarily, science. The biggest names in the field are so obviously anti-science that it's hard to stomach sometimes.... God, the stories I have! Linguistics in the academy, as everyone knows, is a relatively contemporary affair, and it sickens me to know deep down that there won't be a place for beautiful and humanistic---and scientific---work for very much longer, given the ways that capitalism has, for a very long time now, found perfect bedfellows in elite academic institutions.

Alright, it's too early for tears; time for coffee.
posted by sadierose at 7:12 AM on January 29, 2021 [7 favorites]


The history of computer science is how society is affected by it. Dijkstra's source code dumps in DEK's basement and “completely novel and now unknown ideas for software development” are minor footnotes. The purpose of a system is what it does, and I saw the computer industry marginalize women and people from under-represented groups, all in the pursuit of this “science” ferda.
posted by scruss at 9:49 AM on January 29, 2021 [2 favorites]


The history of the Steam Engine can not be understood outside of the business and cultural changes of the time, from very slight reading it was as techno-cutthroat as any of the giant startups in silicon valley if not more viscous. The historians would love to have technical documents that were vital trade secrets of the time.

What would seemingly be possible is archiving live documents from Intel, Google, and all the rest into a hard time based algorithmically locked repository that will be accessible to historians a hundred years in the future. Hmm interesting research problem, how to prove it's really 100 years in the future and not just a tweaked clock.
posted by sammyo at 11:13 AM on January 29, 2021 [1 favorite]


I was interested in why Leibniz was working on combinatorics and found a discussion in Knuth's Combinatorial Algorithms, Part 1, p. 500:
A Jesuit priest named Bernard Bauhuis had composed a famous one-line tribute to the Virgin Mary, in Latin hexameter:
Tot tibi sunt dotes, Virgo, quot sidera caelo.
His verse inspired Erycius Puteanus, a professor at the University of Louvain, to write a book presenting 1022 permutations of Bauhuis's words ... The idea of permuting words in this way was well known at the time; such wordplay was what Juliius Scaliger had called "Proteus verses" ... A natural question now arises: If we permute Bauhuis's words at random, what are the odds that they scan? ... Leibniz raised this question, among others, in his Dissertatio de Arte Combinatoria (1666), a work published when he was applying for a position at the University of Leipzig. At this time Leibniz was just 19 years old, largely self-taught, and his understanding of combinatorics was quite limited ...
You can see the tot tibi problem on page 73 of Leibniz's dissertation. The first correct answer was given in 1692 by Jacob Bernoulli, but it was not until the late 20th century that the problem could be understand as an instance of the NP-hard "exact cover" problem.
posted by cyanistes at 2:56 PM on January 29, 2021 [4 favorites]


I appreciate your perspective on this, but I think you're being quite uncharitable to Knuth. He is, after all, responding to a historian who tried to figure exactly this out, and apparently celebrated the decline in specialist history.

Knuth is literally calling that historian’s approach “dumbed-down,” though. This is not a rhetorical strategy likely to inspire a whole lot of charity. And it’s not just a tit for tat issue: the title of Knuth’s essay indicates that he either doesn’t accept or doesn’t understand—and in any case doesn’t respect—how historians think about fundamental issues for the work they do.
posted by No-sword at 5:07 PM on January 29, 2021


Knuth is clearly naïve to think that historians would work, of their own volition, on the intellectual history of computer science. This is a project that is primarily of interest to computer scientists, and so computer science is going to have to provide the impetus and funding. Historians are naturally going to work on topics that are of historical interest.

But if historians leave the intellectual history of computer science up to the computer scientists, they are not going to be happy with the results! A lot of whiggish interpretation of the history of mathematics was done by mathematicians before historians became interested in the subject.

It's natural for someone who knows all about the exact cover problem to look back at Leibniz's work and extract the parts that seem to be an early reflection of the modern concept, but this is presentism, and probably leads to misunderstanding what Leibniz was up to. For example, the very first thing in Dissertatio de Arte Combinatoria is a "Demonstratio Existentiae Dei", a logical proof of the existence of god. How are we supposed to interpret this? As far as I can tell, this "proof" is a mostly worthless piece of equivocation, except that postulate 4 says that one can take an infinite number of things and reason about them as a whole ("etiam infinitis, intelligi possit, quod de omnibus verum est ... quod ipsum erit Totum"), which seems (with hindsight) to be groping after the theory of infinite sets. But this is my intellectual bias! I'm motivated to interpret Leibniz as embedding plums of interesting mathematics in a pudding of religion, and I need a historian to set me straight.
posted by cyanistes at 1:59 AM on January 30, 2021 [1 favorite]


Thomas Haigh wrote an interesting response to Knuth's talk in the January 2015 Communications of the ACM. He explained why Knuth's "dumbed down" comment was misdirected:
Knuth's complaint that historians have been led astray by fads and pursuit of a mass audience into "dumbed down" history reflects an assumption that computer science is the whole of computing, or at least the only part in which historians can find important questions about software. This conflated the history of computing with the history of computer science. [...] To call such work "dumbed down" history of computer science, rather than smart history of many other things, is to misunderstand both the intentions and the accomplishments of its authors.
And he explained why Knuth's expectations are unrealistic (unless the field of computer science can change):
Historians judge the work of prospective colleagues by the standards of history, not those of computer science. There are no faculty jobs earmarked for scholars with doctoral training in the history of computing, still less in the history of computer science. [...] Thus the kind of historical work Knuth would like to read would have to be written by computer scientists themselves. Some disciplines support careers spent teaching history to their students and writing history for their practitioners. Knuth himself holds up the history of mathematics as an example of what the history of computing should be. It is possible to earn a Ph.D. within some mathematics departments by writing a historical thesis [...] As Knuth himself noted toward the end of his talk, computer science does not offer such possibilities.
posted by cyanistes at 8:08 AM on January 31, 2021 [2 favorites]


Isn't one huge barrier to researching/writing the type of history that Knuth is calling for that most of the technical details he wants recorded and analyzed are proprietary and not something the individual or company that owns them wants publicly available? Or am I missing something about the timescale he is talking about?

He references "marvelous breakthroughs in techniques about how to render scenes and pack data and do things in parallel and coordinate thousands of online users" in the gaming industry, which, sure, but wouldn't e.g. Activision be pretty ticked off and be dialing their lawyers if someone wrote down and published their server load balancing techniques or whatever?
posted by Wretch729 at 7:31 PM on January 31, 2021


I don't understand his problem.

TBH I haven't checked for a half decade now, but sites like Gamasutra have (certainly used to have) deeply technical (and some humorous ones, too: "No Twinky!") articles by team leads and programmers about their games: postmortems on the development process, explanations (including the actual math/code used) of how Freelancer's volumetric dust clouds were done, pathfinding algorithms.

One utterly fascinating page I found somewhere (I think it was even posted here on the blue!) about the evolution of camera/movement in 2D platform games.

I have, somewhere in my digital caves, a complete archive of a now defunct forum which was all about early MMO's with the programmers of Everquest and Runescape discussing everything from server clustering to economies in MMO's.

Programmers LOVE to share. From Carmack's source code (and his use of the Fast inverse square root) to the code NVidia and Intel show off (to be used on their products, to be sure) to RSI/Star Citizen's explanations of their server/gfx/etc code.

To even StackOverflow where you can see how a platform develops over time by looking at answers over time on how to solve a problem.

Go to a conference, Knuth! Read some articles online! As far back as 'The C programming language', the Gang of Four, the GPU Gems books, GFX Blackbook, papers on neural nets/ML/etc ... it's all out there!
posted by MacD at 3:44 AM on February 1, 2021


« Older This is the most cheerful video on this topic I've...   |   [Beef trusts] pride themselves on producing a safe... Newer »


This thread has been archived and is closed to new comments