Built To Last
September 17, 2020 5:21 PM   Subscribe

When overwhelmed unemployment insurance systems malfunctioned during the pandemic, governments blamed the sixty-year-old programming language COBOL. But what really failed?
Mar Hicks discusses the past and future of COBOL for Logic Magazine.
posted by zamboni (99 comments total) 36 users marked this as a favorite
 
If you drive in unmaintained 60 year-old car around town and it suddenly breaks down I don't think it's the car's fault.
posted by GuyZero at 5:44 PM on September 17, 2020 [19 favorites]


I really like this article, but I do not buy the argument in the middle about why Cobol is derided. The author positions Fortran as the opposite of Cobol -- complex, terse, high-prestige, associated with masculinity -- but people say all the exact same things about Fortran.

"There's actually still a lot of Fortran/Cobol code around." "Actually, knowing Fortran/Cobol is surprisingly still a useful skill." "You know, modern versions of Fortran/Cobol have a lot of nice features." There are even joke modern web frameworks for both of them -- Cobol on Cogs and fortran.io.

I think the reality is that, while the modern versions of both languages are probably good, old code was written in the old versions, and it's not always possible to make use of new programming language features when extending or maintaining old code. And it is the case that actually, programming languages in the 1970s were just worse and you don't want to use them if you can avoid it.

I think this accounts for some, but not all, of the derision and mockery applied to old languages. (Ask a physicist and you might be able to hear a story about horrible Fortran code that had been floating around their lab for 35 years.)

That section of the piece does have some good psychological insight into the bad parts of programming culture, but I am not completely convinced by it.
posted by vogon_poet at 5:49 PM on September 17, 2020 [5 favorites]


Yes, old languages are derided. That’s not actually the point the article is making. The two languages represent something of a philosophical fork in the road.

Imagine a tech industry that spent the last 50 years trying to improve upon the design goals that COBOL is an attempt to satisfy. More people able to read, understand, and ultimately to build things they need via code.

Instead, we live in a world where tech chose to spend 50 years on a path where your primary source of credibility was an ability to put up with things both obtuse and deliberately obfuscated, and adding more layers was how you got ahead.

Gatekeeping: a viable tactic for making money, if you don’t care about the folks who are shut out. It’s been true in other industries for centuries; today’s tech industry just pours a bunch of denial over the top to hide it.
posted by FallibleHuman at 6:15 PM on September 17, 2020 [25 favorites]


COBOL helped bring about the convergence of Computer Science and Computer Programming. In other words, the "scientists" were able to get into programming business. So much so, that coming into the 80's that practically the entire "programming" portion of the industry ceased to exist. Yeah, the job name lives on, but the people that use it are wrong.

These types of articles just love to tell us how programming was dominated by women (which it was), but then go into how one day in 60's a bunch of dudes decided that programming was for them and how they went "down" a level and forced all the women out, which is definitely not true.

Industry changes (of which COBOL was one) allowed the a more direct approach and eventually the "programmers" became redundant and then completely unnecessary. Think of it was how there was a time where a office worker would use a DictaPhone to dictate a memo that would then be typed up by the typist pool, but eventually personal computing came along and the same same sort of office worker would type out the memo directly.

In my example, firing the typist pools en-masse might be something to talk about, but these articles just love to assume all the office workers went downstairs and took the typists' jobs.

Anyway, false history aside, literally zero engineers have gotten fired because management thought their current staff used a technology that was too easy. That's because the type of management who fires their staffs always assumes they can just find new people on every street corner in town no matter what their stack is. They definitely did not think to themselves "good thing this isn't in C, because any dumbass can do FORTRAN!"

Also, the languages the author alludes to as being "contemporary" (C++ and Java) are 30 and 25 years old respectively.
posted by sideshow at 6:16 PM on September 17, 2020 [11 favorites]


I read Dr. Hicks as saying that FORTRAN was the macho language of the time. It may not be a particularly good example of one now. And the article notes how old C is.
posted by scruss at 6:18 PM on September 17, 2020 [3 favorites]


I was astounded by this article. I thought it was going to be yet another article on how some programmer with esoteric knowledge of an archaic language comes out of retirement to fix a legacy system. And it would be slightly newsworthy since its pandemic-related.

Instead, there are surprising takes on programming language origins and gender, government maintenance vs. consumer-based obsolescence as well as a surprise root cause of the problem.

I'm an old tools/compiler engineer who's worked with all of the languages mentioned. This gives me a lot to think about...
posted by aworks at 6:20 PM on September 17, 2020 [9 favorites]


That's unfortunate that the buckling of unemployment systems around the country will probably stick to COBOL. Thanks for posting this article!

Speaking of languages older than C, according to the stats on this GitHub mirror of the R interpreter, 23.3% of it is written in Fortran.
posted by chinesefood at 6:26 PM on September 17, 2020 [1 favorite]


It occurs to me that it's likely that criticism of Cobol in 70s/80s/90s probably had more to do with these highly gendered "real programmer" attitudes, and specifically limited to that context the stuff described in the article is a lot more convincing.

The article seems to slip back and forth a bit between modern criticism of the language and past criticism.
posted by vogon_poet at 6:26 PM on September 17, 2020 [3 favorites]


There are multiple ways to force an outcome. One way would be to choose an outcome and actively work to achieve it.

Another is to just remain passive. Ask for more proof. Rationalize. Find reasons (you hadn’t thought of previously) to not do something. Eventually people stop asking for change.

In the meantime, how can problems be your fault? You didn’t do anything! They’re probably just whiners.
posted by FallibleHuman at 6:31 PM on September 17, 2020 [5 favorites]


In order to care for technological infrastructure, we need maintenance engineers, not just systems designers—and that means paying for people, not just for products
Amen; and a built heritage comparison: when Australia was colonised, its architects and builders very quickly discovered that the Sydney basin has a lot of extremely high-quality sandstone. Most of NSW's 19thC's official buildings are built from sandstone which is, on one hand, extremely beautiful and honey-coloured, but on the other hand relatively soft (compared to harder stone and brick) and given to wear. When modern building techniques came in in the late 19thC and early 20thC, the labour market for stone masons disappeared, as did the available quarries, which are now underneath some of the most expensive real estate in the world. Stonemasonry, and particularly decorative sculptural masonry, is nowadays a tiny, tiny niche.

The NSW Minister for Finance administers a sandstone stockpile (a strategic reserve of stone), and employs a smallish Heritage Stonework team of masons. Every now and then a clever Treasury wants to sell it off and allow 'the market' to provide the same services; but there's simply no other market for that kind of masonry: who builds anymore in Pyrmont sandstone, when it can't be quarried? The only organisation with the capacity to maintain NSW Government buildings—which actually means 'training new masons' and 'managing finite supply'—is the NSW Government itself.
posted by Fiasco da Gama at 6:46 PM on September 17, 2020 [30 favorites]


As someone who once spent some time as a summer intern in the early 90s attempting to convert a burglar alarm monitoring system that monitored burglar alarms all over the country from COBOL to C, it seemed to me that among COBOL's main flaws is that COBOL is just inescapably miserable to program in, and by comparison, C was a joy. One can become accustomed to anything, I suppose, but I pity the souls accustomed to COBOL.
posted by smcameron at 8:10 PM on September 17, 2020 [12 favorites]


It is an interesting design dichotomy: do you build a programming language for the programmer or for the machine? The latter seems to have largely won out, although one of the traditional arguments for keeping code minimal was to save memory; that has become less and less of a concern as time has gone by.

The first language my son learned to code in was INFORM, which has become more and more natural-language oriented. It's a great way to learn programming techniques using words that mean things when strung together in sentences. But I can also see the value in creating more streamlined grammars than any human language has.
posted by rikschell at 8:14 PM on September 17, 2020 [2 favorites]


I haven’t used INFORM but in my experience the problem with using “natural” language for programming is that the machine still isn’t natural — and will never be natural — so now instead of relying on a person to translate “human” thought into a computer, now you have another computer (the compiler/interpreter) to do the job. But that computer is no more “natural” than the one you were dealing with before, so it doesn’t really solve the underlying problem.

And now instead of talking to a machine, the programmer now has to talk to a machine pretending to be a person.
posted by bjrubble at 8:34 PM on September 17, 2020 [6 favorites]


One thing that bugs me is the tendency in modern languages to prefer brevity over clarity. I would rather my code be clear to any random developer than be sleek and lint compliant
posted by grumpybear69 at 9:06 PM on September 17, 2020 [16 favorites]


Fewer lines of code often help lead to fewer bugs. Shorter programs are easier to debug and there is less room to hide problematic logic and code smells.

It can go to the other extreme, true, where shorter code leads to obfuscation, either by less-readable or obscure syntax or, say, navigation of multiple layers of indirection from chained functions.

I think striking the right balance comes down to knowing how to do a thing in the most idiomatic way for the given language, and by copious use of comments, where needed.
posted by They sucked his brains out! at 9:44 PM on September 17, 2020 [3 favorites]


So when I started to learn programming, we were still using different versions of FORTRAN - I learnt the subtle differences between Watfor and Watfiv not to mention Buff40.

Then at university - COBOL - for which there was a STANDARD - which meant that it would work the same way on every machine - or the manufacturer had to warn you when it was NOT standard. Blown away.

You knew who the heavy duty Cobol programmers were in the days of punch cards - they were the ones with long arms to carry the boxes of code. and then PL1 which tried to be all things to everyone

The lecturer who considered Pascal a theoretically complete programming language, except that the "do until" command was redundant and should not have been allowed as it could always be expressed as a "do" command.

Cobol was and is a good language exceptionally fit for purpose.
posted by Barbara Spitzer at 10:21 PM on September 17, 2020 [8 favorites]


scapegoating COBOL can’t get far when the code is in fact meant to be easy to read and maintain.

The word "meant" is doing a bit of work, there. I'm not going to deride COBOL because I have little experience with COBOL, but I'd argue the track record of programming languages that purport to be easy due to their resemblance to natural language or simple feature set is not all that fantastic. Large bodies of text/code are unwieldy and writing things over and over again is no fun. Not to say we should all be using APL but there's a reason mathematicians use symbols.

Yes, old languages are derided.

Old languages are definitely not universally derided - the obvious counterexample being LISP, which is something of a Real Programmer totem (albeit for a different kind of Real Programmer than C). On the other hand neither are "easy"/"readable" languages - I mean, that's Python's whole deal and it is beloved. Of course I think that (and the trend of modern Real Programmers going towards FP and high abstraction) has a bit to do with how many people blew their own extremities off with the "manual memory management or bust" school of Real Programming.
posted by atoxyl at 10:24 PM on September 17, 2020 [7 favorites]


The thing linked makes good points about the fact that there aren't many civil-servant programmers (I mean "software engineers") anymore, and why that's a problem. But the section on the evolution of programming languages is determined to grind an axe whether there's a whetstone or not.

COBOL didn't lose the popularity contest because it's so easy that macho he-man guy programmers couldn't be bothered with it. It lost because it's not very expressive. It's harder to write complicated programs in COBOL than C, once the data you have to work with isn't batches of sales receipts that you have to compute a commission on or something simple like that. The contrived example comparing COBOL to FORTRAN is just that, contrived. COBOL programmers, at the time when they were half the programming workforce, really were semi-skilled compared to people who were working with ALGOL or PL/I. The stuff they were doing then would be mostly handled by a non-programmer using a spreadsheet, today.

The "English-like" verbose syntax that's possible for COBOL was never used in the COBOL systems I've looked at... COBOL doesn't require it, and nobody who's not a novice would voluntarily use it. The managers never did get around to actually reading code.
posted by Aardvark Cheeselog at 10:43 PM on September 17, 2020 [15 favorites]


This is really interesting. Thanks!
TOTAL = REAL(NINT(EARN * TAX * 100.0))/100.0

users of COBOL could write the same command as:

MULTIPLY EARNINGS BY TAXRATE GIVING SOCIAL-SECUR ROUNDED.
I'm incredibly biased. (I grew up using IDL, which is more or less Fortran-like syntax implemented in C and have worked on two big projects in Fortran long ago. I've never seen a line of Cobol before.) But, the second one seems far less straightforward to me. What takes precedence? I assume "ROUNDED" is a function that gets applied to SOCIAL-SECURE. . . but, what kind of variable does it produce? How do you know which parts of the line it operates on? I really don't think my reaction is gate-keeping, even subconsciously. . . it's just an incredibly foreign syntax for someone who learned programming (badly) after spending a lot of time with pen-and-paper equations.

It's good to be reminded that there are other ways of doing things.
posted by eotvos at 10:53 PM on September 17, 2020 [3 favorites]


The skill-set for programming has little to do with specific languages. The problem with trying to produce a system that allows "non-programmers" to program is that non-programmers will then write programs and that often doesn't end well. Programming is really about understanding a problem well enough to be able to describe it completely and unambiguously in simple enough terms for a mindless computer to understand. Natural languages turn out to be very poor tools for this purpose.

The problems with those older languages that are being mocked are often because at the time they were designed it was necessary for programmers to be able to wring every last bit of performance out of the fantastically expensive hardware they were writing for. These days we have way more computing resources and can devote some of this to providing tools that assist programmers in being more productive. We can afford the bytes for long variable names and the processor cycles for syntax highlighting! (And a mountain of other stuff ofc.)

I'd think the missing civil servant programmers are now now writing (e.g.) Excel spreadsheets and, for many use cases, that's probably not a bad thing. Viewed in that light Excel is just a good example of a programming tool ("language") with a giant overhead compared to the tasks it's often used for, and that's OK.
posted by merlynkline at 1:12 AM on September 18, 2020 [10 favorites]


As someone who once spent some time as a summer intern in the early 90s attempting to convert a burglar alarm monitoring system that monitored burglar alarms all over the country from COBOL to C, it seemed to me that among COBOL's main flaws is that COBOL is just inescapably miserable to program in, and by comparison, C was a joy.

How much of that is syntax-based? The long-winded verbiage could be eliminated by coming up with a new syntax for the same semantics; as the ASTs generated are the same, compilers could accept either, and scripts could be written for machine-translating to the new syntax.

This wouldn't help with deeper issues, though. Does COBOL these days have block structure or is it all GOTOs?
posted by acb at 1:54 AM on September 18, 2020


I'll bite?
posted by axiom at 2:05 AM on September 18, 2020 [1 favorite]


I'd think the missing civil servant programmers are now now writing (e.g.) Excel spreadsheets and, for many use cases, that's probably not a bad thing. Viewed in that light Excel is just a good example of a programming tool ("language") with a giant overhead compared to the tasks it's often used for, and that's OK.

I tend now to think of Excel as essentially a fancy GUI for Visual Basic For Applications, which is definitely not a joy to write in; and as such, Excel suffers from non-programmers often essentially trying to code without really understanding that's what they're trying to do (nested VLOOKUP, anyone?).
posted by solarion at 2:12 AM on September 18, 2020 [4 favorites]


>It's not always possible to make use of new programming language features when extending or maintaining old code.
It's not always cost-effective, please check you're using a Universal Turing Machine. Snark aside, the refactor paradigm[1] gets you a long way and, crossed with the essential habit of putting boundaries between things with differing responsibilities[2], you can do it. I get that it's work and not always praised or economical, but it can be done. The boundaries mean that I shouldn't care what each component does on your side of the interface boundary and you get to use any fancy new language features you want.

1: Wrapping the old in tests and ensuring that the new passes the test cases in the way that the old did.
2: Avoid locking up your system with components reaching in to other components so that you can't change one without breaking all of it. Segregarion of responsibility with reasonable interface boundaries across which you pass the data and enforce it's formed right to a schema is essential for maintenance.

>I'd think the missing civil servant programmers are now now writing (e.g.) Excel spreadsheets and, for many use cases, that's probably not a bad thing. Viewed in that light Excel is just a good example of a programming tool ("language") with a giant overhead compared to the tasks it's often used for, and that's OK.
Please reconsider this, it causes job-for-life approaches to maintaining your Excel sheets, syphoning business logic out of the body of institutional knowledge into stuff stored on a SMB share or Sharepoint. It keeps cash flowing but is part of a terrible ecosystem (and we should also call out being a fungible replacement part of the business setting, individuated and unable to act in union with colleagues or time-hire contractor not even fully an employee of the business).

>I tend now to think of Excel as essentially a fancy GUI for Visual Basic For Applications, which is definitely not a joy to write in; and as such, Excel suffers from non-programmers often essentially trying to code without really understanding that's what they're trying to do...
If we still had 1990's mindset MS, there would be outrageous prices for cloud execution of your sheets on the older versions of Excel, mainframe-izing the 'computer on every desk and in every home' revolution they wrought. I dunno about 'non-programmer' because I think in terms of workflows, a sequence of steps to get done what you need to get done. So maybe our business activities need us to be explicit about those workflows so that we can move the rote parts into the computer. Instead we could use human smarts to manage them and to respond to the unexpected things the computer isn't so good at handling.
posted by k3ninho at 2:39 AM on September 18, 2020 [2 favorites]


sideshow, your description of the original women programmers as typists rather than highly academically skilled professionals who were doing algorithm design and optimization and inventing new (early) programming languages (i.e. what has come to be called computer science) is "false history", and Grace Hopper would likely have some severe words to say to you on the subject. Also all of the women who safely got the first astronauts to the moon and back.
posted by eviemath at 3:47 AM on September 18, 2020 [28 favorites]


> It is an interesting design dichotomy: do you build a programming language for the programmer or for the machine?

Or given the ever increasing prevalance of Javascript, why not neither?
posted by grahamparks at 4:00 AM on September 18, 2020 [16 favorites]


These types of articles just love to tell us how programming was dominated by women (which it was), but then go into how one day in 60's a bunch of dudes decided that programming was for them and how they went "down" a level and forced all the women out, which is definitely not true.

sideshow, that's a load of crap. It is, mostly definitely, true.

tl;dr? It's all about gatekeeping.

Historian Nathan Ensmenger wrote about how "Computer Geeks" replaced "Computer Girls". From the article:

As the intellectual challenge of writing efficient code became apparent, employers began to train men as computer programmers. Rather than equating programming with clerical work, employers now compared it to male-stereotyped activities such as chess-playing or mathematics. But even so, hiring managers facing a labor crunch caused by the rapid expansion of computing could not afford to be overly choosy. The quickest way to staff new programming positions was to recruit from both sexes, and employers continued to hire women alongside men.

The masculinization of computer programming

In 1967, despite the optimistic tone of Cosmopolitan’s “Computer Girls” article, the programming profession was already becoming masculinized. Male computer programmers sought to increase the prestige of their field, through creating professional associations, through erecting educational requirements for programming careers, and through discouraging the hiring of women. Increasingly, computer industry ad campaigns linked women staffers to human error and inefficiency.

posted by lemon_icing at 4:27 AM on September 18, 2020 [18 favorites]


For what it's worth, I've got an outside historical view of prejudice against COBOL. I was making and selling buttons with funny sayings in the 80s at science fiction conventions, and I was pleased to pander to any non-pernicious prejudice which would cause people to buy buttons.

One of those prejudices was against COBOL. As far as I could tell (I didn't ask), the issue with COBOL was that it was used for boring business applications. Cool programmers were doing something else. Math? Science? Tools for other programmers? I'm guessing here.
posted by Nancy Lebovitz at 5:10 AM on September 18, 2020 [7 favorites]


The comments here unintentionally support the points of the article: dislike for COBOL is all tied up in gatekeeping and sexism. Maybe pause to think for 10 seconds before making comments about how gatekeeping and sexism are good things?

Protip: if you are a dude and your hot take in a discussion of sexism is "that's certainly not because of sexism," you are being sexist.

I loved the article and learned a lot, thanks.
posted by medusa at 5:15 AM on September 18, 2020 [9 favorites]


I read Dr. Hicks as saying that FORTRAN was the macho language of the time. It may not be a particularly good example of one now.

FORTRAN is the lingua franca of supercomputing. It's still one of the if not the fastest languages there are, and where every skosh of computational speed matters, it's an excellent choice. Even as few as five years ago Lawrence Livermore, Sandia and Los Alamos national laboratories started work to create an open-source Fortran compiler designed for Lawrence Livermore programming infrastructure. It is used by NASA, the DOE, intelligence agencies and in most supercomputing centers I've work with in my career.

Its applications include computer simulations for nuclear weapons, fusion reactor research, national security, medicine, climate and basic science missions. Anyone who thinks FORTRAN is dead or some sort of anachronistic legacy leftover is simply wrong.
posted by wolpfack at 5:27 AM on September 18, 2020 [5 favorites]


My dislike for COBOL is wrapped up in having to learn it from a man with bad skills as a teacher who looked like televangelist James Bakker and sounded like Elmer Fudd, as well as a tumultuous and eventually bad-breakup-ridden college relationship which colors everything of that timeframe of my life.

(I swear, waiting to go to college until I was 22 might have been a better idea...)
posted by mephron at 5:43 AM on September 18, 2020 [1 favorite]


COBOL is a tool. As a tool, it is good at some things and not so good at others. It is EXTREMELY good at really boring but important things like financial transaction processing. This accessible dive into how COBOL does math helps explain what’s going on and why COBOL’s default approach to doing math works so well for boring but important work.

The core issue that I saw in the article and that I experience daily in managing the development of enterprise software, is maintenance. It is so hard to fight with the money people that it is worth building maintainable code, or doing refactoring and maintenance on that code, because it takes away from feature work. Never mind that the legacy system is a fragile mess and bodging another feature on the side will just make it that much more fragile.
posted by rockindata at 5:58 AM on September 18, 2020 [8 favorites]


I'm by no means a Cobol expert but I have tinkered with it from time to time.

> Does COBOL these days have block structure or is it all GOTOs?

Cobol has gotos but they're not really used for the same reasons they're not used anywhere else: they make a damn mess! The Procedure Division of a Cobol program consists of a bunch of named paragraphs which you can CALL. A CALLed paragraph behaves more like a subroutine than a goto: Execution jumps to the paragraph, runs all the statements defined in the paragraph and then (usually) execution jumps back to where the call was made. It's possible to write paragraphs so that execution falls through to the next paragraph (similar to switch statements without a break) but it's generally considered bad form. These are open subroutines in that they don't accept input or produce output but operate in the same global context as the rest of the program and have direct access to all the program data. But still, paragraphs are basically self-contained execution blocks accessed by name so you can reorder them without problems and each one should do one specific thing like any good procedure or function.

You can also have subprograms which behave more like actual functions. You can pass parameters to a subprogram and it'll do it's job. Subprograms don't really return results like a function but they can write their results to files or whatever which can be read from the original program if needed. Subprograms can either be in separate files or embedded within the same source file as the main program.

Finally, as of the 2002 standard, Cobol has included object oriented features. I don't know much about those so can't comment further but OO Cobol is a thing that exists.

>the second one seems far less straightforward to me. What takes precedence?

I think this is really just a matter of what one is used to as opposed to any weakness in the language or it's semantics. Precedence confusion is by no means a problem unique to Cobol. Look at all the memes where people argue about how to evaluate things like 2 + 2 x 4. In this example, GIVING is basically the assignment operator. So, multiply the two numbers, stash the result in SOCIAL-SECUR (which will have been defined previously), and round if necessary. Once you work in Cobol a bit there's no ambiguity. And I would argue that the second version is more understandable to someone who doesn't know either of the languages in question but needs to figure out what's going on.

As to what type of value it produces, Cobol (at least pre-2002 Cobol) really only has two types of data: numbers and alpha-numeric data. And it doesn't have things like int, long and short. It has numbers and you can define what those numbers should look like and they don't all need to look the same. So SOCIAL-SECUR might be defined like this:

01 SOCIAL-SECUR PIC 99999V99.

or this, which is equivalent

01 SOCIAL-SECUR PIC 9(5)V99.

Which admittedly seems a bit weird until you get used to it. The PIC is short for PICTURE. When you defined data in Cobol you essentially provide a snapshot of what the data "looks like." This says that SOCIAL-SECUR is a 7-digit number with 5 digits before the decimal point (the V) and 2 digits after. So if that's how SOCIAL-SECUR is defined, then that's what ROUNDED will give you, a value rounded to the nearest cent. If it's defined differently, say

01 SOCIAL-SECUR PIC 9(5).

then you get a number rounded to the nearest dollar (we didn't specify where the decimal point is, so it's at the end.)

Anyway, that turned into more of a Cobol lesson than I intended and I could go on about data declarations and code re-use but I won't. Point being, massive list of reserved words aside, the semantics of Cobol aren't any more difficult to learn than any other programming language. It's different for sure but that doesn't make it bad. Cobol's neat. I don't hate it, I actually kind of like it. And it's always good to look at different ways to solve a problem.
posted by Mister_Sleight_of_Hand at 6:18 AM on September 18, 2020 [18 favorites]


Programming is really about understanding a problem well enough to be able to describe it completely and unambiguously in simple enough terms for a mindless computer to understand.

This is a tangent, but merylnkline's comment captures the biggest issue I've always had with coding as a non-coder - I work in a non-technical role at a software company and there's a lot of "anyone can learn to code, why not give it a try!" messaging from the software engineers. I've done some very basic Python and understood what was going on well enough that I could probably try doing some more basic Python, but the fundamental issue I have is that I just don't have the kind of ideas or problems that it makes sense to create software to service or solve.

I spent my formative years reading and writing a lot rather than trying to make things happen with computers, so now as an adult when I have a creative idea it's usually for a story or an essay rathe than for a problem I could solve by coding. I also don't have a deep enough understanding of what a computer can/can't/should do to begin building ideas for things that I could code that might improve my life or other people's lives. So while I don't think there are any particular intellectual barriers to me learning to code, there are absolutely understanding-of-problem-space barriers, and I don't see any of the "anyone can code!" people trying to address this aspect of the experience.
posted by terretu at 6:19 AM on September 18, 2020 [10 favorites]


Here to say that Mar Hicks is a serious historian on this subject, and their book Programming Inequality is a really enjoyable deep dive into the specific chain of events that led to the masculinization of the English computer industry after WWII, starting with the widespread entry of women into cryptanalysis and early computing for the war effort.

Also, a great Twitter presence at @histoftech
posted by bgribble at 6:24 AM on September 18, 2020 [5 favorites]


There’s an old joke among programmers: “If it was hard to write, it should be hard to read.” In other words, if your code is easy to understand, maybe you and your skills aren’t all that unique or valuable.

Uh, if your code is easy to understand, then it is way less likely that someone who has to maintain it long after you're gone will completely ruin it because they can't figure out WTF is going on. Sadly lots of developers focus on protocol and cleverness and not at all on how the code is going to be interacted with in the long term. Also, often, they don't take into account how it will be invoked by end users.
posted by grumpybear69 at 6:42 AM on September 18, 2020 [4 favorites]


My understanding is that Python has largely filled the role of 'programming language that's readable, easy enough for domain experts to pick up, can solve real problems in it, still cool' without the macho bullshit.

Or maybe I just haven't run across any macho bullshit yet.
posted by Merus at 6:46 AM on September 18, 2020


From the actual article, rather than the comments:

If management thinks the tools you use and the code you write could be easily learned by anyone, you are eminently replaceable.

In order to care for technological infrastructure, we need maintenance engineers, not just systems designers—and that means paying for people, not just for products.

These two ideas are in conflict. The root of the idea that powerful-but-arcane languages make the language more "real" is that management are looking for reasons to not have to pay for maintenance, and writing in a reliable, easy-to-understand language means that management can call the project done and fire all of the now-redundant workers that much easier.
posted by Merus at 7:00 AM on September 18, 2020 [1 favorite]


The problem with trying to produce a system that allows "non-programmers" to program is that non-programmers will then write programs and that often doesn't end well.

I know you didn’t mean it this way, but when I was first learning to code, this comment would have lodged in my brain and I would have felt like I shouldn’t bother the “real programmers” by even trying to code. After all, it just won’t end well.

I didn’t learn to code until age 21. I thought of myself as a “non-programmer” for sure. But within a year of that first C++ class, I had a job where I wrangled a legacy FORTRAN codebase to do scientific data analysis, and now I do data science using R all day long. I don’t meet plenty of people’s standards for a “real programmer” because R isn’t a compiled language, and anyway I’m just making one-off software to solve specific scientific problems, not production software. But… I code all day, for money, and a lot of people think I am very good at it.

Regarding ensuing discussion of Excel:

By the nature of my job, I work with a lot of people who do their analysis using horrendously complicated Excel spreadsheets. Writing those wild formulas is coding. It’s just inefficient and fiendishly difficult to debug. But I think it’s more useful to say “Hey — if you use formulas in Excel, you have already written code! Languages like R, or Python, or C, or FORTRAN — those give you a quicker way to write the formulas, and they’ll run faster, and you can write formulas to do more cool stuff that Excel doesn’t know how to do. But the first stuff that anybody learns in any of those languages? It’s not really any different from Excel formulas.”

Because that gives people the correct idea that hey, coding is not this big mysterious thing that only super-geniuses could ever do. It makes it seem accessible and feasible to learn, which it is. And maybe they will consider trying it out, and then their lives and mine will become much easier when we aren’t trying to debug a gigantic, sluggish, tangled Excel workbook that could have been an R script.

(I’ve considered trying to develop a class that uses Excel as a gentle lead-in to coding in Python or R — drawing parallels between how to do something in Excel and how to do it in Python or R, and showing how tasks that are horrible in Excel can be very straightforward in Python or R.)
posted by snowmentality at 7:05 AM on September 18, 2020 [15 favorites]


Don’t worry, you can write completely inscrutable Python, especially if you do things like refuse to use the standard library and tools like requests and instead implement your own string handling approaches, use single letter global variables, copy paste the same code over and over again because calling a function is “too slow”
posted by rockindata at 7:33 AM on September 18, 2020 [5 favorites]


This accessible dive into how COBOL does math helps explain what’s going on and why COBOL’s default approach to doing math works so well for boring but important work.

This is true, but it's also not true. The company I work for does taxes (generally in real-time, which includes multiple rounds of recalculating taxes), and payments and processing for nearly every country in the world. We were able to switch from COBOL to modern languages, and it was expensive (every project we work in is expensive) but not dramatically so. We pay lots of taxes to the US federal government, so our taxes must be good enough.

I programmed COBOL in college and no matter how many times it is said that it was written to be 'human readable' - that was in 1959. By 1998 it was crap, and getting to program in terrible languages like PowerBuilder and MS Access Visual Basic, and C were so amazing comparatively.

It's no different than people who crow about how fast old muscle cars were -and they were, for their time. Now the stock versions get burned by midsize cars and minivans.

I still remember going to the library to get COBOL debug logs for a program to take a file input, use a set rate, and compute an output - reams of paper that they give to you in one of those boxes to hold reams of paper.

This paragraph is also crap and a copout:
website through which people filed claims, written in the comparatively much newer programming language Java, that was responsible for the errors, breakdowns, and slowdowns. The backend system that processed those claims—the one written in COBOL—hadn’t been to blame at all.

Yeah, front end websites do 10X the work to protect backend systems from bad data, which kills them and crushes throughput. Also - running everything in a state through a few mainframes to support COBOL, which generally was not originally programmed to be geographically distributed, leads to front end crashes due to volume.
posted by The_Vegetables at 7:50 AM on September 18, 2020 [2 favorites]


Yeah, it definitely wasn't a case of "java bad, COBOL good" that caused the unemployment websites to crash. It was unprecedented demand and the inability for those systems to scale, largely due to infrastructural decisions. There's a reason companies are flocking to AWS and working with CDNs like Cloudfront.
posted by grumpybear69 at 8:14 AM on September 18, 2020 [4 favorites]


> Sadly lots of developers focus on protocol and cleverness and not at all on how
> the code is going to be interacted with in the long term.

This. What I should really have said before: Programming is really about understanding a problem well enough to be able to describe it completely and unambiguously in simple enough terms for a mindless computer and future maintainers (including yourself) to understand.

> you can write completely inscrutable Python

Oh yes. And COBOL. And anything. Including English.

> I don’t meet plenty of people’s standards for a “real programmer”
> because R isn’t a compiled language

Sounds like those people don't meet my standards for a lot of things, including "real programmer" (They're just jealous because your compiler is so fast that you don't have to run it as a separate build step; so fast they apparently can't even tell it's there :) )
posted by merlynkline at 8:25 AM on September 18, 2020 [1 favorite]


Or maybe I just haven't run across any macho bullshit yet.

Eh, there's macho bullshit in every sub-area related to computers. There's just (fortunately) less of it some places than others, at least.
posted by eviemath at 8:26 AM on September 18, 2020 [1 favorite]


Way back in the dawn of time when I was getting my AS in software engineering, I was in the very last year of students at my college who were required to take COBOL and assembly.

By then I'd been coding in various languages for at least 12 years. I started with BASIC and LOGO back when I was 8, and by the time I got to college I'd been writing private projects in C for a few years so I didn't go into any of my classes without prior experience or bias.

I don't dispute that COBOL is a valid language that can do good things. It is and it can.

It's also just absolutely miserable to try to write in. Yes, C fans can get far too obsessed with terseness, but COBOL took the understandable for the time but now shown to be wrong idea that making the language more verbose would make it more human readable and more accessible to non-programmers.

It didn't make COBOL more accessible, but it did make programming in it an exercise in annoyance.

It was easier to use than assembly, no argument at all. But that's not saying much.

I'll bet that if we took a group of people who'd never heard of any programming languages and were therefore untainted by any macho anti-COBOL garbage, gave them an intro to programming and taught them both COBOL and virtually any non-joke language invented in the past 30 years, they'd pick the newer language as the friendlier and less awful language to use.

COBOL suffers from the problem a lot of pioneering and groundbreaking things do: it had no prior work to learn from so it made a lot of mistakes. Modern languages aren't better because modern people were mentally superior, they're better because they learned from the mistakes, bad parts, setbacks, and failures, of what went before.

COBOL was an excellent first effort. But it isn't macho or posturing or sexist to say that as a language it's pure misery to program with and we've got better tools now.
posted by sotonohito at 8:36 AM on September 18, 2020 [4 favorites]


One issue I would have programming COBOL is the lack of a (proper) function abstraction. It's such a nice abstraction that I would be annoyed to no end by it, and it would drive me crazy.
posted by Monday, stony Monday at 8:37 AM on September 18, 2020


Snowmentality: I am teaching the class you describe. We do a task in Excel, and in R, so that we can figure out what each one is best at. I’m only a week in, but it’s really a lot of fun.
posted by Valancy Rachel at 8:41 AM on September 18, 2020 [4 favorites]


Another factor to consider is, if you grant that programming in Cobol did eventually become more frustrating than newer languages, less career upside, lower status, etc. -- it would be a classic pattern of discrimination for companies to steer women into these jobs. I wonder if there's any strong evidence of this happening at any companies?
posted by vogon_poet at 9:01 AM on September 18, 2020 [1 favorite]


I'll also add that I think the whole idea of programming for non-programmers is invalid and needs to be abandoned.

Learning to program is learning to see the universe in a way you hadn't before. I suspect its much like learning how to draw, or compose music, in that it requires you to adjust your mind and change your way of thinking in such profound way it never really leaves you and it affects your entire thinking process for the rest of your life. Learning to drive is similar, if not quite so profound in how it changes your mind.

Once a person has changed how they think and they have become a programmer the language they use is just a matter of familiarity, comfort, and convenience. Some languages are better at some things, sure, but ultimately a programming language is a programming language.

But no matter how friendly the language, someone who hasn't learned the programming way of thought won't be able to do anything with it.

As an example, back in my college days in my intro to programming class (which no one in my degree plan was allowed to test out of or skip in any way) towards the early days we were assigned a simple task to help familiarize ourselves with programming and the Java language: have the computer roll a pair of dice 100 times, have it record how often each number came up, and after it had finished rolling 100 times display how often each number had come up. The teacher even had an example of the output he wanted. Clear, straightforward, and dead easy if you had the programming mindset.

I'll note here that I think there is a huge and toxic strain of macho in programming which holds that writing pseudocode, or flowcharting, or whatever before you sit down and start hammering out real code is the only proper way to program. That's pure bullshit and I do think the teacher did us all a grave disservice by not starting this exercise, and all other exercises in the class, by insisting we write some pseudocode or draw a flowchart.

But I knocked mine out quickly and was asked for help by a classmate who hadn't managed to get anywhere.

He wasn't a programmer. He hadn't gotten his head in the right place yet, and again I do think the teacher was failing at the true task of helping us shape our minds into programming minds rather than the surface task of teaching us the basics of Java.

I asked him how he was doing and he answered: "I don't know where to start".

And that, right there, is why programming for non-programmers is pure bullshit.

Because it wouldn't have mattered if we were using Scratch, or another block based coding language practically invented as a programming for non-programmers language, if the person in question doesn't have the mental framework of programming they won't know where to start.

There's no such thing as programming for non-programmers. At best it's a gentle introduction to how to put your mind in the programming place.
posted by sotonohito at 9:32 AM on September 18, 2020 [7 favorites]


there's a distinction one has to establish when talking about cobol, or inform 7, or other similar "what if we made programming language commands mimic english-language sentences" languages. this distinction is between readability — how difficult it is to understand what a piece of code is doing, how difficult it is to debug that code if it's not doing what you want, how difficult is it to add new code to that piece of code — and the appearance of readability. the appearance of readability concerns how difficult it is for someone looking at a piece of code to think they understand what that code does.

cobol has a high appearance of readability. a line of code like:
MULTIPLY EARNINGS BY TAXRATE GIVING SOCIAL-SECUR ROUNDED.
is something that a non-programmer can glance at and get a feel for what it does. but when you start working on code longer than one-liners, the appearance of readability and genuine readability become decoupled; the english-language mimicry works to obscure rather than reveal what the program is doing.

i remember as a kid working a little bit in inform 7, the text-adventure programming language mentioned upthread. inform 7, another language with a high appearance of readability, was great so long as you weren't doing anything more complicated than "move the green book into the study and wake christina," (a context where you can just say "move the green book into the study and wake christina" and it works). but if you try to write inform 7 to handle (for example) implementing a fire that spread from room to room, or writing a rope that could be both tied around things and cut, the english-like quality breaks down hard, and what remains of the resemblance to english just tricks you into thinking you understand something you don't. whenever i had to do anything tricky like that, i'd first work out the logic in a pseudocode that more closely resembled python, and only then translate it into inform 7. the third or fourth time i had to do that i ended up ditching inform 7 altogether and going back to inform 6.

see this sample code for implementing a matchbook in inform 7 — it will give you a good feel for the limitations of english-mimicry.

i'd argue that the chief value of cobol's english-like character isn't as a tool to help programmers work, but instead as a tool to help managers and executives find the concepts of computers and computer programming more accessible. it is, in short, a marketing tool, and a damned effective one. when hopper developed flow-matic — and it's fair to describe cobol as flow-matic with the serial numbers filed off — one of the big problems that her employers faced was that the executives they were marketing computers to had no idea whatsoever what a computer was, and found both computers and the people who worked on them terrifying. flow-matic let them pretend that computer programs were business memos, and that they could trust a program to do what it says it does just by glancing at it. and thereby flow-matic (for better or for worse) boosted computer adoption in business/computer sales to businesses.

basically: we have to evaluate cobol in terms of both technical engineering and social engineering. as a piece of technical engineering it's got some strong points and some weak points. as a piece of social engineering it was an unparalleled success.
posted by Reclusive Novelist Thomas Pynchon at 9:46 AM on September 18, 2020 [6 favorites]


mar hicks, by the way, is one of the few historians i really trust to handle this topic well. doing serious analysis of the role of programming languages in culture requires having deep knowledge of several very different fields all at once, with each of those fields having its own ferociously difficult technical jargon & set of established practices. there's like maybe at most twelve people in the world qualified to do this work, and hicks is one of them.
posted by Reclusive Novelist Thomas Pynchon at 9:56 AM on September 18, 2020


> Don’t worry, you can write completely inscrutable Python, especially if you do things like refuse to use the standard library and tools like requests and instead implement your own string handling approaches, use single letter global variables, copy paste the same code over and over again because calling a function is “too slow”

Hey, rockindata, can you drop a pin? I don't want to fight, I just want to talk (I want to fight) (not you, but every scientist I've ever worked with).
posted by protocoach at 10:13 AM on September 18, 2020


I'll note here that I think there is a huge and toxic strain of macho in programming which holds that writing pseudocode, or flowcharting, or whatever before you sit down and start hammering out real code is the only proper way to program. That's pure bullshit and I do think the teacher did us all a grave disservice by not starting this exercise, and all other exercises in the class, by insisting we write some pseudocode or draw a flowchart.

I am having trouble parsing this paragraph. Are you pro- or anti-pseudocode?

I'm extremely pro since it allows you to sketch out the problem space before diving into implementation.
posted by grumpybear69 at 10:21 AM on September 18, 2020 [3 favorites]


I just assumed there was an extra/missing Not in there somewhere, as the rule was always "Manly Men code, not talk."

Edit: I know this because I was the manly man who sat silently until the design was laid out in my head, and then started typing. That lasted until I left school and had to work on something larger than my skull...
posted by Cris E at 10:25 AM on September 18, 2020 [2 favorites]


Sorry, I'm obviously writing bad English today.

I am extremely pro pseudocode/flowchart/whatever and consider the idea that a real programmer should just sit down and start coding to be toxic.

I think my intro to programming teacher failed his students because he didn't make us start with pseudocode/flowchart/whatever and I think if he had it would have not only taught good habits to the students but gotten them to the programming mind faster and less painfully.
posted by sotonohito at 10:30 AM on September 18, 2020 [2 favorites]


I'll add that being pro-pseudocode isnt just an opinion its backed up by science. Studies show that people forced to start with pseudocode write better, more efficient, programs with fewer bugs than people who don't, and that the pseudocoders are FASTER even when you count the time it takes to do the pseudocode. It is the objectively correct way to program.
posted by sotonohito at 10:34 AM on September 18, 2020 [1 favorite]


I’ve been coding professionally for 15 years (including on systems that basically every coder in this discussion has used) and I have maybe run into about 4 of these “macho programmers” out of hundreds of colleagues. Must be a gen x thing?
posted by thedaniel at 11:18 AM on September 18, 2020 [1 favorite]


the macho programmer who sees the matrix and single-handedly hacks the gibson is a trope from media. people who fall for that trope and try to embody it suck at programming and often wash out pretty early. the ones who stick around (typically the type of tall pretty rich straight white cis man whose confidence is unshakable because they have a lifetime of being deferred to and told they are geniuses) lowkey sabotage every team they work on (and are often rewarded for their work regardless).
posted by Reclusive Novelist Thomas Pynchon at 11:27 AM on September 18, 2020 [6 favorites]


Dang do we know each other IRL?
posted by thedaniel at 11:34 AM on September 18, 2020


For people wondering about Python as it relates to the topic of this thread, there was a long-awaited paper that came out in Nature the other day by the authors of NumPy, a library that runs underneath a lot of data science and analysis tools written in Python.

Lior Pachter is a computational biologist known to be a bit outspoken on the social media, and he suggested that the absence of any women authors on the NumPy paper — and general lack of diversity in this and other Python projects, like SciPy — might indicate why some women in the data science community prefer contributing to R, as opposed to Python.
posted by They sucked his brains out! at 11:54 AM on September 18, 2020


(In other words, if there is "macho bullshit" in Python, some of it might be baked in by way of who contributes to the libraries in it, as much as the lack of participation by or attribution to those who aren't part of that culture.)
posted by They sucked his brains out! at 11:56 AM on September 18, 2020


anyway a good synecdoche for how appearance of readability and readability drift apart from each other is found in one common feature of inform 7 and cobol: the use of . to mark the end of statements. this makes sense in the context of the appearance of readability, since it means that statements are like english-language sentences in that they end with periods. however, this feature is a disaster for real readability, because a . is easier to forget than something like ;, and because if you do forget one you’re in a world of hurt, because the visual inconspicuousness of . makes it harder to find the missing one.

i really do think the social engineering aspect of cobol is really important and valuable, and hopper’s realization of the importance of social engineering put her streets ahead of the mathy fortran boys. in terms of impact her work is mixed, of courses — all of this was to the benefit of big capitalists, both at the companies that manufactured computers and at the companies that used them to control their business processes. and the folks stuck programming in cobol for the following 50 years had a ton of headaches that weren’t strictly necessary.

ultimately my take on early computer programming language development is a little “neither moscow nor washington” — or i guess “neither mit nor remington-rand.” neither side was working toward “good” — but nevertheless we must acknowledge the smarts of the people involved.
posted by Reclusive Novelist Thomas Pynchon at 12:02 PM on September 18, 2020 [2 favorites]


According to the Wikipedia page COBOL got novel features like "recursion" and "floating-point and binary data types" in the (poorly supported) 2002 standard and proper IEEE 754 floating-point in 2014. That's mind-boggling in several different ways.
posted by ikalliom at 12:04 PM on September 18, 2020 [2 favorites]


It's never about the language, it's always about the ecosystem. For COBOL, that's IBM and their VSAM technology, making it more like an Oracle database than a general-purpose programming language. You could replace COBOL with a standardized subset of Java and accomplish the same tasks. That's what a team at the IRS attempted to do but the project was mothballed.
posted by RobotVoodooPower at 12:05 PM on September 18, 2020 [2 favorites]


replacing rock-solid old cobol code with code in a modern language is always (yeah, i’m going to stand by that use of the word “always”) a bad idea. the cobol codebase has a half-century worth of debugging in it, code that handles tons of rare edge cases, and the new code will inevitably miss some of those.

it’s extra funny when people replace cobol code with java code, since java is without a doubt the cobol of the late 20th / early 21st century.
posted by Reclusive Novelist Thomas Pynchon at 12:11 PM on September 18, 2020 [6 favorites]


COBOL the language is so different to anything we generally work with in modern computing which I think is why it's "hard".

Record and table lookups, and strange formats for data?
Data types that are alien to our post-ALGOL/C world?
COBOLCASE? And column line formatting stuff?

I know that there are more modern IDEs for COBOL, but between the syntax verbosity, all caps and formatting and weird data data types, despite all the "it's just another language" it really isn't the same as coding in python or C or Java or any modern language. Sure you CAN learn it, but the difficult of moving from a modern language to COBOL.

IDK, I guess I'm one that think Aesthetics matter. Python seems to be the opposite of COBOL in this respect.

Then again, I am a loser who's never really done anything useful with programming in my life and maybe a real programmer has a better grasp on what is valid to say.

Just because almost all languages are Turing Complete and as valid as any other language doesn't mean all are equally easy to learn or desirable to learn.

I'm not saying the COBOL programmers that end up writing these articles are wrong, but since they already know COBOL and likely have been programming it for years, it's quite easy for them to just pish-posh on how easy COBOL is.

But I do agree that the framing around the COBOL issue really taints the reality of what's going on to influence the market, the pricing/value of COBOL devs, and what it would really take to just replace COBOL properly and safely, which is a ginormous undertaking.
posted by symbioid at 12:18 PM on September 18, 2020 [2 favorites]


> According to the Wikipedia page COBOL got novel features like "recursion" and "floating-point and binary data types" in the (poorly supported) 2002 standard and proper IEEE 754 floating-point in 2014. That's mind-boggling in several different ways.

it’s common in a lot of shops for programmers to be disallowed from using recursion at all, due to how easy it is to do it wrong. and standard floats are terrible for representing money, which is the main type of decimal number these old systems work with. like, as far as i know no one was ever writing scientific simulations in cobol.
posted by Reclusive Novelist Thomas Pynchon at 12:27 PM on September 18, 2020 [7 favorites]


flow-matic let them pretend that computer programs were business memos, and that they could trust a program to do what it says it does just by glancing at it. and thereby flow-matic (for better or for worse) boosted computer adoption in business/computer sales to businesses.

I think this is an astute observation: if you're an executive who doesn't know anything about computers you can still look at a piece of COBOL code and every bit of it SHOUTS BUSINESS at you (it has to, because COBOL is all caps all the time). In natural language writing punctuation is mostly irrelevant in terms of understanding the text, so people who aren't familiar with the conventions of programming tend to assume that "punctuation" is likewise decorative and just gloss over it. An exec who sees a piece of code in a terse, symbol-heavy language will tend to think their programmers are wasting time (it took two weeks to produce a page of mostly punctuation?!), whereas a verbose language that is substantially words like COBOL gives the impression that the programmer is doing a great deal of work, all of it directly business-related, with a minimum of ornamental parenthesis and braces and whatnot.
posted by Pyry at 1:06 PM on September 18, 2020 [2 favorites]


Sure, COBOL is highly domain-specific and possibly very good at what it is for, but COBOL-85 compilers apparently already have stuff like fractional exponents so you can compute a square root, like you can on a pocket calculator. I would think that the amount of goods is sometimes convenient to express in floating point even when money isn't. The reason for late IEEE 754 compliance is probably the fact that the CPUs which run COBOL programs might not have IEEE 754 floating point units. But that it took until 2002 to have access to a fairly basic feature of the underlying hardware, is what I don't really understand.

I definitely have some bias, my first programming course was in FORTRAN77.
posted by ikalliom at 1:15 PM on September 18, 2020 [1 favorite]


if you dig through the records of the codasyl meetings that prepared the first draft of the cobol spec, you’ll see that hopper’s team1 was absolutely fanatical about trying to disallow all mathematical symbols altogether — no “*”, only “multiply” — because if math symbols were allowed at all then programmers would always use them instead of the cumbersome english-language-word version... and if everyone was using + and * and whatever the code would look like math instead of business. at one point they threatened to pull out of codasyl altogether over this.


1: hopper herself wasn’t there, but imo it’s pretty safe to say that the remington rand/sperry rand team was under her direction in one way or another.
posted by Reclusive Novelist Thomas Pynchon at 1:21 PM on September 18, 2020 [2 favorites]


ikalliom: oh that’s super interesting about how it used a competing / antiquated standard for floats for so long. although i’m mid-key obsessed with early cobol i don’t spend enough time thinking about language features from later in the language’s lifespan.

the thing about the “very good for one specific domain” nature of cobol is that the domain it was in was the domain of at least 2/3rds of all computing during the language’s heyday1... but it’s a really really boring awful domain that no one wants to think about. cobol: the programming language of choice for folks who treat humans as interchangeable, controllable labor-production units.

1: if i’m making a joke i say that most of the other 1/3 consisted of designing and implementing guidance systems for nuclear missiles.
posted by Reclusive Novelist Thomas Pynchon at 1:30 PM on September 18, 2020


(this is where i typically go on my soapbox rant about how you can tell the nature of a culture by its programming languages, that cobol is a language that could only happen under capitalism, and that because english is the language of programming worldwide, modern programming languages reveal that most cultures are heavily colonized)
posted by Reclusive Novelist Thomas Pynchon at 1:32 PM on September 18, 2020 [1 favorite]


So COBOL is like the complement of APL, which is mathematical (and possibly other) symbols only.
posted by ikalliom at 1:47 PM on September 18, 2020 [3 favorites]


COBOL is bureaucracy-as-code, yes?
posted by grumpybear69 at 2:05 PM on September 18, 2020 [3 favorites]


proper IEEE 754 floating-point in 2014.

standard floats are terrible for representing money

lol I was just going to say COBOL discouraging the use of floats for money is, no kidding, probably one of its stronger points
posted by atoxyl at 2:14 PM on September 18, 2020 [5 favorites]


On the other hand a point against COBOL occurs looking at that code example - not only is verbosity not always good but verbosity within constraints on actual text length for things like variable names can be kind of a nightmare.

Another thing worth mentioning is that it’s not all about the language either way, though. Some of those old COBOL programs are rock solid because the systems they run on are rock solid because the quality of software and hardware engineering as engineering has by no means monotonically improved.
posted by atoxyl at 2:21 PM on September 18, 2020 [2 favorites]


yeh, in a lot of cases the quality of software corresponds to the length of time the software has been in use / was actively maintained. time exposes flaws (and opportunities to fix those flaws) that nothing else can. replacing old cobol code with java is a little bit like ripping out old-growth forests to plant eucalyptus trees.
posted by Reclusive Novelist Thomas Pynchon at 2:27 PM on September 18, 2020


If you want a good “women in computing” shout-out, remember that Margaret Hamilton literally named “software engineering” and did a much better job treating it as a real thing than most of us do now.
posted by atoxyl at 2:27 PM on September 18, 2020 [5 favorites]


and also remember that academic snob / formal verification of code enthusiast edsger dijkstra referred to software engineering as “how to program when you cannot.”

most of the time people have to write code for the fallen world, and the academics who pretended that code written for the world instead of for formal proofs didn’t matter... well, they did no one any favors. and wow a lot of their contempt was driven by misogyny.
posted by Reclusive Novelist Thomas Pynchon at 2:31 PM on September 18, 2020


(Well, okay, a few people probably used those words together and more than one of them worked on Apollo, even, but it was a milestone in engineering software for reliability and she is generally credited with the vision for how they approached that.)
posted by atoxyl at 2:35 PM on September 18, 2020 [1 favorite]


sigh i should probably go do work now. it’s just, the best part of my day was finding out that mar hicks has a piece about cobol!
posted by Reclusive Novelist Thomas Pynchon at 2:37 PM on September 18, 2020


This was fairly enlightening example of COBOL doing what it's good at.
posted by ikalliom at 2:39 PM on September 18, 2020 [2 favorites]


I'm sort of on the opposite end of this argument. I did my taxes this year in dc...
posted by jim in austin at 2:43 PM on September 18, 2020


Is COBOL a weakly structured Turing-complete language like BASIC and FORTRAN, or an archaic-looking DSL for setting up directed acyclic graphs of record-processing units? Because if the latter, then it's a lot less inelegant than it sounds.
posted by acb at 2:59 PM on September 18, 2020 [2 favorites]


COBOL discouraging the use of floats for money is, no kidding, probably one of its stronger points

But then how do you make change?

:P
posted by eviemath at 3:43 PM on September 18, 2020


with your point fixed like stars in the firmament
posted by atoxyl at 4:18 PM on September 18, 2020 [2 favorites]


hey so i’m going to reach upthread to talk about something said by one of our resident fight-pickers

> These types of articles just love to tell us how programming was dominated by women (which it was), but then go into how one day in 60's a bunch of dudes decided that programming was for them and how they went "down" a level and forced all the women out, which is definitely not true.

it would be profitable for anyone who fell for this particular line to look into reading hicks’s programmed inequality, which is immaculately researched and nicely documents how precisely what our friend from upthread claimed didn’t happen did happen in the u.k. in the 1960s and 1970s, and links that event to the subsequent collapse of the british tech industry. i really hope the person who i quoted here takes a look and, perhaps, adjusts the things they say about the tech industry’s development in the mid-20th century to account for the evidence that hicks presents.
posted by Reclusive Novelist Thomas Pynchon at 4:48 PM on September 18, 2020 [11 favorites]


the english-language mimicry works to obscure rather than reveal what the program is doing

I find that this happens as well, and it's the main reason I'd far rather script in bash than in AppleScript, and why during my programming formative years I disliked COBOL on sight. It's not about being a Real Man, it's about not having to feel like I'm trying to type while wearing boxing gloves. Writing and reading COBOL is just tedious.

And even though something like COBOL or AppleScript superficially resembles English, I find that when I sit down and try to figure out how to express what I want to do in either of these languages, the resemblance to English actively gets in my way. I find myself spending a lot of time drifting beyond the bounds of the computer language and into actual English, writing something more akin to pseudocode that expresses perfectly clearly what an English-speaker would want the machine to do but doesn't actually conform to the syntax of the compiler I'm actually talking to. And of course that doesn't work.

The essence of natural language is fluidity: the ability to bend it and shape it and adapt it until what you're trying to communicate comes across clearly. Computer languages don't, and shouldn't, have that; computer languages need to be above all precise. Which means that an English-mimicking computer language is pretty much forced to use a tiny tiny slice of idiomatic English. And having forced English into that Procrustean bed, it ends up being both irritating and distracting to have to use a long sequence of words to express what is conceptually a tightly-specified machine operation.

To my way of thinking, the computer language that most resembles a natural language in spirit is Perl. The absolutely foundational There's More Than One Way To Do It ethos that infuses that language yields endless opportunities for creating expressive idioms. But there's precious little resemblance between almost any Perl function and a paragraph of English text, and the fact that there are so many different ways to say the same thing means that comprehending Perl written by somebody whose idiomatic style is different from your own can be very time-consuming.

To me, the distinction between programmers and non-programmers is not a matter of skill or even aptitude, but of attitude. If you want to know how to write a sequence of instructions that break down a process into steps that a non-mind can follow, you're a programmer whether you can actually do that yet or not. If you don't want to know anything about how IT works under the hood, and your main concern is herding cats or moving product, then you're a non-programmer.

As a programmer, it's really fucking annoying to be restricted to working with tools whose primary design goal has always been to fool Management (or, worse, Marketing) into thinking they understand stuff that they are almost always demonstrably clueless about because they want to be. Managers don't read and audit computer programs; they read executive summaries and progress reports. Sweating the details at the level of coding statements is simply not their job.
posted by flabdablet at 8:38 AM on September 19, 2020 [1 favorite]


To my way of thinking, the computer language that most resembles a natural language in spirit is Perl.

Well that’s what Larry Wall said he was trying to do, right?

(I don’t even like the TMTOWTDI-ness of Ruby, let alone Perl, personally.)
posted by atoxyl at 10:17 AM on September 19, 2020


Once upon a time, I taught COBOL. We still had punch cards. Blows my mind.
posted by Goofyy at 1:15 PM on September 19, 2020 [1 favorite]


It's just possible that some of my gut-level revulsion on first exposure to the language might have had something to do with the fact that our approved coding process involved writing our programs out on coding sheets, submitting those over the counter at the I/O centre, then collecting our first-run printouts and punch card decks for editing and debugging the following day.

Expressing ideas in COBOL involves a lot more time spent carefully inscribing tidy block caps onto coding sheets than expressing the same ideas in FORTRAN. And when the punch card editing facilities available to engineering students in 1981 consisted of a choice between 12-key totally manual punches and huge whizz-clunk things with a coding wheel on the left and a big punch button at the front, re-punching a card that the I/O centre operator has passive-aggressively rendered as IDENTIFICATION DIVIS1ON. because your serifs aren't quite to their liking is kind of enraging.

I was so happy the day I learned that all it took was a tram ride into the city to get me access to several proper Type 29 keypunches at RMIT. Strictly speaking only RMIT students were supposed to use those, but there were no real access controls imposed so I'd just walk in there as if I owned the place and sit down and get punching. This probably counts as my first IT security breach via social engineering.
posted by flabdablet at 5:35 AM on September 20, 2020 [2 favorites]


You had punch cards? We would have given anything for punch cards.
posted by thelonius at 6:55 AM on September 20, 2020 [3 favorites]


What was your poison? Mark-sense cards, TTY paper tape, or something even worse?
posted by flabdablet at 7:40 AM on September 20, 2020


...AND WE LIKED IT!
posted by rhizome at 6:13 PM on September 20, 2020 [1 favorite]


I used to use kitchen tongs to individually drop electrons onto the silicon. Every day I would arise and give thanks for those tongs.
posted by GCU Sweet and Full of Grace at 8:02 PM on September 20, 2020 [2 favorites]


Have you noticed how hard that is to do with those new-fangled silicone-rubber-sleeved tongs they sell in all the stores these days? Electrons are such slippery damn things. Without the traditional tin plated tongs it takes forever to pick them up.
posted by flabdablet at 4:29 AM on September 21, 2020


As a programmer, it's really fucking annoying to be restricted to working with tools whose primary design goal has always been to fool Management (or, worse, Marketing) into thinking they understand stuff that they are almost always demonstrably clueless about because they want to be. Managers don't read and audit computer programs; they read executive summaries and progress reports. Sweating the details at the level of coding statements is simply not their job.

As an aside, this is why the history of COBOL strikes me as true - it was designed to be programmed by women and then a male manager would visually 'check it out' (LOL) to make sure it was good. Lining up your stuff and counting to 80 was womens work, reading the compute statements in sort-of english was managerial work.
posted by The_Vegetables at 7:53 AM on September 21, 2020 [3 favorites]


Coder:
000304 PERFORM UNTIL EOF 
000305    PERFORM C0200-READ-CUST-RECORD 
000306    IF SUCCESS 
000307       ADD +1 TO WA-IN-REC-CNT 
000308       IF CREDIT-TRANS 
000309          ADD +1 TO WA-CR-REC-CNT 
000310          MOVE WW-CUST-NUM TO WW-CUSTO-NUM 
000320          MOVE WW-CUST-NAME TO WW-CUSTO-NAME 
000330          PERFORM C0300-WRITE-OUT-REPORT 
000340       ELSE 
000350          ADD +1 TO WA-DB-REC-CNT 
000360       END-IF 
000370    END-IF 
000380 END-PERFORM 
Manager: I got every word in that sentence! You're a Straight Talker!
posted by flabdablet at 12:24 PM on September 21, 2020


As an aside, this is why the history of COBOL strikes me as true - it was designed to be programmed by women and then a male manager would visually 'check it out' (LOL) to make sure it was good. Lining up your stuff and counting to 80 was womens work, reading the compute statements in sort-of english was managerial work.

This explains why, in the early 80s, my mother (who had no computer experience and had hitherto worked in journalism/publications) briefly had a job working with (from what I understand, proofreading/correcting) programs in COBOL. The experience, such as the frustration of things not working due to missed punctuation, left her with a loathing of computers which lasted for a decade or two, ending only when she got a Mac and realised you could edit video with it.

I'm guessing that some if not most of the suck in COBOL comes from the alienation of programming labour that is implicit in its architecture and use cases.
posted by acb at 2:02 PM on September 21, 2020 [2 favorites]


« Older 13 minutes of humans being nice, plus swears   |   "we no longer simulate slime mold, but take... Newer »


This thread has been archived and is closed to new comments