Avoiding the data-driven wave
August 6, 2019 5:50 AM   Subscribe

 
From the article:
... this emphasis on data minimization and privacy, use and users of information, community informatics, civil liberties and the human dimension of informational creation and consumption has been steadily eroded in favor of the same harvesting, hoarding, mining and manipulation that were once the exclusive domain of computer science programs.

As LIS [Library and Information Science] schools boost their hiring of computer science graduates, this transition is accelerating. At some schools, LIS scholarship traditions have been relegated to specialty tracks, with core programs looking almost indistinguishable from “light” computer science curriculums.
Devoid of context, data-driven tools can cause great harm.
posted by ZeusHumms at 6:49 AM on August 6, 2019 [2 favorites]


Even with context they can. But then (mostly) at least it was intended.
posted by aleph at 7:44 AM on August 6, 2019 [2 favorites]


Devoid of context, pretty much any kind of tools can cause great harm. While computer science curricula may well have become more dismissive of privacy and eager to collect data on everyone and everything since I was in school, I'm not convinced the change there has been greater than the same change that's happened in society in general. There are surveillance cameras in the school hallways, not just in elementary schools where the students are deemed to require constant adult supervision, but in universities too. You know who we don't blame? People teaching optics and electronics design.

Some good points are made, but I feel like I've read this same article many times over the years, with anthropology, literature, or even political science taking the place library science does here, and they made some sense too. Too narrow an education is harmful in any field, and the people who use the actual computer science they learned to write the actual code are not usually the ones causing the problems.
posted by sfenders at 7:53 AM on August 6, 2019 [5 favorites]


While computer science curricula may well have become more dismissive of privacy and eager to collect data on everyone and everything since I was in school, I'm not convinced the change there has been greater than the same change that's happened in society in general.

I do, for one simple reason - Alphabet, Facebook, et al. hide the details of their data gathering. They do this because (as we've seen) people do not want this level of data collection.
posted by NoxAeternum at 8:40 AM on August 6, 2019 [7 favorites]


Thanks for this post.
posted by carter at 8:56 AM on August 6, 2019


Do computer science programs often have ethics and professional standards components? Most other people who get to enter the workforce with a title of engineer tend to get at least a token overview of that sort of thing. I'm assuming there's not much for the computer science side.

If a structural engineer takes a punt on a calculation and a building falls down as a result, there are (usually) repercussions for that person. I'm guessing not so much for whoever writes (or signs off on) the code that causes an autonomous vehicle to drive into a barrier, or incorrectly identifies somebody as a criminal.
posted by Jobst at 9:53 AM on August 6, 2019 [1 favorite]


Jobst: If a structural engineer takes a punt on a calculation and a building falls down as a result, there are (usually) repercussions for that person.

From the article, I'm not sure that's the best analogy. It's more like an ethics course for engineers which asks them whether they really want to be part of the military-industrial complex, whether they really want to work on devices which kill and maim in service of the state, how to avoid letting the state use their skills for destruction. I don't think engineers get courses in that, though I could be wrong. Not many disciplines do. It sounds like Library Science was one of the few.
posted by clawsoon at 10:08 AM on August 6, 2019 [4 favorites]


Alphabet, Facebook, et al. hide the details of their data gathering.

That's a business decision that has very little to do with computer science. Facebook's entire operation has less to do with computer science than you might think, really. We sometimes call it a "technology company", but virtually any large enterprise you can think of is thoroughly reliant on technology, from General Motors to Monsanto. Facebook is based on computer science in the same way that United Airlines is based on aerodynamics. Yeah they need it, but that's not really to blame for most of what people complain about.

Although of course they don't shy away from using the latest stuff, and they're large enough to do some real research, it doesn't even require any particularly fancy technology to implement the basic Facebook business model. It took time for the Internet to get pervasive enough, but now that it has done so our social relationships could be effectively monetized and exploited with 1990's tech without too much difficulty. No interesting computer science need be involved.

Mind you I'm all in favour of better education for computer science students, who are more vulnerable than most to being enraptured by the Silicon Valley madness. I don't think they make a particularly great scapegoat though, and I suspect that their education is likely to be somewhat less deficient than is suggested here.
"The idea that data should be minimized to protect privacy was not even a concept. Secure systems design emphasized how to safeguard data from unauthorized access, but never the concept of how to safeguard the users whose data that was from harm."
Preventing unauthorized access to data is a definition of privacy in computer science terms. It is central to a lot of things that might be covered, from steganography to side-channel attacks to Tor. Hidden biases in ML algorithms are not some kind of secret everyone wants to ignore, they're an active area of research. People use Facebook all the time and so it, along with the other giant monsters, becomes the "face" of the tech industry, but I follow enough current research to know that it doesn't yet dominate academia to the extent you might imagine reading this stuff.

By all means give students some better education, but don't expect them to change the course of fucking Facebook by going to work for it slightly less ignorant of the choice they're making. It's the people in the board room who can do something there, not a bunch of fresh graduates of whom there will always be enough who are eager to sell out. There is, of course, also no shortage of people who'd rather work on their own federated privacy-respectful social media projects instead, but that doesn't pay nearly as well.
posted by sfenders at 10:55 AM on August 6, 2019 [3 favorites]


Do computer science programs often have ethics and professional standards components?

I graduated in 1998 with a Computer Science degree from a medium sized Catholic University (Univ of Dayton) and we definitely studied ethics as they related to technology.
posted by mmascolino at 12:44 PM on August 6, 2019


I feel like asking "do CS majors study ethics" is an odd question, as even if CS majors did study ethics (and from what I can tell, most CS programs at least have/had an ethics course), the things that come up in this study of ethics are often grounded in yesterday's ethical concerns. I know my program basically talked about how important bullet-proof engineering was in the case of obviously mission-critical programming, like for medical systems (the Therac 25 bug being the ur-example there) or avionics or other things where lives could be lost due to software bugs, and that hacking is bad, mm-kay? (Cliff Stoll's The Cuckoo's Egg was required reading.)

Little to no attention was paid to the effects of massive datasets and datamining and all the various privacy issues surrounding all of that, because it just wasn't a thing people were thinking about at the time. Google was still in their could-do-no-wrong "don't be evil" phase of existence. Facebook was still something you had to have a .edu email account to join. Microsoft was still struggling to get Vista out the door. Amazon still mostly sold books. Apple had only just released the iPhone before I graduated, and the era of ubiquitous computing was still a glimmer on the horizon. The only movement I remember seeing on this front was concern about how governments were beefing up surveillance practices post-9/11, and since the extent of it was hidden from the public, it was couched in a lot of 'what-ifs' based on the little we did know.

I say this not to excuse how ethics were taught in CS courses, just that y'know, hindsight is 20/20 and these concerns didn't have the sort of traction then that they do today. I feel like people underestimate just how ill-prepared society as a whole and CS as a discipline has been to deal with the rapidly changing landscape of technological progress at that time when it comes to ethics.
posted by Aleyn at 2:54 PM on August 6, 2019 [3 favorites]


Do computer science programs often have ethics and professional standards components? Most other people who get to enter the workforce with a title of engineer tend to get at least a token overview of that sort of thing. I'm assuming there's not much for the computer science side.

What clawsoon said, basically. There is some sort of "don't hurt people through negligence" ethics, though it's not as developed as it is in other engineering disciplines because CS is relatively young and relatively... not disciplined. The only time someone will try directly to dissuade you from a career building things to hurt people on purpose is if a particular professor wants to make a point about it (which actually did happen a bit in the course of my education but just a bit). I would guess that is true of other engineering disciplines, because that's always been one side of what engineering does.
posted by atoxyl at 1:14 AM on August 7, 2019 [1 favorite]


I say this not to excuse how ethics were taught in CS courses, just that y'know, hindsight is 20/20 and these concerns didn't have the sort of traction then that they do today. I feel like people underestimate just how ill-prepared society as a whole and CS as a discipline has been to deal with the rapidly changing landscape of technological progress at that time when it comes to ethics.

Except that part of what taught Mark Zuckerberg that he was beyond being held accountable was that Harvard softpedaled the ethics concerns over his initial version of Facebook and how it used the images of Harvard students. HIPAA, which focuses heavily on the securing of sensitive data, was signed into law in the mid-90s. This is why I don't buy the argument that CS moved faster than ethics did - the evidence shows that ethical concerns on data existed, but just got ignored.
posted by NoxAeternum at 6:22 AM on August 7, 2019 [1 favorite]


Suffice it to say that this is a constant tension in parts of my job, and I'm glad to see it being discussed.

Not exactly apropos, but in my grad school the Information Management and Library and Information Science programs were under the same school but almost totally siloed, and it was IMO detrimental to both groups. Even just looking at demographics, the LIS people were almost all twenty-something white women and the MIMs were almost entirely slightly younger South Asian and Chinese men.
posted by aspersioncast at 9:59 AM on August 7, 2019


This is why I don't buy the argument that CS moved faster than ethics did - the evidence shows that ethical concerns on data existed, but just got ignored.

I feel like you're making roughly the same point I was making, but from a different angle. I didn't claim these concerns didn't exist, I'm claiming that they were thought unimportant or overblown if they were considered at all; this data hadn't been abused yet, so was it really a problem that Facebook had the name, photo, city and email address of all of it's users? The phone company printed out books with all that information and more (save the photo) for their subscribers, so why is this data set more of a problem than that? HIPPA is one thing; the value of medical data is obvious on its face, but this was more like phone-directory information plus a posting history and photos. I think it felt a lot different at the time, and the scale and reach of the internet was still being understood. We were still in the stage of 'lol, people posting about their meals, how boring and stupid' thinking about social media.

My point is more that given the state of things at the time, it's not surprising to me that CS ethics courses didn't treat these topics, and I was addressing the specific question of "are CS majors even taught ethics?" (to which the tl;dr answer is "generally yes, but it wouldn't have helped anyway") not "what should be taught in a hypothetical CS ethics course?"
posted by Aleyn at 4:27 PM on August 7, 2019 [1 favorite]


« Older Testing no-wash socks, one sweaty day at a time   |   Dying the Christian Science way: the horror of my... Newer »


This thread has been archived and is closed to new comments