Once we searched Google, but now Google searches us.
January 20, 2019 11:06 AM   Subscribe

Shoshana Zuboff has a new book published which welcomes us to the age of surveillance capitalism. (LA Times)
The Guardian's tech editor John Naughton has a 10 question and answer session with her.
Once we thought of digital services as free, but now surveillance capitalists think of us as free.
posted by adamvasco (13 comments total) 26 users marked this as a favorite
 
I find that kind of gross, actually, because her early-oughts book The Support Economy seemed to be a gung-ho cheer squad for shifting most of the economy to gigging support work with nothing but handwaving to explain why the reintroduction of a servant class would go well this time. I bounced off it hard, but many of the failings of the system were obvious risks then.
posted by clew at 11:18 AM on January 20 [3 favorites]


( To clarify -- servants famously know everything about their masters, more perhaps than the masters understand about themselves; but the servants remain servants and the masters masters. That knowing is what The Support Economy proposed as the new kind of work. We have many examples worked since then of how much human labor makes up the surveillance economy and how bad it is for the laborers.)
posted by clew at 11:28 AM on January 20 [1 favorite]


sort of like with rachel botsman :P

also btw...
-Apple's CEO, Tim Cook calls for new federal privacy law while attacking the 'shadow economy' in an interview with TIME
FUCKING FINALLY, YES, A PRIVACY REGULATION PROPOSAL THAT MAKES SENSE

Tim Cook is calling for a federally-regulated clearinghouse for the (now) completely unregulated and Wild West third-party market in user data. This is where the real data villains live, let me explain... [threadreader]
-The Internet Is A Privacy Disaster. But We Still Don't Know How To Talk About It.
-After 3 Decades, Privatization Has Been Proven a Failure. Let's Bury It for Good.
posted by kliuless at 11:33 AM on January 20 [3 favorites]


Yeah, right. Leaving folks' privacy unharvested is like leaving the coal, oil, and gold in the ground. Good luck with that.

"We are property." -- Charles Fort
posted by hank at 11:36 AM on January 20 [2 favorites]


"don't be evil" ---> "gotcha"
posted by a humble nudibranch at 1:30 PM on January 20


The Internet Is A Privacy Disaster. But We Still Don't Know How To Talk About It.

Who is this "we?" Because there are companies doing the privacy-collecting who certainly have a well-developed language around the activities of the industry, and have been for a long time. Seriously, probably at least 1/10th of all employees in internet-oriented companies, and certainly advertising-driven ones, know exactly what's going on and how to talk about it.

It's just that people would shit bricks if they learned how they talk about it. But don't pretend like the language doesn't exist.
posted by rhizome at 1:40 PM on January 20 [4 favorites]


I'm not sure what the Rachel Botsman¹ ref implies -- from the short interview at the link,
we’ve got to hang on to the fact that trust is a very human process and that we should not outsource this to bots and algorithms. We have to take personal responsibility for in whom and where we place our trust.


Who is responsible for being trustworthy? At least being trusted by the people society claims to protect? --
kids and artificial intelligence, and how this will change their capacity to trust, and their decision-making processes. Currently, we depend on technology to do something, whereas they will depend on it to decide for them.
Ohhhhkay, full creepy.

¹ nominative determinism?!??
posted by clew at 1:49 PM on January 20




kids and artificial intelligence, and how this will change their capacity to trust, and their decision-making processes. Currently, we depend on technology to do something, whereas they will depend on it to decide for them.

Even if not nominative determinism, here we see technocratic determinism. Not only that, but written by someone who is calling these future people "kids," which is nicely paternalistic. I'm also not sure what she means by trust "changing," except that it means stuff you used to be able to trust can't bear it anymore, and new things will have to become trustable. But I don't think that's what she's saying. Regardless, the point is posed badly and deterministically, like the kids in the future won't resist having decisions made for them (perhaps an allusion to insurance? only way I can square it).
"In general, I try and distinguish between what one calls the Future and “l’avenir” [the ‘to come]. The future is that which – tomorrow, later, next century – will be. There is a future which is predictable, programmed, scheduled, foreseeable. But there is a future, l’avenir (to come) which refers to someone who comes whose arrival is totally unexpected. For me, that is the real future. That which is totally unpredictable. The Other who comes without my being able to anticipate their arrival. So if there is a real future, beyond the other known future, it is l’avenir in that it is the coming of the Other when I am completely unable to foresee their arrival."
--Jacques Derrida
posted by rhizome at 2:07 PM on January 20 [1 favorite]


There was a podcast interview between Sam Harris and Derren Brown (a mentalist magician) which asked him whether or not he surprised to find he could get people to, for instance, rob an armored car, through suggestion techniques. He wasn't surprised at all. Derren said he laid down the behavioral track, so to speak, for these people to follow and it wasn't surprising in the least they followed it.

I think laying down behavioral track is a lot of what the internet and social media in particular is currently doing. Unfortunately, what's monetized is attention, and the people that spend the most time on the internet are conspiracy theorists, therefore, everyone must be turned into a conspiracy theorist. Just as print advertising preys on the consumer's anxieties, online media preys on the consumers doubts concerning reality.

I just thought that up, and the idea probably has holes but also sounds like it has some merit to me.
posted by xammerboy at 11:51 PM on January 20


> ...the people that spend the most time on the internet are conspiracy theorists, therefore, everyone must be turned into a conspiracy theorist.

All of us have become Richard Nixon
posted by CheapB at 9:34 AM on January 21


This led me to read some more about Facebook's contagion experiment mentioned in the Guardian interview, where Facebook sought to intentionally manipulate the emotions of over 600,000 people.

This article in the Research Ethics goes beyond the obvious concern that these Facebook users were treated as human subjects to address problems with Facebook's selective presentation of content on the timeline:
The bottom line is that while users can control the content of their disclosures in mediated environments, they cannot control how their disclosures are presented to others. Users cannot control the appearance and timing of their disclosures or any surrounding disclosures. This study sought to exploit that inability and demonstrate the significant impact the manipulation of presentation can have on the perceptions and effect of disclosures. This means that both disclosers and recipients of information are vulnerable to intermediaries. We believe our policy and norms should better reflect such power.
...
...Facebook tries to keep users’ morale high by minimizing the likelihood that when they use the service they will become anxious about the company’s motivations or procedures. As with community standards, this means certain triggers need to be minimized. If users had a felt sense that they were being actively experimented on, some would quit. The same profit-impeding result would occur if some users had a keen awareness that their virtuous identities were being converted into experimental collaborators whose behavior might harm others.

Think of it this way. When users interact with one another through Facebook’s interface, they are only given highly selective information. Essentially they get to see what they post, some of what their friends share, and a non-overwhelming amount of advertising. Some of what remains invisible, therefore, is strategic. Crucially, users are not given windows into how Facebook’s algorithms work (which is a trade secret) or how and why they get updated. Moreover, they only get the flimsiest peek into the scope of Facebook’s advertising agenda.
posted by exogenous at 12:56 PM on January 23


Shoshana Zuboff writing in todays FT:
A new economic logic has taken hold, in which businesses quietly poach private human experience for data. It’s time to fight back.
posted by adamvasco at 1:51 AM on January 25 [1 favorite]


« Older marvel at the underduck region   |   Láadan Newer »


This thread has been archived and is closed to new comments