Expanding such capabilities is only a matter of refining algorithms
September 27, 2012 6:30 PM   Subscribe

The Guardian Projects The Social Distopia. Or, Big Brother wants to be your friend.
posted by Diablevert (43 comments total) 12 users marked this as a favorite
 
Yikes. I suppose it's already too late for us to edit ourselves into better jobs in the future.

The irony for me is Person of Interest is on. Finch's machine probably already knows about my migraines, etc.
posted by dragonplayer at 6:43 PM on September 27, 2012


Jihad, Butlerian: (see also Great Revolt) — the crusade against computers, thinking machines, and conscious robots begun in 201 B.G. and concluded in 108 B.G. Its chief commandment remains in the O.C. Bible as "Thou shalt not make a machine in the likeness of a human mind."
posted by Grimgrin at 6:49 PM on September 27, 2012 [2 favorites]


The best part of the link was the Zenni Optical ad at the bottom that was scrolling the exact frames I've been looking at, including the ones I just ordered.
posted by sourwookie at 6:50 PM on September 27, 2012 [4 favorites]


Then people (especially healthy, successful ones) learn that they can earn a few extra bucks by working as ghost tweeters to inject noise into the system.
posted by cosmic.osmo at 6:51 PM on September 27, 2012 [2 favorites]


Or some sort of Markov-chain algorithm that re-word your posts before publishing.
posted by sourwookie at 6:54 PM on September 27, 2012


This stuff, like Google searches and the processes of democracy, only work as long as people openly pursue their actual objectives. As soon as people start gaming the systems to artifically get undeserved results, the data ceases to be reliable. Cosmic.osmo is on the right track but I believe grossly underestimates the sheer scale of it. It won't be people acting as ghost tweeters here and there; it'll look more like the SEO industry.

I predict that by 2018, HR departments will have gone through several years messing about with idiotic software sold to them by bottom-feeding scammers with the intention of weeding out "problem employees". The results of using this type of software, like SEO today, will be uniformly terrible and it will become impossible to post any job anywhere in the manner we do today without being deluged by thousands of automated applications from (illusory) perfect geniuses. We will be back to the old reliable pre-internet method: social proof (nepotism and networking, in other words). It is not what you know that matters, but who knows that you know it, and who is prepared to vouch for you.

I predict that job ads will almost completely disappear as a concept by 2022; instead, current employees will be tasked with bringing in potential interviewees, and as part of that, will be tasked with networking to be able to do that, as a KPI'd part of their existing positions. As an introvert, I'm completely OK with this. We socialize with other introverts around board games and knitting circles, we develop introvert-style relationships, and we assess the competence and character of our fellow introverts to a depth that extroverts don't usually bother going into. We are in fact better at this than extroverts are, because we are less likely to instantly become fast friends with someone over six rounds of beer in four different karaoke bars on a Saturday night.

So if Tina acquired her job because a friend who works there got it for her, the hiring manager is quite likely to know and accept the fact that she has migraines from time to time. (Just as an aside, the article author seems to be unfamiliar with the entire industry of disability employment placement.) The boss will have made a judgement call based on Tina's apparent competence for the task, her ability to fit in with the existing workforce, and any downsides of employing her.

Of course this has its own set of issues and is certainly open to discriminatory hiring practices. However, the nightmare dystopia will be worse, and the SEO-style system-gaming that it will evolve to predate upon it will be worse still.
posted by aeschenkarnos at 7:06 PM on September 27, 2012 [9 favorites]


This might not have happened yet, but it already is happening a little bit, which is surely why it's interesting. Already people are reluctant to be political and weird once they know they're being watched online. Is the discrimination worse or the boredom of it all, trying hard to project a vibrant and fake personality on twitter?
posted by kettleoffish at 7:14 PM on September 27, 2012 [1 favorite]


It's super silly because there really is nothing to worry about. The company still has to hire someone, and will still hire people who get migraines. transparency is good. If there's a medical condition and you haven't disclosed it to your insurance firm, that just raises everyone else's premium, it's not a victimless crime.
posted by wilful at 7:20 PM on September 27, 2012


Hopefully I can use the same service to weed out law firms.
posted by Brocktoon at 7:54 PM on September 27, 2012


Wow. This will be unfortunate indeed.

On a side note, if anyone is listening, I am 100% healthy, with no history of migraines, sore throat, or bad hair days. I only keep aspirin in the cabinet in the event a dinner guest has a heart attack. Also, 0% chance of getting pregnant, by ANY model!
posted by cacofonie at 7:57 PM on September 27, 2012 [1 favorite]


If there's a medical condition and you haven't disclosed it to your insurance firm, that just raises everyone else's premium, it's not a victimless crime.

THIS IS SPARTA!
posted by srboisvert at 8:20 PM on September 27, 2012 [4 favorites]


Just make your Facebook private (and don't use applications) and censor yourself on Twitter a bit. I've been doing it for years now and it works just fine.
posted by Defenestrator at 8:22 PM on September 27, 2012 [1 favorite]


But Facebook itself exists to reap your data and sell it to data mining companies.
posted by agropyron at 9:14 PM on September 27, 2012 [1 favorite]


Defenestrator: "Just make your Facebook private (and don't use applications) and censor yourself on Twitter a bit. I've been doing it for years now and it works just fine."

The USG has backdoor interfaces on sites like Facebook.
posted by dunkadunc at 9:16 PM on September 27, 2012


The Facebook has backdoor interfaces on sites like UNITED STATES WORLD GOVERNMENT. Strangely they still just use it to advertise skin care solutions for local moms. The future is strange and helpful.
posted by passerby at 9:28 PM on September 27, 2012


passerby: "The Facebook has backdoor interfaces on sites like UNITED STATES WORLD GOVERNMENT. Strangely they still just use it to advertise skin care solutions for local moms. The future is strange and helpful."

That is idiotic.

If you want to do proper signals intelligence, you use Facebook to gather data and augment your databases. Who knows who? Who contacts whom? Who is a fan of what? Who posts what kind of links? What were they doing a year ago? Five?
posted by dunkadunc at 9:37 PM on September 27, 2012 [2 favorites]


Ok, what are the bets on a small counter-culture rising up around or in rejection to this? A cultural rebound from this. People posting absurd, purposefully galling, balls out novel and unique all over their social network.

A counter culture the specific and conscious purpose of which is to be a counter culture.
posted by sendai sleep master at 9:39 PM on September 27, 2012 [1 favorite]




I predict that in the future, transparency will be the reason to hire someone, in most industries.

However, some professions will still be terrified of hiring anyone with anything in their past that is deemed problematic. This will lead to:
Stage 1: The profession becoming filled with extremely boring people.
Stage 2: An explosion of 3rd party services designed to do the actual interesting aspects of the work that require creativity
Stage 3: The 3rd party service become mainstream and starts to get held to the same standards as the previous profession.
Stage 4: Repeat from step 1, ad infinitum

Expanding such capabilities is only a matter of refining algorithms, setting up the right data hoses
This statement has elements of weaselism. The writer is saying "Watch out! Improvement is only a matter of solving the problems that are not solved yet!"
posted by niccolo at 10:56 PM on September 27, 2012 [2 favorites]


The article reads like science fiction to me. I love SF, but this doesn't sound like a particularly likely future.

For one thing, all this data the inquisitors want to mine may or may not still be there. Given its size and eight year lifespan, I can sort of believe that Facebook will still be around in 2020, but that's not guaranteed. Eventually something else will come along and Facebook will disappear, taking all its user submitted data with it if the users themselves don't migrate everything to the next big deal. The Internet is a perishable medium. Everything is stored on servers, and those servers will eventually fail and/or be turned off. If the big social media companies have the staying power of Coca Cola or IBM, then that data may be continuously migrated to newer servers, but there's no guarantee this will be the case.

And for another, there isn't a lot of really valuable information in the everyday blather that we generate online. Status updates and blog entries are social constructions, and if some semi-sentient algorithm of the future chewed through millions and millions of them it would probably find a tremendous amount of similarity. It would discover a lot about how we want to be perceived, but comparatively little about what we're really like.
posted by Kevin Street at 11:23 PM on September 27, 2012 [2 favorites]


It's like that expression "the map is not the territory." They can know all sorts of things that you've told them, like who your friends are and the broad sketch of events that form your life, but they still don't know how you felt when you wrote the words they're studying. All they can do is infer based on statistical analysis, and that's a shadowy land of giant error bars that can never be as precise as the future imagined in the article.
posted by Kevin Street at 11:41 PM on September 27, 2012 [1 favorite]


Facebook will disappear, taking all its user submitted data with it if

Perhaps more likely is its investors break up the data and sell it off.
posted by Blazecock Pileon at 11:41 PM on September 27, 2012 [1 favorite]


Kevin Street: It's like that expression "the map is not the territory." They can know all sorts of things that you've told them, like who your friends are and the broad sketch of events that form your life, but they still don't know how you felt when you wrote the words they're studying. All they can do is infer based on statistical analysis, and that's a shadowy land of giant error bars that can never be as precise as the future imagined in the article.

And yet, CVs aren't exactly a running joke, and the norm is to include a 'personal statement' thing. That it isn't perfect doesn't mean that nobody will think it good enough.
posted by Dysk at 1:09 AM on September 28, 2012


Is CV short for Curriculum Vitae? From the Wikipedia page it sounds horrific.
posted by Kevin Street at 1:16 AM on September 28, 2012


I don't know if the photo caption is a joke. "Facebook: could posting on the social network mean saying goodbye to jobs in teh future?"
posted by howfar at 1:22 AM on September 28, 2012


Also, there's a reason this Guardian story has to be set in the US. The activities of Narrative Data's client, in discriminating against Tina on the basis of her migraine, which as described appears to potentially fit the definition of a disability for the purposes of the Equality Act 2010, would be a legal minefield in the UK. There are arguments that the employer could make if challenged, but such challenges would become a cost that would have to be factored into the system, along with safeguards in the recruitment process to substantiate any such defence.

Robust anti-discrimination laws mean that employers would need genuinely compelling arguments for adopting this kind of technology. This article does little to show that they will get them. Speaking vaguely of Moore's Law, as if more computing power makes all computing problems trivial, is not sufficient to bear the weight of the argument.
posted by howfar at 2:01 AM on September 28, 2012 [1 favorite]


If this would turn out to be a real practice, I'd also have to wonder about cases where they'd find extremely little or nothing about you.
posted by rudster at 2:04 AM on September 28, 2012


@dunkadunc

if it's cool with you i'm just going to restate something that sounds like what you said, but dumb and clumsy on purpose. i'm doing this to dismiss your concerns for some reason. you might see this being a common tactic for a certain genre of person, but it's not. trust me.
posted by This, of course, alludes to you at 3:36 AM on September 28, 2012 [2 favorites]


And for another, there isn't a lot of really valuable information in the everyday blather that we generate online. Status updates and blog entries are social constructions, and if some semi-sentient algorithm of the future chewed through millions and millions of them it would probably find a tremendous amount of similarity. It would discover a lot about how we want to be perceived, but comparatively little about what we're really like.

Who cares about what you're really like, or what you truly feel? If I'm an employer, I want to know things like: Will they show up on time and work hard? If you're consistently posting to your wall at say 3 AM only on Fridays and Saturdays, then maybe you're the kind of hard-partier we don't want a Nose To The Grindstone, Inc.

You see to think that the content of the post themselves is particularly important, as if they're trying to deduce someone's psychological makeup from elevator chit-chat. But there's lots of things you can tell from patterns that are irrelevant to the content, and yet which might be revelatory as regards your personality and habits --- how many friends have you got? How often do you talk to them? Do you know them in real life or only online (are you tagged in each other's pictures)? Are you close with your family (is there a large subset of friends in the network with the same last name who you frequently speak to)? Are you a night owl? A late sleeper? Do you post a lot during work hours?

All you need to figure out those things is the metadata, you don't even need to read the actual posts. Of course, if you can read the actual posts there's much more to be delved. Hell, with a couple more decades of data one could probably pick out disease clusters from condolence posts.
posted by Diablevert at 4:29 AM on September 28, 2012 [1 favorite]


I think it's a mistake to dismiss the idea (of using software to automatically filter job candidates or insurance applicants based on social networking profiles) for the reason that it won't actually work very well. I agree that it won't actually work very well, in the sense that it would reject good candidates for silly reasons and that it would make decisions based on questionable information. But I think that alone isn't likely to prevent its adoption.

If you're an HR hiring drone under deadline to fill 10 positions, for which you've received 1,000 applications, your ostensible concern ("How can I find the ideal needle in this haystack?") is likely to morph into something slightly different ("How can I make this task faster, which is the only aspect of my performance my boss can really keep track of?") If software can winnow your list for you (because Facebook shows some of them "partying too hard") you're not only going to be tempted to do it, you may be facing institutional pressure to do it. The fact that the software rejects a couple of good candidates won't bother you, because you'll still have 500 others to choose from.

Job postings already have absurdly long and specific requirements lists, to the point where it's probable that no human being exists who could meet them all, for similar reasons. Polygraph tests, although crude and disreputable, were a pretty common feature of hiring until they were regulated in the 1980s, yet are still required for some types of job applications in the US. It's not clear that either of those practices works very well either.

In order to catch on, a new practice or technology doesn't have to work very well. It only has to benefit a few of the right people without hurting the organization that adopts it too much.
posted by Western Infidels at 5:29 AM on September 28, 2012 [2 favorites]


is the problem facebook or that corporate HR is evil?
posted by ennui.bz at 5:43 AM on September 28, 2012 [1 favorite]


Ethical Credit Scores are coming.
Is pureasthedrivensnow.com taken?
posted by fullerine at 6:38 AM on September 28, 2012




You could remove all facebook tracking widgets from non-facebook pages with the AdBlock+ rules :

||facebook.*$domain=~facebook.com|~127.0.0.1
||fbcdn.*$domain=~fbcdn.com|~facebook.com|~127.0.0.1

posted by jeffburdges at 6:50 AM on September 28, 2012


Oh no! Whatever shall I do to keep my private life private while still publishing it on the global information network?

Fools.
posted by General Tonic at 8:50 AM on September 28, 2012


Anyone who thinks this is not going to happen - and happen simultaneously in the banking, credit card and insurance industries - is not familiar with the history of risk management and that discipline/industry's desperate need to quantify.
posted by digitalprimate at 8:56 AM on September 28, 2012 [1 favorite]


Tina Porter is unqualified, all right, and not for specious data-mined reasons, either. Her resume is bogus--Colorado State University has no law school. It's not even in Denver. And you could look that up yourself, without a super-inferring bot or anything.
posted by Kylio at 10:03 AM on September 28, 2012 [1 favorite]


is the problem facebook or that corporate HR is evil?

the dimensions of the kyriarchy are all encompassing
posted by This, of course, alludes to you at 1:31 PM on September 28, 2012


Frankly, I'm more worried about places refusing to hire people without a social media presence. Apart from a neglected LinkedIn profile and a MySpace page I haven't visited in a year and which features a photograph of a plush Cthulhu as my profile picture, I don't have a social media presence.
posted by infinitywaltz at 2:12 PM on September 28, 2012


I think it's a mistake to dismiss the idea (of using software to automatically filter job candidates or insurance applicants based on social networking profiles) for the reason that it won't actually work very well. I agree that it won't actually work very well, in the sense that it would reject good candidates for silly reasons and that it would make decisions based on questionable information. But I think that alone isn't likely to prevent its adoption.

There are already a zillion different "personality profile" test which are administered to potential job applicants which purport to be able to filter out the good applicants from the bad based on what kind of statements the applicant agrees or disagrees with. There's very little actual evidence that these kind of things work to do anything but provide the hiring agent with a smaller pool of applicants to work with -- no studies which show that the applicants are actually better at the jobs, that the tests actually measure anything meaningful, nothing like that.

Mostly, I think it's just a way for someone to have a smaller pool to choose from without having to actually make their own judgements to winnow it down, either because they don't trust their own decision making processes or because they are lazy.

This sounds like more of the same, only you don't have to sit in a cold room with 30 other people and circle A for Agree or D for Disagree about 80-odd peculiarly worded statements.
posted by hippybear at 2:17 PM on September 28, 2012 [1 favorite]


hippybear: "Mostly, I think it's just a way for someone to have a smaller pool to choose from without having to actually make their own judgements to winnow it down, either because they don't trust their own decision making processes or because they are lazy."

I think you've hit the nail on the head right there. My vote is that's it's mostly the latter.
posted by InsertNiftyNameHere at 9:20 PM on September 28, 2012


either because they don't trust their own decision making processes or because they are lazy

Seems pretty obvious that cowardice, incompetence and laziness are the reason for the vast majority of evil in the world, with malice coming very distantly behind. One of the tasks of labour unions and employment law is to make it more trouble than it's worth to pull this sort of shit.
posted by howfar at 3:33 AM on September 29, 2012


Anyone remember handwriting analysis? Back in the day, when I was applying for shitty retail jobs in highschool, it used to be common to have to give a handwriting sample. Some company would "analyze" it and determine whether you were likely to start stealing out of the till, I guess.

The fact that something doesn't fucking work at all doesn't mean that companies won't actually implement it, at least for a while until they figure that out for themselves.

That's how I think "social network analysis" is going to go. It's going to be a fad, and some companies are going to buy into it, because hiring people is hard and fraught with difficult already, and there is honestly desperation at some places, trying to find reliable ways of separating out "good" candidates from "bad", as though they are two discrete and non-overlapping groups. Many of the companies that will pay for such a service will be riddled with problems internally. And eventually the fad will burn itself out and desperate companies will turn to a new solution du jour for hiring screens.
posted by Kadin2048 at 5:55 AM on October 1, 2012


« Older Vangelis, "The City."   |   69°S. Newer »


This thread has been archived and is closed to new comments