Runs smiling face infinitely looped
January 18, 2015 12:56 AM   Subscribe

We Know How You Feel Computers are learning to read emotion, and the business world can’t wait.
posted by infini (61 comments total) 21 users marked this as a favorite
 
It sounds great, but AI startups have promised so much and delivered so much less over the years I automatically discount the claims by about 80%.

her daughter, for a single frame, exhibited ferocious anger, which faded into surprise, then laughter. Her daughter was unaware of the moment of displeasure—but the computer had noticed.

You might at least entertain the possibility that no ferocious anger actually occurred, rather than leaping to the conclusion that your kit now has a better knowledge of human emotions than the human beings experiencing them.
posted by Segundus at 3:07 AM on January 18, 2015 [25 favorites]


That's certainly a possibility, but then Ekman's microexpressions theory would be at fault, not the computer. If the algorithm correctly detects expressions in the judgement of FACS experts, it's delivering exactly what it's claimed to.
posted by topynate at 3:19 AM on January 18, 2015 [1 favorite]


This is just image processing. There is no 'knowing' involved. AI remains as far away as ever.
posted by GallonOfAlan at 3:23 AM on January 18, 2015 [4 favorites]


Look Dave, I can see you're really upset about this. I honestly think you ought to sit down calmly, take a stress pill, and think things over.
posted by fairmettle at 3:26 AM on January 18, 2015 [16 favorites]


must be all those searches for "suicide" on google.
posted by telstar at 3:56 AM on January 18, 2015


DETECTED YOU ARE LAUGHING AT ME, NOT WITH ME. TERMINATION PROCEDURE COMMENCING.
posted by clawsoon at 3:57 AM on January 18, 2015 [14 favorites]


And for the lazy, Robot, from AT&T's archive channel.
posted by effbot at 4:12 AM on January 18, 2015 [2 favorites]


If they could detect when I was angry, them maybe they could stop doing things that make me mad.

You say that, instead of that, they'll just use it to try to manipulate me further? Then it's ultimately doomed.
posted by JHarris at 4:35 AM on January 18, 2015 [3 favorites]


All Hail Our Robot Overlords
posted by Flood at 5:03 AM on January 18, 2015


//Nod: approving, smile: nervous, think: unclear.....puts electrical tape over iPad camera//
posted by resurrexit at 5:04 AM on January 18, 2015


This is like straight out of Person of Interest. Appreciated the tale of autism researcher breaking bad with the ad industry. Kickstarter for LED face scrambler accessories in 5...4...
posted by yoHighness at 5:07 AM on January 18, 2015 [1 favorite]


If this means recommendation engines get better than a 5 year old's knock knock jokes I am not sure I can support it. I don't want my no fuss no mess distributed AI children to grow up.
posted by srboisvert at 5:21 AM on January 18, 2015


I just put a post-it note over that little camera at the top of my iMac.
posted by tommyD at 5:27 AM on January 18, 2015 [2 favorites]


“I do believe that if we have information about your emotional experiences we can help you be in a more positive mood and influence your wellness,” she said. She had been reading about how to deal with difficult experiences. “The consistent advice was you have to take care of yourself, be in a good place, so that you can handle everything else,” she said. “I think there is an opportunity to build a very, very simple app that pushes out funny content or inspiring content three times a day.” Her tone brightened, as she began looking to wider possibilities. “It can capture the content’s effect on you, and then you can gain these points—these happiness points, or mood points, or rewards—that can be turned into a virtual currency.

I don't understand how someone can be so brilliant and so naive at the same time. I mean, pause to envision how this would work if it worked - your boss, your doctor and the state would actually be able to demand that you produce the correct feelings. All those employees at that shitty English chain where there's huge pressure to love your work and be sparkly and so on? Those people will actually have to be sparkly to keep their jobs, because the computer will be able to distinguish who doesn't, and fire them.

The nice thing about Big Brother was that you could either fake loving him or they put you in the room with the rats and you went mad - you didn't have to try to hack your own emotions so that you trained yourself to love him.

Picture what happens when your boss can tell that you are not enjoying the conference call.

Picture what happens when the social worker can evaluate just how loving a parent that desperate, low-income woman really is. Picture it broken out into percentages and goals.

Picture what happens when the interviewer can tell that you're not actually enthused about filing in a windowless room but just really need the work.

Picture some kind of national feelings database organized via smartphone, where they can pick out anyone who is chronically dissatisfied with, say, the news or the President's speech or whatever.

Stay on the scene, like a loving machine, right? Just like a loving machine.
posted by Frowner at 5:33 AM on January 18, 2015 [57 favorites]


10 PRINT "I, I LOOKED INTO YOUR EYES AND SAW"
20 PRINT "A WORLD THAT DOES NOT EXIST"
30 PRINT "I LOOKED INTO YOUR EYES AND SAW"
40 PRINT "A WORLD I WISH I WAS IN"
50 GOTO 10
posted by turbid dahlia at 5:36 AM on January 18, 2015 [6 favorites]


I mean, pause to envision how this would work if it worked - your boss, your doctor and the state would actually be able to demand that you produce the correct feelings.

They already do -- the change you're imagining is not even one of kind, but one of quantity: the manager no longer needs to pass by your desk in order to see you're dozing off.

So it's even easier to imagine than what you posted.

So yeah, I don't know how the guy could not think of this.
posted by LogicalDash at 5:44 AM on January 18, 2015 [2 favorites]


Driven to tetchiness by some whining mosquito of a feature that Google had no good reason'd into Google Maps, I shook the tablet. Up popped the feature feedback form. Google had anticipated this, and was reading the device accelerometers for this signal.

I could have been impressed: I was further discomforted. I'm not quite sure why - an expectation that I'd be driven to physical expressions of frustration? Appropriation of mild violence as a communications channel? My app had become even more passive-aggressive?

Still not quite sure I've worked all those feelings out, but I do suspect very strongly that sentiment analysis on any level faces not so much an uncanny valley as an angst abyss.
posted by Devonian at 6:02 AM on January 18, 2015 [7 favorites]


Yeah the most tragic part of this article is how this woman went from wanting to build a device to help autistic people navigate the world to essentially ushering in Big Brother.
posted by ultraviolet catastrophe at 6:04 AM on January 18, 2015 [7 favorites]


Emotient has already tested its technology on willing employees and consumers for a major fast-food company and a big-box store.

Uh huh. "Willing". Yeah.

Can the faster than I expected train ride to dystopia stop now? Can I get off here in mild dystopia?
posted by dis_integration at 6:05 AM on January 18, 2015 [15 favorites]


"We all became most expressive during a scene in which Twombly has phone sex with a woman identified as SexyKitten—the voice of the comedian Kristen Wiig. In a bizarre, funny moment, SexyKitten seizes control of the call, demanding that Twombly strangle her with an imaginary dead cat, and, as he hesitantly complies, she screams with ecstasy."

Seriously, read the whole article, it's fucking hilarious.
posted by vapidave at 6:09 AM on January 18, 2015 [1 favorite]


Remember in Infinite Jest how the widespread use of videophones created a market for phone masks? where the phones actually started to come with little hooks where you could hang your pleasant-faced phone mask?

Problem solved!
posted by allthinky at 6:11 AM on January 18, 2015 [8 favorites]


You might at least entertain the possibility that no ferocious anger actually occurred, rather than leaping to the conclusion that your kit now has a better knowledge of human emotions than the human beings experiencing them.

This points to the even more terrifying aspect of all this. Once it's fully integrated into our work lives, and we have to produce certain emotional responses to keep our jobs, the emotions that the algorithms say are emotions will become our emotions. We'll be trained to emote according to the machine, not according to ourselves. We will become flesh robots in reverse.
posted by dis_integration at 6:12 AM on January 18, 2015 [8 favorites]


We already need to affect the correct emotional disposition to keep our jobs, no robots needed. On second thought, whether you want to consider your current boss a robot is more a matter of personal taste.
posted by Noisy Pink Bubbles at 6:23 AM on January 18, 2015 [2 favorites]


But our emotions _are_ there to facilitate socialisation. That's why we have them - or at least, it's their major function. They create a change of thought in ourselves and others that reflect inner state. Of course they'll be modified as we introduce artefacts that engage with us: exactly this mechanism is already at work in art.

If you think we live in echo chambers now, wait until the information machinery becomes proactively emotional.
posted by Devonian at 6:34 AM on January 18, 2015


On the bright side maybe one of these things will be able explain some of the facial expressions in Inherent Vice to me.
posted by yoHighness at 6:41 AM on January 18, 2015


10 READ W
20 IF W = 1 THEN PRINT "HAPPY ";
30 IF W = 2 THEN PRINT "JOY ";
40 IF W = 3 THEN PRINT
50 IF W = 4 THEN RESTORE
60 GOTO 10
70 DATA 1,1,2,2,3,1,1,2,2,3,1,1,2,2,3,1,1,2,2,3,1,1,2,2,3,1,1,2,2,3,1,1,2,2,2,3,3,4
posted by flabdablet at 6:42 AM on January 18, 2015 [12 favorites]


The nice thing about Big Brother was that you could either fake loving him or they put you in the room with the rats and you went mad - you didn't have to try to hack your own emotions so that you trained yourself to love him.

No, I think you did actually have to love him: that was part of the horror. But a derail anyway!

posted by alasdair at 6:45 AM on January 18, 2015 [4 favorites]


Like people with congenial resting-face frowns ("bitch face") don't get shit on enough by cat-callers, now their computers are going to try to cheer them up constantly.
posted by wires at 7:13 AM on January 18, 2015 [3 favorites]


Cheer up everyone, it might never happen!
posted by biffa at 7:24 AM on January 18, 2015 [5 favorites]


Frowner: "I don't understand how someone can be so brilliant and so naive at the same time. "

Yes, a thousand times yes. There are so many more negative applications of this than positive ones. The last paragraph of the article is absolutely brilliant.

I already have to deal with two children who can read every emotion I have before I even realize it, if my iPad starts doing it I'm going to throw it off a cliff.
posted by Ella Fynoe at 7:57 AM on January 18, 2015 [1 favorite]


Cheer up everyone, it might never happen or you will be cheered up!

This is a great article, thanks for posting it. A lot to think about, and not much of it makes me feel good.

My facial expressions are public, there's no getting around that without a mask, and they tend to expose my private feelings, whether I like it or not. I'm used to other humans performing this analysis, despite the significant error rate. But the idea of pervasive machines analyzing me like that, and adjusting my environment as it suits them, especially if the purpose is essentially to simply line their pockets, fills me with loathing. Look at my face: you can tell. So why, in my opinion, do humans get a pass but machines do not? I'll have to think about that.

More and more it seems like popular entertainment is tending towards being solely comprised of discrete chunks of emotionally manipulative tableaus that individually tested good in a lab, all glued and stapled together in a mockery of a narrative. But as long as the majority of people come away from it saying "I liked that", it will only get worse. (If you think, "hey, all movies are like that", you need to watch better movies, ijs)

Watching Kaliouby go from making wonderful tools that help people with autism and saying "Well, here is why that is a bad idea" to bad people, to researching how to place advertisements in Facebook videos in the most manpulative (read: lucrative) way possible, is really disheartening. It would make a good sci-fi relentless-descent-into-horror story. Possibly this story has already been written.

#wakeupsheeple #getoffmylawn
posted by sidereal at 7:58 AM on January 18, 2015


Or maybe, just maybe, people should be held accountable for their actions rather than their thoughts and feelings.
posted by mistersquid at 8:01 AM on January 18, 2015 [1 favorite]


#pokerface
posted by valkane at 8:05 AM on January 18, 2015 [1 favorite]


Never ceases to amaze me how quickly physical technology develops, compared to how slowly social technology is developed.

Institute a preemptive code of ethics. Is it that hard? This is literally stuff I was realizing as a child when thinking about the future: the rate that technology can mess you up obviously exceeds the rate that you can respond to it. And since the amount it can mess you up increases as time goes on, it will continue to mess you up more and more until you're preemptive instead of reactive.
posted by tychotesla at 8:21 AM on January 18, 2015 [1 favorite]


Isn't this really basically a bad singularity? Where we all teach ourselves to comply with feelings that machines want us to have? I mean, if we're all having appropriate feelings in sync with what the Google-NSA-Facebook-industrial-complex wants us to feel, aren't we sort of some kind of collective being? All of our feelings will sync - work-readiness where we need to be work-ready, angry where capital needs to destroy something, sexy where we are needed to buy sex toys and enact "romance", etc etc.

It will be like Camazotz except that at least on Camazotz there was full employment and housing for everyone.

It's funny how the main problem with dystopias is always that they are too pleasant.

The issue will be tracking and metrics - of course we're already "required" to produce affect, up to a point (my job is substantially affective labor, for instance) but there's no way to say that my enthusiasm was at 75% on average last quarter and I need to get it to 85% or better next quarter or go on a performance improvement plan. There's no way to say that if Laura doesn't get her loving-mom-quotient up above 90 her kids will be taken away - even though there are lots of threats that can be made, they all leave a little bit of space for the individual to feel differently.
posted by Frowner at 8:36 AM on January 18, 2015


This is my *I'm so close to disabling my account for the whole internet* face
posted by infini at 8:44 AM on January 18, 2015 [1 favorite]


Those people will actually have to be sparkly to keep their jobs, because the computer will be able to distinguish who doesn't, and fire them.

This is coming faster than you think. Especially after the Sony hack I predict a glut of employee-intent-prediction analysis software.
posted by RobotVoodooPower at 8:50 AM on January 18, 2015


I think I'll go into business making Guy Fawkes masks. That's a happy face, isn't it?
posted by tommyD at 8:52 AM on January 18, 2015


I wrote a short story a while ago, about a researcher that was training a robot to recognize smiling human faces and do whatever it can to make people smile. It ended with the robot tearing his face off and sticking it to a wall, smiling.
posted by empath at 8:58 AM on January 18, 2015 [1 favorite]


I would've liked to have read the article with Affectiva watching me so it would've had the data for when I freaked the fuck out. Learn from that, stupid software.

(That was pretty much after its history in the Autism Research Centre, so about a third of the way through; it was just a bunch of nope nope nope past that point.)
posted by minsies at 9:12 AM on January 18, 2015 [1 favorite]


“I think there is an opportunity to build a very, very simple app that pushes out funny content or inspiring content three times a day.” Her tone brightened, as she began looking to wider possibilities. “It can capture the content’s effect on you, and then you can gain these points—these happiness points, or mood points, or rewards—that can be turned into a virtual currency. We have been in conversations with a company in that space. It is an advertising-rewards company, and its business is based on positive moments. So if you set a goal to run three miles and you run three miles, that’s a moment. Or if you set the alarm for six o’clock and you actually do get up, that’s a moment. And they monetize these moments. They sell them. Like Kleenex can send you a coupon—I don’t know—when you get over a sad moment. Right now, this company is making assumptions about what those moments are. And we’re like, ‘Guess what? We can capture them.’ ”

Somebody needs to see The Girl From Monday.
posted by flabdablet at 9:18 AM on January 18, 2015


I think I'll get that burka now. I covered the cameras and microphones in 2007, can't cover the phone microphone. School teaching already happens in a total fishbowl. I have heard of principals walking into classrooms, asking the teacher to step out for a moment, and firing in the hallway. I never heard the particulars on what the in-classroom discussion was.

Turn on the computer and surf, with certain third party features able to turn up microphone volume, maybe they can read quickened interest, and certainly read emotion. I am sure they didn't miss the opportunity to voiceprint all subjects of interest. The point is they can afford to do this, so of course they do. This particular resarcher may have created this program for clients, but as she made it, it was scrupulously duplicated by other, and many, interested parties, who were already developing this anyway.

With billions of people dumping their lives and most intimate interests and intents out onto the web an appetite for info has run amok. Connecting the official record keeping with the covert then the overt hyper voluntary there is no stopping the build. We are now like a library, each of us a book on a shelf for any literate party to read.
posted by Oyéah at 9:18 AM on January 18, 2015 [1 favorite]


In the comics they wrote for the Paranoia role-playing game, the unlucky protagonist (KNG-R-THR 3) has a happiness machine placed on his head. When he starts to get unhappy, it draws his face into a smile. This being Paranoia, the machine pulls harder the more unhappy he gets, starting to cause physical pain. He is eventually killed when his happy pills are swapped with ones that make him depressed, causing the machine to rip his head in half. (This happens off panel, thankfully.)

I'm not saying that these systems will turn into Friend Computer and place machines like this on our head. I'm just saying that it's a possibility.
posted by Hactar at 9:27 AM on January 18, 2015 [1 favorite]


What I imagine they cannot forcibly make us do (as yet) is use the computing devices if we don't choose to. For once I am glad I remember life before the internet.

And in the meantime, I can completely change the way I go online (we still need it for work) to minimize exposure.

Not just for this thing here in this FPP, but for it all. The whole internet is going downhill the way we watched FB do so.

At least until they clamp my head straight in front of a screen, I have this little bit of autonomy left.

A pox upon consumer driven growth as the only motive power left this universe.
posted by infini at 10:03 AM on January 18, 2015 [2 favorites]


Computers still aren't learning anything. As always, people are just learning stuff and programming computers accordingly. Artificial intelligence is still a myth, and repeating myths to ourselves over and over again doesn't make them true.
posted by koeselitz at 10:07 AM on January 18, 2015


Vast amounts of data and random algorithms are no myth. Given the level of intelligence out there, the kind driving paranoid overblown responses to everything from a skittering mouse to a roaring lion, it might be better if it were artificial.

Its the panopticon and its ubiquity that's sucking teh oxygen out of the room.

Its a weaponization of what was once a funfilled playground. And you can keep the ball, I'm stepping out of the game.

This is a straw.
posted by infini at 10:16 AM on January 18, 2015 [1 favorite]


"A real good thing. And tomorrow... tomorrow's gonna be a... real good day."
posted by the sobsister at 10:48 AM on January 18, 2015 [1 favorite]


This is just image processing. There is no 'knowing' involved. AI remains as far away as ever.

Isn't this just a Chinese Room argument?
posted by Justinian at 12:00 PM on January 18, 2015 [5 favorites]


Someone is learning the colors of all your moods...
posted by weston at 12:07 PM on January 18, 2015


Isn't this just a Chinese Room argument?

Yes, yes it is, except the Chinese Room has autonomous drones and control of the entire credit and communications infrastructure.
posted by digitalprimate at 12:56 PM on January 18, 2015 [2 favorites]


Anything that humans do that computers can't is unique to humans and non computable, until computers can do it, and then of course, it never required thought to do at all.
posted by empath at 1:14 PM on January 18, 2015 [2 favorites]


Wikipedia:
A meta-analysis of several studies about emotion recognition in facial expressions revealed that people could recognize and interpret the emotional facial expression of a person of their own race faster and better than of a person of another race.
That's how the cited studies claim people tend to perceive emotions, and I imagine they can build the same kind of bias into the software by picking a data set that predominantly features people of particular ethnicity, or just forgetting to test the system on a diverse enough group of people. Something a bit like this is one possible result. May it be a comfort to the people whose happiness the machine rejects that it performed very well in the lab during testing.
posted by tykky at 1:18 PM on January 18, 2015 [2 favorites]


I look forward to lots of banner ads for Paxil when my computer detects my resting bitch-face.
posted by Ursula Hitler at 1:35 PM on January 18, 2015 [5 favorites]


Sad that she ditched the autism track, but even sadder that she ditched her research partner and mentor. But not too surprising - she is a "traditional" MidEast woman, married to the CEO of the tech company she once interviewed with.
posted by mmiddle at 2:54 PM on January 18, 2015


This essay started out kinda interesting, but got sogged down with head office creative wonks second guessing for the holy grail of "what consumers think", like a goofy real life version of that silly unrealistic TV show 'Lie To Me'. Corporations really don't care what we think, they would rather pay a consultant using some new high tech gadget to confirm their preconceptions of what they would like us to think. Carry on.
posted by ovvl at 3:34 PM on January 18, 2015


My facial expressions are public, there's no getting around that without a mask, and they tend to expose my private feelings, whether I like it or not.

No masks. National security.
posted by scalefree at 4:33 PM on January 18, 2015


We'll build so pretty interesting robots with all this. ;)
posted by jeffburdges at 7:21 PM on January 18, 2015


Does anyone know where I might find old New Yorker articles about the development of the polygraph? Because I have a strong hunch that they will read similar to this, especially its first half.
Also: the software was able to predict voting preference with seventy-three-per-cent accuracy.
I'm pretty sure I could do that well with a gender/race/age/wealth breakdown. No points.
posted by Lemurrhea at 9:57 PM on January 18, 2015


No masks. National security.

Botox for all.

Cheer up everyone, it might never happen!

Also: Cheer up everyone, because it might happen!
posted by biffa at 1:02 AM on January 19, 2015


Oyéah: "I think I'll get that burka now."

What an interesting Irony, I suppose, that the Burqa would eventually be a sort of liberatory space of privacy and freedom. The ultimate anti-thesis to The Panopticon.
posted by symbioid at 8:27 AM on January 19, 2015


-Brain Scans May Help Predict Future Problems, And Solutions
-How stories change hearts and brains: "Across time and culture, stories have been agents of personal transformation – in part because they change our brains."
posted by kliuless at 8:53 PM on January 19, 2015 [1 favorite]


« Older NEON GLITCHY PIXART MADNESS   |   Pictures of decay and ruin Newer »


This thread has been archived and is closed to new comments