“We built voice modulation to mask gender in technical interviews.”
July 11, 2016 1:05 AM   Subscribe

interviewing.io is a platform where people can practice technical interviewing anonymously and find jobs based on their interview performance. Women historically haven’t performed as well as men—specifically, men were getting advanced to the next round 1.4 times more often than women, and had a 20% higher average technical score from interviewers. In an attempt to erase this difference, interviewing.io added voice modulation to their online interviews.
posted by Rangi (15 comments total) 9 users marked this as a favorite
 
So, I'm confused about a couple things (it is pretty late for me though).

First, they told the interviewers and interviewees about the voice masking feature and allowed an opt-in. Does that not affect the results somehow? I thought a more scientific approach would be to do a blind study.

Next, the voice modulation affects the pitch of the voice, but not necessarily things like inflection or word choice. So, a lot of women may not sound like regular guys, they might sound like effeminate men. Maybe it doesn't really affect the interviewers' opinions after all, but it's something to consider.

Finally:
...men treat rejection in dating very differently than women, even going so far as to say that men “reported they would experience a more positive than negative affective response after… being sexually rejected.”
I find that... surprising? But then I never really thought about it. Maybe the only men-being-rejected memories I have are the ones who take rejection badly, because they stand out more. Though it doesn't surprise me at all that women in tech have lower confidence in their abilities.
posted by picklenickle at 2:42 AM on July 11, 2016 [2 favorites]


The point that you only get by reading the article, which is how it's been framed in the tech-bro blogs, is that women actually did worse in this environment - as picklenickle points out, the opt-in really breaks the whole experiment, and I could see it breaking in that way (I know that, in the place I work, masking your voice would be seen as "this person is dishonest" - an oldschool bank, to be fair, but still... perception remains a lot of the problem).

That having been said, the thesis of the piece seems to be, essentially, "the best women in tech are more easily discouraged than the men, and end up giving up voluntarily" - something that I wish I could say were not true, but matches a my experience. But I also think that is systematic - women are more likely to stay knocked down in tech because it's happened over and over, and what's the point of fighting an unwinnable game?

I know quite a few junior level women in IT, and I know quite a few women who used to be senior level women in IT, before saying fuck this entire industry and going and doing something else. And it rigs the stats - some friends of mine really do try and frame this as "women can only be junior because they're not good", rather than "this industry is toxic as shit, and is pushing top talent away, and we really should fix that".
posted by jaymzjulian at 3:04 AM on July 11, 2016 [10 favorites]


Wow, I was not expecting those results. Not sure how much the advance notification actually affected the outcome (I agree it sounds as if it would), but if this can be replicated, attrition is a lot easier a problem to fix than bias. Assuming you can get the interview in the first place.
posted by Mchelly at 3:27 AM on July 11, 2016


men “reported they would experience a more positive than negative affective response after… being sexually rejected.”

The difference in dating, AFAICT, is that the men I've seen have been socialized to blame women (or society) for rejection, while women are socialized to blame themselves.

Also, this leads to extinction bursts, which is an issue.
posted by steady-state strawberry at 3:30 AM on July 11, 2016 [7 favorites]


I don't see how the opt-in breaks the experiment. If there were an effect along the lines of thinking anyone masking their voice must be less trustworthy, then their controls (males modulated to sound female, males and females modulated without gender switching, all compared with opted-in people unmodulated) should have picked that up.

It definitely seems plausible that word choice/speech patterns could still be a factor, of course, as they made no attempt to measure that. But this at least seems to falsify the idea that being visibly (...well, audibly) feminine is deliberately penalised by interviewers in their environment, as opposed to more subtle effects of language choice.
posted by metaBugs at 4:53 AM on July 11, 2016


TBD ... not enough data to comment on the role of self-perception. Footnote 3.
posted by oheso at 4:53 AM on July 11, 2016


The last time we sent someone in wearing a scramble suit, it turned out his best lead to the top of the cartel was just another undercover narc. They spent so much effort trying to root out the corruption and the tendency for narcotics agents to abuse their positions of privilege that they lost the ability to actually get anything done, and poor Bob Arctor was just chasing his own shadow.

*turns slowly to stare at the camera goggle-eyed*
posted by Mayor West at 6:03 AM on July 11, 2016 [6 favorites]


"the best women in tech are more easily discouraged than the men, and end up giving up voluntarily" - something that I wish I could say were not true, but matches a my experience

I don't think that is as true as one might suspect. The women I know in tech take so much shit and get so little support compared to the men. If they are eventually discouraged that does not mean they are more easily discouraged; the comparison is in no way sensical.
posted by dame at 9:49 AM on July 11, 2016 [5 favorites]


I don't know if there's a way to compensate for the fact that vocal pitch is not the only heavily-gendered speech trait.
posted by rmd1023 at 10:46 AM on July 11, 2016 [1 favorite]


For anyone who didn't RTFA: they did an opt-in experiment where they modulated voice pitch to mask gender during mock interviews. All the experiments were mock interviews, but the platform interview.io is for real hiring as well as practice. About 15% of site users are women.

The opt-in experiment had three groups: unmodulated voices, modulated voices with pitch change, modulated voices without pitch change. There were 234 total interviews with an unspecified number of participants, roughly 2/3 male interviews and 1/3 female interviews. The number of participants by gender is also not stated.

They provide us with 2 sound clips: 1 of an unmodulated female voice, and 1 of a modulated female voice with pitch change. There is no sound clip of a modulated female voice without pitch change, and no sound clips of any male voices with or without modulation. They link to more sound clips on NPR and FastCompany, but all of those are also female-to-male. So we don't know what a male-to-female modulation actually sounds like from the information they've supplied.

They compared this experimental data with existing data of "over 1000" real interviews.

What they found:
- Women performed worse than men in the experiment;
- Women performed worse in the experiment than women in the real-world interviews;
- Women in real-world interviews performed worse than men in real-world interviews

where the performance measure is that the interviewer rated them as 3 or more stars out of 4 and/or would put them through to the next round of interviews.

========

So what the experiment appears to show that once you remove gender bias in STEM interviewing (assuming arguendo that changing voice pitch = removing gender bias), women perform worse, and therefore are worse candidates for STEM jobs than men.

The rest of the article appears to be a lame attempt to justify why women aren't worse candidates for STEM jobs than men.

In the middle of the article, the author goes, "but wait!!!" They went back to the data from the >1000 real interviews, and noticed that dramatically more women leave the platform after 1 or 2 failed real interviews than men. If you control for participants who dropped out after 1 or 2 failed real interviews, men and women have the same success/failure rate. In other words, if you count only those interviewees who did 3 or more real interviews, the gender disparity disappears completely and women have an equal success rate to men in the real, non-experimental, non-voice-modulated interviews.

They also say of the experimental data "for technical ability, it appeared that men who were modulated to sound like women did a bit better than unmodulated men and that women who were modulated to sound like men did a bit worse than unmodulated women" but also "these trends weren’t statistically significant".

Needless to say, the majority of commenters take this to mean that men are better and more deserving of being hired into STEM jobs than women. This is what they want to hear, but also, the article makes it sound like that's what they're saying. I'll go into that later.

=====

Meanwhile, there are some questions I think the article leaves unanswered.

1. Of the 234 experimental interviews, what was the total number of participants? Total number of males? Total number of females?
1a. Same for the >1000 real interviews.
2. What was the distribution of number of experimental interviews per participant? per male participant? per female participant?
2a. Same for the >1000 real interviews.
3. What does a modulated female voice without pitch change sound like? Do they all still sound like women or do any of them end up sounding like men?
4. What does a modulated male voice with pitch change sound like?
5. What does a modulated male voice without pitch change sound like? Do they all still sound like men or do any of them end up sounding like women?
6. How many interviews were in each subgroup (unmodulated, modulated/pitch, modulated/no pitch) of the 234 total interviews?
7. Did the same participant have the same voice modulation in every interview or did that change from interview to interview?
8. How many men were modulated to sound like women?
9. How many women were modulated to sound like men?

Above all, the fact that they didn't provide any sound clips of modulated voices without pitch change, or of modulated male voices, seems like a significant omission.

They say "as much as possible, I corrected for people leaving the platform because they found a job [...], were just trying out the platform out of curiosity, or they didn’t like something else about their interviewing.io experience." It would have helped to have some numbers for this and some details about how they got that feedback (since people get hired from real-world interviews on their platform, I assume it comes from hiring stats, but I'd like to know if any of them left because they got a job outside of the platform, or if any of the feedback came from post-questionnaires, or what).

The article includes a graph purporting to show a gender gap in "attrition events" throughout the STEM career pipeline, which they say is highly speculative and based on "not that much data". This is an understatement, since it's not clear what those attrition events are, whether the definition of an attrition event is based on any data, and whether their rate of occurrence is based on any data; or whether the entire graph is basically a thought-experiment that the author pulled out of their ass. To me, it looks like calling that graph "speculative" is overgenerous and what it actually is, is an X-axis and a Y-axis and a red line and a blue line neither of which mean anything at all on their own. There's an "interactive graph" you can click, which turns out to be equally meaningless and unexplained.

As for those men/women who leave the platform after 1 or 2 bad interviews, I would like to know not only the distribution of those who left because they got a job/were just doing it for fun/didn't like the platform, but also what real-world interviewing experiences they'd had that led them to the platform. Did they join the platform for mock interviews after they'd failed 20 real-world interviews in a row? Were they currently employed and just testing the waters? If the experimenters don't have that data, I'd like to know that too.

=====

tl;dr I suspect this entire article is disingenuous. We only hear samples of unmodulated female voices and voices modulated from female to male. They say they included pitch changes from male to female, but we never get to hear those.

It's highly relevant that gender disparity in real-world interviews disappears for candidates who persist for 3 or more interviews. But the way the article states that, it seems to go out of its way to make it sound like a feeble apologetic "even though our experiment shows that women are bad candidates for STEM jobs, here's my weak excuse for why they aren't really!" Except that part isn't a weak excuse, but they then go on for half a page to add a bunch of weak excuses plus a graph unsupported by any apparent data.

They're selling a service, they are not an independent research body. So maybe the service they're selling is "hey brogrammers! use our service to get out of hiring women because we all know that female candidates are shit and now we have proof!" Because that's certainly how the commenters seem to have taken it.
posted by tel3path at 12:53 PM on July 11, 2016 [5 favorites]


I can't emphasize enough, these are not academics doing a study with the lofty goal of advancing human knowledge, they're a software site promoting their own services.

For all I know, they do have data to support the conclusions they make such a charming show of shying away from. But they haven't shown us that data.
posted by tel3path at 1:03 PM on July 11, 2016 [1 favorite]


Thanks for that summary, tel3path. I didn't read the comments (took an automatic nope on that), but my reading was that they were saying that there was no difference between male and female interviewees once they passed the 3+ interview threshold, so the problem to solve for is one of confidence or persistency -- not that women are lesser candidates, but that something about having a bad interview is being internalized or self-discouraging in a way that doesn't happen with men. And that's a social or possibly cultural issue, not a scientific-literacy issue, and has nothing to do with fitness for the job. Or am I being too generous?
posted by Mchelly at 1:04 PM on July 11, 2016


A bit too generous, yeah.

Disproportionately many women leave THEIR PLATFORM after 1 or 2 failed interviews. That is just as likely to say something about their platform as it is about the women who leave it.

Now, maybe those findings are generalizable to the STEM job market as a whole, and I'm not criticizing them for only having data about their own company. That's fair enough.

What's not fair enough is the inference that if women are ditching their site in disproportionate numbers, that means something is wrong with the women and not with the site.
posted by tel3path at 1:14 PM on July 11, 2016 [4 favorites]


Please note that I'm NOT saying that anything IS wrong with their site/interviewing platform.

What I AM saying is that it's a non sequitur to go from "gee, after 1 or 2 bad interview experiences, women are disproportionately likely to quit the site altogether compared to men" to "why do women give up so easily?!?" instead of "are we doing something to turn women off our site [1]?!?"

In fairness, they did say that they controlled as much as possible for those who left because they "didn't like something else about their interview experience". But it's also clear that the author doesn't know why so many women leave at such an early stage.

Also, I get why they're not asking "are our interviewers driving women candidates away?" because the interviewing companies are their customers. But it is a fair question, even if it's obviously not in their best interests to ask it.

It's also not a question which is answered by a non-blind study where the interviewers (who as companies don't want to expose themselves to liability or bad publicity) know what kind of bias is being studied and how.



[1] something like oh say publishing blog articles that slyly imply that our customers have no gender bias it's just that female STEM candidates suck, and gee gosh we're as surprised as you are?
posted by tel3path at 4:05 PM on July 11, 2016


"the best women in tech are more easily discouraged than the men, and end up giving up voluntarily" - something that I wish I could say were not true, but matches a my experience

I don't think that is as true as one might suspect. The women I know in tech take so much shit and get so little support compared to the men. If they are eventually discouraged that does not mean they are more easily discouraged; the comparison is in no way sensical.
posted by dame at 9:49 AM on July 11 [5 favorites +] [!]


I think you are right - my phrasing on this was poor, since what I meant to communicate is that I believe that women leave because the industry drives them out, something that does not happen to men in the same way.
posted by jaymzjulian at 9:58 PM on July 11, 2016 [1 favorite]


« Older Mashup? More like Smash-up.   |   What Meetups Tell Us About America Newer »


This thread has been archived and is closed to new comments