the application seems to have been an afterthought
February 7, 2019 9:16 AM   Subscribe

Along with jet packs and hover boards, a machine to translate from any language to any other is so appealing as a fantasy that people are willing to overlook clunky prototypes as long as they can retain the belief that the future promised by science fiction has, at last, arrived. One particularly clunky subspecies of the universal language translator has a rather dismal history: the sign-language glove, which purports to translate sign language in real time to text or speech as the wearer gestures. For people in the Deaf community, and linguists, the sign-language glove is rooted in the preoccupations of the hearing world, not the needs of Deaf signers.
posted by sciatrix (23 comments total) 15 users marked this as a favorite
 
I love the picture of the guy demonstrating his ASL glove while sitting in front of an open laptop. It's like, sure, I can communicate with you by gesturing with my hands in the air while software completely misinterprets what I'm saying, or... I could maybe... type? On the keyboard? Which is, like, right here, ten inches away?
posted by phooky at 9:24 AM on February 7, 2019 [4 favorites]


To be totally fair, there are Deaf people fluent in ASL who are not literate in English. ASL (and British Sign Language, which is actually a different language in its own right) are not transliterated English--that's actually a major point in the article.
posted by sciatrix at 9:28 AM on February 7, 2019 [9 favorites]


Key parts of the grammar of ASL include “raised or lowered eyebrows, a shift in the orientation of the signer’s torso, or a movement of the mouth,” reads the letter. “Even perfectly functioning gloves would not have access to facial expressions.”

Clearly, the answer is to encase the signer in a complex body suit so we don’t have to look at them while the computer tells us what they are trying to say!

That’s not off-putting at all!
posted by GenjiandProust at 9:57 AM on February 7, 2019 [2 favorites]


The various examples described in the article sound like great examples of engineer's disease (a point the article basically makes). That said, I'd wonder if you could build transcription devices for both Deaf and hearing users that would be useful for each community (although I think they'd have to be separate devices).

One person interviewed for the article says "a dominant fantasy among her friends is for glasses that would auto-caption everything that hearing people say." - that seems well within the realm of practicality either now or within the next few years. It's fundamentally a good speech-to-text algorithm, a good microphone array and some kind of augmented reality glasses. Too bad the Intel laser glasses haven't become a real thing yet; they seem like the platform you'd need to make this work.

Building the reverse - AR glasses with an outward-facing camera and image-based gesture recognition to transcribe sign languages - seems similarly within the realm of possibility. It looks like people are working on the core computer vision problem here (of recognizing signs in video), so that seems possible, at least.
posted by Making You Bored For Science at 10:13 AM on February 7, 2019 [5 favorites]


In fact, while Intel has given up on their smart glasses, they sold all the IP on them to North, who just released their first smart glasses, which seem a lot like what Intel was talking about last year. Maybe these sorts of devices are closer than I think...
posted by Making You Bored For Science at 10:19 AM on February 7, 2019 [1 favorite]


yeah the pieces are in place for real-time automatic audio transcription. not sure how well they would cope with noisy environments, this is still surprisingly difficult for everyone but humans.

Apple has had hundreds of engineers toiling away in secret for years on augmented reality technology. Presumably they are trying to figure out how to make glasses that don't make you look like this. They also make their own low-power chips for neural networks. Pieces coming together.

the article points out the many issues with computerized translation of sign language though. as I understand it, there is no widely-accepted way to write ASL (or any other sign language?), which would definitely make it pretty hard to apply existing translation techniques. does anyone know why no proposal has caught on? at the very least being able to show people "here is what written ASL looks like" would make it very clear that it's not just a weird version of English.
posted by vogon_poet at 11:37 AM on February 7, 2019


“We can't get decent access to communication when we go to the doctor. Why bother with silly gloves when we still need to take care of the basic human-rights issues?”

Forgive me if I'm missing something, but isn't that the point of the gloves? It's an attempt at finding a way for Deaf people to communicate with someone that doesn't know ASL. It's an imperfect attempt with plenty of limitations, granted, but it doesn't seem that much worse than, say the Google Translate app on my phone. It doesn't change my English into fluent, error-free Italian, but it gets the point across.

a dominant fantasy among her friends is for glasses that would auto-caption everything that hearing people say

Put this new Google app in Google Glass (is that still a thing?) and you are most of the way there.
posted by Rock Steady at 11:41 AM on February 7, 2019


Sign languages are useful and yet they also are beautiful and as someone who was born with life-impacting hearing loss I wish that I was taught how to sign as a child and I wish more people learned because really really really there are people out there using sign languages to create stunningly beautiful works of performance art.
posted by nikaspark at 11:55 AM on February 7, 2019 [7 favorites]


Forgive me if I'm missing something, but isn't that the point of the gloves? It's an attempt at finding a way for Deaf people to communicate with someone that doesn't know ASL.

I think the point being made is that deaf people have plenty of ways to communicate with people who don't understand sign language (including the keyboard on the laptop these gloves are attached to), but that their doctors don't bother to support them with even pen and paper. Then, doctors don't pay attention to the communication they do get from deaf people (much the same complaints from women and people with other disabilities; chronic pain) even once basic communications are established.

If the people who are supposed to be helping someone won't bother with basic human rights issues like listening to and believing their truth, a technological solution is not what's needed. These gloves aren't for deaf people. They're for non-deaf people to feel better about their hearing privilege. Any hearing person thinking this technology would be helpful to them would benefit more, and help more people, by just learning ASL.
posted by Revvy at 11:59 AM on February 7, 2019 [8 favorites]


It's an attempt at finding a way for Deaf people to communicate with someone that doesn't know ASL.

That's kind of a partial truth, in my view, because it misses vital context. I'd say it's an attempt at finding a way for Deaf people to communicate with someone who doesn't know ASL, in a manner that places the minimum burden upon that person, and society, to adapt to and accept the real needs of Deaf people. Finger spelling is not a substitute, when having a medical consultation, for a qualified medical interpreter. We would not expect a person who didn't speak English to have a medical consultation through Google Translate, and we shouldn't expect Deaf people to communicate with a similarly limited technology.

Basically, a lot of technology "designed for" disabled people is actually designed at disabled people: yes, the intention is to make our lives less inconvenient, but that still leaves
the question "less inconvenient for whom?".
posted by howfar at 12:01 PM on February 7, 2019 [14 favorites]


Final point FTA:
And, Kolb added, technology could create ways to encourage hearing people to use ASL and become multimodal as well as multilingual.

“That would open up the possibilities of communication for all of us,” she said.


one million times this.
posted by nikaspark at 12:09 PM on February 7, 2019 [2 favorites]


I mean "a hearing person who didn't speak English" and I reveal my own biases even as I criticise those of others
posted by howfar at 12:10 PM on February 7, 2019


If the people who are supposed to be helping someone won't bother with basic human rights issues like listening to and believing their truth, a technological solution is not what's needed.

I don't disagree with this at all, and yet I'm not sure we should reject options to ease the technical difficulties, unless they don't work (this particular iteration of the technology doesn't sound like it does, but what about a camera system that actually captured all the relevant data?) or otherwise do more harm than good (which is always a possibility). Excepting practices/businesses that have a significant number of Deaf patients, the recommendation to "just learn ASL," as if it was something you could pick up in a month, is ridiculous. You can learn to finger-spell in an afternoon, but ASL is a whole other fully-fledged language! It must take just as long to become fluent in it as any other of the many many languages that patients/clients/etc. may speak. (And then there are the regional and racial dialects within the U.S. and the other sign languages of the world.) Everyone should have a qualified medical interpreter. Everyone doesn't, and I am talking about all groups that don't speak or understand spoken English readily--if you think hearing non-English speakers don't have this problem, you are, unfortunately, wrong.

Is research prioritizing the right technology to get the maximum benefit from the limited resources available? Are they even consulting the right people to make informed decisions on that point? Totally fair questions, and ones that the article addresses thoughtfully. But technology's imperfect solutions can still sometimes constitute advances.
posted by praemunire at 12:21 PM on February 7, 2019 [3 favorites]


if you think hearing non-English speakers don't have this problem, you are, unfortunately, wrong.

Yeah, that's entirely true. I phrased it badly. The point I was making was normative, about what "we" (broadly speaking) believe as a group, rather than how society actually treats non-English speakers. Really, the difficulty is not that disabled people are more subjected to the negative effects of marginalisation than other marginalised groups (and this isn't a competition), but that the discourse around disability in abled communities is still exceptionally effective at concealing that marginalisation and its methods, at a time when we are starting to have more open conversations, generally, about marginalisation and suppression of agency.
posted by howfar at 12:45 PM on February 7, 2019 [2 favorites]


I'm kind of interested in looking at the basic haptic and UI design of computing from a less ableist point of view entirely and seeing what we can discover.
posted by nikaspark at 12:47 PM on February 7, 2019 [2 favorites]


Excepting practices/businesses that have a significant number of Deaf patients, the recommendation to "just learn ASL," as if it was something you could pick up in a month, is ridiculous.

The actual expectation for practices is that they hire a medical interpreter for ASL, usually someone who already is fluent in ASL and at least proficient in medicese, to translate as necessary for D/deaf people who need care. That someone can be paid on an hourly, as needed basis, but the hospital ought to have a contract with such a specialist and know where to find one when needed. That's it.

No one is expecting all doctors to learn ASL. In fact, in some quarters there's a little bit of defensiveness around doctors learning a little bit of ASL to get by, because medical interpreters are hard enough to push to get in use and yet they are a major accessibility issue. People worry that haphazard training of ASL once by indifferent doctors will be used as an excuse to fail to retain interpreters who can explain what is happening to a Deaf person who might be in extreme medical stress or being asked to provide informed consent through a language barrier.
posted by sciatrix at 1:44 PM on February 7, 2019 [8 favorites]


And if I sound like I'm jumping down your throat or putting arguments in your mouth, I'm probably responding to the echo of arguments I have heard about whether it's necessary to hire these translators.

The thing is, this isn't only a language issue. It's a disability issue. There is no nation that uses ASL as a language, and there are educational barriers that even prevent people with hearing impairment from going into healthcare as a profession. That's not the case for other languages. This--access to communication without having to bear all the cost of that communication on the back of a Deaf person who is dealing with a medical issue--is an access issue. And that means it plays by different rules as well as a different context, with very differently sized markets for both hard and software truly aimed at increasing ease of communication.
posted by sciatrix at 1:55 PM on February 7, 2019 [6 favorites]


That someone can be paid on an hourly, as needed basis, but the hospital ought to have a contract with such a specialist and know where to find one when needed. That's it.

And thanks to the internet it can be done remotely via video. Practices in small and rural areas can offer it, as can emergency rooms where it's not practical to retain local 24/7 coverage.

For fixed locations such as clinics and hospitals, this seems like something that could be largely solved through a 24/7 remote interpretation service. The interpreters wouldn't even necessarily need to be in the same country.
posted by jedicus at 2:23 PM on February 7, 2019 [3 favorites]


The interpreters wouldn't even necessarily need to be in the same country.

The only good thing about having to rely on telephone interpreters as much as I do (and we're losing the only Polish speaker in the office tomorrow oh god no!) is wondering what country I'm speaking to, there's still something romantic about it.

I say the only good thing because telephone interpretation seems to me to be a good example of how there's no necessary service (or at least no necessary service for marginalised groups) that capitalism can't manage to do in the most barely functional way. The service we use (which, by all accounts, is no better or worse than any other) is juuuust good enough to be better than nothing, but only just. It's deeply frustrating to have no idea of the quality of interpreter you will get from call to call. You get occasional diamonds, who are clearly professional career interpreters making a bit of extra cash between jobs, but you get a lot more people who have clearly taken whatever is the easiest accreditation to pass and are struggling badly. I don't blame them, they're just trying to do a job they've been told they are qualified to do, but I do hate how the profit motive always reduces the things that marginalised people actually need to the most meagre of standards. And yes, regulation can and does help, but there are perverse incentives at work in most markets for necessities.

All of which is to say that I feel like challenging market principles where they are harmful is an important part of delivering good services to everyone. Which might be an obvious point depending on who you are, and is also something of a tangent, but I've written it now.
posted by howfar at 2:48 PM on February 7, 2019 [3 favorites]


There is no nation that uses ASL as a language, and there are educational barriers that even prevent people with hearing impairment from going into healthcare as a profession.

NZSL is actually one of the official languages of New Zealand (the other two are English and Māori). Sadly the educational barriers still exist, both for people with hearing impairment entering the healthcare professions and for hearing people to learn NZSL.
posted by tumbling at 3:19 PM on February 7, 2019 [3 favorites]


Clearly, the answer is to encase the signer in a complex body suit so we don’t have to look at them while the computer tells us what they are trying to say!

ahem
posted by axiom at 9:47 PM on February 7, 2019


we are all just temporarily embarrassed George Jetsons...
posted by quonsar II: smock fishpants and the temple of foon at 10:02 AM on February 9, 2019


As someone who teaches Deaf students at times, I can tell you that computer transcription is still really really far from being reliable and that we will still need ASL interpreters and CART providers for the foreseeable future, especially in academia, where human providers prepare to deal with the specialized vocabulary and computers as yet do not.
posted by hydropsyche at 4:54 AM on February 10, 2019 [2 favorites]


« Older The Liberal Argument For a Green New Deal   |   Freediver, Guillaume Néry's new film Newer »


This thread has been archived and is closed to new comments