a taxonomy of supremacist bad faith, and the margins of permission
February 12, 2024 6:50 PM   Subscribe

Part 1: my Poochie who is sure to tell you that he only finds anti-woke rhetoric understandable. Part 2 : Charlie Kirk now believes that pretending to revere Dr. King is less useful. Part 3: the most insidious kind of bad faith—pretending to take an opposing position in order to create a normalizing debate for [white] supremacy. A.R. Moxon's Reframe, relocated off of substack!

I've never observed my Poochie... recommending any of the many Black writers and thinkers who explain in great researched detail how and why the anti-woke project, as practiced by many racists, actually is racist, and the historical and sociological underpinnings thereof.

No, Poochie doesn’t seem to be interested in understanding all of those other Black voices on the topic of what “woke” once meant before it became “wokeness,” and what it has been made to mean by bad-faith actors, and what all of that suggests about the way that supremacy works, or anything else that any Black voice who opposes the anti-woke movement has to say about supremacy. He doesn’t need to, because he’s found a Black voice, John McWhorter, to whom he can point again and again and again, in order to buttress Poochie’s case, which is that anger in response to injustice’s violence is a far more present threat and a far greater danger to society than the violence of injustice itself, because to Poochie, polarized anger is not an outcome of injustice and violence, but rather its cause.
posted by spamandkimchi (15 comments total) 15 users marked this as a favorite
 
Yup, recignise far too much of that.
posted by Artw at 10:57 PM on February 12 [2 favorites]


Love this guy's writing, but I can't tell if it's just that I agree with him.
posted by rhizome at 12:27 AM on February 13 [2 favorites]


When it comes to Poochie: I have never understood, as a straight white cis male, why I would ever have the right to tell people of any other positionality when it is okay to be angry? Why anyone from my positionality would ever think that? I haven’t lived anyone else’s life, I can see injustice everywhere even just for myself (class, neurotypicality) and that injustice makes me justifiably angry. How much worse would it be with several other varieties of injustice? An anti-anger podcast is not a serious position, it is not engaging with others in good faith from the very start.

For McWhorter: I recall enjoying The Power of Babel in the early 00s, though I was in the middle of a massive evangelical fundie -> intersectional marxist ideological shift at the time so there’s potentially a billion varieties of problematic that would’ve wooshed straight over my head. The politics listed in his Wikipedia page are …all over the place, by anyone’s standard. And some things he says on the politics-linguistics border are not just defensible but wholly correct:

McWhorter has argued that software algorithms by themselves cannot be racist since, unlike humans, they lack intention. Rather, unless the human engineers behind a technological product intend for it to discriminate against people of a particular ethnicity, any unintentional bias should be seen as a software bug that needs to be fixed ("an obstacle to achievement") rather than an issue of racism.

This is nearly indisputable. Linear algebra does not hate black people, American society hates black people and the training material reflects that hatred. AI researchers failing to invest significantly more energy into scrubbing their inputs reflects that hatred.

Fortunately this means we can gradually solve the problem of racist output in AI along known lines with political peer pressure on those researchers, and there has been measurable progress in the last year: many of the most popular and influential models now include bias metrics on a number of social justices axes (gender, race, LGBT sensitivity) in their release notes. It’s slow and sporadic, but still: progress.

From that same article, however:
McWhorter is a vocal critic of the Sapir–Whorf hypothesis. In his 2014 book The Language Hoax, he argues that, although language influences thought in an "infinitesimal way" and culture is expressed through language, he believes that language itself does not create different ways of thinking or determine world views.

I can understand a linguist holding this view before LLMs became popularized, but to not walk it back after? LLMs are a near-perfect realization of Sapir-Whorf, a statistical embodiment of the fact that the parsing of a particular language by humans and the consensus conceptual-relationship map shared by speakers of that language are not just tightly coupled but quite literally the same exact thing.

Racist and colonialist attitudes have been baked into English for centuries, it is in the very air we breathe and this is why “woke” attitudes towards problematic speech are so important. The only reason there is even a sliver of breathing space for Chomsky - and it’s pretty damn marginal - is that there is only one set of physics and human minds are far more than LLMs. We slowly fine-tune and retrain ourselves every moment, and the observation that tossing an apple results in it falling is universal. So what would be a perfect example of Sapir-Whorf is constantly tempered by universal observed outcomes of our actions; but this does not change the fundamental nature, bias, or importance of language in shaping our speech and our thoughts.
posted by Ryvar at 3:37 AM on February 13 [1 favorite]


Didn't Poochie die on the way back to his home planet?
posted by kingdead at 5:06 AM on February 13 [2 favorites]


McWhorter has argued that software algorithms by themselves cannot be racist since, unlike humans, they lack intention.

That might be a distinction without a difference. They may not be intentionally racist but they are effectively racist. The same might be said of a mortgage officer who implements their company's redlining policies.
posted by CheeseDigestsAll at 6:33 AM on February 13 [13 favorites]


It’s a pretty important distinction, IMO: there’s nothing inherently racist about the technology, but rather the materials commonly used to bootstrap it. That means it can be salvaged - even within our capitalist hellscape! - if companies feel pressured to publish their bias metrics and those are treated as a competitive feature. Even if less racism gets treated like a mere “nice to have” by our corporate overlords (and it will be), that’s enough to make a lot of progress as we hit diminishing returns on other core metrics.
posted by Ryvar at 6:54 AM on February 13 [1 favorite]


McWhorter has argued that software algorithms by themselves cannot be racist since, unlike humans, they lack intention. Rather, unless the human engineers behind a technological product intend for it to discriminate against people of a particular ethnicity, any unintentional bias should be seen as a software bug that needs to be fixed ("an obstacle to achievement") rather than an issue of racism.

See the thing here, to me, is that the "indisputable" reading of this is so obvious as to be trivial (like, the computer/code is not conscious, definitely doesn't have intent the way that humans do, etc.) but because of that, this argument is only worth stating if you're trying to push back on responsibility for systemic racism or deny its existence generally. Hell, to pull a reductio ad absurdum, the weapons used to commit genocide don't hold any prejudices, and in fact don't even want to murder! They don't want anything at all! This is true, but in a way that doesn't matter in any discussion of the use of weapons to commit genocide.

In other words, it sounds a lot like the kind of argument that Moxon is writing about here (McWhorter's argument, not yours, Ryvar.)
posted by Navelgazer at 12:25 PM on February 13 [2 favorites]


Yeah this is bad faith arguing of exactly the kind described in TFA, when it comes down to it.

Also zero surprise that the new go to guy for bad faith arguments would be an AI bro.
posted by Artw at 12:47 PM on February 13


For McWhorter: I recall enjoying The Power of Babel in the early 00s, though I was in the middle of a massive evangelical fundie -> intersectional marxist ideological shift at the time so there’s potentially a billion varieties of problematic that would’ve wooshed straight over my head.

I feel weird because I feel like I was there for John McWhorter's origin story. Ground Zero for McWhorter transforming into an all-purpose anti-woke pundit came out of the 1996-1997 Ebonics moral panic that occurred after the Oakland, CA school board passed a brief resolution that argued Ebonics a.k.a. African-American Vernacular English could be used a bridge for learning standard English. McWhorter is a linguist and his original academic specialty is creole languages, which is why he was originally involved and brought in as a relevant authority in the "Ebonics debate."

Since I was a newly minted graduate student at UC Berkeley in 1996-1997, I had a front row seat for this whole controversy. A.R. Moxon's article highlights how McWhorter is now a convenient figure for a certain kind of "debate me, bro" centrist, but back in the day, it was actually surprising how McWhorter came off poorly in debates before scholarly Berkeley audiences.

For example, I saw a debate between McWhorter and John Ogbu, the Nigerian-American anthropologist known for the acting white hypothesis and the involuntary minority hypothesis. As a spectator, I thought Ogbu actually got the better of McWhorter in that debate. Ogbu made the point that the goal of the Oakland School Board was to teach standard English, but that the board believed they would be more effective doing that in a majority Black district if Black students could learn standard English without being stigmatized for the dialect they grew up. Ogbu said that the Black parents he observed actually wanted their children to learn standard English and were offended at the notion that their kids would be "taught Ebonics," but the parents were still talking to him in African American Vernacular English just the same. McWhorter's argument boiled down, "Well ackshually, Ebonics is not a language; it's a dialect." and not really saying much of substance beyond that. (This was especially off-putting considering that McWhorter tends to be descriptivist instead of prescriptivist in internal debates among linguists.)

He had another debate on campus with sociologist of education, Pedro Noguera, who's basically the guy who pretty much invented the scholarly study of the concept of the achievement gap. I wasn't there for that debate, but I did read the article about it in the campus newspaper, the Daily Californian, the following day. And the general consensus from was that Noguera just mopped the floor with McWhorter.

So, having viewed this all happen as a bystander, I just can't help thinking McWhorter's transition into a reactionary centrist anti-woke pundit is a petty counter-reaction to the slights he felt in those debates.
posted by jonp72 at 3:26 PM on February 13 [11 favorites]


Out of field AI bro, I guess. The worst kind.

(Except fintech)
posted by Artw at 3:36 PM on February 13


I like the language breakdown here. Feels like a companion piece to Innuendo Studios alt-right playbook YT series.
posted by SoundInhabitant at 3:41 PM on February 13 [3 favorites]


there’s nothing inherently racist about the technology, but rather the materials commonly used to bootstrap it.

This is like saying that there is nothing inherently racist about capitalism.

Except for pretty much its entire history.
posted by srboisvert at 4:38 PM on February 13 [1 favorite]


Navelgazer: ...the "indisputable" reading of this is so obvious as to be trivial (like, the computer/code is not conscious, definitely doesn't have intent the way that humans do, etc.) but because of that, this argument is only worth stating if you're trying to push back on responsibility for systemic racism or deny its existence generally.


A mild pushback: saying that the tool isn't racist allows you to refocus the conversation on the people who build the tool, and how it reveals their bias. Or am I missing something?
posted by wenestvedt at 7:40 AM on February 14 [2 favorites]


This is like saying that there is nothing inherently racist about capitalism.

I really, really don’t want to get caught up in this derail in a thread about, ostensibly, paying attention to black voices other than McWhorter. I’m opting to “take it outside” and posted a reply in the weekly Free Thread instead. We can continue there if you like. Sorry, but I refuse to be a Poochie.
posted by Ryvar at 9:42 AM on February 14 [1 favorite]


this derail in a thread about, ostensibly, paying attention to black voices other than McWhorter

I’d say it’s more about situational awareness of bad actors making bad faith arguments TBH.
posted by Artw at 10:28 AM on February 14 [1 favorite]


« Older Making the Blue LED   |   JS+TDS=? Newer »


This thread has been archived and is closed to new comments