The way to a nation's brain is through weaponized feel-good tweets
December 3, 2019 11:59 AM   Subscribe

On August 22, 2019, @IamTyraJackson received almost 290,000 likes on Twitter for a single tweet. Put in perspective, the typical tweet President Trump sends to his 67 million followers gets about 100,000 likes. That viral tweet by @IamTyraJackson was innocent: an uplifting pair of images of former pro football player Warrick Dunn (Wikipedia) and a description of his inspiring charity work building houses for single mothers. For an anonymous account that had only existed for only a few months, “Tyra” knew her audience well. [...] For “Tyra,” however, inspiring messages like this were a tool for a very different purpose. That Uplifting Tweet You Just Shared? A Russian Troll Sent It (Darren Linvill & Patrick Warren for Rolling Stone)

Here’s what Russia’s 2020 disinformation operations look like, according to two experts on social media and propaganda.
We’ve spent the past two years studying online disinformation and building a deep understanding of Russia’s strategy, tactics, and impact. Working from data Twitter has publicly released, we’ve read Russian tweets until our eyes bled. Looking at a range of behavioral signals, we have begun to develop procedures to identify disinformation campaigns and have worked with Twitter to suspend accounts. In the process we’ve shared what we’ve learned with people making a difference, both in and out of government. We have experienced a range of emotions studying what the IRA has produced, from disgust at their overt racism to amusement at their sometimes self-reflective humor. Mostly, however, we’ve been impressed.

Professional trolls are good at their job. They have studied us. They understand how to harness our biases (and hashtags) for their own purposes. They know what pressure points to push and how best to drive us to distrust our neighbors. The professionals know you catch more flies with honey. They don’t go to social media looking for a fight; they go looking for new best friends. And they have found them.

Disinformation operations aren’t typically fake news or outright lies. Disinformation is most often simply spin. Spin is hard to spot and easy to believe, especially if you are already inclined to do so. While the rest of the world learned how to conduct a modern disinformation campaign from the Russians, it is from the world of public relations and advertising that the IRA learned their craft. To appreciate the influence and potential of Russian disinformation, we need to view them less as Boris and Natasha and more like Don Draper.
Darren Linvill is an associate professor of communication at Clemson. His work explores state-affiliated disinformation campaigns and the strategies and tactics employed on social media (Science Direct, highlights only).

Patrick Warren is an associate professor of economics at Clemson. Dr. Warren’s research focuses on the operation of organizations in the economy such as bureaucracies, political parties (
Oxford Academic, abstract only), for-profit and non-profit firms, armies, and propaganda bureaus.
posted by filthy light thief (71 comments total) 42 users marked this as a favorite
 
Hey, nobody let the Russians find out about cats.
posted by Huffy Puffy at 12:20 PM on December 3, 2019 [2 favorites]


Too late, they're already decades ahead on cat memes.
posted by Reyturner at 12:23 PM on December 3, 2019 [13 favorites]


This is just one of many reasons why you can't trust anything that purports to be good news.
posted by Faint of Butt at 12:50 PM on December 3, 2019 [5 favorites]


Libby Watson's response on Twitter covers most of my own feelings about this. What is the thesis of this piece? What action are they calling for people to take? Should we stop talking about or caring about injustice? The act of talking about police misconduct, or racism, or sexism is not what's undermining American faith in their institutions. Misconduct and bigotry is what's undermining those institutions. People don't mistrust cops because of a Russian psy-op, they mistrust cops because cops lie constantly and murder people on a regular basis and have done so forever. If these are convenient fault lines for foreign actors to exploit to create division in American society, the responsibility for fixing that doesn't fall on people seeing the messaging from those foreign actors, it falls on our institutions to improve.
posted by protocoach at 12:52 PM on December 3, 2019 [55 favorites]


they are framed to serve Russia’s interests in undermining Americans’ trust in our institutions.

Wow, that's like shooting fish in a barrel that are already shooting each other indiscriminately anyway.

“Joe Biden doesn’t deserve our votes!”

The outrage treadmill is exactly why I gave up on HuffPost, then MSNBC and eventually Facebook. It's weird that it's being used both to sell ads and to undermine the US. And yet some of the actual content is not wrong...
posted by Foosnark at 1:00 PM on December 3, 2019 [2 favorites]


The obvious solution is to stop sharing any kind of tweets.
posted by signal at 1:19 PM on December 3, 2019 [11 favorites]


Libby Watson's response on Twitter

While she has some good points, she really loses me here: "how weak the case for worrying about russian "disinformation" online is. when they say "disinformation" what they mean is "information I do not like""

No. When they say "disinformation" they mean using (stolen) Facebook data to serve ads lying about Hilary Clinton to voters targeted as likely to believe them.
posted by dnash at 1:28 PM on December 3, 2019 [39 favorites]


What is the thesis of this piece?

It has no stated thesis as far as I can tell. What's the implied thesis? I don't really know but the general line of thought with these new-red-scare pieces is that by fixating on domestic issues, Americans spend less effort pressing the political class to do foreign policy stuff. The press isn't covering Syria, politicians aren't pushing the president to act on it, they're all focussed on domestic virtue signaling. I guess the implied thesis is that domestic social issues are "not real" and by extension foreign policy is "real." I guess. I have no clue what the author's actual position is. But it seems a lot like he's saying that "whataboutism" is real. Although there's not much of a call to action to that.

This is not unlike actual Chinese diplomats going after US policy on Twitter.

posted by GuyZero at 1:42 PM on December 3, 2019 [3 favorites]


If the world ends, it'll end not with a bang or a whimper, but a like and a share.

Who could have imagined that the state-of-the-art in asymmetric warfare now is to tie up a whole society by turning it into a large-scale bickering-laden episode of Seinfeld? Monsters are due on Maple Street, yadda yadda yadda... Was that wrong? Should they have not done that?

The feel-good tweets are to get you to trust and empathise... Kinda like meat tenderizer for your brain. They learned that from the white supremacists.

As usual, Captain Kirk has a solution.
posted by zaixfeep at 1:56 PM on December 3, 2019 [5 favorites]


It looks like the thesis is buried near the end of the article: "...we need to learn to question our own and others’ biases on social media. We need to teach — to individuals of all ages — that we shouldn’t simply believe or repost anonymous users because they used the same hashtag we did, and neither should we accuse them of being a Russian bot simply because we disagree with their perspective. We need to teach digital civility. It will not only weaken foreign efforts, but it will also help us better engage online with our neighbors, especially the ones we disagree with."

I think it's very ungenerous to frame this article as "ignore social injustices" when the author is very clearly saying that the professional trolls are very cleverly using real information - including social injustices - to promote other unrelated agendas.
posted by ElKevbo at 2:01 PM on December 3, 2019 [29 favorites]


That's the damning response to Libby Watson. This is not neutral propaganda - it's designed to make certain politicians less viable in the general by damaging them in the primary. It fractures support for the candidate they oppose and thereby props up the propagandist's candidate, who in all likelihood doesn't give two shits about the underlying "critique."

But if the critique is legitimate in its own right, how much do the propagandist's motives matter? And if it's not legitimate - well even there there's a sense in which it doesn't matter so much who it's coming from, because political lies are political lies regardless of whether they are coming from Russia or Fox News.

The point is that the implication of a lot of the "foreign trolls" discourse is that Americans are being divided by issues on which we ought to stand together. I think there's a strong case that we are well past that, for reasons thay have nothing to do with Russia. And, on the other side, the actual lessons to learn here about trusting what you read online also apply just as much to domestic sources. There are issues to which the Russian source is obviously relevant - anything about Ukraine, say - but what's the actual message of vague warnings about "sowing division" except that dissent is dangerous?
posted by atoxyl at 2:57 PM on December 3, 2019 [2 favorites]


Do you want to win or do you want to be right and lose?
This seems to be going very vague to avoid getting tied up in specifics, but on something where specifics exist and matter.
Hopefully I'm not making too much of a reach, but based on what's said above/the context of the thing,
"winning" in this context is ignoring/silencing abuses of power so we can better unite behind that power,
and "losing" (but being right) is calling attention to abuses of power so that there's less unified support behind said power so it's less-effective at countering other groups for bad reasons?

Alternately, why don't we just all unite around breaking apart abusive power? This is being framed as "don't listen to those people trying to divide us by saying true things", when it could just as easily be neutered by embracing the critique. If someone's pointing out we have a growing neo-Confederate armed rebellion brewing, saying the only answer is "stop saying that, we need their guns to fight the Russians" seems suspiciously self-serving.
posted by CrystalDave at 3:43 PM on December 3, 2019 [7 favorites]


But if the critique is legitimate in its own right, how much do the propagandist's motives matter?

It matters a lot. There are likely to be many positive and negative facts that you don't know about most things. Any person presenting those facts to you can choose either positive or negative ones to present, depending on what they want to convince you of.

If you think the process by which you get information is equally likely to generate a fact whether it's positive or negative, you can learn a lot about the overall valence of things by looking at the information you get. (e.g. This study was pre-registered and nobody knew how it would come out, so I believe it.)

If you think it's biased towards positive or negative facts, but you understand the nature of the bias, you can still learn a lot by adjusting how seriously you take the information you get. (e.g. The local news is bought and paid for by the factory, so if it reports positive things about the factory, they count for very little, but if it reports about the weather, I believe it.)

But if you think it's neutral, or biased in some way, and it's secretly biased in a different way, you will adjust incorrectly and draw wrong conclusions.
posted by value of information at 3:51 PM on December 3, 2019 [12 favorites]


People are being bizarrely naive about this. This is a technique that was used on Tumblr a lot; there's a good writeup somewhere about it, but the gist is that the cute animal pictures and the woke retweets are how they hook you in and get followers. They present themselves as True Progressives (or depending on the audience, True Conservatives) and build an audience of thousands who think they are listening to a fellow traveller. Then they start sprinkling in the lies and spin. The fact that they may tweet some true things does not make them any the less dangerous propagandists.

It's not the only political bot technique, some are just used to amplify hashtags and are mostly true robots, but it's a well-known one.
posted by tavella at 4:04 PM on December 3, 2019 [36 favorites]


Do you want to win or do you want to be right and lose?

Have a different idea of how to win, I suspect.

People are being bizarrely naive about this.

I would guess most everybody arguing about this stuff believes this of the people on the other side.
posted by atoxyl at 4:58 PM on December 3, 2019 [2 favorites]


If you want a non-political example, look at someone like David "Avocado" Wolfe. He gets lots of followers with feel-good affirmations then turns it into a channel for new age quackery and monetizing his followers.
posted by CheeseDigestsAll at 5:01 PM on December 3, 2019 [6 favorites]


A more wide-reaching effect of this seems to be that it's giving people a handy excuse to dismiss any critique of power they find uncomfortable with "this must be a bot". And it's driving a few high-profile people positively off the edge, like Jen Kirkman tweeting that AOC was a "Russian tool".
posted by Space Coyote at 5:16 PM on December 3, 2019 [3 favorites]


And it's damaging to places like Twitter, where anyone with below the reader's defined number of followers can be dismissed as a bot without further comment.
posted by sneebler at 5:24 PM on December 3, 2019 [1 favorite]


Boy. Sounds like it’s working.
posted by orange ball at 5:28 PM on December 3, 2019 [2 favorites]


Alternately, why don't we just all unite around breaking apart abusive power?

Russia, which embraced neo-fascism a decade ago, counts as an abusive power here. Just because Russian disinformation happens to be posting things you agree with does not make them the good guys.
posted by Merus at 5:29 PM on December 3, 2019 [17 favorites]


It's weird how the centrists calling for compromise and unity never seem to be offering any compromise themselves. They never seem to think THEY had better get ready to carry some water for someone they don't prefer in order to save the republic.

I guess it's because "compromise" only points rightward in the mind of a centrist.

Heck, if I were a foreign strategist trying to sow discord, and I was trying to keep the center alienated from the left, I would be telling the center, "You're right. I'm controlling them and poisoning them against compromise with you! Only you see the threat I pose! They are practically my foreign agents, repeating the words I tell them muahaha! They will deny it, but that's because they're delusional!"
posted by fleacircus at 5:45 PM on December 3, 2019 [7 favorites]


Yes that's what I took away from this article too, is that the real problem are the centrists.
posted by chuntered inelegantly from a sedentary position at 6:03 PM on December 3, 2019 [4 favorites]


Perhaps it's time to call on the big social media companies to ban all Russian accounts immediately.
posted by Philemon at 6:14 PM on December 3, 2019 [2 favorites]


There's an apocryphal quote often attributed to Lenin that goes something like "the capitalists will sell us the rope we use to hang them," but hey presto, it looks like it was right after all.
posted by Gelatin at 6:38 PM on December 3, 2019 [4 favorites]


If your immediate reaction to this article is to get defensive then you are serving as a perfect example of why these techniques work so well.

If your first thought about psy-ops is that it could not possibly work on you, then guess what, you are exactly the type of person being targeted.

This shit only stops working when we are rigorously self-aware and willing to question our own doctrines--not just those of other people. You have to be willing to interrogate yourself because this shit targets the lizard brain, the gut, and it is extremely difficult to stay abreast of how your lizard brain is influencing your behavior.
posted by Anonymous at 6:43 PM on December 3, 2019


But if the critique is legitimate in its own right, how much do the propagandist's motives matter?

Rather than reiterate something I've argued before, I'll just link to my previous comment about how factual, true, and arguably relevant information can absolutely be weaponized in order to cause bad decisions by bad actors.
posted by Justinian at 6:47 PM on December 3, 2019 [5 favorites]


Russia, which embraced neo-fascism a decade ago, counts as an abusive power here. Just because Russian disinformation happens to be posting things you agree with does not make them the good guys.
I mean... yes/agreed? What would've given the indication otherwise here?
posted by CrystalDave at 6:48 PM on December 3, 2019 [3 favorites]


What if some Russian doesn't work for the IRA but makes fun of Elizabeth Warren or whatever just inspired by the regular personal troll impulse, is that allowed?
posted by save alive nothing that breatheth at 6:58 PM on December 3, 2019


The point of the article is not "stop caring about the things you care about." It's "stop amplifying the influence of automated accounts, even if you agree with their posts." Every like, retweet, and follow gives a bot more influence and its controller more reach for spreading any message they like, including ones you might not agree with.

When you see a post that resonates with you, ask these questions:
"Can I independently assess the accuracy of the content?"
"Is this someone I know personally?"
"Does this account look like it's backed by a real person?"
"Do this account's followers look like real people?"

If you can't answer at least two or three of those questions affirmatively, don't bless them with a like, a follow, a retweet, or a share.

DON'T use follower count as a proxy for assessing legitimacy. It's ridiculously cheap to buy 1,000 or 10,000 artificial followers.
posted by simra at 7:08 PM on December 3, 2019 [20 favorites]


At the risk of sounding flippant, the thesis of the piece and the actions it is calling on people to take are the following:

1. Don't RT randos.
2. Don't waste your time getting into stupid fights on the internet.

That's it.
posted by capricorn at 7:35 PM on December 3, 2019 [8 favorites]


How do y'all feel about calling Kamala Harris a cop?
posted by Reyturner at 7:39 PM on December 3, 2019 [1 favorite]


>1. Don't RT randos.
>2. Don't waste your time getting into stupid fights on the internet.


Also:

* Russian propagandists aren't just tweeting out fake news bits aimed a ultraconservatives, the way you likely imagine they are. Their methods are quite a bit more subtle and are aiming to decenter and destabilize American politics and society by aiming at quite a few different targets.

* Their methods are far more informed by marketing and P.R. methods than "fake news" per se.

* Know your enemy; we need to understand their methods and their objectives even if--or especially if--they run opposite to our expectations.

* 90% or maybe even 99% of what they spew out is, by design, either innocuous populist stuff designed to get likes and retweets, or perfectly true ideas or facts, also designed to gain your trust, likes, and retweets. These things, even though true or innocuous, are not the purpose of the propaganda. They serve the purpose of building other users' trust in the accounts, and helping build followers. But they themselves are not the intended propaganda message.

* That in itself helps explain why it is harmful to follow, like, retweet, etc, these type of accounts even for things that are perfectly true or that you perfectly agree with.

* The purpose of the 99% of innocuous messages is to prime you for the 1% that twists the knife. One example from the article is the "my cousin ... polled over 1,000 conservative Christians. 'What would you do if you discovered that your child was a homo sapiens?'"

That OF COURSE never really happened; it's playing straight into the stereotypes many on the left have about conservative Christians (both prejudiced AND dumb--yeeargh!!!11!); it's reinforcing those (untrue) stereotypes; it's sowing political discord.

That's their purpose.

They are subtle, not always direct, about it.

And if I may say so: People who act like that need to be shunned, whether or not they are Russian trolls.
posted by flug at 8:22 PM on December 3, 2019 [23 favorites]


> it's sowing political discord.

Putting into MeFi terms: On Metafilter, a discussion like the "homo sapiens" example would be shut down PDQ by the mods. We all appreciate that because we understand that kind of discourse never leads to productive conversation or mutual understanding. It leads to anger, frustration, and division. And, of course, it is grossly insulting to a large segment of the population.

Instead of shutting this kind of discourse down, like the MeFi mods do, the Russian trolls' objective is to seek out exactly those kind of incendiary topics, pour gasoline on the fire, and stand around fanning the flames.

The Russian trolls are the anti-Metafilter mods.

Their objective is to turn the entire internet into anti-Metafilter.
posted by flug at 9:01 PM on December 3, 2019 [14 favorites]


How do y'all feel about calling Kamala Harris a cop?

Incredibly good, especially seeing her suspend her campaign.

It is pretty fascinating seeing American liberals, who have been fine with their political icons interfering with the governments of other countries for decades, suddenly cry foul when the shoe is on the other foot. The chickenhawks have come home to roost, or somehing.
posted by Ouverture at 10:02 PM on December 3, 2019 [11 favorites]


No. When they say "disinformation" they mean using (stolen) Facebook data to serve ads lying about Hilary Clinton to voters targeted as likely to believe them.

You wonder why they bother when Fox News does it for free.
posted by MartinWisse at 11:06 PM on December 3, 2019 [1 favorite]


How do y'all feel about calling Kamala Harris a cop?

I mean, she (in)famously liked to describe herself as California’s “top cop.”
posted by Beware of the leopard at 1:33 AM on December 4, 2019 [2 favorites]


Sorry, calling this “a new red scare” is ridiculous. Do you not believe the Russian government is interfering in elections in western democracies using these means? Do you think it’s actually the Ukrainians, like Devin Nunes, who appears to be working for Russia?

Russia is not communist and has no interest in imposing communism. “Red” is a clueless reference. They are fascist, and their interest is in sowing conflict an disruption. And it’s been proven over and over again that they are doing this on a massive scale.

Progressives who trivialize this as a “new red scare” are ignorant of history, using a totally upside down argument, and acting as always as useful idiots for Russian despots. And you’re making common cause with reality-denying Russian agents, or otherwise useful idiots, in the GOP.

If you don’t really believe Russia is a threat to western democracies you’re in denial.

Also I don’t see any blacklists or show trials.
posted by spitbull at 5:04 AM on December 4, 2019 [16 favorites]


If you go to a protest and the Bob Avakianites give you a flyer, are you going to make a jillion copies and send them to all your friends? No? Same principle. The number of people objecting to basic concepts of digital literacy and knowing who benefits from your doing something before you do it is kinda depressing.
posted by PMdixon at 5:23 AM on December 4, 2019 [1 favorite]


Kamala Harris is an interesting case for a lot of reasons.

The "Kamala is a cop" meme appears to have genuinely gotten in their heads. And when addressing it, her campaign tried to spin it as a foreign disinformation campaign despite that same campaign's desire to lean into her tough on crime image. "Justice is on the Ballot" was literally a slogan they used.

She's also a great example of why "foreign influence" as a primary vector of misinformation is hard for a lot of people to take seriously (or at least more seriously than the harm that media bias in favor of capital and the status quo already poses). In this very thread her support for Sanders' M4A bill was cited as evidence that the Democrats as a party are ideologically to the left despite Harris spending the intervening time walking that support back with conditions and 10 year timelines and assurances to the insurance industry that they'll "still have a role".

So, I don't think any but the tankies of tankies deny that The Russians are trying to influence politics in the west because of course they are, they'd be fools not to. But certain parties trying to imply that they're the biggest or even only problem, and if it weren't for them things would be normal, presents as equally foolish.
posted by Reyturner at 6:21 AM on December 4, 2019 [5 favorites]


In this very thread her support for Sanders' M4A bill was cited as evidence that the Democrats as a party are ideologically to the left despite Harris spending the intervening time walking that support back with conditions and 10 year timelines and assurances to the insurance industry that they'll "still have a role".

Literally all the statement you're referring to said was that looking at the recent history of D primary fields and platforms, this one is to the left of where it has been. That is obviously true. You are arguing against a claim that was not made. Can you help me understand why?
posted by PMdixon at 6:42 AM on December 4, 2019


It got in their heads because it's so effective, short and simple but as Reyturner's link says it's clearly referencing a broad range of decisions made and people hurt.

It also worked for a bunch of memes which need multiple characters, and it's interesting to see those characters develop as the primary continues. Bernie as chill and cool with things in contrast to Kamala as a cop is something I've seen a lot of.
posted by Acid Communist at 6:43 AM on December 4, 2019


Can you help me understand why?

As soon as you help me to understand why the purported support for those positions by most of these candidates is not obviously cynical pandering with no intention to follow through and how assertions to the contrary are totally bizarre.
posted by Reyturner at 6:53 AM on December 4, 2019 [1 favorite]


*aren't totally bizarre.
posted by Reyturner at 7:13 AM on December 4, 2019


As soon as you help me to understand why

Intentional or not, this is an excellent example of a propaganda technique called tu quoque, or ‘whataboutism’, which has been used very, very effectively by Russians since the days of the USSR.

(To be clear: I am not calling anyone a Russian troll; however, I do notice that the disingenuous, disruptive rhetorical techniques used by Russian trolls have seeped into social discourse on a much larger scale. Whether or not we realize that we’re being influenced, our behavior toward one another is being influenced by this. Russian propaganda and disinformation is very successfully influencing and reshaping American public discourse, and likely a lot of private conversation, as well.)
posted by LooseFilter at 7:42 AM on December 4, 2019 [11 favorites]


Calling something "Russian propaganda" can of course itself be a form of propaganda. Even when it's true.

I have no doubt the Internet Research Agency is doing these things. And also no doubt that there are American groups also working in a similar manner, many of them for profit.
posted by Foosnark at 7:51 AM on December 4, 2019


Whataboutism is toxic to discourse whether its a calculated attack or not. It should be consistently rejected as being a useful contribution to a conversation.
posted by PMdixon at 7:59 AM on December 4, 2019 [2 favorites]


Going back to the article, it's worth clicking on the link for Digital Civility. It'd be better phrased as Digital Civics. They're not calling for tone policing (at least in the linked article, it felt that there was a little of that in the RS piece), but instead to be aware of (and teach) how to spot bots and trolls, especially sympathetic trolls. (A favorite of mine that a friend posted was that the March of Dimes was so named because they gave a dime out of every dollar they received, rather than the truth, which is that they were organized in the 40s and asked for dime donations.)

I'm not a fan of Biden. I'm not going to vote for him in the primaries, although I'm at the Yellow Dog stage in voting against Trump. And I agree with most of the criticisms about him. That said, facts should be checked before memes are shared. That's really what it boils down to. That's the thesis of the piece. There's some bits about being less divisive, something I think came much more from the right than the left (read the piece about Trump and evangelicals that's to the side for a decent account of that), but I think it's more important to focus on the fact checking than anything else.
posted by Hactar at 8:08 AM on December 4, 2019 [5 favorites]


Pointing out that candidate doesn't support a policy that they obviously don't support and then pointing out that citing that candidate as evidence of broad support for that policy is strange isn't whataboutism, tho.

I didn't go into how that's also true of the rest of the named candidates because I was specifically talking about Harris.
posted by Reyturner at 8:10 AM on December 4, 2019 [1 favorite]


Responding to a statement that is explicitly stated in relative terms as if it were stated in absolute terms is not good faith engagement.
posted by PMdixon at 8:15 AM on December 4, 2019


Ok, I will concede that, due to popular pressure, many candidates, including Harris, have made statements supporting leftist causes that they have since walked back and equivocated into meaninglessness, which is more than can be said about previous campaigns.
posted by Reyturner at 8:32 AM on December 4, 2019 [1 favorite]


Yes, it is. And that matters.
posted by PMdixon at 8:35 AM on December 4, 2019


I mean, declining support for M4A correlates with not knowing what it is, and there is a lot of domestic money and power invested into keeping people ignorant.

And the more candidates muddy the waters with their fake support, the more harm is done to our ability as a society to address structural injustice. And dismissing criticism as divisive foreign agitprop is absolutely a tool being used and abused for that purpose.

At the end of the day, the potential harm of Russian "chill Bernie" memes strikes me as less urgent than the actual harm being done by capital and the power establishment in an effort to protect profits at the expense of the most vulnerable people.

Foreign influence is a problem, but not the biggest by a long shot.
posted by Reyturner at 9:17 AM on December 4, 2019


Foreign influence is a problem, but not the biggest by a long shot.

It’s not a competition, though, and we don’t talk only about [what you think are] the biggest problems, especially in a discussion about the topic of this post.
posted by LooseFilter at 9:38 AM on December 4, 2019 [3 favorites]


It’s not a competition

Yeah politics is not some kind of contest of powers trying to spend their resources and mobilize their supporters to exercise political will.
posted by Space Coyote at 10:07 AM on December 4, 2019 [5 favorites]


But the polls also suggest that voters are unaware of single-payer’s most popular features. When Navigator asked respondents to pick their top three priorities for health-care policy, reducing out-of-pocket costs, premiums, deductibles, and drug prices were by far the most commonly cited. Meanwhile, “ensuring that you can keep your existing insurance coverage” ranked next to last

Which is to say, voters ostensibly care a lot more about cutting their costs than they do about avoiding changes to their existing coverage (note that even keeping one’s current doctors did not rank as a high priority).


But go off.
posted by Reyturner at 12:04 PM on December 4, 2019 [2 favorites]


Well yeah, if by "maybe you could pitch M4A this way" you mean "maybe you could tell people what it actually does in plain language".
posted by Reyturner at 12:19 PM on December 4, 2019


But if the critique is legitimate in its own right, how much do the propagandist's motives matter?

It seems like it would depend on how the critique is framed and supported, and I think that does flow from the propagandist's motives.

If I, a propagandist, want to support a cause directly, I'm going to try to be enticing and persuasive, and if I'm using a legitimate critique, presumably I'm supporting a "there is truth and there is falsehood and they can be established by study and reasoning" model. So if I'm putting forward a legitimate critique, I'm putting it forward in a way that emphasizes its truth and the possibility of truth. Here, I would argue that any old propagandist telling the truth, even if it's an ugly truth being told for ulterior reasons, is on pretty firm ground. "Don't tell people about the evil doings of the United States, we should keep those quiet for the greater good" is not something we should accept.

If I, a propagandist, want to fuck with people and create a sense of instability, etc, I'm going to frame my critique in ways that cut against "there is truth and we can establish it through study and reasoning" - in fact, I'd probably rather advance a weak case that will get people het up and make them uncertain than advance a strong one. The propagandist in this case doesn't care about the effectiveness or morality of what they're sharing and in fact wants to make people feel that all news is unreliable and that it's basically impossible to get actionable information. I, such a propagandist, would be just as happy to spread racist rumors as truthful information.

Like, consider badly formatted, partially incorrect, misleadingly framed tweets by people you agree with. Those really make me nervous because I feel like I have to do a lot of drilling down to establish what's actually going on and yet it's tempting just to trust them because they're people I agree with. And that's not any kind of bot, it's just someone who lets point-scoring get ahead of their common sense. Those tweets make things worse.

So I guess I'd say that the outcome of propaganda depends on the goals of the propagandist, and that anyone who thinks that any state with a big security apparatus is somehow operating on the side of the common people instead of amorally serving its own interest is going to be in for a pretty big shock down the road.
posted by Frowner at 1:42 PM on December 4, 2019 [2 favorites]


Eliminates private insurance companies? 58-37 against.
I'm trying to imagine the people who would rather break a bone and go bankrupt than tear down those ghouls.

Raises taxes? 60-37 against.
If we accept that "raises taxes" is an automatic disqualification, regardless of actual cost to the consumer going down, literally nothing can be done.

And the Republicans are going to call anything that any Democrat does communist fascism, so who gives a fuck what they think.
posted by Reyturner at 3:10 PM on December 4, 2019


I'm trying to imagine the people who would rather break a bone and go bankrupt than tear down those ghouls.


The polls in this case demonstrate that for whatever reason people do not feel it is a choice between bankruptcy and tearing them down. You can disagree with that opinion, but the overwhelming existence of these people is a fact and I'm not sure what the point is of burying your head in the sand.

Republicans are going to call everything communism, but the effectiveness of that critique is heavily dependent on policy chosen and the breadth and depth of propaganda. I think it is ridiculous to pretend like the Russian intelligence apparatus doesn't heavily weight the propaganda efforts in the Republicans' favor.

It is really disturbing how many people would rather clap their hands over their ears and close their eyes because their preferred candidate happens to be the beneficiary of this work. Do you really think that's happening because the Russians just think they're the best candidate?
posted by Anonymous at 6:01 PM on December 4, 2019


The Russian social media subversion campaign is sophisticated. I often wonder at the provenance of memes and factoids on twitter, imgur, reddit -- any of these semi-public sites (the semi-private ones, like discord, facebook, I've entirely given up on, as they're just ludicrously susceptible to entryism). Remember the woman who bleached manspreaders?

I'm sure it's not just the Russians btw. Recently I saw somebody post a video where a lone police officer in Hong Kong gets kicked around by people wearing masks, gloves, alongside a bit of defiant / triumphant text / commentary. I seem to recall the clip starts out kind of exciting with a spectacular bit of bravery by one of the masked persons, then as the cop falls to the ground and more people move in to start kicking him, the footage rapidly becomes quite unpleasant. I can see how somebody might have posted that clip out of an authentic desire to share a powerful moment. But I also see how it portrays supporters of the HK protests as thugs who celebrate violence. False flag? Different perspectives? Fog of war? Who knows?

The one thing that's certain is controversy draws lots of angry comments from lots of angry people, stirs a lot of bad blood between people. When you build something, you have to carefully make sure that every single part fits together. But when you want to destroy something, you generally only need to mess with two or three parts before the whole thing comes tumbling down.

That's why I think in the final analysis we ought to include an unflinching look at own vanities. I'm not fond of the word "performative" but it's difficult to deny that likes & views & RTs convey social value which tends to accrue to the most timely or the most popular or the most outrageous user.
posted by dmh at 6:49 PM on December 4, 2019 [3 favorites]


In comparison the democratically countenanced efforts at information laundering and counter-influencing seem to suffer from an embarrassing attitude of ineptitude: A Muslim online lifestyle platform targeting British teenagers is discreetly funded by the Home Office’s counter-extremism programme, the Observer has learned.
posted by dmh at 7:10 PM on December 4, 2019


And the smear campaign spins up again..
HRC to Stern, re Russians: “Basically, they were like, hey, let’s do everything we can to elect Donald Trump... They also said Bernie Sanders, but that’s for another day.” #VastRussianConspiracy https://twitter.com/maxblumenthal/status/1185342091351777286
Anyway, looking forward to being told that better things aren't possible because, while you might be passing along this piece of content about Medicare For All because you think it's funny and has a good moral message, one can never discount the possibility that it's Russia making you do it.
posted by Space Coyote at 7:01 AM on December 5, 2019


Anyway, looking forward to being told that better things aren't possible because, while you might be passing along this piece of content about Medicare For All because you think it's funny and has a good moral message, one can never discount the possibility that it's Russia making you do it.

Man, I'm frustrated seeing this rejection of this piece! And I think I'm especially frustrated because the people I'm following who are most concerned by this propaganda initiative are leftist progressives who are deeply worried about discord sowed by these kinds of inflammation resulting in the breakup of coalitions that might otherwise achieve our goals.

I hang out on Tumblr a bunch, right? And I've watched these accounts spread. I've watched the kinds of inflammation and discord they leave in their wake. I've even followed one or two by mistake. I've paid close attention to the kinds of specific intra-progressive fights that these accounts spread, too--in part because at least one of them is an existing fault line within queer communities I've been watching for some time. Centrists are not the only people who are concerned about this, and they are also not the only people who should be concerned about this kind of propaganda.

Here is the person whose writing on counter-propaganda and monitoring Russian influence I follow most closely. She's an anarchist herself, and you can see her own political opinions very clearly in the specific examples of the stuff she is citing and talking about and the examples she uses.

These discussions are targeting marginalized people and trying to sow despair. They are trying to get you to decide that we can't really do better by each other. These trolls want you to think there is no hope, they want you to fight each other based on kneejerk categorizations without stopping to evaluate what specifically is being said, and they want you to be mistrustful of anyone who doesn't say things you already believe.

It's not about your political position. It's about your ability to trust people who disagree with you on some things and your ability to form coalitions with them anyway. It's about the ability to ramp down aggression and find meaningful agreement with people, at least in the short term. It's about fact-checking and being skeptical of information that looks too good to be true, and it's very much about trying to trust each other to be trying to find connection even when we are scared and angry. That is not necessarily a centrist position or even a liberal position: it's a position on how we as humans should interact with each other. You don't need to be convinced by everything another person believes in order to try to communicate and find common ground with them on specific topics--indeed, making coalitions with people with whom you disagree on other items to achieve specific goals is how effective politics happens at all. But that doesn't mean that you have to change your values or your opinions on what can and should be achieved over a long term. And it certainly doesn't mean that you shouldn't be trying to influence the people you are building coalitions with over to your side, too.

I just--we have to take this shit seriously. We have to be kinder to each other. I am not saying we need to go sing Kumbaya with a bunch of Nazis, but I am saying that within the left we need to be careful about throwing babies out with the bath water. If we are going to build a better world with better things, we need to convince both ourselves and people who don't necessarily agree with us on first principles: and that means being careful with our contempt and our rage, and generous with our compassion and willingness to de-escalate as much as we possibly can.
posted by sciatrix at 7:36 AM on December 5, 2019 [20 favorites]


Don't listen to sciatrix, did you know that Hillary Clinton and the DNC desperately want to keep you from finding out that you have a chance to win an even money bet on snake eyes coming up on a pair of dice of my choosing?
posted by PMdixon at 7:44 AM on December 5, 2019


I mean. That kind of biting "this is obvious, you've got to be foolish if you don't agree" comment is kind of.... what I'm talking about as not being helpful. Right? If someone has legitimate concerns ("taking this seriously feels like it is going to be used to shut down my attempts to organize to make a better world"), the best way to grapple with those concerns is to hear them out fairly and then explain why they're wrong. The best way to counter these trolls is to decrease inflammation in both leftist and liberal spaces. That means that you want to de-escalate conflicts and provide room for people to find points of agreement, not leave people fuming over a well-pointed gotcha.

Leaving those comments feels super good--god knows, I left one myself yesterday--but they don't serve to help inoculate communities against the kind of discord that Russian users are sowing. We need to decrease inflammation as much as we possibly can, especially in the presence of paid propagandists who are explicitly trying to increase it. The best way to do that is to honestly listen to people's concerns and take them seriously. That means you have to enforce those principles as much as possible within your community spaces. You work on encouraging people to take deep breaths and work through fear and anger to de-escalate. You eject people from community spaces who refuse to do that work or who actively incite inflammation in others. You work on inclusivity and accessibility, including and especially to people who don't intuitively understand you right off the bat, and you listen as much as you speak.

When I left that short, bitey "well, gotcha!" comment, I thought about it for a few minutes... and then I flagged a mod to delete it. (The comment it was responding to was also deleted.) I didn't need to humiliate the person I was responding to, who is a decent community member and human being that said something dumb. I didn't need to have my own biting, immediate wit recognized, either. I just needed the original dumb thing to be either not present or pushed back against, and there are ways to do that that cause less inflammation than the sharp, sarcastic little quip I put in place to begin with.
posted by sciatrix at 8:13 AM on December 5, 2019 [10 favorites]


And the smear campaign spins up again

Two tweets up from that one this same guy is ranting about how Jill Stein has been unfairly smeared. He don’t seem so credible.

You know what is actually true, though? I remember seeing the worst of the misogynist smears against HRC, the most misleading memes, get laundered through the Sanders subreddit. Like someone would share something from a tumblr or Facebook or whatever, and suddenly it’s now on reddit, and then it’s out in the broader ecosystem. The SandersforPresident subreddit was moderated, for a long time, by people affiliated with the Sanders campaign. Tad Devine, who was with the Sanders campaign, was interviewed by Mueller and had to testify in a case against Paul Manafort, because they worked together in Ukraine in that shady election that presaged the 2016 clusterfuck in the US.

And even if that doesn’t make you suspicious, given that we’re in the worst timeline and far less crazy things have happened regularly in the last 3 years, the Russian support for Sanders is a part of, I think, several indictments? (Some of the same indictments that have named Jill Stein, so maybe that explains why that ranting tankie is not having it.) Like it is now part of the public record.

Sanders hasn’t commented publicly on it, he just likes to benefit privately from it. And it fucking sucks. I have no idea why anyone thinks you should trust a guy who does this.
posted by schadenfrau at 9:13 AM on December 5, 2019 [3 favorites]


And it's driving a few high-profile people positively off the edge, like Jen Kirkman tweeting that AOC was a "Russian tool".

I'm sure there's other examples, but Kirkman was a low-key 9/11 truther back in the day. She's been off the edge for over a decade.
posted by Lentrohamsanin at 1:28 PM on December 5, 2019




Yeah, on that, can anyone clarify, is any of the information being cast into doubt? All the articles call it a disinformation campaign, and link it to previous occurences where the information was false, but they're less clear on whether there is actually any reason to doubt the documents themselves.

Which would be part of why I'm suspicious that attempts to rein in Russian interference are going to conveniently undermine the left.
posted by Acid Communist at 8:05 PM on December 6, 2019


Well there's that word: disinformation.

"Misinformation" is falsehooods.
"Disinformation" can include truths your superiors would rather remain undisclosed, obscure, or neglected.
posted by save alive nothing that breatheth at 8:19 PM on December 6, 2019 [1 favorite]


truth mixed with lies ... and it's like mustard gas in WW1. All sides are using it, sometimes inadvertently attacking their own side. Fog of war and all that.
posted by philip-random at 8:54 PM on December 6, 2019


« Older gromm-nom-nom   |   The one-traffic-light town with some of the... Newer »


This thread has been archived and is closed to new comments