Reddit is Being Manipulated By Big Financial Services Companies
February 27, 2017 10:11 AM   Subscribe

Reddit is being regularly manipulated by large financial services companies with fake accounts and fake upvotes via seemingly ordinary internet marketing agencies.

“I work with a number of accounts on Reddit that we can use to change the conversation. And make it a bit more positive.” This was the startling admission of a professional-looking marketing agency that, in a phone call with me, openly bragged about manipulating conversations on Reddit. This wasn’t a one-off, nor was it the result of weeks of plumbing the depths of the dark web looking for shilling services. Finding this agency, and several others, took less than a few hours of basic Googling.

The business of Internet shilling - posing as a genuine forum user but being in the employ of a corporation to promote their work - is booming. And it has been for a long time. From fake Amazon reviews to the U.S Army astroturfing social media, comment manipulation is as old as the very concept of internet forums. Fake comments and fake conversations being hard to spot, especially when they’re made by specialist agencies, makes shilling big business. Nowhere is this more apparent than on Reddit. Being the world’s 22nd most popular website and the U.S.’ 7th makes it a popular target because of the hundreds of millions of eyeballs it attracts every month.
posted by Bella Donna (60 comments total) 34 users marked this as a favorite
 
Meanwhile, my Reddit account was shadowbanned some time ago without explanation and no replies to any attempt to reverse the decision. So now I just read and don't attempt to contribute at all. More room for the marketers to take over, I guess.
posted by Servo5678 at 10:14 AM on February 27, 2017 [3 favorites]


Figures.

Gaming the system: it's the American Way.
posted by ZeusHumms at 10:19 AM on February 27, 2017 [2 favorites]


The Reddit discussion on this is depressingly cynical. A couple hours after it was posted the conversation was basically "well duh, everyone knows that". Where's the appropriate outrage?

The big question is whether Reddit, Inc is in any way cooperating with the shills. At the end of the discussion is the top comment is calling out Reddit's owners saying they really need to comment on this. So far no answer, although in past controversies sometimes they take their time and then write a thoughtful response days later.

I tend to think Reddit, Inc is probably on the right side of this. Fighting manipulation is hard. There's also all sorts of shades of manipulatio. /r/leagueoflegends for instance feels totally artificial to me. But in that case I don't think it's manipulated by professional marketers so much as "influencers" who are just very good at working Reddit's algorithms.

See also vote manipulation by Donald Trump supporters.
posted by Nelson at 10:20 AM on February 27, 2017 [5 favorites]


The big question is whether Reddit, Inc is in any way cooperating with the shills.

There is no way the people running Reddit aren't aware of this. The methods described in the article aren't super-secret hacker exploits, they're very simple tricks that could be easily defeated if management cared to do so. They are either don't care or are actively profiting from allowing this.
posted by Sangermaine at 10:24 AM on February 27, 2017 [9 favorites]


Vote brigading and other similar techniques may be relatively simple to combat, but I dunno what you do about 9 year old accounts with thousands of karma being used to insert occasional shill shitposts into otherwise normal account activity. The worst part is that shilling is believed to be so widespread that the posting of any strong argument or opinion is automatically dismissed as a shill. When I was advocating for HRC prior to the expulsion of /r/the_Donald type posters on /r/politics, I was regularly accused of being a paid CTR shill. Now many pro-Trump posts are dismissed as Russian disinfo campaigning. The existence of fake conversations do real harm to legitimate expression because people behave increasingly antisocially and/or retreat further into echo chambers.

I find myself longing for the old days, when internet commerce was derided on Usenet forums where we shared our favorite gopher sites and IRC channels.
posted by xyzzy at 10:37 AM on February 27, 2017 [18 favorites]


They are either don't care or are actively profiting from allowing this.

Unsurprisingly, this is also why they're totally cool with becoming the world's largest white supremacist site.
posted by zombieflanders at 10:39 AM on February 27, 2017 [38 favorites]


I dunno what you do about 9 year old accounts with thousands of karma being used to insert occasional shill shitposts into otherwise normal account activity.

Simple: you ban them. It's really that easy.
posted by NoxAeternum at 10:40 AM on February 27, 2017 [11 favorites]


First you have to identify the shill shitpost.
posted by Nelson at 10:46 AM on February 27, 2017 [3 favorites]


If the activity is obvious, I agree that banning would work. Is it really that obvious in most of these cases?
posted by soelo at 10:46 AM on February 27, 2017


FTC in theory should be able to go after companies selling or commissioning astroturfing. State AGs too: e.g. NY. I could maybe see a claim under a state unfair competition or consumer remedies act.
posted by snuffleupagus at 10:46 AM on February 27, 2017 [4 favorites]


Nothing like that could happen here, though, right?
*opens a delicious bottle of ice cold Pepsi Blue...mmm, refreshing!*
posted by sexyrobot at 10:47 AM on February 27, 2017 [33 favorites]


Simple: you ban them. It's really that easy.
Obviously, but how to identify them in an efficient manner? It says right in TFA that mods usually give old accounts with varied activity the benefit of the doubt, and no one modding a huge sub like /r/politics has time to do extensive research into the provenance of all accounts posting something that doesn't quite pass the smell test. So it would be up to reddit to develop a tool to make this easier. And I don't know what that tool would do. Sentiment analysis? What?
posted by xyzzy at 10:47 AM on February 27, 2017 [5 favorites]


Discovery of who the astroturf vendors have paid? (By Reddit.)
posted by snuffleupagus at 10:49 AM on February 27, 2017


Pepsi Blue, surf the rainbow!
posted by sexyrobot at 10:50 AM on February 27, 2017 [2 favorites]


> no one modding a huge sub like /r/politics has time to do extensive research into the provenance of all accounts posting something that doesn't quite pass the smell test. So it would be up to reddit to develop a tool to make this easier.

This comment from an fpp a couple of years ago has an interesting (horrifying?) breakdown of the number of mods per sub. I must be outdated by now, but I can't imagine that the situation has improved.

One way of identifying and banning shills is to have human beings do it. If you don't have enough eyes on a sub, then you need more. And you probably need to actually pay them in money, not in good feelings and company scrip. But that would cut into the bottom line, and allowing rampant shilling isn't costing them more than they're willing to pay, so it's unlikely to change.
posted by rtha at 10:56 AM on February 27, 2017 [13 favorites]


For a giant site like Reddit, how big a problem is this? I mean a paid contributor is still contributing, so as long as they aren't dominating the posting, which could be noticed and looked into, in theory, then what they say is just another take on whatever subject is involved. If it's transparently empty support for something, the readers should be able to recognize that just as they would any empty claim from an unpaid contributor.

If that is the case, which it very well may not be, then it's really more the upvoting that is the problem as that isn't as transparent, so wouldn't that be the more important element to address?
posted by gusottertrout at 11:02 AM on February 27, 2017 [1 favorite]


That assumes that the shills and bigots don't have access to influence or harass moderators with the implicit consent of the admins and owners, which they currently do:
“There have been multiple iterations of the_donald,” a Reddit spokesperson, who did not wish to be identified by name, told Gizmodo in a phone call. (CEO Steve Huffman previously said that The_Donald’s team of top mods had turned over at least four times over the year.) “The [moderator teams] that I’ve been involved with for the last six months or so, we’ve actually had a very close working relationship with. We share a Discord channel with them—their private chat. It’s been highly responsive when we need to ask them to take things down that are probably rule violations,” Reddit contended. (The moderation team of r/The_Donald did not respond to multiple requests for comment.)

If Centipede Central is the chat Reddit is referring to—a chat room within the Slack-like Discord program, the one linked in the sidebar of The_Donald and one of the largest servers on Discord—its users have encouraged the harassment of other moderators, artificially inflated the vote count on posts, rigged off-Reddit polls, and posted John Podesta’s personal Netflix login information for the chat’s 1000+ members to use at will.
[...]
Reddit has been conspicuously silent in shouldering blame for giving an audience and a recruitment center to this groundswell of bigotry, despite harboring its most prominent community—one which openly mocks the leadership and rules of the very platform that allowed it to proliferate. Reddit needs to contend with its own shrieking tide of ignorance and hatred now, and outgrow the notion that fewer rules is the shortest distance to greater authenticity.

“At this point, I think reddit is a lost cause because of the admins inability to take action on the group while simultaneously being overwhelmed with dealing with the individual,” a moderator told us. “No other subreddit has been able to be used [as] a platform for harassment for this long in Reddit’s history. And it’s likely going to be what kills it.” Said another: “The social experiment has run its course.”
Now you could say that's just about /the_donald, but given their pseudo-libertarian stances on money and speech, it would seem that Reddit would be more than happy to be paid directly by corporate hacks not to take action.
posted by zombieflanders at 11:07 AM on February 27, 2017 [6 favorites]


Well, there are two different questions here: is this a problem for reddit vs is this a problem in general.

It's not a problem for reddit from their POV unless it leads to users leaving due to gaining a reputation for being full of shills. All reddit cares about is that people keep coming.

It's a problem in general because since reddit is so huge, manipulating something to the top can help spread your brand/disinfo and influence the greater narrative online.
posted by Sangermaine at 11:09 AM on February 27, 2017 [3 favorites]


The funny thing about operating on venture capital while you try to find a viable business model is that, sometimes, the viable business model finds you.
posted by radicalawyer at 11:10 AM on February 27, 2017 [13 favorites]


The "in Soviet Putin's Russia" joke is more relevant than ever now.
posted by Sangermaine at 11:14 AM on February 27, 2017 [2 favorites]


wait, you say there's unscrupulous marketeers ruining the internet? has anyone told laurence canter and martha siegel about this?
posted by entropicamericana at 11:15 AM on February 27, 2017 [10 favorites]


I read that as "Reddit is being mansplained by big financial services companies", and I thought "How is that even possible?".
posted by blue_beetle at 11:33 AM on February 27, 2017 [8 favorites]


has anyone told laurence canter and martha siegel about this?

Thanks for the flashback. And wow, man, have we come a long way since those days.
posted by JoeZydeco at 11:34 AM on February 27, 2017 [3 favorites]


Pepsi Red-dit?
posted by zachlipton at 11:59 AM on February 27, 2017


Reddit would be more than happy to be paid directly by corporate hacks not to take action.

Reddit's has considerable difficulty finding reliable revenue streams, despite their popularity. Lately, they seem to be breaking even more often than not, but not profitable. Perhaps "shill bribes" will become part of their business model.

Though, I guess that's already in place, partially. "Reddit Gold" gives a post a slightly increased amount of visibility and legitimacy.
posted by honestcoyote at 12:32 PM on February 27, 2017


I wrote an article in 2004 about companies paying marketing agencies to infiltrate usenet groups plus other forms of stealth marketing. A local beloved coffee place, Peet's, was one example I used. As recently as 2014 Jezebel noticed a CraigsList ad in Los Angeles looking to pay Yelpers for false reviews. I've seen similar CL ads for the SF Bay Area. People are always, always, always trying to game the system. It drives me nuts but apart from exposing it, I'm not sure what else can be done.
posted by Bella Donna at 12:39 PM on February 27, 2017 [2 favorites]


I would have assumed that vote-ring and brigade detection would be pretty evolved by now.
posted by rhizome at 12:40 PM on February 27, 2017


I'd love to see further analysis of the data presented in this FiveThirtyEight article about language use on Reddit. They don't comment on it, but almost all the graphs of political memes and phrases show massive, literally vertical spikes that collapse just as immediately. This is quite unlike the more natural curves seen in the non-political language, and I think clear evidence of botnet-driven campaigns throughout the election season.
posted by Rhaomi at 12:41 PM on February 27, 2017 [11 favorites]


More than a few subreddits would be better off if financial services companies took full editorial control. Sure we'd hear more about how great Credit Suisse Group is, but at least the irrelevant noise would for the most part be more focused on the one topic and easier to mentally filter out.
posted by sfenders at 1:03 PM on February 27, 2017 [2 favorites]


*opens a delicious bottle of ice cold Pepsi Blue...mmm, refreshing!*

Sadly, the liquid inside is not nearly as delicious as the bottle.
posted by Sys Rq at 1:11 PM on February 27, 2017 [3 favorites]


A while back I created one new reddit account a day for (most of) an entire year, intending to always having a "cakeday" account to post from. I got bored of the project and abandoned it, and gave my accounts to a friend who works for a small pop culture website so that he could use the accounts to promote materials from said site. Unfortunately said friend flew a little too close to the sun somehow and long story short any reddit account created from any machine I own is now immediately shadowbanned.

Pointless story aside, I really do think that we should think of participation on (effectively) unmoderated websites like reddit as not being about discussion so much as tactical advantage. If your idea of "using" reddit is posting your thoughts there, you're not using reddit. Really using reddit (or facebook, or twitter, or wikipedia, or whatever) means figuring out exploits that allow you to elevate your content and content favorable to you while suppressing the content generated by people you oppose. It's not a conversation, it's a game.
posted by You Can't Tip a Buick at 1:13 PM on February 27, 2017 [15 favorites]


I'd love to see further analysis of the data presented in this FiveThirtyEight article about language use on Reddit. They don't comment on it, but almost all the graphs of political memes and phrases show massive, literally vertical spikes that collapse just as immediately. This is quite unlike the more natural curves seen in the non-political language, and I think clear evidence of botnet-driven campaigns throughout the election season.

I don't know. For one thing, the point of branding is to build long-term relationships, and spikes are not really what you want for that. You want for people to put their babies in Coka-Cola and Harley Davidson shirts (even parents who don't own any product other than the shirt). I'd say Star Wars is one of the most successful brands of my generation, because have you tried finding adult pajamas that are not branded with iconography from 30 years ago?

The frequencies on the left side of the graphs are not terribly impressive. We're talking analysis of the long tail of a Zipf distribution. The magnitude of those scales are different from graph to graph. Most are about what you'd expect from something that hit national news and then faded out pretty quick. "Nasty woman," and "bad hombre" almost certainly peak just after the debates where they were uttered.

That's not to say that there are not botnet political trolls in operation, just that I'm not convinced that spikes in the ass end of a Zipf distribution are strong evidence.
posted by CBrachyrhynchos at 1:22 PM on February 27, 2017 [1 favorite]


I'd take payment in lobster to shill for Metafilter.
posted by Sphinx at 1:27 PM on February 27, 2017


This is quite unlike the more natural curves seen in the non-political language, and I think clear evidence of botnet-driven campaigns throughout the election season.

I think it's just a matter of comparing these to other news-based memes. For example, "left shark."

Drumpf spiked immediately after John Oliver's segment. Others spiked after speeches, for sure.
posted by explosion at 1:28 PM on February 27, 2017 [1 favorite]


I used to do this back in the day so I'm weirdly pleased to see it still going on. Mine wasn't on Reddit, just forums where fanboys would hang out, but we'd create hundreds of accounts and then dive into the threads. As the article indicates, the most important thing to do is blend in with the site culture. Sites with off topic groups are a goldmine because you can walk into a political argument and leave with hundreds of posts in a couple days.

The irony was the difference between an actual fanboy and a paid shill was microscopic. As long as you were a fanboy for other things and trolled the hell out of competitors, you were pretty much undetectable unless the admins cared to look.

Most of them don't because more users equal more pageviews equal more ads equal them getting more money.
posted by Ghostride The Whip at 1:32 PM on February 27, 2017 [8 favorites]


I finally cut the cord with reddit, after hanging on far too long thinking it was useful to see what was influencing my (college-aged) students. But it's really just seeming like a lost cause, and I couldn't handle wading through all that crap anymore.

But it was fascinating, in December and January, seeing the same reactions from posters again and again in political threads. "Wait, I can still see my comment!" "Wow, the tenor has changed so much" and then, over and over again, watching regular users realize that outside manipulators had botted the everloving shit out of their conversations for months.
posted by range at 1:36 PM on February 27, 2017 [6 favorites]


Is there another source besides Forbes? I'm running noscript and don't really feel like exempting them…
posted by klangklangston at 2:49 PM on February 27, 2017


It's not a conversation, it's a game.

So true. It killed the Internet for me - I'm only recently participating as 'myself' again, after years of personal silence and professional gamesmanship. I'd find real communities and lurk like a voyeur, just marveling at people's ability to have genuine conversations inside of a marketing and surveillance machine. Between paralyzing cynicism and an abusive ex with Google Fu, I couldn't bring myself to participate unless hiding behind a puppet or brand identity.

I lurked here for fourteen years.

Something flipped around the time of the election and I just don't give a fuck anymore. I'm ready to party in the sausage factory.
posted by BS Artisan at 3:55 PM on February 27, 2017 [10 favorites]


Given the recent advertiser friendly changes to the sites homepage, I'd be stunned if Reddit Inc. isn't on board with this.
posted by peppermind at 4:39 PM on February 27, 2017


Not limited to just social media. I don't trust Amazon reviews. I certainly don't trust magazine reviews. Come to think of it, I don't trust any reviews.
posted by Beholder at 5:14 PM on February 27, 2017 [1 favorite]


CBrachyrhynchos: "That's not to say that there are not botnet political trolls in operation, just that I'm not convinced that spikes in the ass end of a Zipf distribution are strong evidence."

I don't know a Zipf distribution from a Zapf dingbat, but is it not weird to see these massive spikes/collapses *and* more natural movement in the same chart? Look at "low energy" in the first graph, for example. You see a lot of modest, granular rises and falls throughout the summer, then it suddenly doubles instantly on November 15th (???), experiences a similarly gentle fluctuation for a few weeks, only to collapse by the same amount on December 6th. To me, that looks like a baseline of genuine usage that is suddenly boosted by automated activity.

And it's not just Trump spam -- check out "I'm with her." You see a very modest increase during the DNC in late July (when you'd expect everybody to be saying it), then usage skyrockets more than tenfold on October 1st, hovers up and down at that elevated level for awhile, then plummets back to the previous baseline on October 22nd.

Compare with a less charged phrase like "yuge," which never sees these vertical changes. Or non-political phenomenon like "oculus" or "pokemon". Even "button", referring to an old April Fool's Day social experiment they ran, doesn't show such overnight growth/collapse despite the surprise, viral nature of the thing.
posted by Rhaomi at 5:37 PM on February 27, 2017 [2 favorites]


I don't know a Zipf distribution from a Zapf dingbat, but is it not weird to see these massive spikes/collapses *and* more natural movement in the same chart?

Not especially. I'm willing to bet that "La La Land" is spiking pretty hard this month. "Midnight" and "button" would have the same problems of words that are not terribly distinctive. And we're talking about tokens that jump from 1/3000 to 1/2000. There was a pretty aggressive marketing campaign around "I'm With Her" in October, including social media. Just as one point of data, I saw the campaign pass out hundreds of professionally produced stickers with that phrase at Savannah Pride in mid-October of last year. That appears to drop off on election week, just as you'd expect a campaign slogan to do.

So yeah, for a buzzphrase to spike just at the moments its hitting peak news and marketing saturation isn't necessarily a sign of automated activity.
posted by CBrachyrhynchos at 6:16 PM on February 27, 2017


Just for comparison, Rogue One, Captain America, Deadpool, Ghostbusters, and Suicide Squad got big spikes on release week, which is about what you expect from big entertainment media.
posted by CBrachyrhynchos at 6:45 PM on February 27, 2017


So do the post-release Overwatch characters, Ana and Sombra.
posted by CBrachyrhynchos at 6:46 PM on February 27, 2017


I look at Reddit less and less, because everything comes back to some product, or some sneaky, manipulative marketing. There's occasionally interesting stuff, like when there was a guy riding a tractor with GPS who did an AMA because he was bored (apparently the tractor was steering itself). It does still have the potential to reach interesting people all over the world, with a diversity that a site like this just can't have -- I mean, how many threads are there here with sewer maintenance workers joking cracking inside jokes with each other, or with truckers complaining about new drivers?

And that's what really sucks about Reddit now, that all the truly colorful, diverse, actually human stuff, has just been covered up by algorithms and posting strategies that manipulate the conversations towards what marketers want them to be. And it's subtle -- there was a post showing "my teacher's funny coffee cup" with a comment asking "where can I get one?" and a helpful Amazon link. And I just got that same mug for my birthday, and it feels like everything around me is being shaped by gentle nudges from marketing departments.

I read a lot of old newspapers for research, and occasionally I'll see an article and be like "wait, this is an ad for Liebig's Meat Extract," or something. So I don't want to make it sound like marketing hasn't always been kind of shitty and manipulative. But at least the ads were always in the same place. Now I'm just becoming somehow even more paranoid and withdrawn than I was before, because it's horrible when you realize there's no way to know how many times you've been played by a bot or a paid account that wanted you to buy something, when you thought you were actually reaching out to someone. It sucks.
posted by shapes that haunt the dusk at 6:50 PM on February 27, 2017 [6 favorites]


I read that as "Reddit is being mansplained by big financial services companies", and I thought "How is that even possible?".
Actually....
posted by fullerine at 6:52 PM on February 27, 2017 [4 favorites]


Nothing can out-mansplain Reddit
posted by Stonkle at 7:11 PM on February 27, 2017 [1 favorite]


It is not easy to stop shills. If you allow posts, there's not a significant cost to create an account, and you have a reasonable amount of traffic, anything you can think of will work for a brief moment before your adversaries route around it, and by route around I mean going to insane levels of subterfuge.
posted by zippy at 7:20 PM on February 27, 2017


So true. It killed the Internet for me - I'm only recently participating as 'myself' again, after years of personal silence and professional gamesmanship. I'd find real communities and lurk like a voyeur, just marveling at people's ability to have genuine conversations inside of a marketing and surveillance machine.

and

It is not easy to stop shills. If you allow posts, there's not a significant cost to create an account, and you have a reasonable amount of traffic, anything you can think of will work for a brief moment before your adversaries route around it, and by route around I mean going to insane levels of subterfuge.

With no way of knowing the people behind the mask, it is easy for people to game the system -- shilling for certain interests, banding together to shame, drown, attack, and censor dissenting voices. A lot of trolls smearing and bullying people have a vested interest in quashing alternative ideas, and I am aware of it...

And I post under my own name for that reason. It's a risk for me, but I have no filter (*sobs*) and express myself out in the open, even at the risk of getting abused -- and make no mistake, I do get abused -- not by people who are intolerant to the notion that different people experience things differently because they have different life requirements -- but by people who have more dubious and calculated motives for doing so. I know people have handles and I don't blame them for using them, but I just feel more comfortable posting under my own name.
posted by Alexandra Kitty at 8:19 PM on February 27, 2017


With no way of knowing the people behind the mask, it is easy for people to game the system

There's one weird trick for protecting against this. If you're interested, I could connect you with the folks who discovered it.
posted by dersins at 9:01 PM on February 27, 2017 [1 favorite]


Clearly the solution to the wider problem of ads, ad-blocking, and ad-blocker-blocking is for (more) sites to fully monetize this practice for themselves.

What we have here is a model for advertiser-supported content: original articles subsidized by sponsored discussion threads that are guaranteed to be interesting and engaging (or infuriating) enough to prompt people to read and participate in them and be subtly (or not) swayed on topics related (and unrelated) to the content. Algorithms will personalize the tone (and grade-level) of the commentary for specific readers based on a profile constructed from their internet usage tracked across all the sites they visit, and monitor for and prevent unauthorized derailments.

It seems sadly almost inevitable when you think about it.
posted by Pryde at 9:05 PM on February 27, 2017 [2 favorites]


Reddit is just people. Dumb people, mean people, interesting people, people trying to make a buck, scammers, pickpockets, rogues and thieves.

It always amazes me that people expect the internet to be different than real life. It's always been worse because it removes the motivators to stay nice that we have in real life: shame and fear of retribution.
posted by fshgrl at 9:36 PM on February 27, 2017 [3 favorites]


@You Can't Tip a Buick: Wikipedia is not unmoderated and I really don’t see how it’s comparable to social media in this sense.
posted by koavf at 10:10 PM on February 27, 2017


klangklangston, the Forbes piece is a video, which is also available here. Forgot to note that in the original posting. Apologies. The video has captions available. Yay!
posted by Bella Donna at 10:40 PM on February 27, 2017 [1 favorite]


Oops. The captions are kind of crappy. Better than nothing but definitely some wrong info.
posted by Bella Donna at 10:41 PM on February 27, 2017


What's funny is that it makes me feel like I'm a paid shill on the three occasions a year that I want to earnestly recommend a product or website. Despite making thousands of random comments on other posts, I still feel like, "how do I know that I'm not getting paid to recommend this??"
posted by salvia at 10:44 PM on February 27, 2017 [2 favorites]


I still feel like, "how do I know that I'm not getting paid to recommend this??

Odds are, you have been manipulated by marketers, to some degree, into recommending it. You're just not getting paid for your services. One could almost argue that the smarter people are the ones accepting cash in return for posting their manipulated opinions. That's a depressing thought to start the day on!
posted by mantecol at 3:38 AM on February 28, 2017 [1 favorite]


Koavf: Wikipedia is a great source of backlinks, for one, but it's arguably better for content marketing if you can get your material in there. Leads who are researching a problem, product/service, or company will often believe what they read on Wikipedia more readily than what they pick up in other channels or directly from a company itself, so it's a good place to set bait.
posted by BS Artisan at 6:19 AM on February 28, 2017


There's one weird trick for protecting against this.

Yes, not posting at all, or at least remembering that some people have mule accounts and too much free time on their hands. I don't need know any more than that.
posted by Alexandra Kitty at 8:01 AM on February 28, 2017


Unsurprisingly, this is also why they're totally cool with becoming the world's largest white supremacist site.

Yup, reddit only cares about raw user engagement numbers and pageviews

they're going to dig themselves in to a 4chan like hole with this where even the sketchy offshore dildo companies don't want to advertise there for fear of tarnishing their brand by association.

Hands off frozen peach moderation will destroy your site AND your brand, i don't understand why so many companies refuse to understand or acknowledge this. When you pander to the lowest common denominator pepe spamming nerdonazi, you not only drive away everyone else but become inextricably intertwined with that trash.
posted by emptythought at 2:05 PM on March 2, 2017


« Older Love and Cheeseburgers (and Tacos)   |   From drug sniffing wasps to football playing... Newer »


This thread has been archived and is closed to new comments