Wikipedia and state censorship
August 6, 2014 8:18 AM   Subscribe

Under the new “right to be forgotten” law Google is forced to remove search results for certain pages. Of the 328,000 links that Google has so far been coerced into removing, more than 50 were to Wikipedia (*nyt). The Wiki Media Foundation has created a dedicated page where they will be posting notices about attempts to remove links to Wikimedia. They include Gerry Hutch, Tom Carstairs in concert (image) and the rest Italian and Dutch articles. A new front in the Wikipedia deletion wars has opened. Wikipedia swears to fight 'censorship', and Wales calls the law 'deeply immoral'. posted by stbalbach (83 comments total) 10 users marked this as a favorite
 
Has Barbara Streisand yet requested anything be taken down?
posted by Thing at 8:23 AM on August 6, 2014 [1 favorite]


The result is an internet riddled with Orwell's 'memory holes' – cases where inconvenient information simply disappears."
We should stop referring to the ruling as the "right to be forgotten." This is a much better term.
posted by ChurchHatesTucker at 8:31 AM on August 6, 2014 [5 favorites]


MetaFilter enforces a "right to be forgotten" with the "Brand new day" policy and a strong mod-enforced norm against researching users' post and comment history. Is this different? How?
posted by the man of twists and turns at 8:34 AM on August 6, 2014 [1 favorite]


saying a monkey owns a copyright to something seems very disingenuous.
posted by garlic at 8:36 AM on August 6, 2014 [2 favorites]


MetaFilter enforces a "right to be forgotten" with the "Brand new day" policy and a strong mod-enforced norm against researching users' post and comment history. Is this different? How?

No government forced MetaFilter to implement a “Brand new day” policy over mathowie's stringent objections.
posted by Holy Zarquon's Singing Fish at 8:39 AM on August 6, 2014 [18 favorites]


MetaFilter enforces a "right to be forgotten" with the "Brand new day" policy and a strong mod-enforced norm against researching users' post and comment history. Is this different? How?

Plus I'm pretty sure if I do a search on metafilter for handles of people who were banned and then took advantage of the brand new day policy, I'd still find the old comments/posts that got them banned in the first place.
posted by nushustu at 8:45 AM on August 6, 2014 [1 favorite]


Wales is the last person who should be throwing rocks here, considering his attempts to rewrite the history of Wikipedia to push the other founders out of the picture, not to mention his comments on deletion of Wikipedia articles.

And I find it funny how people want to privilege information over people.
posted by NoxAeternum at 8:55 AM on August 6, 2014 [1 favorite]


saying a monkey owns a copyright to something seems very disingenuous.

The people at Commons who debate these things are usually pretty clued. And they have support from WMF's legal council. I haven't seen the discussion but imagine it was informed, not just a stick it to the man monkeys rule decision.
posted by stbalbach at 8:55 AM on August 6, 2014 [1 favorite]


stbalbach: "Also: Wikipedia refuses to delete a photo as a 'monkey owns it'."

What implications does this have for those "elephant painting" or "why cats paint" animal paintings?
posted by boo_radley at 9:02 AM on August 6, 2014


The bizarre thing about the monkey picture is that by that logic, unless they have the permission of the monkey, they don't actually have the right to use it.
posted by Sequence at 9:07 AM on August 6, 2014 [7 favorites]


And I find it funny how people want to privilege information over people.

It's people all the way down.
posted by Kadin2048 at 9:10 AM on August 6, 2014


Of course, If the monkey owns the copyright, then there is no way they could possibly have permission to post the copyrighted photo. If the photographer knew how to find the monkey again, perhaps he could leave out some unsigned takedown notices and hope the the monkey "marked" one...
posted by surlyben at 9:12 AM on August 6, 2014 [1 favorite]


the man of twists and turns: MetaFilter enforces a "right to be forgotten" with the "Brand new day" policy and a strong mod-enforced norm against researching users' post and comment history. Is this different? How?

First, isn't this better discussed in MetaTalk?

Second, "Brand New Day" has a pretty limited application. It permits users to start fresh on MetaFilter, but other users can still search for and link back to past comments and posts from the user's old persona.

The Memory Hole option that the EU has put into place is an attempt to make the past disappear by making it unsearchable, or actually removing the offending item from Wikipedia.
posted by filthy light thief at 9:15 AM on August 6, 2014


boo_radley: What implications does this have for those "elephant painting" or "why cats paint" animal paintings?

There is a LOT of animal-created art, but I don't see anything around about copyrights of those works. Maybe this case will set some interesting precedence.
posted by filthy light thief at 9:18 AM on August 6, 2014


Pretty sure I don't like the RTBF memory holes, but then I don't really have anything to hide.

As a sysadmin in the US, we do get requests to remove things. They range from incomprehensible ('why would you want to delete that?') to deranged ('this page shall serve as a warning to those who follow, on who you were and what you did'). That latter one might need some explaining. The local Linux User's Group had a gentleman request his posts to the mailing list be removed. Apparently he had made some kind of bomb threat during an argument about first amendment rights to a LUG two hours away, and after being banned from the listserv where the threat was made, he appealed his ban on our mailing list instead.

A choice excerpt (emphasis mine):
There is collateral damage unfortunately because the list is
archived and other Linux communities may connect me with what is
in those archives. There is no telling what opinion people will have
after reading what's on there.
I am already losing the ability to ask
Linux related questions on a different Linux list. Because of
the nature of the Internet, there is no limit to where the archive can
be accessed from let alone the trouble it can cause me. If I am to be
banned forever, the attacks on me and anything that
is damaging to me should be purged. Archiving discrimination and
fighting is problematic and should not be done.


My concern is that RTBF then, would be abused by such people as a reputational airbag; encouraging more of the behavior we don't want by protecting people from the consequences of said behavior.
posted by pwnguin at 9:27 AM on August 6, 2014


What implications does this have for those "elephant painting" or "why cats paint" animal paintings?

The Wikicommons discussions are diffuse in various places and ongoing, but someone left this comment:
David Slater has on many occasions states hat he did not take the image. He is by his own admission not the creator of the image in question. The Macaca cannot own copyrights, hence its valid classification as a public domain image. The debate about the deletion of this image might come up again once Slater changes his story about the circumstances of the creation of this image.
Slater says he plans on bringing the case to court to determine rightful owner, but it goes back to 2011 when a news agency began selling the image.

In terms of animal created art, Wikicommons considers it Public Domain. Other self-portraits of animals on Commons.
posted by stbalbach at 9:34 AM on August 6, 2014 [2 favorites]


Pretty sure I don't like the RTBF memory holes, but then I don't really have anything to hide.

This really sums up the core flaw in the anti-RTBF arguments. The whole point of the "right to be forgotten" is that stigmatization makes reintegration difficult, if not impossible, and that constantly dredging up the past helps to reinforce stigmatization. A lot of the opposition seems to be centered on the worry that someone might "get away" with something because of this, which I find to be telling.
posted by NoxAeternum at 9:45 AM on August 6, 2014 [4 favorites]


constantly dredging up the past helps to reinforce stigmatization.

Memory Hole, meet Streisand Effect. I'm sure you'll have a lot to talk about.
posted by ChurchHatesTucker at 9:50 AM on August 6, 2014 [1 favorite]


the man of twists and turns: "MetaFilter enforces a "right to be forgotten" with the "Brand new day" policy and a strong mod-enforced norm against researching users' post and comment history. Is this different? How?"

It's just a new account. The norm against dirt-digging is a policy against being a jerk, not a policy against accessing information. So, in response to your questions:

1) Uhh, yes, very.
2) Because no content from the user's former identity is removed from Metafilter when they go Brand New Day.
posted by desuetude at 9:57 AM on August 6, 2014 [1 favorite]


Actually, it is a policy against accessing information, because people generally don't just go accessing information in a targeted manner on a lark.
posted by NoxAeternum at 10:01 AM on August 6, 2014


Ah yes, the Streisand Effect - how the Internet punishes people who do something it doesn't approve of. I really tend to see it more and more as a sort of mob justice. /
posted by NoxAeternum at 10:05 AM on August 6, 2014 [2 favorites]


This comes up so often in such discussions, but: Metafilter is not a state actor. It is not required to uphold the same standards as state actors. Metafilter can limit speech in any way it wants. There is a difference between saying "a state actor should not be able to enact laws that say people can't call each other assholes" and a private website deciding to set a community standard that it's not okay to call each other assholes.

The people who really need protecting here aren't the giant celebrities who're subject to the Streisand Effect, though. I mean--this is an area where I am incredibly conflicted and I still haven't totally figured out how I feel about it. But who they're trying to protect are ordinary people, and we're getting to the point now with news archiving that there's no reasonable expectation without this that the world will ever forget what you did. Simultaneously, it has become commonplace for people to Google prospective hires. If you can't have an expectation of getting legitimate employment after leaving incarceration--well, it is in the best interest of everybody to make sure that people who've served their time can get jobs. This isn't just about making feel better about their past errors or escape the judgment of random strangers on the internet; it's about stuff like whether the casual Google search by a hiring manager is going to lose you a job on the basis of something you did twenty years ago because you haven't been generating enough indexable content with your name in it in the meantime.
posted by Sequence at 10:10 AM on August 6, 2014 [2 favorites]


Huh? No, it really isn't a policy against accessing information. You can do all the digging you want in someone's posting history. There is nothing stopping you. What is typically considered rude, to the point of being an enforced social norm, is discussing it when it's not relevant, and particularly quoting someone's history in order to create a "gotcha!" situation, along with other general asshat behaviors. But the information is all there if you want to pull it up.

The analogy to the EU directive would be if some posts were somehow invisible in a user's history, although still present in the discussions. (I think, actually, I've seen other forums do weird stuff like that, or just prohibit/disable the view-posting-history tool.)

The real-world analogy to MetaFilter's policy, I think, would be if you didn't try to somehow memory-hole things on the Internet, but just prohibited (say) discrimination on the basis of anything that anyone did online before they were 18, or before they legally changed their name, or some other clear line-in-the-sand that they could draw. That might be difficult to enforce in practice, although not really any more than other anti-discrimination rules, which doesn't mean it's a bad idea.

Rather than trying to make things disappear from the Internet, which is never going to be successful anyway, the better tactic would seem to be to focus on what sort of shitty things people are actually going to do to each other with the information that's (always going to be) available. E.g., if we'd prefer people's teenage mirrorshots and college selfies to not be grounds for employment discrimination later, then we should legislate that, rather than trying in vain to make the pictures go away while still leaving any that slip through fair game for hiring decisions.
posted by Kadin2048 at 10:17 AM on August 6, 2014 [1 favorite]


This isn't just about making feel better about their past errors or escape the judgment of random strangers on the internet; it's about stuff like whether the casual Google search by a hiring manager is going to lose you a job on the basis of something you did twenty years ago because you haven't been generating enough indexable content with your name in it in the meantime.

In this case, shouldn't the onus be on the hiring manager to disregard that information? We rightfully forbid employers from making hiring decisions based on, e.g., race or gender, so why not include criminal history, or indeed any history unrelated to employment?

Or am I just being a hopelessly naïve pollyanna here?
posted by Faint of Butt at 10:21 AM on August 6, 2014 [3 favorites]


Except that the information isn't being made to disappear from the Internet, unless you consider search engines to be infrastructure. And if you do, then there are a whole lot of other discussions we need to be having then.
posted by NoxAeternum at 10:23 AM on August 6, 2014


We rightfully forbid employers from making hiring decisions based on, e.g., race or gender, so why not include criminal history, or indeed any history unrelated to employment?

And that's the problem. We forbid it, and it happens all the time. Now, with things like gender and even race, at least for a larger employer, if it's a systemic problem then it's got some chance of coming up often enough that you catch it. But the mere fact that it's patently obvious that racial and gender biases continue to exist in hiring should suggest that just telling people not to do something doesn't stop them from doing it. Erecting some barrier against them doing it does. I'm, again, not saying I'm really in favor of this particular tactic, it leaves a bad taste in my mouth, but I can see why they think it's necessary.
posted by Sequence at 10:28 AM on August 6, 2014


I wouldn't say hopelessly. But scientia est potentiam, so it's really hard to ask employers to just ignore that information, especially when it's very difficult for someone negatively impacted to actually get a remedy. This also ignores that employment is just one of the many problems that this causes.
posted by NoxAeternum at 10:29 AM on August 6, 2014


the man of twists and turns: "MetaFilter enforces a "right to be forgotten" with the "Brand new day" policy and a strong mod-enforced norm against researching users' post and comment history. Is this different? How?"

Possibly because Metafilter tells us our comments here remain our own (see small print notice in lower right hand corner of the page), whereas the Google ruling forces people to remove content for which the person or entity requesting removal does not own.

tl;dr: MeFi allows us to delete our own mistakes. Google is forcing us to delete our own work to hide other people's mistakes.
posted by caution live frogs at 10:30 AM on August 6, 2014 [1 favorite]


A lot of the opposition seems to be centered on the worry that someone might "get away" with something because of this, which I find to be telling.

Spoken like someone who's never been on the receiving end of abuse and infuriated that the abuser keeps getting away with it.
posted by straight at 10:32 AM on August 6, 2014


OK, OK, for the greater good, I will out myself.

I am that monkey. I explicitly gave Wikipedia permission to publish it.
posted by Flunkie at 10:32 AM on August 6, 2014 [4 favorites]


Why is Wikimedia publishing images of the takedown emails? Why not post their contents, so that the link to the page being suppressed is clickable?
posted by Flunkie at 10:42 AM on August 6, 2014


Spoken like someone who's never been on the receiving end of abuse and infuriated that the abuser keeps getting away with it.

No, it's spoken like someone who realizes that cutting off your nose to spite your face is counterproductive.
posted by NoxAeternum at 10:43 AM on August 6, 2014


tl;dr: MeFi allows us to delete our own mistakes. Google is forcing us to delete our own work to hide other people's mistakes.

Not really. The only thing Google does is remove search results for specific queries; e.g. Martin Wisse is a big meanie head. The original source of this search result is still there.

Note btw that this socalled right to be forgotten is the result of years of civil lawsuits and jurisprudence, drawing on some very basic human right principles and taking into account Google's unique position and power as well as their refusal to cooperate before.

Now on my own blogs I've gotten two requests to take down old posts for some reason or other, where people who'd been in the news a decade or so ago for something bad, possibly by mistake requested to have this taken down. I complied in both cases, but is that giving in to censorship?
posted by MartinWisse at 10:45 AM on August 6, 2014 [5 favorites]


From Wikipedia (ironically I guess) article on the right to be forgotten, the EU law states "...individuals to whom the data appertains are granted the right to ‘obtain from the controller the erasure of personal data relating to them and the abstention from further dissemination of such data..." which to me sounds like deletion requests. In the case of Google, they delink rather than delete, as they don't own the content but they do own the index. But by my reading, the law allows deletion requests for the content itself. So yes, technically I should not have explicitly stated Google in the last sentence. But the end result is the same - actually, are you in the EU? I should send a request. Your content must go so that my mistakes can be hidden.
posted by caution live frogs at 10:52 AM on August 6, 2014


So, here's a couple of questions for the people opposed to the "right to be forgotten":

Do you consider the near-universal policy of withholding the name of someone filing a rape charge to be censorship?

Regardless of your prior answer, do you consider this to be a good policy?

The reason I ask is because ultimately, the two policies stem from the same balancing act - weighing the public's right to know against the individual's ability to function in society. You might disagree at where the balance point is, but you cannot dismiss that balance is what is being evaluated.
posted by NoxAeternum at 10:54 AM on August 6, 2014 [1 favorite]


Note btw that this socalled right to be forgotten is the result of years of civil lawsuits and jurisprudence, drawing on some very basic human right principles and taking into account Google's unique position and power as well as their refusal to cooperate before.

Last year the advocate-general of the European Court of Justice, Niilo Jaaskinen, has stated clearly that there is no such right under European data and privacy laws and that the courts cannot require Google to remove links under such a claim.
posted by ChurchHatesTucker at 10:57 AM on August 6, 2014


The reason I ask is because ultimately, the two policies stem from the same balancing act - weighing the public's right to know against the individual's ability to function in society.

I don't really see how that's analogous. You're comparing a police organization choosing which information to disclose vs. a court attempting to make previously legally published information unfindable.
posted by zixyer at 11:06 AM on August 6, 2014 [1 favorite]


Last year the advocate-general of the European Court of Justice, Niilo Jaaskinen, has stated clearly that there is no such right under European data and privacy laws and that the courts cannot require Google to remove links under such a claim.

So, was that a binding ruling from the court, or was it just the advocate-general doing the legal version of "this is what I think"?
posted by NoxAeternum at 11:18 AM on August 6, 2014


The reason I ask is because ultimately, the two policies stem from the same balancing act - weighing the public's right to know against the individual's ability to function in society. You might disagree at where the balance point is, but you cannot dismiss that balance is what is being evaluated.

The balance was already decided when the information was published. The ruling doesn't say the public no longer has a right to know. The information is still public. The ruling (as I understand it) simply says that the information must be harder to discover. That's an incredibly bad principle: "you have the right to this information, but only if you really really want it."
posted by Thing at 11:26 AM on August 6, 2014 [1 favorite]


So, was that a binding ruling from the court, or was it just the advocate-general doing the legal version of "this is what I think"?

The court ruled the other way, or we wouldn't have this post. The AG's opinion shows it wasn't at all an obvious conclusion based on EU law.

There is a LOT of animal-created art, but I don't see anything around about copyrights of those works.

They're not persons under the law.
posted by ChurchHatesTucker at 11:27 AM on August 6, 2014


Ah yes, the Streisand Effect - how the Internet punishes people who do something it doesn't approve of. I really tend to see it more and more as a sort of mob justice. /

I don't think that the Streisand Effect is supposed to describe the ethical course of action in a set of circumstances. It's a descriptive, not a normative term; describing what is apt to happen when someone tries to suppress information where the attempt at suppression itself may lead to increased dissemination of the information targeted. While the concept can certainly be problematic, and there is a certain amount of schadenfreude involved, I think it's more akin to victim blaming than any kind of mob justice.
posted by delegeferenda at 11:40 AM on August 6, 2014


I think it's more akin to victim blaming than any kind of mob justice.

The people who fall afoul of the Steisand Effect are rarely the victims. You have to be active for anyone to notice.
posted by ChurchHatesTucker at 11:54 AM on August 6, 2014


The court ruled the other way, or we wouldn't have this post. The AG's opinion shows it wasn't at all an obvious conclusion based on EU law.

Honestly, it comes across as a tad Halbiggian - he puts a lot of stock in "well, it doesn't say anything explicitly..."
posted by NoxAeternum at 11:55 AM on August 6, 2014


The people who fall afoul of the Steisand Effect are rarely the victims. You have to be active for anyone to notice.

The rarely part of your argument is precisely why it's problematic to treat the Streisand Effect as something prescriptive rather than a value neutral description of a phenomenon.
posted by delegeferenda at 12:10 PM on August 6, 2014


I never said it was prescriptive. Unless by that you mean "predictable."
posted by ChurchHatesTucker at 12:17 PM on August 6, 2014


If the 'deleted links' are still searchable by Duck Duck Go and other Search Engines, I see this as nothing more than "the right to be forgotten by lazy researchers". Unless you accept the argument that Google is more powerful than any national government. Which, maybe they are.
posted by oneswellfoop at 12:33 PM on August 6, 2014


If the 'deleted links' are still searchable by Duck Duck Go and other Search Engines, I see this as nothing more than "the right to be forgotten by lazy researchers".

But it could also be construed as, "government frustrates your rights." Why make a right to know something harder than it needs to be? What other rights should the government put people off from exercising?
posted by Thing at 12:53 PM on August 6, 2014 [1 favorite]


Why make a right to know something harder than it needs to be?

Because, as the saying goes, your right to swing your fist ends at my nose. This is another flaw in the anti-RTBF argument: it concerns itself solely with the impact on the searcher, and any others involved are dismissed.
posted by NoxAeternum at 1:04 PM on August 6, 2014


Ah, inconvenient truths.
posted by ChurchHatesTucker at 1:06 PM on August 6, 2014


I am conflicted about the law, but I'm certain that there should be a cultural shift. Police departments post mugshots of people who have merely been arrested. Enterprising companies suck up the names and photos and, with some Google juice, prominently display the mugshot to the world. They will remove your listing for an exorbitant fee.

I've seen some people call out these assholes but I haven't seen anyone indict the prevalent attitude of "people who've been arrested deserve what they get". I don't know if you have a right to be forgotten but you ought to have a right not to be exploited.
posted by Monochrome at 1:47 PM on August 6, 2014


Your house catches fire and burns to the ground. This embarrasses you so you ask people going by to not look at the smoldering remains.

Right.
posted by tommasz at 1:56 PM on August 6, 2014


So, should that house fire hang over your head forever?
posted by NoxAeternum at 2:14 PM on August 6, 2014


Ah yes, the Streisand Effect - how the Internet punishes people who do something it doesn't approve of. I really tend to see it more and more as a sort of mob justice.

Of course it is. Just like when sexist or racist or otherwise 'objectionable' (gay, female) people get their lives messed up for doing, saying or being things the mob doesn't like.

The Internet is the greatest facilitator of mob justice in human history. Though, at least, torches or pitchforks aren't generally involved.

More on topically, here's a blog entry from New Zealand's Privacy Commissioner on the RTBF, and here's his speech on the same topic (disclosure: I wrote the speech).
posted by Sebmojo at 2:23 PM on August 6, 2014


And not just over your head, but the head of your children, and their children, and so on. And before you say that's a crazy example, let me point to the fiasco with Google Maps when they created an overlay of a shogunate era map for Tokyo - and inadvertently empowered people who were looking for "untouchable" heritage in prospects for employment, marriage, etc. I remember there was quite a few people arguing that Google should have kept the overlay available, no matter how it was affecting the real lives of people.
posted by NoxAeternum at 2:27 PM on August 6, 2014


Of course it is. Just like when sexist or racist or otherwise 'objectionable' (gay, female) people get their lives messed up for doing, saying or being things the mob doesn't like.


Conflation is not an argument. It's a tactic.

I'm starting to see how the NZ government was so easily convinced that Kim Dotcom was a huge threat that required extra-judicial handling.
posted by ChurchHatesTucker at 2:31 PM on August 6, 2014


The reasoning people are using around this issue is still stuck in the realm of ideals, where idiots spend their time thinking about things at such a distance they can't possibly expect to really accomplish anything useful.

IMO, in practice, it's not going to be possible for people to adopt an "I don't care what people know about me" policy in general because of the kinds of social abuse and criminal behavior this attitude becoming widespread will enable. Me personally, I'm in the middle of a long term art-project that involves blabbing about all sorts of personal stuff on the internet, but that's a sacrifice I'm making voluntarily.

Privacy is often one of the most valuable tools victims of crime and abuse have at their disposal to protect themselves from abusers. People on all sides of these issues are going to find, as years go by, that regardless of what they believe is or should be ideal, reality is not going to work out as they would like or expect.

I suspect this is one of those cases where thinking and arguing about the problem in terms of abstract principles and ideals is counterproductive and will only lead to more confusion. Privacy's role in society is much more fundamental and practical than people who are still banging the free information drum have any interest in understanding. That said, I don't know about this particular rule. In the end, if it's the law, and it's a legitimate law with the support of the public, then it's a business requirement for the system, convenient or not.
posted by saulgoodman at 2:39 PM on August 6, 2014 [3 favorites]


Because, as the saying goes, your right to swing your fist ends at my nose. This is another flaw in the anti-RTBF argument: it concerns itself solely with the impact on the searcher, and any others involved are dismissed.

The impact on the searcher is important because the Right to be Forgotten neither extinguishes the right to know nor offers any definite protection for the individual whose information is public. You can destroy somebody's life with old information and still abide scrupulously by the Right to be Forgotten. When this ruling offers nothing to one party, then the harm to the other is always greater. If you want to reestablish privacy over certain information, then do that, but don't pretend that the Right to be Forgotten is anything more than a terrible hash.
posted by Thing at 2:44 PM on August 6, 2014


Conflation is not an argument. It's a tactic.

What do you mean? An argument is a tactic for demonstrating the higher quality of your ideas, so I'm not seeing your distinction.
posted by Sebmojo at 2:49 PM on August 6, 2014


What do you mean? An argument is a tactic for demonstrating the higher quality of your ideas, so I'm not seeing your distinction.


Conflating unlike things, like outrage over censoring truthful information vs. mob attacks on unpopular people, is not 'demonstrating the higher quality of your argument.' Rather the opposite.
posted by ChurchHatesTucker at 2:55 PM on August 6, 2014


The impact on the searcher is important because the Right to be Forgotten neither extinguishes the right to know nor offers any definite protection for the individual whose information is public. You can destroy somebody's life with old information and still abide scrupulously by the Right to be Forgotten. When this ruling offers nothing to one party, then the harm to the other is always greater. If you want to reestablish privacy over certain information, then do that, but don't pretend that the Right to be Forgotten is anything more than a terrible hash.

Ah, the good old "all or nothing" fallacy. Either perfect protection is afforded or there is no protection at all.
posted by NoxAeternum at 2:56 PM on August 6, 2014


Creating a false sense of security can often be worse than not having one at all.
posted by Kadin2048 at 2:57 PM on August 6, 2014


A group of people is a group of people, dude, they're not categorically different because they believe different things.
posted by Sebmojo at 2:58 PM on August 6, 2014


At the same time, we should not be letting the perfect become the enemy of the good.
posted by NoxAeternum at 2:58 PM on August 6, 2014


Specifically: mob justice is a social phenomenon that can be exercised in both good causes and bad; this seems inarguable.
posted by Sebmojo at 3:02 PM on August 6, 2014


A group of people is a group of people, dude, they're not categorically different because they believe different things.

wat

You just identified the category in which they are different.

Otherwise you could just group "Ukranian separatists" and "teenage babysitters" to justify stopping payments to those little terrorists.
posted by ChurchHatesTucker at 3:09 PM on August 6, 2014


Ah, the good old "all or nothing" fallacy. Either perfect protection is afforded or there is no protection at all.

You seem happy to erode the right to freedom of speech for an undefined and arguable benefit.
posted by Thing at 3:13 PM on August 6, 2014


You seem happy to erode the right to freedom of speech for an undefined and arguable benefit.

And you seem happy to inflict real pain for an ideal that we don't acknowledge except in the breach. I find it amusing that the standard bearer for this is a man who has been pro-censorship, especially when it comes to himself.
posted by NoxAeternum at 3:22 PM on August 6, 2014


Can you give me a real life example of "real pain" which the Right to be Forgotten will stop?
posted by Thing at 3:30 PM on August 6, 2014


> They're not persons under the law.

They should incorporate.
posted by jfuller at 6:54 PM on August 6, 2014


They should incorporate.

Well, there's an arguable Catch-22...
posted by ChurchHatesTucker at 7:00 PM on August 6, 2014


scientia est potentiam

Science is potatoes?
posted by Purposeful Grimace at 7:18 PM on August 6, 2014


Does latin even have a word for potatoes? What with them being a new world crop and all . . .
posted by Carillon at 8:02 PM on August 6, 2014


It's hard to believe that any government power to suppress or obscure information is going to be used primarily to protect the weak and innocent.
posted by straight at 2:42 AM on August 7, 2014 [1 favorite]


It's hard to believe that any government power to suppress or obscure information is going to be used primarily to protect the weak and innocent.

The fact that a debt cannot be held against you by a reporting agency after 10 years is due to government regulations. So is HIPAA, which protects your medical records. Not to mention the practice of sealing juvenile records to allow for a fresh start. I find the attitude that the government does not look out for the weak to be a rather corrosive one.

Furthermore, it's important to note that the organizations opposed to the right to be forgotten aren't disinterested observers. Google is built on data analysis, and these rulings act as restraints on how they can operate. And as I've pointed out through this thread, Wales has a checkered history when it comes to censorship - not only has he engaged in whitewashing his own past aggressively, he's also been supportive of censorship of Wikipedia, stating that deletion is just as important as creation.
posted by NoxAeternum at 9:38 AM on August 7, 2014 [1 favorite]


Does latin even have a word for potatoes? What with them being a new world crop and all . . .

The Vatican updates the language regularly.
posted by ChurchHatesTucker at 9:47 AM on August 7, 2014


Nox, those examples all work because they apply across the board to everyone automatically. It's this ad hoc suppression of information at the request of individuals that seems more likely to be abused to help powerful people hide misdeeds that ought to be public.
posted by straight at 9:55 AM on August 7, 2014


But if your concern is for the suppression of information, why wouldn't they be applicable?
posted by NoxAeternum at 10:53 AM on August 7, 2014


Ha! How awesome. It's my son's birthday and some fuckwit just wiped out our bank
account and changed all our security verification info. Way to stick it to the little guy. How timely.
posted by saulgoodman at 11:14 AM on August 7, 2014


The fact that a debt cannot be held against you by a reporting agency after 10 years is due to government regulations.

That's the ability to act on information.

So is HIPAA, which protects your medical records. Not to mention the practice of sealing juvenile records to allow for a fresh start.

That's information that should never be distributed in the first place.

It's telling that they're not going after the sources of the information (yet.) Rather, they're just trying to obfuscate it. The fact that these complaints generate new stories would be humorous if it wasn't so serious.
posted by ChurchHatesTucker at 11:23 AM on August 7, 2014


Actually, it's both, because the regulations say that the reporting agencies have to expunge records of old debt - the fact that the debt was incurred doesn't vanish, but it is no longer allowed to be disclosed. As for the sealing of records, remember that court rulings are public records, so sealing a record removes it from the public. You have somewhat of a point with HIPAA in theory, but the reality was that prior to HIPAA, medical records were much more leaky.

I find it interesting that in our discussion on the NSA, it's regularly brought up that the protection afforded by complexity has been eroded through technology, and thus actual regulation is needed - yet people seem reluctant to perform that analysis on the private sector.
posted by NoxAeternum at 12:12 PM on August 7, 2014


(Phew! False alarm. Our bank's just incompetent.)

I don't know. Europeans are still generally a lot more protective of their privacy rights than many Americans (Londoners possibly excepted, living in the Panopticon and all). Maybe Google just has to suck it up and comply with the law because Europeans do have these expectations. You go to market with the consumers you've got, not the ones you wish you had, don't you?
posted by saulgoodman at 3:48 PM on August 7, 2014


I think that's very true, especially in Germany (I realize that I'm generalizing a lot).

This law seems like an overreach to me as an American because I think we tend to value free speech a lot more, and are willing to live with more consequences before giving the government the power to censor things.

I really think it's just a difference in values. But it still really isn't fair to put all the onus on the search engines to vet the RTBF requests for validity. They should be going to the courts before they reach Google or Microsoft or Yahoo.
posted by zixyer at 4:30 PM on August 7, 2014


> Well, there's an arguable Catch-22...

Form a corporation to own your elephant as an asset, then sign ownership of the corporation over to the elephant. I don't see any problem with that except infinite recursion (corporation owns elephant, elephant owns corporation, etc.) I honestly don't think a black hole or rift in spacetime will occur. Even if that strategy doesn't work there's still bound to be a way--and a country of incorporation where it's legal.


> The Vatican updates the language regularly.

The Vatican can update ecclesiastical Latin but not even God can update the language of Cicero.
posted by jfuller at 4:56 PM on August 7, 2014


« Older Unsolved Mysteries for your browser   |   There is nothing quite as beautiful as cash Newer »


This thread has been archived and is closed to new comments