Wikipedia’s porn filter DOA, and a proposal

Warning: this post has links to pages that are definitely not safe for work or school. I’ll warn you which ones those are with “NSFW.”

The post has two parts. The first is about the availability of porn on Wikipedia and Wikimedia Commons, which for most people reading this is probably old news; but they’ve reached some new lows, such as actual pornographic films. The second part contains what I think is real news: that the much-debated porn filter they were developing is no longer in development and looks likely to be dropped.

I

There are, as many people reading this know very well, stupendous amounts of explicit imagery on Commons as well as Wikipedia itself; simply search for any fetish, porn industry term, or body part, and you’ll be likely to find it illustrated literally ad nauseam. Users, whether they like it or not, can be exposed to all sorts of inappropriate, explicit content when doing perfectly innocuous searches. This degree of smut is obviously inappropriate for an Internet resource touted as “educational” and promoted for classroom use.

Almost two years ago, I reported the Wikimedia Foundation to the FBI (as I was required to by law) on grounds that Wikimedia Commons was hosting two image categories, labeled “Pedophilia” and “Lolicon,” which featured depictions of child sexual abuse. I tracked the fallout in two posts. Recently, FoxNews.com followed up their coverage, reporting that little had been done since then. The Fox News reporter did a good job, I think. But some more developments have come to light.

The pervy categories are still there, and include whole hierarchies of categories under the titles “Erotic images of children” (NSFW) and “Child sexuality” (NSFW). The garbage by Martin van Maele, who drew many illustrations of children being sexually abused in the early 20th century, is still there, aggressively and proudly defended by the powers-that-be on Wikimedia Commons as “historical” and “educational.” To give you an idea of the attitude of the pedophilia sympathizers on Commons, who clearly feel themselves to be put-upon and wronged, consider that there is a so-called “Hate for pedophiles” category which has existed, unmolested, since May 2010 (which, come to think of it, is the month when my FBI report made news). Consider also (as was recently pointed out to me) that the activists-for-free-porn on Commons have been awarding each other the new, outrageously gross, “Hot Sex Barnstar” (NSFW!) for their efforts. There are clearly some (to me) extremely unsavory characters involved who have made it their mission to make Commons, and Wikipedia as well, as free as possible to host the most explicit sorts of imagery on this tax-exempt, non-profit 501(c)(3) website.

Recently I received an email from someone who follows this issue. He called a few things to my attention. One item: a convicted child pornographer has apparently been prominently involved in curating adult pornography. It seems he is one of those who loves to use Commons to post pervy naked pictures of himself–discussion here. He is probably not the only one. Another item: Commons is now hosting an antique video (really, really NSFW) which I am told (I didn’t watch the whole thing) shows a dog fellating a woman (in a nun’s habit) and a man.

The Wikipedia community’s more prurient tendencies are, so far from being reined in and moderated, exercised more boldly than ever.

II

My correspondent also directed me to this extremely interesting discussion on the Wikimedia Foundation mailing list (Foundation-L). Read both pages–here is page 2. As I write this, discussion is ongoing.

This discussion has revealed two pieces of news so far.

First, the powers-that-be at the WMF have directed their programmers to stop working on their opt-in “controversial content” (including porn) filter. They have higher priorities, we are told.

This needs some background. The very least that Wikipedia could do, on this issue, is to let people turn on a filter so that they (or the children using their computers) would not be shown porn and other grossly inappropriate content. In fact, my understanding is that the porn would merely be “collapsed,” meaning that the user could still easily display it by “uncollapsing” the image. This, as sane people can agree, sounds both welcome and completely uncontroversial. This is what the WMF’s consultant recommended in 2010, and it was widely assumed, after a referendum indicated general support (if lukewarm), that it would be implemented soonish. After all, the tool would simply let people turn on a personal filter. (It wouldn’t be turned on automatically–users would have turn it on in user settings.) And the filter would only hide “controversial” images, not completely block them. But, no. There’s no compromise on porn in Wikipedia-land, despite this being an “educational” project constitutionally committed to consensus and compromise. They want their commitment to free speech so loudly proclaimed that two full-color vulvas greet you at the top of the page (with a variety of others further down), should you have the temerity to look up the subject on Wikipedia. There has been such a groundswell of loud opposition to the opt-in filter idea that the project was never implemented.

This leads me to the second piece of news. It appears that two Wikimedia Foundation Board members, Kat Walsh and Phoebe Ayers, have both changed their positions. The Board was sharply divided on the need of this filter (which is just as amazing and silly as it sounds) last fall, but things have become even sillier since then. There is more community opposition, and so Ms. Walsh and Ms. Ayers no longer support it. They strongly imply that the earlier decision to build a filter is now a dead letter.

This says something very disappointing and even disturbing about the Wikimedia Foundation as an institution. It certainly looks as though they are in the thrall of anarchist porn addicts whose scorn for the interests of children–the group of users that stands to gain the most from a high-quality free encyclopedia–is constrained only by the limits of the law, and maybe not even that.

Eighteen months ago, after speaking at length to both WMF Executive Director Sue Gardner and the consultant she hired, Robert Harris, I had the distinct impression that the WMF might be capable of prevailing on Wikipedia and Commons to the extent of, at least, installing a completely innocuous opt-in filter system. So color me disillusioned.

I don’t wish any grief or embarrassment upon Wikipedia’s more sensible managers, like those I’ve mentioned–Gardner, Ayers, and Walsh. They are clearly right that politically they’re in a “damned-if-you-do, damned-if-you-don’t” situation. But given the choice, I’d rather be damned for doing the bare minimum needed to meet the needs of children, or at least trying to do that, than be more richly damnable for not doing anything. I’d suck it up and remind myself that there are quite a few more important things than power and status. Since such a complete no-brainer as an opt-in filter is currently politically impossible, Gardner and other sane adults among the Wikimedia managers face a dilemma: maintain some degree of power in the organization, but implicitly supporting what is only too clearly a deeply dysfunctional and irresponsible organization; or resign. If I were Gardner, or a member of the Board, I would seriously consider resigning and making a clear and pointed public statement.

Ultimately, the WMF, which has legal responsibility for the project–and which is supposed to be the grown-up, after all–has shown in its inability to act on this issue that it cannot avoid a truly spectacular scandal or a really serious lawsuit. The potential trouble isn’t that the government might shut Wikipedia down, or slap it with judgments it can’t repay. Rather, the biggest potential trouble is a mass exodus of profoundly disillusioned contributors, which is surely the Achilles’ heel of a project with tens of millions of articles and files to manage.

If she really wanted to take serious leadership on this issue, what Gardner might do is spearhead a brand new project to use some of the many millions they’ve amassed and start a serious Wikipedia for young people, one that K-12 teachers can be proud to use with their students. It could be a curated version of Wikipedia. It would not only have obvious controls regarding age-appropriate content, it would also have reviewed versions of articles, Citizendium-style. They could brag that they have finally adopted “flagged revisions,” which the media has been repeatedly led to believe is “right around the corner.”

I do not think the WMF needs to ask the Wikipedia rank-and-file to vote on this. Or, if they ask those people, they should also ask their contributors to vote on it, as well. The WMF has to ask itself: who are we serving, the Wikipedia rank-and-file, which is dominated by anarchist porn addicts, or readers? Are they sensible enough to answer this question correctly?

As an added bonus, if a WMF-supported responsible version of Wikipedia existed and were growing healthily, then I would shut up about the X-rated version of Wikipedia. Maybe.


by

Posted

in

,

Comments

Please do dive in (politely). I want your reactions!

12 responses to “Wikipedia’s porn filter DOA, and a proposal”

  1. Andreas K.

    [Editor note: the following are NSFW links, which means, don’t click on them if you are underage or don’t want to see some pretty disgusting stuff.]

    I do not understand how on earth the Wikimedia Foundation board can defend there being no filter on any Wikimedia website – not even a voluntary one.

    For example, the dog sex video is the first thing that comes up on French Wikipedia for users entering “devoirs” (homework) or “vacances” (holiday) as their search term.

    http://fr.wikipedia.org/w/index.php?title=Sp%C3%A9cial%3ARecherche&profile=images&search=devoirs&fulltext=Search&searchengineselect=mediawiki

    Searching for images of a toothbrush on English Wikipedia brings up an image of a woman masturbating with a toothbrush (!!).

    http://en.wikipedia.org/w/index.php?title=Special%3ASearch&profile=images&search=toothbrush&fulltext=Search

    This has been known for half a year. Yet nothing is being done.

    http://tch516087.tch.www.quora.com/Why-is-the-second-image-returned-on-Wikimedia-Commons-when-one-searches-for-electric-toothbrush-an-image-of-a-female-masturbating

    The mind boggles. What are they thinking?

    1. Andreas K.

      I’ve asked the Wikimedia Foundation board members and Executive Director for an official statement on the bestiality video:

      http://lists.wikimedia.org/pipermail/foundation-l/2012-March/072463.html

      Specifically, I have asked them whether in their opinion, Wikimedia projects should continue to offer users unfiltered and unfilterable search hits, up to and including bestiality porn, in response to innocuous search terms like “homework”, “toothbrush” and “holiday”.

      1. Nicely put, but it seems unlikely that anyone will respond.

        1. Andreas K.

          Quite possibly so. But that in itself will be a response.

  2. Eric Barbour

    Thanks Larry. You might not get any action from the WMF or the media over this, but these things have a way of adding up slowly. It appears one can’t fight a monstrosity like Wikipedia quickly — patience is a virtue (something Wikipedians are noticeably short on).

    The Commons people who support this stuff are a very mixed bag of nuts. Some of them are random eccentrics, like the Russian child-porn anarchist, and some of them are well-known and well-regarded academic people, with histories of publication and all. Ever wonder how many of those folks have lost their jobs over their Wikipedia obsession?….I’ve already been able to prove that a number of working folks, including a few American and UK politicians, edit Wikipedia during the work day — instead of actually working. Either their employers and constituents don’t know, or else approve of, this activity.

  3. Dr. Sanger, it’s quite impressive to me how you’re able to frame the root problem in such a concise and coherent way. Now, however, brace yourself for the onslaught of Free Culture zealots who won’t hesitate to tell you that the world would be a much better place if we simply showed more pornography and deviant sex imagery to younger and younger children. That’s the mentality of those on the Free Culture side of the coin. I’ve grown weary of trying to engage in debate with them — they’re just too far off on the spectrum.

  4. Dear Larry,

    Do you have updates on the FBI’s investigation? Any news?

    Since it has been nearly two years since your report to them, I’d assume that they should have reached some conclusions. After all, Wikimedia Commons is an open site, and, as you rightly point out, its content is accessible through search engines; the FBI should have no difficulty locating possibly illegal content if it exists.

    1. I don’t know, but I thought the FBI dropped any investigation they had going back in 2010. I’m sure The FBI must choose its battles carefully, and they surely knew that any word they had about Wikipedia would require a huge investment of labor due to the PR implications. They probably figured that the problems were not worth the effort–they have bigger problems to chase down. Frankly, that was my guess even as I was writing my letter to the FBI! I wrote it anyway, because I had already made a public statement indicating awareness of the offending page, and then I read the law that required me to report it… I was surprised that a journalist from Fox News could tell me that the FBI asked for more time to comment before they went to press…which, apparently, Fox News did not give them…

  5. Are Wikipedia and Wikimedia Commons appropriate for K-8 students to use?…

    Considering that Wikipedia and Wikimedia Commons are decidedly “adult” projects, and yet unlike Google Image Search, Flickr, and many others, stubbornly refuses to install even a weak opt-in filter (see https://larrysanger.org/2012/03/wikipedias-porn-…

  6. Andreas K.

    Also note two recent articles by Jack Stuef on Buzzfeed:

    It’s Almost Impossible To Get Kiddie Porn Off Wikipedia
    Wikipedia’s self-policing isn’t working.

    http://www.buzzfeed.com/jackstuef/its-almost-impossible-to-get-kiddie-porn-off-wiki

    The Epic Battle For Wikipedia’s Autofellatio Page
    In the underbelly of Wikipedia is an exhibitionist subculture dedicated to one thing: Ensuring that their penis is the visual definition of penis. Meet Jiffman, one such exhibitionist. (This article is very probably NSFW.)

    http://www.buzzfeed.com/jackstuef/inside-the-seedy-world-of-wikipedia-exhibitionism

  7. […] March of 2012, Board members dropped hints that work had stopped on the filter and that they, like others, no longer supported it. I began […]

  8. […] March of 2012, Board members dropped hints that work had stopped on the filter and that they, like others, no longer supported it. I began […]

Leave a Reply

Your email address will not be published. Required fields are marked *