Warning: this post has links to pages that are definitely not safe for work or school. I’ll warn you which ones those are with “NSFW.”

The post has two parts. The first is about the availability of porn on Wikipedia and Wikimedia Commons, which for most people reading this is probably old news; but they’ve reached some new lows, such as actual pornographic films. The second part contains what I think is real news: that the much-debated porn filter they were developing is no longer in development and looks likely to be dropped.

I

There are, as many people reading this know very well, stupendous amounts of explicit imagery on Commons as well as Wikipedia itself; simply search for any fetish, porn industry term, or body part, and you’ll be likely to find it illustrated literally ad nauseam. Users, whether they like it or not, can be exposed to all sorts of inappropriate, explicit content when doing perfectly innocuous searches. This degree of smut is obviously inappropriate for an Internet resource touted as “educational” and promoted for classroom use.

Almost two years ago, I reported the Wikimedia Foundation to the FBI (as I was required to by law) on grounds that Wikimedia Commons was hosting two image categories, labeled “Pedophilia” and “Lolicon,” which featured depictions of child sexual abuse. I tracked the fallout in two posts. Recently, FoxNews.com followed up their coverage, reporting that little had been done since then. The Fox News reporter did a good job, I think. But some more developments have come to light.

The pervy categories are still there, and include whole hierarchies of categories under the titles “Erotic images of children” (NSFW) and “Child sexuality” (NSFW). The garbage by Martin van Maele, who drew many illustrations of children being sexually abused in the early 20th century, is still there, aggressively and proudly defended by the powers-that-be on Wikimedia Commons as “historical” and “educational.” To give you an idea of the attitude of the pedophilia sympathizers on Commons, who clearly feel themselves to be put-upon and wronged, consider that there is a so-called “Hate for pedophiles” category which has existed, unmolested, since May 2010 (which, come to think of it, is the month when my FBI report made news). Consider also (as was recently pointed out to me) that the activists-for-free-porn on Commons have been awarding each other the new, outrageously gross, “Hot Sex Barnstar” (NSFW!) for their efforts. There are clearly some (to me) extremely unsavory characters involved who have made it their mission to make Commons, and Wikipedia as well, as free as possible to host the most explicit sorts of imagery on this tax-exempt, non-profit 501(c)(3) website.

Recently I received an email from someone who follows this issue. He called a few things to my attention. One item: a convicted child pornographer has apparently been prominently involved in curating adult pornography. It seems he is one of those who loves to use Commons to post pervy naked pictures of himself–discussion here. He is probably not the only one. Another item: Commons is now hosting an antique video (really, really NSFW) which I am told (I didn’t watch the whole thing) shows a dog fellating a woman (in a nun’s habit) and a man.

The Wikipedia community’s more prurient tendencies are, so far from being reined in and moderated, exercised more boldly than ever.

II

My correspondent also directed me to this extremely interesting discussion on the Wikimedia Foundation mailing list (Foundation-L). Read both pages–here is page 2. As I write this, discussion is ongoing.

This discussion has revealed two pieces of news so far.

First, the powers-that-be at the WMF have directed their programmers to stop working on their opt-in “controversial content” (including porn) filter. They have higher priorities, we are told.

This needs some background. The very least that Wikipedia could do, on this issue, is to let people turn on a filter so that they (or the children using their computers) would not be shown porn and other grossly inappropriate content. In fact, my understanding is that the porn would merely be “collapsed,” meaning that the user could still easily display it by “uncollapsing” the image. This, as sane people can agree, sounds both welcome and completely uncontroversial. This is what the WMF’s consultant recommended in 2010, and it was widely assumed, after a referendum indicated general support (if lukewarm), that it would be implemented soonish. After all, the tool would simply let people turn on a personal filter. (It wouldn’t be turned on automatically–users would have turn it on in user settings.) And the filter would only hide “controversial” images, not completely block them. But, no. There’s no compromise on porn in Wikipedia-land, despite this being an “educational” project constitutionally committed to consensus and compromise. They want their commitment to free speech so loudly proclaimed that two full-color vulvas greet you at the top of the page (with a variety of others further down), should you have the temerity to look up the subject on Wikipedia. There has been such a groundswell of loud opposition to the opt-in filter idea that the project was never implemented.

This leads me to the second piece of news. It appears that two Wikimedia Foundation Board members, Kat Walsh and Phoebe Ayers, have both changed their positions. The Board was sharply divided on the need of this filter (which is just as amazing and silly as it sounds) last fall, but things have become even sillier since then. There is more community opposition, and so Ms. Walsh and Ms. Ayers no longer support it. They strongly imply that the earlier decision to build a filter is now a dead letter.

This says something very disappointing and even disturbing about the Wikimedia Foundation as an institution. It certainly looks as though they are in the thrall of anarchist porn addicts whose scorn for the interests of children–the group of users that stands to gain the most from a high-quality free encyclopedia–is constrained only by the limits of the law, and maybe not even that.

Eighteen months ago, after speaking at length to both WMF Executive Director Sue Gardner and the consultant she hired, Robert Harris, I had the distinct impression that the WMF might be capable of prevailing on Wikipedia and Commons to the extent of, at least, installing a completely innocuous opt-in filter system. So color me disillusioned.

I don’t wish any grief or embarrassment upon Wikipedia’s more sensible managers, like those I’ve mentioned–Gardner, Ayers, and Walsh. They are clearly right that politically they’re in a “damned-if-you-do, damned-if-you-don’t” situation. But given the choice, I’d rather be damned for doing the bare minimum needed to meet the needs of children, or at least trying to do that, than be more richly damnable for not doing anything. I’d suck it up and remind myself that there are quite a few more important things than power and status. Since such a complete no-brainer as an opt-in filter is currently politically impossible, Gardner and other sane adults among the Wikimedia managers face a dilemma: maintain some degree of power in the organization, but implicitly supporting what is only too clearly a deeply dysfunctional and irresponsible organization; or resign. If I were Gardner, or a member of the Board, I would seriously consider resigning and making a clear and pointed public statement.

Ultimately, the WMF, which has legal responsibility for the project–and which is supposed to be the grown-up, after all–has shown in its inability to act on this issue that it cannot avoid a truly spectacular scandal or a really serious lawsuit. The potential trouble isn’t that the government might shut Wikipedia down, or slap it with judgments it can’t repay. Rather, the biggest potential trouble is a mass exodus of profoundly disillusioned contributors, which is surely the Achilles’ heel of a project with tens of millions of articles and files to manage.

If she really wanted to take serious leadership on this issue, what Gardner might do is spearhead a brand new project to use some of the many millions they’ve amassed and start a serious Wikipedia for young people, one that K-12 teachers can be proud to use with their students. It could be a curated version of Wikipedia. It would not only have obvious controls regarding age-appropriate content, it would also have reviewed versions of articles, Citizendium-style. They could brag that they have finally adopted “flagged revisions,” which the media has been repeatedly led to believe is “right around the corner.”

I do not think the WMF needs to ask the Wikipedia rank-and-file to vote on this. Or, if they ask those people, they should also ask their contributors to vote on it, as well. The WMF has to ask itself: who are we serving, the Wikipedia rank-and-file, which is dominated by anarchist porn addicts, or readers? Are they sensible enough to answer this question correctly?

As an added bonus, if a WMF-supported responsible version of Wikipedia existed and were growing healthily, then I would shut up about the X-rated version of Wikipedia. Maybe.