On the moral bankruptcy of Wikipedia’s anonymous administration

Larry Sanger

I announced, named, and launched Wikipedia way back in January of 2001. My originating role in the project was acknowledged by Jimmy Wales later on in 2001, when he wrote, “Larry had the idea to use wiki software…” Virtually all of the news articles about the project before 2005 identified me as one of the two founders of the project, as did the project’s first three press releases, all of them approved by Jimmy, of course. I managed it as “instigator” and “chief organizer” for the project’s seminal first 14 months. To give you an idea of what role I had in the project, Jimmy declared, a few weeks before I left the project, that I was “the final arbiter of all Wikipedia functionality.”

Since then, I’ve become better known as a critic of Wikipedia. But this is mostly because I am defending myself against repeated attacks on my reputation and pointing out inconvenient truths that a more responsibly-managed organization would try to fix. Contrary to what some have said, I bear no grudges–once I have defended myself, I let matters drop. And I am not trying to damage Wikipedia. Rather, because I inflicted it on the world, I am trying to improve it because it has become one of the most influential websites in the world. I feel some responsibility for it, even though I’m long out of its administration.

I’ve been reading draft chapters of a fascinating book, written by some online friends of mine, about the history and conduct of Wikipedia and its administration. I knew that Wikipedia’s administration is screwed up and somewhat corrupt, but these writers have opened my eyes to episodes and facts that I had not been tracking. However useful Wikipedia might be–and its usefulness is something I have always affirmed–the sad fact is that Wikipedia’s administration has been nothing but one long string of scandal and mismanagement. The saga of Wikimedia UK and its chair is only the latest. Did you know that the deposed chair, Ashley van Haeften, continues to sit on the Wikimedia UK board, and continues to head up Wikimedia Chapters Association? This is despite the fact that van Haeften has been banned from editing Wikipedia, for various violations of policy such as using multiple “sockpuppet” accounts (anonymous, fake accounts), something truly egregious for a high-ranking editor. What kind of Internet organization allows its leadership to continue on in positions of authority spite of being banned (for excellent reasons, mind you) from the very institution it is promoting? Wikipedia defenders, consider what you are defending.

But again, this is only the latest in a long, long series of scandals, which included things like Jimmy Wales telling The New Yorker, of all things, that he didn’t have a problem with someone lying about his credentials on Wikipedia, the hiring of a deputy director with rather dodgy views on child-adult sexual relations, and the hiring of a COO who turned out to be a convicted felon.

Let’s not forget the problems associated with the many, many questionable editorial decisions made by Wikipedia administrators. Like the rank-and-file, they can be and often are completely anonymous. You read that right. The people who make editorial decisions about what is taken to be “probably pretty much right” by a lot of gullible Internet users do not even have to reveal their own identities. That’s right. There are all too many Wikipedia administrators who self-righteously pride themselves on insisting that the full, ugly truth be revealed about the targets of their sometimes quite biased Wikipedia biographies; yet those very same administrators bear no personal responsibility for their actions, which can be quite consequential for people’s careers and personal lives, insofar as they remain anonymous.

No other journalistic or scholarly enterprise would tolerate such unaccountability. The reason that journalists are prized in our society, the reason they are in their positions of power and influence, is that they have committed themselves to high journalistic standards and put their personal reputations on the line when they make claims that can damage their targets. Wikipedia, like it or not, enjoys a level of credibility but without personal accountability. The system has been ripe for abuse and indeed far too many Wikipedia administrators do routinely abuse the authority they have obtained. I look forward to the above-mentioned book because it will really blow the lid off this situation.

Wikipedia administrators bear a heavy moral burden to make their identities known. If you make serious decisions that affect the livelihoods and personal relationships of real people, or what students believe about various subjects, the price you pay for your authority is personal responsibility. Without personal responsibility, it is simply too easy to abuse your authority. Why should anyone trust the decisions of anonymous Wikipedia administrators? They could easily be personally biased, based on ignorance, or otherwise worthless. Worse, aggrieved parties–whether they are persons whose reputations have come under attack or scholars who are seriously concerned about the misrepresentation of knowledge in their field–have no recourse in the real world. If someone writes lies about you, there is no way you can name and shame the liar, or at least the Wikipedia admin who permits the lie. Instead, you have to play the stupid little Wikipedia game on its own turf. You can’t go to the real world and say, “Look, so-and-so is abusing his authority. This has to stop.” In this way, by remaining anonymous, Wikipedia’s decisionmakers insulate themselves from the real-world responsibility that journalists routinely bear for their statements and publishing decisions.

If you were a Wikipedia administrator, wouldn’t you feel absolutely bound to make your identity known? Wouldn’t you feel cowardly, craven, to be standing in judgment over all manner of important editorial issues and yet hiding behind anonymity? I know I would. Why shouldn’t we hold Wikipedia responsible for making its administrators’ identities known? A Wikipedia administrator who refuses to reveal his or her identity is morally bankrupt, because unaccountable authority is morally bankrupt. Members of democratic societies are supposed to know this.

Even the so-called “bureaucrats,” the people who are responsible for conferring adminship on an account, can be anonymous. In fact, from a glance at their usernames, most of them are anonymous.

It is a little strange that journalists, who are trained to understand the importance of taking responsibility for published work, have given Wikipedia a pass for this appalling state of affairs. It’s one thing for Wikipedia authors to be anonymous, a situation journalists often remark on with bemusement. It is quite another for its administrators to be, a fact that journalists have hardly noticed at all.

Indeed, why is the fifth most popular website in the world, which shapes what so many people believe on all sorts of subjects, controlled by a cadre of mostly anonymous administrators? Isn’t that fact, all by itself, scandalous? Why don’t we as a society demand more accountability? I don’t get it.

Wikipedia, wake up. We, the undersigned (let’s make a petition out of this), demand that all administrators be identified by name.

The Saga of Wikimedia UK and its Chair

Larry Sanger

The following story is very instructive about the sort of people in the Wikipedia universe, and what sort of people actually run things on the sixth most popular website online.

In case you didn’t know, there is an organization, Wikimedia UK, that is legally independent of the Wikimedia Foundation headquartered in San Francisco. Wikimedia UK has a separate budget of £1 million, and is currently headed up by someone who calls himself “Fae” (among many others) on Wikipedia, and whose real name is Ashley Van Haeften.

Van Haeften is a charming character. Among his many exploits, he is reputed to have posted pornographic pictures of himself in bondage gear to Wikimedia Commons, although any evidence has by now been deleted, so we now have only copies like this. While he has been a high-profile administrator on Wikipedia, he routinely lobbed personal attacks at those who dared to criticize him. And much else.

So the High Court of Wikipedia, the Arbitration Committee, declared on July 20: “For numerous violations of Wikipedia’s norms and policies, Fæ is indefinitely banned from the English Language Wikipedia. He may request reconsideration of the ban six months after the enactment of this remedy, and every six months thereafter.”

In other words, Van Haeften, the head of a £1 million charity devoted to the promotion of Wikipedia, has been banned from Wikipedia itself, and for violating Wikipedia’s own policies!

Now that is, I’m sure you’ll agree, just appalling. It speaks volumes about the Wikipedia community at present that Van Haeften attained the position he holds. But it gets even worse.

On July 26, Wikimedia UK held a closed-door meeting in which the Board declared that they are “united in the view that this decision does not affect his [Van Haeften’s] role as a Trustee of the charity.”

In other words, the board that manages a £1 million budget, devoted to promoting Wikipedia, supports its chair even if the chair has been banned from editing Wikipedia itself. One has to wonder: how can the Wikimedia UK Board pretend that Van Haeften can continue to be a credible chair of a well-funded Wikipedia charity if the judicial body of Wikipedia has deliberately excluded him from the website for violating Wikipedia’s own policies?

It is a stunning revelation of just how huge a pass the mainstream media has given Wikipedia that this story was nowhere to be heard, outside of online forums and blogs, until this morning. Eleven days after Van Haeften, head of Britain’s £1 million Wikipedia charity, was banned from Wikipedia, and five days after he was unaccountably supported by Wikimedia UK, a single story came out in the mainstream media.

This morning, the Daily Telegraph came out with a pitch-perfect and (as far as I can tell) factually accurate report:

Ashley Van Haeften is chairman of Wikimedia UK, a charity with an £1m annual budget funded by donations by Wikipedia visitors and dedicated to promoting the website among British museums and universities.

Despite his volunteer role at the head of the charity he is now banned indefinitely from contributing to Wikipedia because of “numerous violations of Wikipedia’s norms and policies”.

Mr van Haeften’s punishment exposes a deep rift among Wikipedia contributors over the mass of explicit material in the online encyclopaedia, at a time when the Government is developing new controls on internet access to protect children online.

The story goes on to discuss Wikipedia’s problem of unfiltered porn, readily available to the school children who use it, and includes my YouTube video about the problem, and the following quote from yours truly: “Some things are worth going to the mat over and this is one of them. It goes to the sense of seriousness of the whole project. Wikipedia can’t command respect if it regards itself as above the norms of wider society.” The story was also followed up by an excellent report in CivilSociety.co.uk. (Update: And on August 1, FoxNews.com.)

I hope the Wikimedia Foundation (WMF) is paying attention. They will lose credibility by being associated with Wikimedia UK (WMUK). They should not allow Wikimedia UK to use Wikipedia.org for any further fundraising. Nor should WMF cooperate with WMUK in any other way. The WMF should also release a statement condemning WMUK’s recent action. They should continue this non-cooperation until the WMUK Board has been replaced. If the WMF continues to act as if nothing has happened, they will become complicit in the appalling behavior of Ashley Van Haeften and his colleagues on the WMUK Board who supported him.

If Van Haeften had any decency, he would have resigned on July 20. If the WMUK Board had any sense, they would have fired him as soon as Van Haeften made his defiance clear.

By the way, if you want to get into the sordid details, some places to start are this long Wikipediocracy thread and this Wikipedia Review thread. Frankly, I haven’t read much of either one.

This story’s front page thumbnail is from a screen capture of this Google Images search–note, the fifth search result for the image search is taken from this article. (Update: that search is made when SafeSearch is “off.” As it turns out, Google’s optional filter excises Van Haeften’s self-pornography.)

UPDATE (8/2/2012): Van Haeften has finally resigned.

Wikimedia Foundation Board Officially Rejects Porn Filter

Larry Sanger

Last Wednesday, the Wikimedia Foundation board quietly voted, in person, 10-0 in favor of repealing the “personal image hiding feature”–in other words, a very weak, opt-in porn filter. “Quietly,” I say, because the resolution was not posted publicly until the middle of the weekend. Note that the page mistakenly states that Jimmy Wales voted against it: “That page is wrong,” Wales clarified on his user talk page, “I voted yes.”

This is certainly news. A brief recap of some related events will help put it in essential context. (Here’s another recap.) You may not know that funding for the early years of Wikipedia came from Bomis, Inc., which made much of its money from what Wikipedians have called “softcore porn.” I’ve always said that Bomis was the fertilizer on which Wikipedia was built. Jimmy Wales was CEO and one of the three partners of Bomis. I started Wikipedia for Bomis, which paid my paycheck. Anyway, I’m not sure when Wikipedia first started hosting what most people would call porn, but it may have been around 2003. Over the years, there have been many proposals to rein in or filter the “adult content,” all of which have failed. In March 2008, Erik Moeller, who had recently been appointed Deputy Director of the Wikimedia Foundation (WMF), came under heavy fire for what Mashable called “his continued self-defense of statements generally indicating that pedophilia is something that’s less than evil.” Moeller continues to hold the post. In December, 2008, Wikipedia was temporarily blacklisted by the British Internet watchdog, the Internet Watch Foundation, for hosting “images of child pornography.” The site continues to host the offending image, as well–an album cover feature a nude, and very sexualized, picture of a pre-pubescent girl.

Things really began to heat up in 2010. In March, I reported the WMF to the FBI because they hosted graphic depictions of child sexual molestation on Wikimedia Commons–and they still do. At the time, I also strongly urged the WMF to install a pornography filter. In the fallout, Wales and others started purging porn from Commons, but Wikipedians summarily swatted down the erstwhile “God-King” and reinstated much of the porn that had been deleted. There was also ongoing concern about Wikipedia’s pedophilia problem. In reaction, the WMF commissioned a report, which recommended installing an opt-in porn filter. In May, 2011, the WMF unanimously approved a “personal image hiding feature.” Matters were far from settled, however. In September, 2011, Wikipedians came out strongly in favor of allowing minors to edit pornography articles right alongside adults, and the German Wikipedians voted 86% against even a weak, opt-in a porn filter.

In March of 2012, Board members dropped hints that work had stopped on the filter and that they, like others, no longer supported it. I began conferring with some colleagues about what to do; I had been largely silent on the issue since the WMF demonstrated some commitment to tackle it responsibly. I was surprised to learn that the amount of “adult” content on Wikimedia servers had grown substantially since 2010. With the help of those colleagues I carefully wrote and posted this explanation of the problem, which got quite a bit of exposure. As I put it via Twitter: “Wikipedia, choose two: (1) call yourself kid-friendly; (2) host lots of porn; (3) be filter-free.” Jimmy Wales responded via Twitter, stating clearly and unequivocally that he supported the filter. My impression is that members of the public who recently commented on the issue online have been overwhelmingly supportive, many expressing surprise and even shock at the amount of “adult” content that Wikipedia hosts. This video of mine may help clarify the trouble.

That takes us up to today. On the issue of a weak, opt-in filter, the WMF perfectly reversed itself, going from unanimous support to unanimous rejection. “Unanimous” rejection assumes that Jimmy Wales voted yes on Wednesday’s resolution, as he said on his Wikipedia user talk page, and contrary to what the resolution page says, as of this writing. He has further clarified (if that is the right word) that, despite his apparent “yes” vote for Wednesday’s resolution, he continues to “strongly support the creation of a personal image filter.” If I were cynical, I would say that he and the WMF had deliberately left his views unclear, so that he could speak out of both sides of his mouth. Anyway, if he still strongly supports the creation of a “personal image filter,” voting to rescind the resolution that would create the filter is a mighty strange way to show his support.

However matters are, the filter is now officially and overwhelmingly rejected. Unless they make another 180° change and actually get to work, publicly, on a filter, I believe a boycott may well be in order.

UPDATE: Jimmy Wales is now hosting a discussion (talk) of how the filter should be written. Let’s see if anything constructive comes out of it.

Dad & Junior find porn via a Wikipedia “Schools Gateway”

Larry Sanger

I thought I’d illustrate one of the “what could possibly go wrong?” scenarios, for people who (1) don’t want to click on the nasty links and/or (2) lack imagination.

Script:

Opening shot: article about unfiltered porn on Wikipedia

Voiceover: The adult imagery is blurred in this video. Still, don’t watch it with your kids.

Man (voiceover): Wikipedia has porn? Surely not.

(switches to Wikipedia)

Man: What did he say, “fisting”? I don’t even know what that is.

(types in fisting)

Man: Oh! Boy! They have multimedia!

(clicks on magnifying glass in search box, then clicks “Multimedia”)

Man: Let’s see here… Oh my gosh. OK, yeah, well, it’s a real-life phenomenon, I guess. So that’s what fisting is. Wikipedia, you have disappointed me. How can I tell Junior to “go look it up on Wikipedia” now? But wait a second, didn’t I see once something called “Simple Wikipedia”?

(types in simple.wikipedia.org)

Man: Yeah…here we go…yeah, it says here, “The Simple English Wikipedia is for everyone! That includes children.” So Junior shouldn’t be able see any fisting in here. Oh, look at this, they have a “Schools Gateway.”

(clicks on “Schools Gateway”)

Man: Yeah, see, they wouldn’t have such a page if they feared the wrath of schools. I mean, they could get in serious trouble, if they invited schools in and also hosted adult content here. (sounding uncertain) Right? Well, whatever.

Man (turning away from microphone): Junior! C’mere!

Junior (could be same person doing voiceover, but preferably turned away from microphone and in a boy’s voice): What?

Man: Here’s a place where you can search for images for those reports you were working on!

Junior: That’s cool.

(clicks on magnifying glass icon in search box, clicks on Multimedia)

Man: it’s called “Simple English Wikipedia.” They say it’s for children, so, you could probably bookmark this one. Didn’t you make a list of things to search for? What’s the first one?

Junior: Yeah, OK. We’re growing different plants in science, and I’m growing a cucumber plant. So I need a cucumber picture for the title page of my lab report.

Man: Huh.

(types in “cucumber”; gets results)

Man: These look OK. You could use any of those.

Junior: No, I just want one cucumber by itself.

Man: Huh, well, OK.

(scrolls down; sees the nude lady)

Man: Oh my gosh.

(quickly backs up)

Junior: You didn’t see any cucumbers by themselves?

Man: Uh…no. (to himself) Well, that was probably just a fluke. It’s Wikipedia. You can’t expect perfection. (exhales) OK, so what else have you got?

Junior: In health, my group is doing a powerpoint presentation about dental hygiene. I’m supposed to find the pictures.

Man: All right, let’s get to work. What’s the first item on the list?

Junior: “Toothbrush.”

Man: Oh…kay.

(types in “toothbrush”)

Man: (not noticing the bad pic) So, there’s some toothbrushes…

Junior: What is that? Is that somebody using a toothbrush on his arm?

Man: Whoa!

(quickly backs up)

Man: Wow. OK. Well, maybe those were just flukes. Let’s just try one more.

Junior: What was that?

Man: Never mind. No more dental hygiene pictures. Anything else?

Junior: You could search for “Poseidon.” I’m supposed to write a report about him.

Man: OK, “Poseidon.” That sounds safe.

(types it in, hits enter; picture of naked lady is blurred)

Man: There we go.

Junior: Wow! Is that a naked lady?

Man: What the?

(quickly goes back)

Man: Oh, my god.

Junior: I don’t think that was Poseidon, Dad.

Man: You know what? Go away, Junior.

Junior: Whatever! I’ll just get on Mom’s computer.

Man: Wait, wait, do not do that yet. Just go play.

Junior: Good! OK, bye!

Man: (innocent-sounding encouragement) Yeah, you go have fun. (after a short pause) I wonder…

(types in “fisting”)

Man: (disgusted) Those are the same results I saw on the main Wikipedia search. Ugh. What kind of people run this website? God…

  • (Does these things, ad lib voiceover:
    goes to main page
  • clicks on “Wikipedia:Schools”
  • clicks on “talk”
  • types in: Porn on Simple English Wikipedia
  • types in: Shouldn’t you at least tell schools that you’ve got a lot of porn here, so they can make an informed decision? — Concerned Dad
  • save

Jimmy Wales reiterates support for Wikipedia porn filter

Larry Sanger

Over on Twitter, I’ve been having the first conversation, of sorts, I’ve had in years with Jimmy Wales. First, I wrote (pointing to my post, “What should we do about Wikipedia’s porn problem?“):

Jimbo replied:

Hmm, I thought:

Jimbo had a couple of replies to this, too:

I found it implausible that the God King could do nothing:

And that’s as far as it’s gone, as of this writing.

As much as I appreciate Jimmy Wales’ willingness to state his views publicly, it is hard to believe him when he says he “strongly” supports a filter. He is not only on the Wikimedia Board of Trustees, he is the only member of the board that, if I’m not mistaken, has a seat made especially for him. He surely knows the situation with the filter: he surely knows that the Wikipedia community has come out against it, and that the Board he sits on has let the matter drop. So why doesn’t he ask the Board to take it up again?

The trouble for Jimbo, Sue, the Board, and the grown-ups in the community generally, is that there are a lot of very loud filter opponents who will scream bloody murder if work on the filter continues, and even more if it is finished and installed. I’m sure they wish the whole thing would just go away. But that is the weak way out. That is why it is important that we not just let it go away. Jimmy Wales has boldly declared that he “strongly” supports the filter. I will believe him when he takes decisive action to help it come into being. Until then, I must conclude that he “weakly” supports it.

Some other Tweeps have shared “What should we do about Wikipedia’s porn problem?” as well, including Robert Scoble (261,146 followers) and TJ Manotoc (128,146 followers), who says “this is a bit of a shocker.” Several high-profile journalists (in order of response, Dan Murphy, Declan McCullagh, Andrew Lih, and Jason Walsh) have retweeted, expressed support, or told me they were reading the post.

What should we do about Wikipedia’s porn problem?

Larry Sanger

I want to start a conversation.

I. Problem? What problem?

So, you didn’t know that Wikipedia has a porn problem?

Let me say what I do not mean by “Wikipedia’s porn problem.” I do not mean simply that Wikipedia has a lot of porn. That’s part of the problem, but it’s not even the main problem. I’m 100% OK with porn sites. I defend the right of people to host and view porn online. I don’t even especially mind that Wikipedia has porn. There could be legitimate reasons why an encyclopedia might want to have some “adult content.”

No, the real problem begins when Wikipedia features some of the most disgusting sorts of porn you can imagine, while being heavily used by children. But it’s even more complicated than that, as I’ll explain.

(Note, the following was co-written by me and several other people. I particularly needed their help finding the links.)

Here is the short version:

Wikipedia and other websites of the Wikimedia Foundation (WMF) host a great deal of pornographic content, as well as other content not appropriate for children. Yet, the Wikimedia Foundation encourages children to use these resources. Google, Facebook, YouTube, Flickr, and many other high-profile sites have installed optional filters to block adult content from view. I believe the WMF sites should at a minimum install an optional, opt-in filter, as the WMF Board agreed to do [*] in 2011. I understand that the WMF has recently stopped work on the filter and, after a period of community reaction, some Board members have made it clear that they do not expect this filter to be finished and installed. Wikipedians, both managers and rank-and-file, apparently do not have enough internal motivation to do the responsible thing for their broad readership.

But even that is too brief. If you really want to appreciate Wikipedia’s porn problem, I’m afraid you’re going to have to read the following.

Here is the longer version:

The Wikimedia Foundation (WMF) and its project communities have recently stopped work on an optional, opt-in filter that the Foundation’s Board approved [*] in 2011. “Opt-in” means the filter would be switched on only for users who choose to turn it on. It would hide certain content behind a warning, and even then, the content would still be accessible to all users. It is accurate to call this proposed filter “weak”.  Nevertheless, after a period of community reaction, some Board members have made it clear that they do not expect this filter to be finished and installed. WMF director Sue Gardner implicitly endorsed their description of the situation at the end of this discussion [*] (at “I wish we could’ve talked about the image filter”).

Yet, Wikipedia and its image and file archive, Wikimedia Commons, host an enormous and rapidly growing amount of pornographic content. This includes (or did include, when this petition was drafted):

WARNING, THE FOLLOWING ARE EXTREMELY EXPLICIT
• articles illustrated with pornographic videos (“convent pornography” [*], “The Good Old Naughty Days” [*], “A Free Ride” [*])
• videos of male masturbation [*] and of ejaculation in two [*] formats [*]; pictures as well: ejaculation [*]
• illustrated articles about various extreme and fetishistic topics (cock and ball torture [*]hogtie bondage [*]fisting [*]autofellatio [*]pearl necklace [*]hentai [*])
• photo categories for the “sexual penetrative use of cucumbers” [*] and other vegetables, practices like scrotum inflation[*], pictures about penis torture [*]
(Note, [*] indicate links to archived versions of pages, for reference in case these pages are edited.) Some searches produce unexpected results [*]. For example, an image search for “male human” [*] in the “Simple Wikipedia” (touted as a children’s version: “The Simple English Wikipedia is for everyone! That includes children and adults who are learning English”) shows dozens upon dozens of pornographic and exhibitionist images. Almost all the most frequently viewed media files on Wikimedia servers [*] are sexually explicit files, which puts the lie to the oft-repeated claim that pornography is rarely viewed on Wikipedia.

Many parents and teachers are neither aware of the adult content on Wikipedia sites, nor that it is accessible to school-age students, nor that this content is in fact quite popular.

With so much adult content, so often accessed, you might think that Wikipedia is adults-only, and that children don’t use it. But of course, they do. We are told that today’s children are “Generation Z” who get much of their information online. Even pre-teen children are heavy users of Wikipedia, which is often ranked in the top five of all websites in terms of traffic. In fact, 25% of the contributors to Wikipedia are under the age of 18, according to a 2010 survey, and about 12% of both readers and contributors said they had only a primary education.

Youth readership is something that the Wikimedia Foundation appears to condone, at the very least. For example, Jimmy Wales has addressed audiences of school children about Wikipedia, and one of their Wikipedian in Residence programs is at the Children’s Museum of Indianapolis [*]. Wales expressed a common attitude about children’s use of Wikipedia in an interview in which he said that if “a 10-year-old is writing a little short paper for class, and they want to say that they got some information from Wikipedia, I think we should be just glad that the kid’s writing and actually thinking about giving credit — due credit — to people who have helped. And I think that’s wonderful.” (Libertyfund.org, at the 20:19 mark; cf. this BBC story)

If it is meant to be used with children, you might wonder whether Wikipedia and its sister projects really intend for their service to include pornography. Of that, there is no doubt. Wikipedia declares officially that it is “not censored” [*] (originally, this was labeled [*] “Wikipedia is not censored for children”) and its official policy page [*] on “offensive material” also makes it clear that pornography is permitted. To learn about the attitudes of many Wikipedians in the trenches, see the “Wikipedia:Pornography” [*] page and follow the links, or just try this search.

Moreover, in case there were any doubt, the Wikipedia community actively permits children to edit such content. The issue came up last year when a user who said he was 13 years old joined a Wikipedia volunteer group, WikiProject Pornography [*]. This raised eyebrows; someone proposed to restrict editing of articles about pornography to adults. Wikipedians discussed the matter at great length, took a vote, and a solid majority rejected the proposal [*].

This might look like a liberals vs. conservatives issue, at first glance; but I believe it is nonpartisan, more of an adolescent-minded-young-men vs. grownups-with-children issue. Nobody thinks of Google as being conservative just because they have SafeSearch (which is opt-out, i.e., turned on by default).

The WMF is a tax-exempt nonprofit organization with an educational mission. The presence of enormous amounts of unfiltered adult content, the “educational” purpose of which is questionable for anyone, directly conflicts with the goal of supporting the education of children.

That is Wikipedia’s porn problem.

II. Is inaction acceptable?

The official Wikipedia position on this problem appears to be: do nothing, and heap scorn upon anyone who suggests that something needs to be done. That also seems to suit many techno-libertarians, especially young males without children, who are the most enthusiastic consumers of porn, and who often dominate conversations about Internet policy.

I think inaction will prove unacceptable to most parents. At the very least there should be a reliable filter available, which parents might turn on if their younger children are using Wikipedia. I know that I would use it with my 6-year-old; then I might let him look at Wikipedia, if it were reliable. It’s hard to look over your children’s shoulder every moment they’re online. Wikipedians often glibly advise parents to do just this: if Junior is using Wikipedia to view porn and learn all about creative sexual fetishes, it’s your fault. You should be monitoring more closely. This obviously doesn’t wash, when it is well within Wikipedia’s power simply to add a filter that parents could turn on.

It is also unacceptable for most teachers and school district technology directors. How, really, can you defend giving kids access to a website with so much porn, when it is so obviously counter to CIPA rules, and when their parents would in many cases object (if they knew of the problem)?

What about you? If you agree, I’m going to make it easy for you to comment. I know that some Wikipedians might want to respond in a very hostile fashion–I’m no stranger to such disputes, myself–and this would put off a lot of people from commenting. But since this is my blog, happily, I can make up the rules, and so I will. I particularly encourage participation by parents, teachers, and women generally. I would especially like to hear from people who support the idea that Wikipedia tackle this problem. If you are opposed, that’s fine, but I will post your contribution only if you are polite and well-reasoned. I will not post anything that is personally insulting, and I also reserve the right not to post “flame bait” and merely silly or stupid remarks (and on such matters, my judgment is final). I will also pull the plug on any opponents who attempt to dominate the conversation. We already know there will be significant opposition, namely, from some Wikipedians and some of Wikipedia’s supporters. The purpose of this post is to get people talking about whether Wikipedia should be doing something about this problem.

III. What should be done?

There are a few things we might do.

First, we might blog, tweet, and post on Facebook about the problem. For better or worse, we’re all connected now, and getting the word out there is simply a matter of using social media. One person’s comment won’t amount to much–even this one won’t, probably. But a lot of people together can create a groundswell of support. So add your voice.

Second, we might contact leading Wikipedians, including Sue Gardner and other members of the WMF Board of Trustees. And don’t forget the many leading members of the Wikipedia project itself, such as the “checkusers” and the active administrators. If these people hear from readers not in the community, it can really make a difference. If enough of us write, Wikipedians might finally get the idea that there are a lot of readers out there who want a voice in determining what options are available to users.

A few months ago, I repeatedly (just to be sure) mailed Wikimedia chief Sue Gardner about Wikipedia’s porn problem. In 2010, she and I had a very productive and polite exchange, by both email and phone, about these issues. But recently, she has not responded. That was disappointing, but I believe I understand. My guess–it is only a guess, and I will correct this if I learn differently–is that Sue has been beaten down by her dysfunctional community. She has given up. I think she wants a filter installed, but it is politically impossible, and she fears for her job if she takes a hard-line stand. That’s my guess. If I am right, then external pressure will wake up the Wikipedia community and make it easier for her to insist that the community do the right thing.

Third, talk to the press. If you know reporters, or people who have lots of followers online, ask them to report about this story. It’s a big story. Why isn’t it big news that Wikipedia has given up its 2011 commitment to install a porn filter? Surely it is. It’s time to ask the Wikimedia Foundation, as well as the leading Wikipedians, some hard questions. (And reporters, do be sure to ask questions of leading Wikipedians; I say that because the WMF does not control Wikipedia or Commons. If they did, they would be legally liable for a lot more than they are now. The people really making the decision, arguably, are the adolescent-minded Wikipedia admins who see nothing wrong with the current situation–not necessarily WMF employees.)

The fourth option is the “nuclear” option: we might boycott Wikipedia. Now, I’m not calling for a boycott–yet. If anything, I’d like to kick off a public discussion about whether we should boycott Wikipedia. I have been talking about this with some online acquaintances, and I am honestly torn. I don’t want this to be a mere publicity stunt: I want to call for a boycott only if it could possibly have a positive effect. I also don’t want to call for a boycott if I don’t know that there will be a significant groundswell of popular support. And I don’t want this to be about me. I want it to be all about making Wikipedia more responsibly managed and more useful for children–kids are some of its most important users, even if Wikipedians loudly insist that it is not “censored for children.”

But if Wikipedia and the Wikimedia Foundation do not take decisive action between now and end-of-year fundraising time, I might very well call for a boycott. For now, let’s get the word out, start a conversation, and see if we can make a difference without taking such drastic action.

Please feel free to repost this online.

UPDATE: in a response to me, Jimmy Wales has reiterated his support for a Wikipedia porn filter. But this wouldn’t be the first time Jimbo has supported a Wikipedia proposal that never saw the light of day. Let’s make him put his money where his mouth is.

UPDATE 2: I made a video vignette, “Does Wikipedia have a porn problem? Dad investigates.

Wikipedia’s porn filter DOA, and a proposal

Larry Sanger

Warning: this post has links to pages that are definitely not safe for work or school. I’ll warn you which ones those are with “NSFW.”

The post has two parts. The first is about the availability of porn on Wikipedia and Wikimedia Commons, which for most people reading this is probably old news; but they’ve reached some new lows, such as actual pornographic films. The second part contains what I think is real news: that the much-debated porn filter they were developing is no longer in development and looks likely to be dropped.

I

There are, as many people reading this know very well, stupendous amounts of explicit imagery on Commons as well as Wikipedia itself; simply search for any fetish, porn industry term, or body part, and you’ll be likely to find it illustrated literally ad nauseam. Users, whether they like it or not, can be exposed to all sorts of inappropriate, explicit content when doing perfectly innocuous searches. This degree of smut is obviously inappropriate for an Internet resource touted as “educational” and promoted for classroom use.

Almost two years ago, I reported the Wikimedia Foundation to the FBI (as I was required to by law) on grounds that Wikimedia Commons was hosting two image categories, labeled “Pedophilia” and “Lolicon,” which featured depictions of child sexual abuse. I tracked the fallout in two posts. Recently, FoxNews.com followed up their coverage, reporting that little had been done since then. The Fox News reporter did a good job, I think. But some more developments have come to light.

The pervy categories are still there, and include whole hierarchies of categories under the titles “Erotic images of children” (NSFW) and “Child sexuality” (NSFW). The garbage by Martin van Maele, who drew many illustrations of children being sexually abused in the early 20th century, is still there, aggressively and proudly defended by the powers-that-be on Wikimedia Commons as “historical” and “educational.” To give you an idea of the attitude of the pedophilia sympathizers on Commons, who clearly feel themselves to be put-upon and wronged, consider that there is a so-called “Hate for pedophiles” category which has existed, unmolested, since May 2010 (which, come to think of it, is the month when my FBI report made news). Consider also (as was recently pointed out to me) that the activists-for-free-porn on Commons have been awarding each other the new, outrageously gross, “Hot Sex Barnstar” (NSFW!) for their efforts. There are clearly some (to me) extremely unsavory characters involved who have made it their mission to make Commons, and Wikipedia as well, as free as possible to host the most explicit sorts of imagery on this tax-exempt, non-profit 501(c)(3) website.

Recently I received an email from someone who follows this issue. He called a few things to my attention. One item: a convicted child pornographer has apparently been prominently involved in curating adult pornography. It seems he is one of those who loves to use Commons to post pervy naked pictures of himself–discussion here. He is probably not the only one. Another item: Commons is now hosting an antique video (really, really NSFW) which I am told (I didn’t watch the whole thing) shows a dog fellating a woman (in a nun’s habit) and a man.

The Wikipedia community’s more prurient tendencies are, so far from being reined in and moderated, exercised more boldly than ever.

II

My correspondent also directed me to this extremely interesting discussion on the Wikimedia Foundation mailing list (Foundation-L). Read both pages–here is page 2. As I write this, discussion is ongoing.

This discussion has revealed two pieces of news so far.

First, the powers-that-be at the WMF have directed their programmers to stop working on their opt-in “controversial content” (including porn) filter. They have higher priorities, we are told.

This needs some background. The very least that Wikipedia could do, on this issue, is to let people turn on a filter so that they (or the children using their computers) would not be shown porn and other grossly inappropriate content. In fact, my understanding is that the porn would merely be “collapsed,” meaning that the user could still easily display it by “uncollapsing” the image. This, as sane people can agree, sounds both welcome and completely uncontroversial. This is what the WMF’s consultant recommended in 2010, and it was widely assumed, after a referendum indicated general support (if lukewarm), that it would be implemented soonish. After all, the tool would simply let people turn on a personal filter. (It wouldn’t be turned on automatically–users would have turn it on in user settings.) And the filter would only hide “controversial” images, not completely block them. But, no. There’s no compromise on porn in Wikipedia-land, despite this being an “educational” project constitutionally committed to consensus and compromise. They want their commitment to free speech so loudly proclaimed that two full-color vulvas greet you at the top of the page (with a variety of others further down), should you have the temerity to look up the subject on Wikipedia. There has been such a groundswell of loud opposition to the opt-in filter idea that the project was never implemented.

This leads me to the second piece of news. It appears that two Wikimedia Foundation Board members, Kat Walsh and Phoebe Ayers, have both changed their positions. The Board was sharply divided on the need of this filter (which is just as amazing and silly as it sounds) last fall, but things have become even sillier since then. There is more community opposition, and so Ms. Walsh and Ms. Ayers no longer support it. They strongly imply that the earlier decision to build a filter is now a dead letter.

This says something very disappointing and even disturbing about the Wikimedia Foundation as an institution. It certainly looks as though they are in the thrall of anarchist porn addicts whose scorn for the interests of children–the group of users that stands to gain the most from a high-quality free encyclopedia–is constrained only by the limits of the law, and maybe not even that.

Eighteen months ago, after speaking at length to both WMF Executive Director Sue Gardner and the consultant she hired, Robert Harris, I had the distinct impression that the WMF might be capable of prevailing on Wikipedia and Commons to the extent of, at least, installing a completely innocuous opt-in filter system. So color me disillusioned.

I don’t wish any grief or embarrassment upon Wikipedia’s more sensible managers, like those I’ve mentioned–Gardner, Ayers, and Walsh. They are clearly right that politically they’re in a “damned-if-you-do, damned-if-you-don’t” situation. But given the choice, I’d rather be damned for doing the bare minimum needed to meet the needs of children, or at least trying to do that, than be more richly damnable for not doing anything. I’d suck it up and remind myself that there are quite a few more important things than power and status. Since such a complete no-brainer as an opt-in filter is currently politically impossible, Gardner and other sane adults among the Wikimedia managers face a dilemma: maintain some degree of power in the organization, but implicitly supporting what is only too clearly a deeply dysfunctional and irresponsible organization; or resign. If I were Gardner, or a member of the Board, I would seriously consider resigning and making a clear and pointed public statement.

Ultimately, the WMF, which has legal responsibility for the project–and which is supposed to be the grown-up, after all–has shown in its inability to act on this issue that it cannot avoid a truly spectacular scandal or a really serious lawsuit. The potential trouble isn’t that the government might shut Wikipedia down, or slap it with judgments it can’t repay. Rather, the biggest potential trouble is a mass exodus of profoundly disillusioned contributors, which is surely the Achilles’ heel of a project with tens of millions of articles and files to manage.

If she really wanted to take serious leadership on this issue, what Gardner might do is spearhead a brand new project to use some of the many millions they’ve amassed and start a serious Wikipedia for young people, one that K-12 teachers can be proud to use with their students. It could be a curated version of Wikipedia. It would not only have obvious controls regarding age-appropriate content, it would also have reviewed versions of articles, Citizendium-style. They could brag that they have finally adopted “flagged revisions,” which the media has been repeatedly led to believe is “right around the corner.”

I do not think the WMF needs to ask the Wikipedia rank-and-file to vote on this. Or, if they ask those people, they should also ask their contributors to vote on it, as well. The WMF has to ask itself: who are we serving, the Wikipedia rank-and-file, which is dominated by anarchist porn addicts, or readers? Are they sensible enough to answer this question correctly?

As an added bonus, if a WMF-supported responsible version of Wikipedia existed and were growing healthily, then I would shut up about the X-rated version of Wikipedia. Maybe.

Wikipedia’s proposed legal policies

Larry Sanger

It looks like Wikipedia might be, finally, accepting its legal obligations.

Geoff Brigham, General Counsel of the Wikimedia Foundation (which is the legal owner of Wikipedia), has posted to the Foundation-l mailing list a link to a draft document described as Office of General Counsel “staff policies.”  The document is here: meta.wikimedia.org/wiki/Legal/Le…

When I saw it, one of my first reactions was: it’s totally amazing that  it has taken Wikipedia ten years to draft this document.  Better late than never, though.

Note that these are proposed policies of the Wikimedia Foundation. They are described as “staff policies.” I don’t believe the average Wikipedia editor is considered “staff.”  Nevertheless, the policies would seem to impact Wikipedians directly, and they essentially serve to conclude certain contentious issues in a way that is sure to upset some of the louder idiots on Wikipedia.  For example, the Wikimedia Foundation would be placing the whole sordid “child pornography” mess in the ambit of “office actions.”  Office actions, for those who aren’t familiar with this term of wiki-governance, are content decisions that the foundation takes without consulting or debating with the Wikipedia community.  They are, in short, rare end runs around the collaborative process, rare bows to the fact that the enterprise takes place in a broader societal (especially legal) context.

In particular, the page addresses child pornography on Wikipedia (and Wikimedia Commons).  It is worth quoting the section:

5. Child Pornography. Child pornography must be removed from the site immediately. Generally speaking, child pornography constitutes a photograph or other visual image of a child engaged is sexually explicit conduct.[3]

It is important to note that depictions such as drawings, cartoons, sculptures, or paintings that represent children in sexually explicit conduct may run afoul of certain obscenity statutes if the depictions lack certain cultural or social value. See 18 U.S.C. 1466A.

Relevant federal statutes on child pornography – with corresponding definitions — may be found here:

missingkids.com/missingkids/serv…

State laws may also be applicable.[4]

As soon as possible, the Office of the General Counsel or the Reader Relations unit should report any discovery of child pornography (as described above) to the National Center for Missing and Exploited Children (800-843-5678).

Community members who find child pornography on the site may delete and report it. Community members are asked to notify Readers Relations or the Office of the General Counsel about any child pornography found on the site to ensure it is properly reported to the authorities.

Even when reporting, Community members are advised not to send images of child pornography through any means, including the email.

First let me point out that “child pornography” is the term used by the WMF–extended to include “drawings, cartoons, sculptures, or paintings.”  The text cites precisely the statute under which I reported the WMF to the FBI (18 U.S.C. 1466A), and under which, at my request, a Senator and Representative of mine referred the matter to the Congressional FBI liason.  But for this, I was publicly excoriated, as you can see here and here.  Nevertheless, it led to reportage in The Register and FoxNews.com and others.  It also had many other indirect effects, including the appointment of a consultant, Robert Harris, to write a report about how the community should deal with “controversial content.”  Also, I’m not sure that this is related, but the WMF’s counsel during the child pornography hullaballoo, Mike Godwin, left the foundation I think last fall.  He had nothing but loud, blustery, and quite unprofessional contempt for my report to the FBI.  It seems that Geoff Brigham, the new counsel who is apparently responsible for these “staff policies,” would not have the same reaction.  If he is willing to underscore the Foundation’s commitment to 18 U.S.C. 1466A, going so far as to require staff members to report violations to the National Center for Missing and Exploited Children, the Foundation’s legal management certainly does seem to have changed for the better.

Congratulations to the WMF would be a little premature at this point, because we need to see how things will shake out–and whether the WMF will actually act on its own policies.  Still, I’m feeling vindicated.

But maybe more interesting in the long run is the fact that the WMF has–as was inevitable, because it is surely the WMF’s legal obligation to do so–taken certain new powers upon itself, in a way that is possibly unprecedented.  Observe a new element of Wikipedia’s governance taking shape before your eyes.

25 Replies to Maria Bustillos

Larry Sanger

In a recent essay, in The Awl (“Wikipedia and the Death of the Expert“), Maria Bustillos commits a whole series of fallacies or plain mistakes and, unsurprisingly, comes to some quite wrong conclusions.  I don’t have time to write anything like an essay in response, but I will offer up the following clues for Ms. Bustillos and those who are inclined to nod approvingly with her essay:

1. First, may I point out that not everybody buys that Marshall McLuhan was all that.

2. The fact that Nature stood by its research report (which was not a peer-reviewed study) means nothing whatsoever.  If you’ll actually read it and apply some scholarly or scientific standards, Britannica‘s response was devastating, and Nature‘s reply thereto was quite lame.

3. There has not yet been anything approaching a credible survey of the quality of Wikipedia’s articles (at least, not to my knowledge).  Nobody has shown, in studies taken individually or in aggregate, that Wikipedia’s articles are even nearly as reliable as a decent encyclopedia.

4. If you ask pretty much anybody in the humanities, you will learn that the general impression that people have about Wikipedia articles on these subjects is that they are appalling and not getting any better.

5. The “bogglingly complex and well-staffed system for dealing with errors and disputes on Wikipedia” is a pretentious yet brain-dead mob best likened to the boys of The Lord of the Flies.

6. It is trivial and glib to say that “Wikipedia is not perfect, but then no encyclopedia is perfect.”  You might as well say that the Sistine Chapel is not perfect.  Yeah, that’s true.

7. It is not, in fact, terribly significant that users can “look under the hood” of Wikipedia.  Except for Wikipedia’s denizens and those unfortunate enough to caught in the crosshairs of some zealous Wikipedians using the system to commit libel without repercussion, nobody really cares what goes on on Wikipedia’s talk pages.

8. When it comes to actually controversial material, the only time that there is an “attempt to strike a fair balance of views” in Wikipedia-land is when two camps with approximately equal pull in the system line up on either sides of an issue.  Otherwise, the Wikipedians with the greatest pull declare their view as “the neutral point of view.”  It wasn’t always this way, but it has become that way all too often.

9. I too am opposed to experts exercising unwarranted authority.  But there is an enormous number of possibilities between a world dominated by unaccountable whimsical expert opinion and a world without any experts at all.  Failing to acknowledge this is just sloppiness.

10. If you thought that that Wikipedia somehow meant the end of expertise, you’d be quite wrong.  I wrote an essay about that in Episteme. (Moreover, in writing this, I was criticized for proving something obvious.)

11. The fact that Marshall McLuhan said stuff that presciently supported Wikipedia’s more questionable epistemic underpinnings is not actually very impressive.

12. Jaron Lanier has a lot of very solid insight, and it is merely puzzling to dismiss him as a “snob” who believes in “individual genius and creativity.”  There’s quite a bit more to Lanier and “Digital Maoism” than that.  Besides, are individual genius and creativity now passe?  Hardly.

13. Clay Shirky isn’t all that, either.

14. Being “post-linear” and “post-fact” is not “thrilling” or profound.  It’s merely annoying and tiresome.

15. Since when did the Britannica somehow stand for guarantees of truth?  Whoever thought so?

16. There are, of course, vast realms between the extremes of “knowledge handed down by divine inspiration” and some dodgy “post-fact society.”

17. The same society can’t both be “post-fact” and thrive on “knowledge [that] is produced and constructed by argument,” Shirky notwithstanding.  Arguments aim at truth, i.e., to be fact-stating, and truth is a requirement of knowledge.  You can’t make sense of the virtues of dialectical knowledge-production without a robust notion of truth.

18. Anybody who talks glowingly about the elimination of facts, or any such thing, simply wants the world to be safe for the propagation of his ideology by familiar, manipulable, but ultimately irrational social forces.  No true liberal can be in favor of a society in which there are no generally-accepted, objective standards of truth, because then only illiberal forces will dominate discourse.

19. Expert opinion is devalued on Wikipedia, granted-and maybe also on talk radio and its TV spin-offs, and some Internet conversations.  But now, where else in society has it been significantly devalued?

20. What does being a realist about expertise–i.e., one who believes it does exist, who believes that an expert’s opinion is, on balance, more likely to be true than mine in areas of his expertise–have to do with individualism?  Surely it’s more the romantic individualists who want to be unfettered by the requirements of reason, including the scientific methods and careful reasoning of experts, who are naturally inclined to devalue expertise per se.

21. Wikipedia does not in any plausible way stand for a brave new world in which competing arguments hold sway over some (fictional) monolithic expert opinion.  There have always been competing expert views; Wikipedia merely, sometimes, expresses those competing expert views when, from some professors, you might hear only one side.  Sometimes, Wikipedia doesn’t even do that, because the easy politicization of collaborative text written without enforceable rules makes neutrality an elusive ideal.

22. Um, we have had the Internet for more than 20 years.

23. The writing down of knowledge is more participatory now, and that’s a fine thing (or can be).  But knowledge itself is, always has been, and always will be an individual affair.  The recording of things that people take themselves to know, in Wikipedia or elsewhere, and findable via Google, does not magically transfer the epistemic fact of knowledge from the recorder even to those who happen to find the text, much less to all readers online.  Knowledge itself is considerably more difficult than that.

24. Ours is an individualistic age?  Don’t make me laugh.  People who actually think for themselves–you know, real individualists–appear to me to be as rare as they ever have been.  It is a delight to meet the few who are out there, and one of the better features of the Internet is that it makes it easier to find them.  The West might be largely capitalist, but that doesn’t stop us from being conformist, as any high school student could tell you.

25. The real world is considerably more complex than your narrative.

Looong interview with me by Dan Schneider in Cosmoetica

Larry Sanger

Off and on, for the last 2.5 years, I have been answering questions from poet and critic Dan Schneider, who has conducted a series of long, interesting interviews.  My interview, posted a few hours ago, is #27 in the series; Schneider himself gives the interview four stars (out of five).  That should tell you something about the Schneider: he’s the kind of guy who asks questions that take hours and hours to answer, and then has the audacity to rate the answers.  The questions cover my life, Wikipedia, Citizendium, philosophy, and my reactions to various idiosyncratic puzzles that Schneider has come up with.  If you were to ask why I agreed to do an interview that ended up being 40,000 words long, without any compensation or anything, I’d say that I didn’t know it was going to be that long, and Dan Schneider was very persistent.  And maybe this reveals just how vain I really am.