What should we do about Wikipedia’s porn problem?

I want to start a conversation.

I. Problem? What problem?

So, you didn’t know that Wikipedia has a porn problem?

Let me say what I do not mean by “Wikipedia’s porn problem.” I do not mean simply that Wikipedia has a lot of porn. That’s part of the problem, but it’s not even the main problem. I’m 100% OK with porn sites. I defend the right of people to host and view porn online. I don’t even especially mind that Wikipedia has porn. There could be legitimate reasons why an encyclopedia might want to have some “adult content.”

No, the real problem begins when Wikipedia features some of the most disgusting sorts of porn you can imagine, while being heavily used by children. But it’s even more complicated than that, as I’ll explain.

(Note, the following was co-written by me and several other people. I particularly needed their help finding the links.)

Here is the short version:

Wikipedia and other websites of the Wikimedia Foundation (WMF) host a great deal of pornographic content, as well as other content not appropriate for children. Yet, the Wikimedia Foundation encourages children to use these resources. Google, Facebook, YouTube, Flickr, and many other high-profile sites have installed optional filters to block adult content from view. I believe the WMF sites should at a minimum install an optional, opt-in filter, as the WMF Board agreed to do [*] in 2011. I understand that the WMF has recently stopped work on the filter and, after a period of community reaction, some Board members have made it clear that they do not expect this filter to be finished and installed. Wikipedians, both managers and rank-and-file, apparently do not have enough internal motivation to do the responsible thing for their broad readership.

But even that is too brief. If you really want to appreciate Wikipedia’s porn problem, I’m afraid you’re going to have to read the following.

Here is the longer version:

The Wikimedia Foundation (WMF) and its project communities have recently stopped work on an optional, opt-in filter that the Foundation’s Board approved [*] in 2011. “Opt-in” means the filter would be switched on only for users who choose to turn it on. It would hide certain content behind a warning, and even then, the content would still be accessible to all users. It is accurate to call this proposed filter “weak”.  Nevertheless, after a period of community reaction, some Board members have made it clear that they do not expect this filter to be finished and installed. WMF director Sue Gardner implicitly endorsed their description of the situation at the end of this discussion [*] (at “I wish we could’ve talked about the image filter”).

Yet, Wikipedia and its image and file archive, Wikimedia Commons, host an enormous and rapidly growing amount of pornographic content. This includes (or did include, when this petition was drafted):

WARNING, THE FOLLOWING ARE EXTREMELY EXPLICIT
• articles illustrated with pornographic videos (“convent pornography” [*], “The Good Old Naughty Days” [*], “A Free Ride” [*])
• videos of male masturbation [*] and of ejaculation in two [*] formats [*]; pictures as well: ejaculation [*]
• illustrated articles about various extreme and fetishistic topics (cock and ball torture [*]hogtie bondage [*]fisting [*]autofellatio [*]pearl necklace [*]hentai [*])
• photo categories for the “sexual penetrative use of cucumbers” [*] and other vegetables, practices like scrotum inflation[*], pictures about penis torture [*]
(Note, [*] indicate links to archived versions of pages, for reference in case these pages are edited.) Some searches produce unexpected results [*]. For example, an image search for “male human” [*] in the “Simple Wikipedia” (touted as a children’s version: “The Simple English Wikipedia is for everyone! That includes children and adults who are learning English”) shows dozens upon dozens of pornographic and exhibitionist images. Almost all the most frequently viewed media files on Wikimedia servers [*] are sexually explicit files, which puts the lie to the oft-repeated claim that pornography is rarely viewed on Wikipedia.

Many parents and teachers are neither aware of the adult content on Wikipedia sites, nor that it is accessible to school-age students, nor that this content is in fact quite popular.

With so much adult content, so often accessed, you might think that Wikipedia is adults-only, and that children don’t use it. But of course, they do. We are told that today’s children are “Generation Z” who get much of their information online. Even pre-teen children are heavy users of Wikipedia, which is often ranked in the top five of all websites in terms of traffic. In fact, 25% of the contributors to Wikipedia are under the age of 18, according to a 2010 survey, and about 12% of both readers and contributors said they had only a primary education.

Youth readership is something that the Wikimedia Foundation appears to condone, at the very least. For example, Jimmy Wales has addressed audiences of school children about Wikipedia, and one of their Wikipedian in Residence programs is at the Children’s Museum of Indianapolis [*]. Wales expressed a common attitude about children’s use of Wikipedia in an interview in which he said that if “a 10-year-old is writing a little short paper for class, and they want to say that they got some information from Wikipedia, I think we should be just glad that the kid’s writing and actually thinking about giving credit — due credit — to people who have helped. And I think that’s wonderful.” (Libertyfund.org, at the 20:19 mark; cf. this BBC story)

If it is meant to be used with children, you might wonder whether Wikipedia and its sister projects really intend for their service to include pornography. Of that, there is no doubt. Wikipedia declares officially that it is “not censored” [*] (originally, this was labeled [*] “Wikipedia is not censored for children”) and its official policy page [*] on “offensive material” also makes it clear that pornography is permitted. To learn about the attitudes of many Wikipedians in the trenches, see the “Wikipedia:Pornography” [*] page and follow the links, or just try this search.

Moreover, in case there were any doubt, the Wikipedia community actively permits children to edit such content. The issue came up last year when a user who said he was 13 years old joined a Wikipedia volunteer group, WikiProject Pornography [*]. This raised eyebrows; someone proposed to restrict editing of articles about pornography to adults. Wikipedians discussed the matter at great length, took a vote, and a solid majority rejected the proposal [*].

This might look like a liberals vs. conservatives issue, at first glance; but I believe it is nonpartisan, more of an adolescent-minded-young-men vs. grownups-with-children issue. Nobody thinks of Google as being conservative just because they have SafeSearch (which is opt-out, i.e., turned on by default).

The WMF is a tax-exempt nonprofit organization with an educational mission. The presence of enormous amounts of unfiltered adult content, the “educational” purpose of which is questionable for anyone, directly conflicts with the goal of supporting the education of children.

That is Wikipedia’s porn problem.

II. Is inaction acceptable?

The official Wikipedia position on this problem appears to be: do nothing, and heap scorn upon anyone who suggests that something needs to be done. That also seems to suit many techno-libertarians, especially young males without children, who are the most enthusiastic consumers of porn, and who often dominate conversations about Internet policy.

I think inaction will prove unacceptable to most parents. At the very least there should be a reliable filter available, which parents might turn on if their younger children are using Wikipedia. I know that I would use it with my 6-year-old; then I might let him look at Wikipedia, if it were reliable. It’s hard to look over your children’s shoulder every moment they’re online. Wikipedians often glibly advise parents to do just this: if Junior is using Wikipedia to view porn and learn all about creative sexual fetishes, it’s your fault. You should be monitoring more closely. This obviously doesn’t wash, when it is well within Wikipedia’s power simply to add a filter that parents could turn on.

It is also unacceptable for most teachers and school district technology directors. How, really, can you defend giving kids access to a website with so much porn, when it is so obviously counter to CIPA rules, and when their parents would in many cases object (if they knew of the problem)?

What about you? If you agree, I’m going to make it easy for you to comment. I know that some Wikipedians might want to respond in a very hostile fashion–I’m no stranger to such disputes, myself–and this would put off a lot of people from commenting. But since this is my blog, happily, I can make up the rules, and so I will. I particularly encourage participation by parents, teachers, and women generally. I would especially like to hear from people who support the idea that Wikipedia tackle this problem. If you are opposed, that’s fine, but I will post your contribution only if you are polite and well-reasoned. I will not post anything that is personally insulting, and I also reserve the right not to post “flame bait” and merely silly or stupid remarks (and on such matters, my judgment is final). I will also pull the plug on any opponents who attempt to dominate the conversation. We already know there will be significant opposition, namely, from some Wikipedians and some of Wikipedia’s supporters. The purpose of this post is to get people talking about whether Wikipedia should be doing something about this problem.

III. What should be done?

There are a few things we might do.

First, we might blog, tweet, and post on Facebook about the problem. For better or worse, we’re all connected now, and getting the word out there is simply a matter of using social media. One person’s comment won’t amount to much–even this one won’t, probably. But a lot of people together can create a groundswell of support. So add your voice.

Second, we might contact leading Wikipedians, including Sue Gardner and other members of the WMF Board of Trustees. And don’t forget the many leading members of the Wikipedia project itself, such as the “checkusers” and the active administrators. If these people hear from readers not in the community, it can really make a difference. If enough of us write, Wikipedians might finally get the idea that there are a lot of readers out there who want a voice in determining what options are available to users.

A few months ago, I repeatedly (just to be sure) mailed Wikimedia chief Sue Gardner about Wikipedia’s porn problem. In 2010, she and I had a very productive and polite exchange, by both email and phone, about these issues. But recently, she has not responded. That was disappointing, but I believe I understand. My guess–it is only a guess, and I will correct this if I learn differently–is that Sue has been beaten down by her dysfunctional community. She has given up. I think she wants a filter installed, but it is politically impossible, and she fears for her job if she takes a hard-line stand. That’s my guess. If I am right, then external pressure will wake up the Wikipedia community and make it easier for her to insist that the community do the right thing.

Third, talk to the press. If you know reporters, or people who have lots of followers online, ask them to report about this story. It’s a big story. Why isn’t it big news that Wikipedia has given up its 2011 commitment to install a porn filter? Surely it is. It’s time to ask the Wikimedia Foundation, as well as the leading Wikipedians, some hard questions. (And reporters, do be sure to ask questions of leading Wikipedians; I say that because the WMF does not control Wikipedia or Commons. If they did, they would be legally liable for a lot more than they are now. The people really making the decision, arguably, are the adolescent-minded Wikipedia admins who see nothing wrong with the current situation–not necessarily WMF employees.)

The fourth option is the “nuclear” option: we might boycott Wikipedia. Now, I’m not calling for a boycott–yet. If anything, I’d like to kick off a public discussion about whether we should boycott Wikipedia. I have been talking about this with some online acquaintances, and I am honestly torn. I don’t want this to be a mere publicity stunt: I want to call for a boycott only if it could possibly have a positive effect. I also don’t want to call for a boycott if I don’t know that there will be a significant groundswell of popular support. And I don’t want this to be about me. I want it to be all about making Wikipedia more responsibly managed and more useful for children–kids are some of its most important users, even if Wikipedians loudly insist that it is not “censored for children.”

But if Wikipedia and the Wikimedia Foundation do not take decisive action between now and end-of-year fundraising time, I might very well call for a boycott. For now, let’s get the word out, start a conversation, and see if we can make a difference without taking such drastic action.

Please feel free to repost this online.

UPDATE: in a response to me, Jimmy Wales has reiterated his support for a Wikipedia porn filter. But this wouldn’t be the first time Jimbo has supported a Wikipedia proposal that never saw the light of day. Let’s make him put his money where his mouth is.

UPDATE 2: I made a video vignette, “Does Wikipedia have a porn problem? Dad investigates.


by

Posted

in

Comments

Please do dive in (politely). I want your reactions!

134 responses to “What should we do about Wikipedia’s porn problem?”

  1. Andreas Kolbe

    I absolutely believe that Jimmy Wales has always been in favor of the image filter, just as he was in favor of flagged revisions (a system whereby anonymous edits to Wikipedia biographies and other articles would only have become visible to the general public after they had been vetted by a contributor with a known track record of good edits). I actually believe he would even be in favor of an opt-out filter, where explicit material is hidden by default, and the user has to switch the filter off, like in Google and Flickr, to see it.

    The problem is that Jimmy Wales is just one board member among many these days, and that the others do not necessarily share his views. Two board members up for re-election publicly withdrew their support for the image filter earlier this year, characterising the board’s decision to implement even a weak, opt-in image filter as a mistake.

    Implementation of the image filter was supposed to begin in January 2012. Instead, it has been indefinitely postponed, and no plan or new schedule to implement one has been announced to date. Even that information only became public knowledge because people asked questions about it on the Wikimedia mailing list.

    As it stands, the image filter is, to all intents and purposes, dead.

    1. We might see whether it is really and truly dead when they make a public statement, which one journalist tells me they have promised him.

  2. Taymon

    I favor the image filter as it was proposed, and really don’t understand why so many people don’t like it. I’d like to, but I’ve been unable to find any explanation from them.

    I don’t think they’ll go further than that, though, because Wikipedia’s philosophy has always been that it’s the user’s decision whether to see something—not their parents’, school district’s, or government’s. I can’t honestly say that I disagree.

    1. I’m glad you support the filter proposal. But I wanted to comment on your remark that “Wikipedia’s philosophy has always been that it’s the user’s decision whether to see something.” Most adults, and particularly most parents, understand the following and hardly need it explained, but I’ll explain it anyway. It is part of the responsibility of parents to shield their children from seeing certain things until they’re ready for it. That is why most of us don’t take four-year-olds into horror films, or eight-year-olds into porn films. For a whole variety of reasons, we expose our children to ever-expanding circles of media. (And in this regard, schools act in loco parentis.) This need not reflect a moralistic attitude (not that there is anything wrong with having a moralistic attitude; I am a moralist myself), nor need it deserve the epithet “censorship,” at least not in any condemnatory use. It is merely responsible parenting. If you’re not a parent, you might not understand.

      What portion of active Wikipedians are parents? It’s hard to say, but this 2010 survey shows that only 14.72% of Wikipedia readers and users had children.

      Explains quite a lot, I think.

  3. Matthew Butler

    The *only* circumstance in which I’d ever support an image filter on wikipedia is if it was controlled by the community. There would need to be an agreed upon threshold for flagging images and the option to not view flagged images if you are easily offended or if you don’t want kids looking. Again, it needs to be community-driven in the spirit of wikipedia. Pictures of body parts are *not* porn in many people’s opinion and the last thing we need is a moral crusade against knowledge.

    1. I ask myself, as a father, whether I would want my son to study the pictures on Wikipedia’s “vulva” article. On balance, I think not. Now, am I terribly worried that he would somehow stumble across that page or those images? No; but on the other hand, those are just examples. There are many, many other examples of not obviously sexual, but still “adult,” topics and media. A parent who opts not to have his little boy or girl look at such things need not represent a “moral crusade against knowledge.” It would be if we were trying to stop adults from seeing such things, but that’s not the point, obviously.

      The Wikipedia community represents an extremely narrow and not at all representative cross-section of its readership. Because it is so much dominated by young men, especially young men of a libertarian-progressive bent, the choices made by that community would not necessarily reflect the needs, desires, and choices of the readers who most want and need a filter. Indeed, the Wikipedia community is so completely blinkered to the needs of children, not to mention the sensibilities of women and older readers, that if the decision is left entirely to them, without external input, there would be no filter at all. The bottom line is that Wikipedia, both the WMF and the community as a whole, needs to wake up and realize that it is part of a bigger world. If you want to serve that world, and do the maximum amount of good, then give your users some real choice!

      1. Keith Lawless

        Given the demographics of the Wikipedia contributors that you mention, would we not end up with a “flagging” problem instead of a “filtering” problem? If an opt-out image filtering system were in place, someone would still have to flag an image as “adult”. The logical candidate is the image contributor – do you feel there is enough incentive for this person to voluntarily flag an image? Since this person was motivated (for whatever reason) to upload the image in the first place, I’m skeptical that they would feel the social responsibility to “self-censor”.

        1. ix

          They would not be self-censoring (since the filter is optional and most adults don’t use it), and I think you are blinded by your own low opinion of people who upload pornographic content. I’m pretty sure most of these people actually feel that the picture merits addition to the article. A case can be made (and I’ll make it here) that a video showing an ejaculation on an article about ejaculation exactly does what it is supposed to: provide a clarifying illustration. This might be something a very small subset of people actively seek out, because the text description alone doesn’t quite cut it (say, a woman that has never had sex).

          In any case, if you allow that people uploading this content may often feel this way, I do not think it is incompatible with that belief system to also flag this content as something that is “generally experienced as pornographic”. Not only that, any parent coming across the file in question can add that tag, and it would only affect people using the image filter, meaning their addition to a file should be uncontroversial (by WP standards, i.e. often still hotly debated) in most cases.

          Adding that tag makes sense in any case. If somebody is going out of their way to upload a picture to wikipedia, surely checking the box that says “possibly pornographic” will not be beyond them. And for those cases they miss, someone will see and tag it.

  4. Gabe Kneisley

    I wholeheartedly agree with filtering the content, and I feel that it should be opt-out. Beyond kids, wikipedia is often used for work purposes. Who the hell wants to learn more about a cucumber just to see it shoved in a vagina?

    My only caution here is calling out the pitchforks too soon. Dungeons and Dragons and the fact that it is still identified with Satanism, murder, and psychosis is a good example of the long life of falsehood.

    I don’t think any of us want Wikipedia labeled as a pornography site. I think what we need to do is just change the demographic of Wikipedia. In the end, it really is ours. Just like neutrality is a moving community standard, viewing standards and preferences need to be.

    As a society, we consider some material actively harmful to the social development of children. Among those things are pornography, racist literature, enticement torture and murder, etc. It really is irresponsible to think that parents are just going to tolerate this. Wikipedia needs parents, educators, and the goodwill of the community a hell of a lot more than they need Wikipedia. It does have to change, but there may be more constructive things to ask for, like increased participation by people who care. As an occasional editor, I will definitely be using my voice to argue against this crap, and I hope others will too.

    1. John Lilburne

      Whilst I would tend to agree with not getting out the pitchforks too soon, the difference between this issue as D&D is that the porn is real, the lack of filtering is real, and the WMF’s lack a spine to do anything about it is real too. Perhaps the vision of pitchforks and flaming torches heading in their direction will induce them to evolve a spine.

    2. Funny enough I was speaking about cucumbers on London Wikimedia Meetup Number 56 🙂 I have managed to make a few people angry 🙁 I am so happy that you think the same. I hope very much that Wikipedia will change for the best. Thank you for the point.

  5. Interesting. Years of internet use and abuse hadn’t brought this to my attention at all. I’m kind of surprised, because I’m a big fan of both.

    I think Matt is on the right track; if there were a set of clear guidelines, the community can probably police itself with image tagging. We accept the possibility of vandalism causing the spread of inaccurate information temporarily; I think this is a reasonable benchmark for the acceptable probability of people seeing porn who shouldn’t or don’t want to see it. An important technical feature will be supporting opt-in filtering for entire networks. At first blush that looks to me like the only technical problem. Coming up with the guidelines is surely the harder problem, but allowing every country in the world to dictate their own requirements with penalties for non-compliance is certain to be substantially worse.

    It’s unfortunate that Wikimedia hasn’t addressed this themselves yet, because it will be a bloodbath when the mainstream news finds out.

  6. Patrick Wyatt

    Perhaps concerned users could “tag” each page with a code word or phrase that could be recognized by net filtering (“net nanny”) software – something that wouldn’t be found on “non-objectionable pages”. That puts the burden on the people who care about the issue rather than upon the folks who edit Wikipedia content, and therefore might be acceptable to all parties.

    There might be some fighting over which pages should be tagged by different members of the community (i.e. tagging of subjects like evolution), but since it is opt-in I think that it night still be workable.

  7. The interesting part of this discussion is the bit about what happens if you take this up on Wikimedia Commons. Wikipedians argue that they have a free site, anyone can edit, etc. The fact is, however, that there is a powerful and vocal contingent on the Commons, and on Wikipedia itself, who will oppose any attempt to impose what they call ‘censorship’ on their project.

    It poses an interesting question for libertarianism itself. If people are allowed absolute freedom, almost instantly some clique or other will take power and impose their views and ideology on others. In this case, we have a small group of libertarians taking power and imposing their extreme view on others.

  8. Ambrose

    Seems like a no-brainer to me, as a parent of five children. I wasn’t aware of this problem before, but now Wikipedia is on my kids don’t use it without parental supervision list. Thanks, Wikipedia. We parents have nothing else to do…

    1. Mike

      Whoever lets his kid surf the Internet alone has plenty of other problems than Wikipedia.

      The Internet is not child-safe and it is the central point about an encyclopedia to thoroughly describe every single lemma.

      Most of the links given above are only findable searching expressivly for them.

      If your kid are looking for terms like “hogtie” and “fisting”, well maybe you really should have a talk.

  9. @Matthew Butler

    I’m a logician so I am suspicious of the use of ‘power words’ in any argument. ‘Power words’ are terms that are loaded with emotional signification for the user, or for other people. The word ‘community’ is such a word, at least for Wikipedians. Not for ordinary people, I think. In the sentence “it needs to be community-driven in the spirit of wikipedia” you manage to get three such words in one sentence, namely “community-driven”, “spirit”, “Wikipedia”. Similarly in “moral crusade against knowledge” we have ‘moral’, ‘crusade’, ‘knowledge’.

    Doesn’t really add up to an argument, does it? I find it rather scary. There is this ‘community’ which has certain ‘needs’, and says that things ‘need to be’ or ‘have to be’ governed by it. Why? Why can’t ordinary people have a say as well? As soon as you bring power words like ‘knowledge’ into it, you are invoking the whole community, not just the ‘Wikipedia community’.

    This ‘Wikipedia movement’ sounds like a form of emerging Fascism. Oops, power word.

  10. Larry, thank you so much for the article. Thank you 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *