What should we do about Wikipedia’s porn problem?

I want to start a conversation.

I. Problem? What problem?

So, you didn’t know that Wikipedia has a porn problem?

Let me say what I do not mean by “Wikipedia’s porn problem.” I do not mean simply that Wikipedia has a lot of porn. That’s part of the problem, but it’s not even the main problem. I’m 100% OK with porn sites. I defend the right of people to host and view porn online. I don’t even especially mind that Wikipedia has porn. There could be legitimate reasons why an encyclopedia might want to have some “adult content.”

No, the real problem begins when Wikipedia features some of the most disgusting sorts of porn you can imagine, while being heavily used by children. But it’s even more complicated than that, as I’ll explain.

(Note, the following was co-written by me and several other people. I particularly needed their help finding the links.)

Here is the short version:

Wikipedia and other websites of the Wikimedia Foundation (WMF) host a great deal of pornographic content, as well as other content not appropriate for children. Yet, the Wikimedia Foundation encourages children to use these resources. Google, Facebook, YouTube, Flickr, and many other high-profile sites have installed optional filters to block adult content from view. I believe the WMF sites should at a minimum install an optional, opt-in filter, as the WMF Board agreed to do [*] in 2011. I understand that the WMF has recently stopped work on the filter and, after a period of community reaction, some Board members have made it clear that they do not expect this filter to be finished and installed. Wikipedians, both managers and rank-and-file, apparently do not have enough internal motivation to do the responsible thing for their broad readership.

But even that is too brief. If you really want to appreciate Wikipedia’s porn problem, I’m afraid you’re going to have to read the following.

Here is the longer version:

The Wikimedia Foundation (WMF) and its project communities have recently stopped work on an optional, opt-in filter that the Foundation’s Board approved [*] in 2011. “Opt-in” means the filter would be switched on only for users who choose to turn it on. It would hide certain content behind a warning, and even then, the content would still be accessible to all users. It is accurate to call this proposed filter “weak”.  Nevertheless, after a period of community reaction, some Board members have made it clear that they do not expect this filter to be finished and installed. WMF director Sue Gardner implicitly endorsed their description of the situation at the end of this discussion [*] (at “I wish we could’ve talked about the image filter”).

Yet, Wikipedia and its image and file archive, Wikimedia Commons, host an enormous and rapidly growing amount of pornographic content. This includes (or did include, when this petition was drafted):

WARNING, THE FOLLOWING ARE EXTREMELY EXPLICIT
• articles illustrated with pornographic videos (“convent pornography” [*], “The Good Old Naughty Days” [*], “A Free Ride” [*])
• videos of male masturbation [*] and of ejaculation in two [*] formats [*]; pictures as well: ejaculation [*]
• illustrated articles about various extreme and fetishistic topics (cock and ball torture [*]hogtie bondage [*]fisting [*]autofellatio [*]pearl necklace [*]hentai [*])
• photo categories for the “sexual penetrative use of cucumbers” [*] and other vegetables, practices like scrotum inflation[*], pictures about penis torture [*]
(Note, [*] indicate links to archived versions of pages, for reference in case these pages are edited.) Some searches produce unexpected results [*]. For example, an image search for “male human” [*] in the “Simple Wikipedia” (touted as a children’s version: “The Simple English Wikipedia is for everyone! That includes children and adults who are learning English”) shows dozens upon dozens of pornographic and exhibitionist images. Almost all the most frequently viewed media files on Wikimedia servers [*] are sexually explicit files, which puts the lie to the oft-repeated claim that pornography is rarely viewed on Wikipedia.

Many parents and teachers are neither aware of the adult content on Wikipedia sites, nor that it is accessible to school-age students, nor that this content is in fact quite popular.

With so much adult content, so often accessed, you might think that Wikipedia is adults-only, and that children don’t use it. But of course, they do. We are told that today’s children are “Generation Z” who get much of their information online. Even pre-teen children are heavy users of Wikipedia, which is often ranked in the top five of all websites in terms of traffic. In fact, 25% of the contributors to Wikipedia are under the age of 18, according to a 2010 survey, and about 12% of both readers and contributors said they had only a primary education.

Youth readership is something that the Wikimedia Foundation appears to condone, at the very least. For example, Jimmy Wales has addressed audiences of school children about Wikipedia, and one of their Wikipedian in Residence programs is at the Children’s Museum of Indianapolis [*]. Wales expressed a common attitude about children’s use of Wikipedia in an interview in which he said that if “a 10-year-old is writing a little short paper for class, and they want to say that they got some information from Wikipedia, I think we should be just glad that the kid’s writing and actually thinking about giving credit — due credit — to people who have helped. And I think that’s wonderful.” (Libertyfund.org, at the 20:19 mark; cf. this BBC story)

If it is meant to be used with children, you might wonder whether Wikipedia and its sister projects really intend for their service to include pornography. Of that, there is no doubt. Wikipedia declares officially that it is “not censored” [*] (originally, this was labeled [*] “Wikipedia is not censored for children”) and its official policy page [*] on “offensive material” also makes it clear that pornography is permitted. To learn about the attitudes of many Wikipedians in the trenches, see the “Wikipedia:Pornography” [*] page and follow the links, or just try this search.

Moreover, in case there were any doubt, the Wikipedia community actively permits children to edit such content. The issue came up last year when a user who said he was 13 years old joined a Wikipedia volunteer group, WikiProject Pornography [*]. This raised eyebrows; someone proposed to restrict editing of articles about pornography to adults. Wikipedians discussed the matter at great length, took a vote, and a solid majority rejected the proposal [*].

This might look like a liberals vs. conservatives issue, at first glance; but I believe it is nonpartisan, more of an adolescent-minded-young-men vs. grownups-with-children issue. Nobody thinks of Google as being conservative just because they have SafeSearch (which is opt-out, i.e., turned on by default).

The WMF is a tax-exempt nonprofit organization with an educational mission. The presence of enormous amounts of unfiltered adult content, the “educational” purpose of which is questionable for anyone, directly conflicts with the goal of supporting the education of children.

That is Wikipedia’s porn problem.

II. Is inaction acceptable?

The official Wikipedia position on this problem appears to be: do nothing, and heap scorn upon anyone who suggests that something needs to be done. That also seems to suit many techno-libertarians, especially young males without children, who are the most enthusiastic consumers of porn, and who often dominate conversations about Internet policy.

I think inaction will prove unacceptable to most parents. At the very least there should be a reliable filter available, which parents might turn on if their younger children are using Wikipedia. I know that I would use it with my 6-year-old; then I might let him look at Wikipedia, if it were reliable. It’s hard to look over your children’s shoulder every moment they’re online. Wikipedians often glibly advise parents to do just this: if Junior is using Wikipedia to view porn and learn all about creative sexual fetishes, it’s your fault. You should be monitoring more closely. This obviously doesn’t wash, when it is well within Wikipedia’s power simply to add a filter that parents could turn on.

It is also unacceptable for most teachers and school district technology directors. How, really, can you defend giving kids access to a website with so much porn, when it is so obviously counter to CIPA rules, and when their parents would in many cases object (if they knew of the problem)?

What about you? If you agree, I’m going to make it easy for you to comment. I know that some Wikipedians might want to respond in a very hostile fashion–I’m no stranger to such disputes, myself–and this would put off a lot of people from commenting. But since this is my blog, happily, I can make up the rules, and so I will. I particularly encourage participation by parents, teachers, and women generally. I would especially like to hear from people who support the idea that Wikipedia tackle this problem. If you are opposed, that’s fine, but I will post your contribution only if you are polite and well-reasoned. I will not post anything that is personally insulting, and I also reserve the right not to post “flame bait” and merely silly or stupid remarks (and on such matters, my judgment is final). I will also pull the plug on any opponents who attempt to dominate the conversation. We already know there will be significant opposition, namely, from some Wikipedians and some of Wikipedia’s supporters. The purpose of this post is to get people talking about whether Wikipedia should be doing something about this problem.

III. What should be done?

There are a few things we might do.

First, we might blog, tweet, and post on Facebook about the problem. For better or worse, we’re all connected now, and getting the word out there is simply a matter of using social media. One person’s comment won’t amount to much–even this one won’t, probably. But a lot of people together can create a groundswell of support. So add your voice.

Second, we might contact leading Wikipedians, including Sue Gardner and other members of the WMF Board of Trustees. And don’t forget the many leading members of the Wikipedia project itself, such as the “checkusers” and the active administrators. If these people hear from readers not in the community, it can really make a difference. If enough of us write, Wikipedians might finally get the idea that there are a lot of readers out there who want a voice in determining what options are available to users.

A few months ago, I repeatedly (just to be sure) mailed Wikimedia chief Sue Gardner about Wikipedia’s porn problem. In 2010, she and I had a very productive and polite exchange, by both email and phone, about these issues. But recently, she has not responded. That was disappointing, but I believe I understand. My guess–it is only a guess, and I will correct this if I learn differently–is that Sue has been beaten down by her dysfunctional community. She has given up. I think she wants a filter installed, but it is politically impossible, and she fears for her job if she takes a hard-line stand. That’s my guess. If I am right, then external pressure will wake up the Wikipedia community and make it easier for her to insist that the community do the right thing.

Third, talk to the press. If you know reporters, or people who have lots of followers online, ask them to report about this story. It’s a big story. Why isn’t it big news that Wikipedia has given up its 2011 commitment to install a porn filter? Surely it is. It’s time to ask the Wikimedia Foundation, as well as the leading Wikipedians, some hard questions. (And reporters, do be sure to ask questions of leading Wikipedians; I say that because the WMF does not control Wikipedia or Commons. If they did, they would be legally liable for a lot more than they are now. The people really making the decision, arguably, are the adolescent-minded Wikipedia admins who see nothing wrong with the current situation–not necessarily WMF employees.)

The fourth option is the “nuclear” option: we might boycott Wikipedia. Now, I’m not calling for a boycott–yet. If anything, I’d like to kick off a public discussion about whether we should boycott Wikipedia. I have been talking about this with some online acquaintances, and I am honestly torn. I don’t want this to be a mere publicity stunt: I want to call for a boycott only if it could possibly have a positive effect. I also don’t want to call for a boycott if I don’t know that there will be a significant groundswell of popular support. And I don’t want this to be about me. I want it to be all about making Wikipedia more responsibly managed and more useful for children–kids are some of its most important users, even if Wikipedians loudly insist that it is not “censored for children.”

But if Wikipedia and the Wikimedia Foundation do not take decisive action between now and end-of-year fundraising time, I might very well call for a boycott. For now, let’s get the word out, start a conversation, and see if we can make a difference without taking such drastic action.

Please feel free to repost this online.

UPDATE: in a response to me, Jimmy Wales has reiterated his support for a Wikipedia porn filter. But this wouldn’t be the first time Jimbo has supported a Wikipedia proposal that never saw the light of day. Let’s make him put his money where his mouth is.

UPDATE 2: I made a video vignette, “Does Wikipedia have a porn problem? Dad investigates.


by

Posted

in

Comments

Please do dive in (politely). I want your reactions!

134 responses to “What should we do about Wikipedia’s porn problem?”

  1. milw1818

    Mr. Sanger — I do not participate in the Wikipedia community and I have no stake in this debate. However, after reading several of the comments on this page I felt the need to respond.

    I am struck by the manner in which you conspicuously demonstrate your adherence to the principles of logical argument, particularly in your responses to the comments shown on this web page. This seems to be somewhat ironic because the entire basis of your argument appears to be an emotional presupposition — that pornography is harmful to children.

    Now, let me be clear, I am not commenting on the effect of pornography on children. I am far from an expert on the subject. I am merely observing an incongruity in your argumentative standard.

    You are frequently dismissive of the arguments of certain commenters on the basis that the logical construction of their argument is defective. But you have not lived up to your own standard because you have failed to establish the logical basis from which the validity of your entire argument flows.

    I challenge you to produce for review the evidence which supports your implicit assertion that pornography is harmful to children. Perhaps it will be a simple task, but until you have done so you cannot claim that you have substantiated your argument.

    1. Have a look http://www.youtube.com/watch?v=TFw1Cnuq9jc Many people don’t want to know about it, but this woman knows what she is talking about. Dr. Sharon Cooper worked as a pediatrician for more than 3 decades she is the lead author of the most comprehensive text on child sexual exploitation & Internet crimes against children.

      “Adult pornography normalizes sexual harm & provides children visual examples of lack of emotional commitment, unprotected sexual contact, & visual examples of violent rape in many cases.”

    2. Carl Gombrich

      The ‘there is no evidence that it does harm’ argument for allowing children to view pornography is among the most depressing. It shows a misunderstanding a. of what constitutes evidence, b. of what should be evidenced and what need not be.

      The points are related because the second misunderstanding often leads to saying silly things with regard to the first.

      For example, as far as I know there is nothing we could really call evidence to show that slavery is bad, either collectively or for individuals kept as slaves. Are those refusing to move on restricting the access of children to pornography therefore in favour of legalising slavery until we have ‘evidence’ (presumably a longitudinal study over many years involving several hundred people, control groups etc) to show that slavery is harmful? Specifically that it is so harmful to individuals that it should therefore be outlawed? If they do not advocate such a move, why don’t they? That is the logic of the position: no evidence, no move.

      But the important point is that slavery is bad, and the argument that it is bad was successfully made on moral grounds by previous generations in the West. That is why it is outlawed in many countries.

      Now ask: is it better or worse for children to come across hardcore pornography? We are talking children, not adolescents searching out of curiosity or for arousal, but children, for whom sexuality is a very different thing. I would like to know the libertarian answer to this question. If you think it is better that children do not see hardcore pornography, then we should something about the fact that, increasingly, many of them do.

      The other points here about libertarianism are well made: the tent in which everything goes is the tent into which, eventually, nobody goes.

      I’m not sure I’m allowed to do this, but as I have no connection with the author may I recommend Jonathan Haidt’s latest book: The Righteous Mind. A superb, highly evidenced but popular account of why it is so hard to get agreement on these sorts of moral issues.

    3. Another video http://www.youtube.com/watch?feature&v=cYPMI3k2F1Y 🙁 The children suffer in our world now, that’s how it is.

      What kind of evidence do you want?

  2. Missing

    I mean it’s not like wikipedia has porn in articles that you wouldn’t expect to find porn in. If a kid is looking up porn by searching ejaculation on wikipedia, blocking the article won’t stop his or her curiosity.

    1. And therefore, what…there is no point to having filters for younger children? That wouldn’t follow…

  3. Leto

    I have read your article with interest.
    I agree with you that some filtering system (one that can be turned off by the user to avoid even the idea of censorship) should be installed.

    But the funny thing in my point of view is that you see this in a very American way: many of the images that you give as an example would not be seen as a problem in continental Europe (some would, though).

    So if such a filter system is created, which morals will be the standard to rate images and videos? What about violence? What about religious symbols?

    So don’t just talk about porn when you talk about filtering, because what is seen as porn and what is not depends a lot on your cultural background. Filtering is a very complex issue, and maybe sometimes it is better to have no filter, then a filtering system that is only appropriate in one cultural context.

    So far for my contribution to this discussion,

    Have a good night,

    Leto

  4. […] after a year, the filter is not in place — and in Sanger’s opinion, Wikipedia has a real porn problem with no progress being made to enact the […]

  5. Brian Longamere

    Larry, trying to take on control of pornography will probably become a lose-lose situation. Even the Supreme Court has driven itself batty over the decades dealing with it, which they classify as a form of “obscenity.” Cases about it relating to children are endless, and as we can see from the 21st century, it’s getting worse. Generation-Zers, supersaturated with free porn, would be unable to even define “obscenity,” much less give an example. And Wikipedia is merely riding the ultra liberal wave, not even asking for a fake user name or password to let anyone, anywhere, intellectual or mindless zombie, write stuff and upload their favorite x-photos under the cover of “education.”

    In college, a good friend of mine was clerking for the nation’s leading 1st Amendment attorney, a man who had won a number of cases in the Supreme Court. He was in town defending the right of a local publisher to reprint a large government document about sexual relations in America, but decided to load it with porn photos under the pretext of “illustrating” the topics. I sat through most of the case and even got to know the attorney and the defendants. I was young, but was still amazed that such a blatant, in your face pretext for publishing porn, became such a big case. It was amusing watching that attorney have the judge and each jury member turn to each and every porn image, trying to super-analyze it for its “educational” value. The transparency of that publisher’s profit motive, which also published other sensationalistic books, was obvious, but still the judge and jury stared at those images and listened to rational arguments, kind of scratching their heads.

    My suggestion would be to simply present hard facts about Wikipedia’s porn-related material. A simple chart comparing the average number of daily viewers of both the articles and their photos with those of general non-porn, “educational” articles would speak volumes. Listing the countless porn-related articles or “notable” people, such as over 200 “Actors in gay pornographic films,” or dozens in “Transsexual female pornographic film actors.” Wikipedia’s definition of “notable” and “actors” is interesting. There are many such subjects, categories, and lists, all under the transparent facade of “education,” which Wikipedia has built.

    The numbers of visitors will say a lot, for example over the last 30 days there were over 151,000 visitors to the article on “erection,” much above the average. Even famous movie directors, like Stanley Kubrick, are less.

    In fact, if Wikipedia simply splits off the massive amount of porn-related material into a new site, such as “Wikipornia,” they could start charging memberships and probably avoid needing donors for their “educational division.” Their credibility among educators would definitely go way up.

  6. Anne

    The Internet is, simply, not for the faint of heart. I’ve been using the Internet since 1995 when I was 10 years old. Pornographic websites were abundant, and people were having “cybersex” in otherwise friendly IRC servers. It was also easier then to be tricked into clicking a link that directed you to porn. We’re fortunate these days to be issued with 18+ warnings.

    My parents never felt the need to monitor my activities online throughout my adolescent years, despite being so glued to the computer. I wasn’t looking at porn – I was learning HTML and sharing files on Napster. Pornography was never relevant to my interests because I knew it was about sex, and my parents had already issued ‘the talk’. I knew about the consequences and dangers of engaging in sexual activity when one isn’t ready.

    I am not against censorship. I just don’t think it’s very effective. One, you can’t censor everything. If kids can’t find it on Wikipedia, they will find it elsewhere – which I think is even more dangerous. In my experience, Wikipedia infers academic tones, no matter what the content is. It is objective. It doesn’t say, “oh Sadomasochism is good/bad… you should engage in it!” If one would use another source for the keywords they type in, who knows what they might find.

    When something is restricted, t’s hard to tell how people will respond to “this page contains restricted material.” If I were a kid who knew nothing about sex and knew that the blocked page was about sex, I would be very curious as to why I am so forbidden. I’ll get my information elsewhere.

    I know this sounds dramatic, but I don’t think parents should be giving their kids full freedom to use the Internet unless they’re confident that their kid is well-adjusted enough to be let out into the world. Here and now, the Internet is inherently dangerous for kids. Much in the same way as it is dangerous to let your kid out into the streets without supervision. The whole world and its thoughts are in it. I don’t think it’s fair to condemn Wikipedia and leave all responsibility to them by protesting for content filters. Wikipedia isn’t a business with steady monetization, it’s not that easy to implement new features for a non-profit website. People could ask nicely, like what this article is doing, but we can’t expect Wikipedia to come up with filters anytime soon.

    (I’m sorry for the long comment. I’m not trying to dominate the conversation… just sharing my personal experience on why I think censorship wouldn’t really work in this case)

    1. First of all, Anne, it’s different with boys. Unlike many girls, they are highly interested in porn, and porn can seriously damage them, especially if they consume a lot of it at a young, impressionable age.

      Second, unlike you, I am against censorship; but also unlike you, I don’t think that an opt-in filter, or any parental filter, counts as “censorship” (and indeed, that’s obvious; if you disagree, you need a good argument to make the case), and asserting that it is censorship without any argument really tips your hand. I think, moreover, that our discourse cheapens the importance of real censorship–the kind that sovereign power does, and the dangers that involves–when we call filtering for children censorship.

      Third, you’re making the argument that Wikipedia is a better place to learn about various perversions and obscene topics than actual porn sites. Perhaps, in some sense. But you should also consider the problems: an impressionable child of the wrong age who sees an “objective”-sounding article about what most people would regard with disgust or horror will come to the false, and quite possibly damaging, opinion that it is “normal.” The wiki is “flat,” meaning information about fisting is in an article that, considered just in how it is linked up with other articles, is on a par with any other topic, say, Einstein or World War II. A child, no matter whether he or she has received “the talk” or is relatively unsheltered, very probably has no context, no way of knowing, that any number of extreme “sexual practices” are indeed extreme. At least if the information is behind a filter, then when Junior figures out how to get around it, he’ll know that it’s something to be wary of. The fact that social mores are strong enough to permit filtering certain topics from children is a powerful and necessary hint to children than they are playing with fire.

      By the way, I’m all in favor of having several different sets of filters, or ways of filtering, precisely because there are different sets of social mores. Identifying the different sets requires that very different communities respect each other enough to leave them the freedom to define their boundaries for themselves. That would be true neutrality.

      Finally, Wikipedia receives $20 million a year. The software for filtering would not be that expensive, and the labor of managing it would be minimal. They can easily afford it. The reason they don’t install a filter is that they don’t want to, and they don’t want to for essentially puerile reasons.

  7. Nihiltres

    Hi Larry. I think you’re somewhat wrong, but at least only somewhat wrong, in your characterization and judgement of the issue.

    Here are some logical points I think are worth taking into account, made in such a way that I think most people will agree that they are reasonable:

    1. It’s reasonable for Wikipedia to have articles and even media documenting sex, its practices, and its tools. Wikipedia’s neutral point of view policy, one of its “Five Pillars”, concerns not expressing judgements in articles, but simply presenting information as objectively as possible and letting people make their own decisions. People objecting to this point will be those who are objecting to porn *per se*, and completely, yes, censoring those topics from the encyclopedia is unreasonable given that there’s a significant proportion of people who would like to learn about those topics.

    2. It’s unreasonable for Wikipedia itself to decide what specifically ought to be filtered. Different people have different mores. Is seeing bikinis immoral? Seeing nipples? Seeing an image purported to display Muhammed? The widely different mores mean that choosing one standard is difficult even without the mandate of a neutral point of view. I’m aware of the problems of moral relativism, but there’s essentially nothing universal about filtering any particular set of media.

    3. It’s reasonable for parents to decide what media their children consume, to some degree. We may disagree as to what degree this may be generally true, but I think in most cases we can justify saying that it’s reasonable for them to filter sexual topics, for example, from their own children given their belief that those topics are obscene to children. (I personally have somewhat less tolerance for this concept when the child has begun puberty, since at this point the child’s agency is more relevant, and the child should be learning about sex anyway, but let’s call that a grey area.)

    4. Most age-verification systems online are trivially defeated. Sites may introduce registration requirements that require users to be 13 or up, and these are trivially defeated by changing the year of birth. Sites may require a click-through to confirm that the user is 18; click “Yes, I’m 18”, and the system is defeated. The only remotely secure and universal one I’ve seen is a credit-card verification system, which is both a) not universal; many people don’t have credit cards, and b) not reasonable information for many websites to request; some users might refuse just on the basis of the privacy of that information.

    5. The primary goal of a filter is to avoid surprises. If a person searches for content about cucumbers, or toothbrushes, perhaps, there isn’t an expectation that they will find sexual content. If a person searches for masturbation, fisting, or fellatio, there is a strong expectation that they should find sexual content. If a person is searching for this content, it is reasonable to assume that they will find it, whether on Wikipedia, or elsewhere, unless there is something, like a watching parent or “net nanny” software, stopping them.

    6. An objective treatment of sex topics is relatively desirable. I see that you, Larry, have already responded to this point, and I think that your response is unsatisfactory. It can easily be argued that a fairly dry description of a topic, with occasional descriptive images, is not anything near the titillation the images or video of a porn site would attempt. On this I think we agree. However you argue that an objective treatment would (and you said “will”, not “could”) lead a child to think that the practice was normal. For a moment, give some credit to the intelligence of the child. If an article is being objective, and the child is not already aware of the societal mores saying explicitly or implicitly that the practice is unusual, the child will still have a judgement to contribute. The concept of taking sexual pleasure from humiliation (can happen in BDSM), for example, sounds odd, and a child is likely to make the same judgement unless either a) they are intrinsically interested in it anyway or b) they don’t realize that most people don’t enjoy humiliation.

    I could be wrong on this. It could be that I am attributing more rationality to a child than is reasonable in most cases, and perhaps your argument stands up well. In any case, I think it’s at very least reasonable to say that, given all non-parentally-approved informational resources on sex, Wikipedia is probably one of the least harmful.

    ~~

    Given all of the above, I do support having some opt-in filter available, given points 1, 3, and 5. However, given point 2, I think that the filter categorization should be developed externally to the community, but available on Wikipedia as an example of what some people think ought to be filtered. It breaks the neutral point of view spirit to normatively say that some image is appropriate or not appropriate for viewing. However, it is entirely acceptable to say informationally that the Net Nanny Association of Exampleland, or Wikipedia user Example, or for that matter the Church of Exampledom, believes that its eponymous filter, available at a click of a button, will keep children or adults safe from nefarious images. However, given points 4, 5, and 6, I think that the filter should be very clear about its action and soft in its implementation: a search should mention that some number of results were omitted from the results by some particular filter(s), for example. A filtered image should offer the option of revealing its contents without excessive intermediary “warning” clicks. Why? It’s better, as I argued in point 6, that a child satisfies their curiosity on Wikipedia than elsewhere—and they will, if determined enough, satisfy their curiosity elsewhere based on points 4 and 5. The key is simply point 5: people shouldn’t come across objectionable things they didn’t expect.

    I respect your opinion, more or less. (It’d help if you’d unblock me on Twitter as though you aren’t afraid of my opinion—it irks me that I cannot automatically follow your tweets on my main account.) However, I think you’re going about it the wrong way. You *will not* get anywhere with a broad campaign like this appears to be (you’ve argued this one several times). By making a broad campaign, you are sending the message to most Wikipedians that you disapprove of things that have probably been put there out of genuine interest in advancing encyclopedic coverage of sex or other topics. You are calling it “porn”, a word connoting material intended to titillate, when much of it is not “porn” but merely “sexually explicit”. Some of it *is* porn or excerpts thereof, but it’s present in the context of the history of pornography, where it’s fairly reasonable to include it.

    People hear “Larry Sanger wants the ‘porn’ cleaned up”, and they think “Forget that tripe, this is about legitimate documenting of the world”, and call it censorship and other unfair labels for what you truly intend. They see the “family-friendly” policy of Citizendium and think that you must be a puritanical conservative. They see you saying that it’s a “adolescent-minded-young-men vs. grownups-with-children issue”, and they see only that you have insulted them as “adolescent-minded”, *no matter* how sensible your argument and weak theirs. Remember: few humans are properly rational, myself included.

    To get somewhere on this, it needs to be brought back to Wikipedia, time and time again. Stress that you respect Wikipedians (even if you don’t), that you are advocating *personal choice*, not censorship, and demonstrate these things with ideas like the ability to bypass the filter. Only then will people open their ears.

    I’d like there to be a filter. I’d never use it, I wouldn’t use it on my children had I any, but I can’t conscientiously deny people the ability to self-determine if it does not disturb others.

    [TL;DR: I support a weak filter on the basis of self-determination.]

    1. Anyone want to take a crack at this? I’ve spent too much time on this lately and responding to this would take me a long time.

      I will say one thing. I reject the notion that Wikipedia’s neutrality policy commits it to lacking a filter. The argument seems to be that any determination of what should be filtered is a position with which people will take issue, and therefore it is “neutral” not to filter anything. This is a logical howler. Indeed it is an example of puerile reasoning that is common in Wikipedia-land. Since it is patently the case that most parents would like something to be filtered, the entire lack of a filter is itself a position that others (like me) clearly take issue with.

      In fact, the lack of a filter is easily one of the most extreme positions one can take on the question “what should be filtered for young children.” Remember, we are not talking about censorship, according to which certain information would not be published, period. We are talking about a parent being able to turn on a filter so that certain information can’t be seen on that computer by his or her children. It is not correct to apply the word “censorship” to the practice of parents preventing their children from seeing certain things. Hence it is possible to be firmly in the free speech camp, like me, while also being firmly in the “let’s have some sane filters for kids” camp.

      Anyway, back to the point at issue: if Wikipedia really wanted to take a neutral position on the issue of what information should be made available to children (or, more precisely, on the issue of what information parents or teachers should be empowered to prevent their children from seeing), it would offer a variety of filters. One can imagine a set of categories that could be used to mark up media; different filters would permit or disallow the display of those media based on common preferences. For example, I can imagine a “Christian conservative” filter not showing any nudity of any sort before the age of 13, say, whereas a “progressive” filter would not block any merely nude pictures, but only if they had a sexual context. And if people disagreed about whether a certain image “had a sexual context,” then there ought to be something in the database to mark that. Why set up the system so that ideological and religious foes have to do battle? Anyone who actually understands what neutrality means knows that that really isn’t necessary.

      The fact that so many Wikipedians have used this puerile argument shows me how common the failure to understand neutrality has become–not that this surprises me, considering that I was doing battle with what were later called “POV pushers” even while I was involved. Some people just instinctively interpret their own position as the “neutral” one, and stubbornly fail to understand that neutrality involves tolerance for views that one strongly disagrees with.

      OK, one other thing. On your #6, you use the word “objective” as if its approbatory force clearly, unproblematically applied texts that lacked any moral judgment of the matter. There is a reason that I chose the word “neutral,” instead of “objective,” when devising Nupedia’s and Wikipedia’s neutrality policy; objectivity implies a single, nonjudgmental perspective that has the best claim to being the correct one. A neutral article, by contrast, is one that is open to, or tolerant of, many different points of view, some of them positive and some of them negative. It is much easier for a group of people to agree on an article that aspires to neutrality than on one billed as “objective.” Neutrality is still very difficult and it still requires that the group decide which views are to be presented in what proportion. The latter is something that most Wikipedians never have learned properly.

      Wikipedia’s articles on sexual fetishes and perversions (having seen what I’ve seen on Wikipedia and Commons lately, I’m happy to stand by this word) might be objective, in the sense that they are nonjudgmental. But they are not neutral, not in the sense that their emphases and presentation reflects attitudes toward the practices in approximate proportion to representation in the population of native English speakers. Indeed, according to that formula, if you were to write a truly neutral article on, say, fisting, one of the very first things that you’d have to say about it–indeed, given the nature of the subject matter, you’d have to say it in the first sentence–is that most people would regard it with disgust and even horror. You might hasten to say that there are some people who, of course, do not regard it that way, and are happy to practice it as often as possible; but you would be obligated, also, to say that those people make up a miniscule portion of the populace. (And let the idiots add a [citation needed] tag after that, if no studies could be found.)

      Now consider, if you dare, Wikipedia’s article on fisting (I won’t bother linking it). I read it quickly (for my sins) and did not spot the slightest hint of moral judgment of the practice. There was only a clinical-sounding “risks” section. There are also two extremely disgusting pictures, which even I can’t look at.

      This isn’t neutral. It may be “objective” in the sense that it is nonjudgmental and fact-stating, but it is very decidedly not neutral in the sense that I defined and promulgated and which held sway for some years after I left, at least. It is precisely because it is lacks any reportage of the moral or emotional reaction of most people to the practice, and because it shows those ridiculously disgusting pictures, which are hardly necessary to convey the concept–it is because of those things that the article is not neutral. You think the pictures are somehow neutral? Of course they aren’t necessary to convey the concept; it involves sticking a whole hand in either that orifice or else that one. Yeah, I got the idea. Once I got the idea, I can quickly decide that I do not want to see it illustrated. But the article’s authors express a clear lack of proper moral and indeed cultural sensitivity by simply showing the pictures without so much as inquiring whether the reader wants to see them; indeed, the callousness (I think that’s the right word) of the display is so extreme, yes of course also for adults, that for that reason it is obvious that the presence of the pictures is non-neutral.

      Now, suppose a child of just the wrong age–maybe 13, a kid who is prepared to question everything, but whose parents and teachers have failed to give him the conceptual, factual, and critical faculties for doing so properly–comes across the article. While many kids would look at it with disgust or horror, as I and most adults do, and would never return, there are others who would feel a sort of obligation to look on it nonjudgmentally. (That, in their juvenile minds, would prove their maturity.) The article’s clinical, non-judgmental tone reflects and so encourages this (on my view) mistaken attitude, so widely rejected by the vast majority of quite critical-thinking adults.

      There are some 13-year-olds who have to be told–they won’t actually understand, and will resist accepting–that certain practices are widely regarded as disgusting or horrifying. They have to be acculturated explicitly, and if they aren’t, they might well fail to be acculturated, to that extent, at all.

      By the way, Nihiltres, it is bad form to announce to the world that I have blocked you from subscribing to my Twitter feed. You’re free to shout it from the rooftops, but who I block really is my own business, isn’t it, and hence it is obviously bad form to attempt to shame me by saying so. I don’t feel the slightest bit shamed by you. I forget why I blocked you, but I assume that you deserved it.

      1. Nihiltres

        The start of your reply makes a straw man of my argument. Ironically, you criticize me for making an argument that I did not and am not making (that no filter is the only neutral filter), then suggest that the best thing is the thing I did, in fact, suggest (offering a variety of filters). My point 2 is intended to mean that any *particular* filter or filter set cannot be truly neutral. Offering a variety, and then letting people choose, is my solution to that problem.

        I’m sorry for my choice of words in point 6. You’re right in that indicating that “neutrality” would probably be a better word, and that indicating the distribution of public opinion on the topic would add to the article and help make it more neutral.

        I can’t believe, however, that the mere presence of pictures is somehow not neutral. It doesn’t follow. For example, is it non-neutral to have a picture of a clown on the “Clown” article? Most people will answer “no”. However, to someone with coulrophobia, those images would inspire fear and possibly loathing. It’s unreasonable to suppose that the non-neutrality arises with the images—it’s a property of the viewer in the example you give. On the contrary, if someone terribly enjoyed fisting, or was amused by clowns, then they might have a positive reaction to an article image.

        I do find your example of the 13-year-old interesting, but again it falls short. It doesn’t actually counter my argument, for one—you’ve mainly reiterated your original argument. My argument includes the possibility that the child might self-acculturate through reactions based on instinct and existing acculturation, but a failure of the article to acculturate the child in any particular way can’t be said to be non-neutral. On the contrary, as you yourself just put it: “a neutral article […] is one that is open to, or tolerant of, many different points of view” and by extension, different cultures. If the rest of the article is neutral, and the child from Exampleland chooses to ignore a statement along the lines of “foobar is widely considered distasteful in Examplelandian culture”, or something else along the same lines, has the article itself suddenly failed because it has not pushed the child towards their native culture? It is again, not a property of Wikipedia, but a property of the child. It’s a reasonable criticism that articles might not be expressing the distribution of views on a subject, but that is not the same by any means as outright encouraging support for an unpopular topic. Essentially, you seem to be saying “if you’re not with us, you’re against us”.

        I apologize for my bad form, but I would appreciate it if you’d find it in you to unblock me on Twitter. It’s an annoyance, and, because it was not preceded by any objection to my behaviour, it feels unwarranted.

        I’d also like to point out that, at the actionable levels of this discussion, I often *agree* with you, and I don’t seek to upset you. My intention in commenting here is to hone the ideas behind the rationale supporting a filter system, not simply to tear down your arguments. My ideal outcome would be to find a way, together, to convince Wikipedians that a filter system is desirable. That is why I criticized your approach in my first comment: it is constructive criticism that offers an approach that I think might be more effective for reaching your goal.

        If you’re not interested in engaging with Wikipedians, then you’re not serious about your stated goal: to “get the word out, *start a conversation*, and see if we can make a difference without taking such drastic action” (emphasis added). I’m offering you a step towards the second part of that goal.

        1. I wrote a reply to this but it was lost and I’m not going to piece it together. I’ll simply summarize the highlights:

          (1) On my first point, I wasn’t attacking a straw man but instead essentially agreeing with you.

          (2) If you are going to make the point that all illustrations of concepts are inherently neutral, you will have to say that of graphic, real photographs of murder-rapes, incest, infant molestation, disembowelment, torture, or just plain murder–or whatever horrible but “real life phenomenon” that Wikipedia has articles about. You’d have a snuff film on the “snuff film” article, which is an abhorrent idea.

          (3) You say, “An objective treatment of sex topics is relatively desirable.” I explained why, using an example of an impressionable 13-year-old. Your reply is to say that the 13-year-old might “self-acculturate,” but that doesn’t contradict my point that the article in its present form might well give some 13-year-olds ideas very much contrary to their parents’ preferences and which are by most people’s notions very much mistaken. It is out of concern for the latter, not the former, that we ought to write articles a certain way and that we should have a filter to make it clearer that the topics are dangerous for “our” (whatever it might be) culture.

          (4) I wanted to start a conversation with the wider world, not with Wikipedians. I’ve butted heads with Wikipedians for over 10 years. I’m tired of it and have lost patience for their sad, twisted, insular world. They won’t listen to me, anyway; but they might listen to an outraged public and declining funds when I organize a boycott in the fall–which I am likely to do if they take no further action on a filter.

        2. Brian

          I have to agree with Nihiltres in their #5 rationale above:

          “5. The primary goal of a filter is to avoid surprises. If a person searches for content about cucumbers, or toothbrushes, perhaps, there isn’t an expectation that they will find sexual content. . . “

          However, by that very logical rationale, Wikipedia should be extremely filtered. The “encyclopedia,” after all, is very old. Encyc. Britannica started over 240 years ago and has been a standard reference in libraries and many homes. Other encyclopedias have also been standards until recently. I think a typical survey of the general public would find that they would be very surprised, if not shocked and dumbfounded, to discover those encyclopedias included the countless sex-related topics and their supporting photos.

          If Wikipedia considers itself an “encyclopedia,” then many of its readers are in for many surprises, even without searching. In any case, “searching” is not necessary to visit most of the articles as they have tons of links from everywhere. And a search of most of the terms mentioned, fisting, masturbation, erection, etc. naturally has Wikipedia as the #1 search result.

          I also agree with Nihiltres’s comment, “If a person is searching for this content, it is reasonable to assume that they will find it, whether on Wikipedia, or elsewhere . . “

          That’s the same argument used for legalizing drugs. It’s also the same argument used by our gun lobby and military arms dealers trying to sell to many of the world’s dictators: If we won’t sell them, they’ll just get them somewhere else.

        3. Brian

          There’s an awful lot of references to “adult” content which should be what is filtered to keep kids out. However, that implies that looking obscene or porn images is OK for most adults. But I don’t think that is the problem.

          One way to look at Wikipedia as a “free” encyclopedia would be to compare it to having a giant public library near your home. Upon entering, you may see arrows pointing to the various sections: magazines, newspapers, science, biographies, fiction, reference, etc. Imagine seeing an arrow pointing to the “porn” or “adults only” department. At the very large “adults only” section there is a librarian checking IDs. That librarian would be the “filter.”

          Asking for such a “filter” implies that most “adults” visiting that “public” library see no problem with having an “adults only” section. However, I’d guess that 99% or more of Wikipedia’s reference materials, paid for by cash donations and thousands of volunteer editors, is of general “all ages” material.

          One also has to wonder if Wikipedia feels its mission would be undermined if the “adults only” section were simply minimized enough to keep it “general.” Consider that readers visited the article on “president of the U.S.” 220,000 times over the last month, verses 467,000 visits for “masturbation.” They read about “kissing” 69,000 times, but “erection, 151,000 times. “Love”, 317,000 visits, verses 627,000 visits to “sexual intercourse.”

          The massive sizes of the “adult” articles is also relevant, with “sexual intercourse” nearly twice the size of “love.” And the very large article about “President of the Unites States” is the same size as the one for “masturbation.”

          Maybe we should ask Mr. Webster to redefine “encyclopedia” for the modern age. Greek enkyklios (encircled) + paideia education, child rearing, from paid-, pais “child.” But in the meantime we shouldn’t imply that “adults” have no problem with most of the porn-style images and lengthy text. Most “adults” don’t need to see 8 detailed blow-up images of every stage of an “erection,” along with 1,200 words, to convey the meaning.

        4. I certainly don’t need to see all that information. It smacks, as I say, of management by an adolescent mind-set, which, in a top-10 website, is more offensive to me than the imagery.

          Still, I’m trying to pick my fights carefully. Complaining about the crassness of Wikipedia’s approach to adult topics might be fun, but it simply isn’t as important as a filter.

  8. […] the Good Old Naughty Days, A Free Ride et toute sorte de réjouissances porno publiées sur Wikipédia. Merci […]

  9. […] create misinformation, started to level the pornography charges against Wikipedia late last month in a blog post that stated that “the real problem [is] when Wikipedia features some of the most disgusting […]

  10. […] features some of the most disgusting sorts of porn you can imagine,” Sanger states in a blog post on his website, “while being heavily used by […]

Leave a Reply

Your email address will not be published. Required fields are marked *