How to crowdsource videos via a shared video channel

I got to talking to one of my colleagues here at Everipedia, the encyclopedia of everything, where I am now CIO, about future plans. I had the following idea.

We could create an Everipedia channel--basically, just a YouTube account, but owned by Everipedia and devoted to regularly posting new videos.

We could invite people to submit videos to us; if they're approved, we put branding elements on them and post them. We share some significant amount of the monetization (most of it) with the creator.

We also feature the videos at the top of the Everipedia article about the topic.

Who knows what could happen, but what I  hope would happen is that we'd get a bunch of subscribers, because of all the connections of the video makers (and Everipedia--we collectively have a lot of followers and a lot of traffic). And the more people we got involved, the greater the competition and the better the videos would be.

There are still huge opportunities in the educational video space--so many topics out there simply have no good free videos available.

Others must have organized group channels like this before, but I can't think of who.

What do you think?

On intellectual honesty and accepting the humiliation of error

I. The virtue of intellectual honesty.
Honesty is a greatly underrated epistemic virtue.

There is a sound reason for thinking so. It turns out that probably the single greatest source of error is not ignorance but arrogance, not lack of facts but dogmatism. We leap to conclusions that fit with our preconceptions without testing them. Even when we are more circumspect, we frequently rule out views that turn out to be correct because of our biases. Often we take the easy way out and simply accept whatever our friends, religion, or party says is true.

These are natural habits, but there is a solution: intellectual honesty. At root, this means deep commitment to truth over our own current opinion, whatever it might be. That means accepting clear and incontrovertible evidence as a serious constraint on our reasoning. It means refusing to accept inconsistencies in one's thinking. It means rejecting complexity for its own sake, whereby we congratulate ourselves for our cleverness but rarely do justice to the full body of evidence. It means following the evidence where it leads.

The irony is that some other epistemic virtues actually militate against wisdom, or the difficult search for truth.

Intelligence or cleverness, while in themselves an obvious benefit, become a positive hindrance when we become unduly impressed with ourselves and the cleverness of our theories. This is perhaps the single biggest I became disappointed with philosophy and left academe; philosophers are far too impressed with complex and clever reasoning, paying no attention to fundamentals. As a result, anyone who works from fundamentals finds it to be child's play (I thought I did, as a grad student) to poke holes in fashionable theories. This is not because I was more clever than those theoreticians but because they simply did not care about certain constraints that I thought were obvious. And it's easy for them in turn to glibly defend their views; so it's a game, and to me it became a very tiresome one.

Another overrated virtue is, for lack of a better name, conventionality. In every society, every group, there is a shared set of beliefs, some of which are true and some of which are false. I find that in both political and academic discussions, following these conventions is held to be a sign of good sense and probity, while flouting them ranges from suspect to silly to evil. But there has never yet been any group of people with a monopoly on truth, and the inherent difficulty of everything we think about means that we are unlikely to find any such group anytime soon. I think most of my liberal friends are—perhaps ironically—quite conventional in how they think about political issues. Obviously conservatives and others can be as well.

Another virtue, vastly overrated today, is being "scientific." Of course, science is one of the greatest inventions of the modern mind, and it continues to produce amazing results. I am also myself deeply committed to the scientific method and empiricism in a broad sense. But it is an enormous mistake to think that the mere existence of a scientific consensus, especially in the soft sciences, means that one may simply accept what science instructs is true. The strength of a scientific theory is not determined by a poll but by the quality of evidence. Yet the history of science is the history of dogmatic groups of scientists having their confidently-held views corrected or entirely replaced. The problem is a social one; scientists want the respect of their peers and as a result are subject to groupthink. In an age of scientism this problem bleeds into the general nonscientific population, with dogmatists attempting to support their views by epistemically unquestionable (but often badly-constructed and inadequate) "studies"; rejecting anyone's argument, regardless how strong, if it is not presented with "scientific support"; and dismissing any non-scientist opining on a subject about which a scientist happens to have some opinion. As wonderful as science is, the fact is that we are far more ignorant than we are knowledgeable, even today, in 2017, and we would do well to remember that.

Here's another overrated virtue: incisiveness. Someone is incisive if he produces trenchant replies that allows his friends to laugh at the victims of his wit. Sometimes, balloons need to be punctured and there is nothing there when deflated—of course. But problems arise when glib wits attack some more complex theories and narratives. It is easy to tear down and hard to build. Fundamentally my issue is that we need to probe theories and narratives that are deeply rooted in facts and evidence, and simply throwing them on the scrap heap in ridicule means we do not fully learn what we can from the author's perspective. In philosophy, I'm often inclined to a kind of syncretistic approach which tips its hat to various competing theories that each seem to have their hands on different parts of the elephant. Even in politics, even if we have some very specific policy recommendation, much has been lost if we simply reject everything the other side says in the rough and tumble of debate.

I could go on, but I want to draw a conclusion here. When we debate and publish with a view to arriving at some well-established conclusions, we are as much performing for others as we are following anything remotely resembling an honest method for seeking the truth. We, with the enthusiastic support of our peers, are sometimes encouraged to think that we have the truth when we are still very far indeed from having demonstrated it. By contrast, sometimes we are shamed for considering certain things that we should feel entirely free to explore, because they do contain part of the truth. These social effects get in the way of the most efficient and genuine truth-seeking. The approach that can be contrasted with all of these problems is intellectual honesty. This entails, or requires, courageous individualism, humility, integrity, and faith or commitment to the cause of truth above ideology.

It's sad that it is so rare.

II. The dangers of avoiding humiliation.

The problem with most people laboring under error (I almost said "stupid people," but many of the people I have in mind are in fact very bright) is that, when they finally realize that they were in error, they can't handle the shame of knowing that they were in error, especially if they held their beliefs with any degree of conviction. Many people find error to be deeply humiliating. Remember the last time you insisted that a word meant one thing and it meant something else, when you cited some misremembered statistic, or when thought you knew someone who turned out to be a stranger. It's no fun!

Hence we are strongly motivated to deny that we are, in fact, in error, which creates the necessity of various defenses. We overvalue supporting evidence ("Well, these studies say...") and undervalue disconfirming evidence ("Those studies must be flawed"). Sometimes we just make up evidence, convincing ourselves that we just somehow know things ("I have a hunch..."). We seek to discredit people who present them with disconfirming evidence, to avoid having to consider or respond to it ("Racist!").

In short, emotional and automatic processes lead us to avoid concluding that we are in error. Since we take conscious interest in defending our views, complex explanatory methods are deployed in the same effort. ("Faith is a virtue.") But these processes and methods, by which we defend our belief systems, militate in favor of further error and against accepting truth. ("Sure, maybe it sounds weird, but so does a lot of stuff in this field.") This is because propositions, whether true or false, tend to come in large clusters or systems that are mutually supporting. Like lies, if you support one, you find yourself committed to many more.

In this way, our desire to avoid the humiliation of error leads us into complex systems of confusion—and, occasionally, into patterns of thinking that can be called simply evil. ("The ends justify the means.") They're evil because the pride involved in supporting systematically wrong systems of thought drives people into patterns of defense go beyond the merely psychological and into the abusive, psychologically damaging, and physical. ("We can't tolerate the intolerant!" "Enemy of the people." "Let him be anathema.")

What makes things worse is that we are not unique atoms each confronting a nonhuman universe, when we are coming to grips with our error. We are members of like-minded communities. We take comfort that others share our beliefs. This spreads out the responsibility for the error. ("So-and-so is so smart, and he believes this.") It is much easier to believe provably false things if many others do as well, and if they are engaged in the same processes and methods in defending themselves and, by extension, their school of thought.

This is how we systematically fail to understand each other. ("Bigot!" "Idiot!") This is why some people want to censor other people. ("Hate speech." "Bad influence.") This is how wars start.

Maybe, just maybe, bad epistemology is an essential cause of bad politics.

(I might be wrong about that.)

It's better to just allow yourself to be humiliated, and go where the truth leads. This is the nature of skepticism.

This, by the way, is why I became a philosopher and why I commend philosophy to you. The mission of philosophy is—for me, and I perhaps too dogmatically assert that it ought to be the mission for others—to systematically dismantle our systems of belief so that we may begin from a firmer foundation and accept only true beliefs.

This was what Socrates and Descartes knew and taught so brilliantly. Begin with what you know on a very firm foundation, things that you can see for yourself ("I know that here is a hand"), things that nobody denies ("Humans live on the surface of the earth"). And as you make inferences, as you inevitably will and must, learn the canons of logic and method so that you can correctly apportion your strength of belief to the strength of the evidence.

There is no way to do all this without frequently practicing philosophy and frequently saying, "This might or might not support my views; I don't know." If you avoid the deeper questions, you are ipso facto being dogmatic and, therefore, subject to the patterns of error described above.

On the Purposes of the Internet

February 28, 2009
Monterrey, Mexico


I am going to begin by asking a philosophical question about the Internet. But I can hear some of you saying, “Philosophy? What does that have to do with the Internet? Maybe I will have a siesta.” Well, before you close your eyes, let me assure you that the question is deeply important to some recent debates about the future of the Internet.

The question is: what is the purpose of the Internet? What is the Internet good for? Perhaps you had never thought that something as vast and diverse as the Internet might have a single purpose. In fact, I am going to argue that it has at least two main purposes.

To begin with, think about what the Internet is: a giant global information network. To ask what the Internet is for is about the same as asking what makes information valuable to us, and what basic reasons there might be for networking computers and their information together.


The two purposes of the Internet: communication and information

I think the Internet has at least two main purposes: first, communication and socialization, and second, finding the information we need in order to learn and to live our daily lives. In short, the Internet is for both communication and information.

Let me explain this in a simple way. On the one hand, we use the Internet for e-mail, for online forum discussions, for putting our personalities out there on social networking sites, and for sharing our personal creativity. These are all ways we have of communicating and socializing with others.

On the other hand, we are constantly looking things up on the Internet. We might check a news website, look up the meaning of a word in an online dictionary, or do some background reading on a topic in Wikipedia. These are all ways of finding information.

I want to explain an important difference between communication and information. Communication is, we might say, creator-oriented. It’s all about you, your personal needs and circumstances, and your need for engagement and recognition. So communication is essentially about the people who are doing the communicating. If we have no interest in some people, we probably have no interest in their communications. This is why, for example, I have zero interest in most MySpace pages. Almost nobody I know uses MySpace. MySpace is mainly about communication and socialization, and since I’m not actually communicating or socializing with anybody on that website, I don’t care about it.

Information, on the other hand, is not about the person giving the information but about the contents of the information. In a certain way, it really does not matter who gives the information; all that matters is that the information is valid and is of interest to me. And the same information might be just as interesting to another person. So, we might say, communication is essentially personal, and information is essentially impersonal.

I say, then, that the Internet’s purposes are communication and information. In fact, the Internet has famously revolutionized both.

The Internet is addictive largely because it gives us so many more people to talk to, and we can talk to them so efficiently. It allows us to compare our opinions with others’, to get feedback about our own thinking and creative work. In some ways, the Internet does this more efficiently than face-to-face conversation. If we are interested in a specific topic, we do not need to find a friend or a colleague who is interested in the topic; we just join a group online that has huge numbers of people already interested, and ready to talk about the topic endlessly.

Online discussions of serious topics are often a simplistic review of research, with a lot of confused amateur speculation thrown in. We could, if we wanted to, simply read the research—go to the source material. But often we don’t. We often prefer to debate about our own opinions, even when we have the modesty to admit that our opinions aren’t worth very much. Discussion is preferred by many people; they prefer active discussion over passive absorption. Who can blame them? You can’t talk back to a scientific paper, and a scientific paper can’t respond intelligently to your own thoughts. The testing or evaluation of our own beliefs is ultimately what interests us, and this is what we human beings use conversation to do.

But the Internet is also wonderfully efficient at delivering impersonal information. Search engines like Google make information findable with an efficiency we have never seen before. You can now get fairly trustworthy answers to trivial factual questions in seconds. With a little more time and skilled digging, you can get at least plausible answers to more many complex questions online. The Internet has become one of the greatest tools for both research and education that has ever been devised by human beings.

So far I doubt I have told you anything you didn’t already know. But I am not here to say how great the Internet is. I wanted simply to illustrate that the Internet does have these two purposes, and that the purposes are different—they are distinguishable.

How the Internet confuses communication and information

Next, let me introduce a certain problem. It might sound at first like a purely conceptual, abstract, philosophical problem, but let me assure you that it is actually a practical problem.

The problem is that, as purposes, communication and information are inherently confusable. They are very easy to mix up. In fact, I am sure some of you were confused earlier, when I was saying that there are these two purposes, communication and information. Aren’t those just the same thing, or two aspects of the same thing? After all, when people record information, they obviously intend to communicate something to other people. And when people communicate, they must convey some information. So information and communication go hand-in-hand.

Well, that is true, they do. But that doesn’t mean that one can’t draw a useful distinction fairly clearly. Here’s a way to think about the distinction. In 1950, a researcher would walk into a library and read volumes of information. If you wanted to communicate with someone, you might walk up to a librarian and ask a question. These actions—reading and talking—were very different. Information was something formal, edited, static, and contained in books. Communication was informal, unmediated, dynamic, and occurred in face-to-face conversation.

Still, I have to agree that communication and information are indeed very easy to confuse. And the Internet in particular confuses them deeply. What gives rise to the confusion is this. On the Internet, if you have a conversation, your communication becomes information for others. It is often saved indefinitely, and made searchable, so that others can benefit from it. What was for you a personal transaction becomes, for others, an information resource. This happens on mailing lists and Web forums. I myself have searched through the public archives of some mailing lists for answers to very specialized questions. I was using other people’s discussions as an information resource. So, should we say that a mailing list archive is communication, or is it information? Well, it is both.

This illustrates how the Internet confuses communication and information, but many other examples can be given. The Blogosphere has confused journalism, which used to be strictly an information function, with sharing with friends, which is a communication function. When you write a commentary about the news, or when you report about something you saw at a conference, you’re behaving like a journalist. You invite anyone and everyone to benefit from your news and opinion. Perhaps you don’t initially care who your readers are. But when you write about other blog posts, other people write about yours, and you invite comments on your blog, you’re communicating. Personalities then begin to matter, and who is talking can become more important to us than what is said. Information, as it were, begins to take a back seat.

Moreover, when news websites allow commenting on stories, this transforms what was once a relatively impersonal information resource into a lively discussion, full of colorful personalities. And, of course, online newspapers have added blogs of their own. I have often wondered whether there is a meaningful difference between a newspaper story, a blog by a journalist, and a well-written blog written by a non-journalist. That precisely illustrates what I mean. The Internet breaks down the distinction between information and communication—in this case, the distinction between journalism and conversation.

Why is the distinction between communication and information important?

I’ll explore more examples later, but now I want to return to my main argument. I say that the communication and information purposes of the Internet have become mixed up.

But—you might wonder—why is it so important that we distinguish communication and information, and treat them differently, as I’m suggesting? Is having a conversation about free trade, for example, really all that different from reading a news article online about free trade? To anyone who writes about the topic online, they certainly feel similar. The journalist seems like just another participant in a big conversation, and you are receiving his communication, and you could reply online if you wanted to.

I think the difference between information and communication is important because they have different purposes and therefore different standards of value. When we communicate, we want to interface with other living, active minds and dynamic personalities. The aim of communication, whatever else we might say about it, is genuine, beneficial engagement with other human beings. Communication in this sense is essential to such good things as socialization, friendship, romance, and business. That, of course, is why it is so popular.

Consider this: successful communication doesn’t have to be particularly informative. I can just use a smiley face or say “I totally agree!” and I might have added something to a conversation. By contrast, finding good information does not mean a significant communication between individuals has taken place. When we seek information, we are not trying to build a relationship. Rather, we want knowledge. The aim of information-seeking is reliable, relevant knowledge. This is associated with learning, scholarship, and simply keeping up with the latest developments in the news or in your field.

Good communication is very different from good information. Online communication is free and easy. There are rarely any editors to check every word you write, before you post it. That is not necessary, because these websites are not about creating information, they are about friendly, or at least interesting, communication. No editors are needed for that.

These communities, and blogs, and much else online, produce a huge amount of searchable content. But a lot of this content isn’t very useful as information. Indeed, it is very popular to complain about the low quality of information on the Internet. The Internet is full of junk, we say. But to say that the Internet is full of junk is to say that most conversations are completely useless to most other people. That’s obviously true, but it is irrelevant. Those who complain that the Internet is full of junk are ignoring the fact that the purpose of the Internet is as much communication as it is information.

Personally, I have no objection whatsoever to the communicative function of the Internet. In fact, it is one of my favorite things about the Internet. I have had fascinating conversations with people from around the world, made online friendships, and cultivated interests I share with others, and I could not possibly have done all this without the communicative medium that is the Internet.

But, as I will argue next, in making communication so convenient, we have made the Internet much less convenient as an information resource.

Communicative signal is informational noise

You are probably familiar with how the concept of the signal-to-noise ratio has been used to talk about the quality of online information and communication. A clear radio transmission is one that has high signal and low noise. Well, I’d like to propose that the Internet’s two purposes are like two signals: the communication signal and the information signal. The problem is that the two signal are sharing the same channel. So I now come to perhaps the most important point of this paper, which I will sum up in a slogan: communicative signal is informational noise. That is at least often the case.

Let me explain. The Internet’s two purposes are not merely confusable. In fact, we might say that the communicative function of the Internet has deeply changed and interfered with the informative function of the Internet. The Internet has become so vigorously communicative that it has become more difficult to get reliable and relevant information on the Internet.

I must admit that this claim is still very vague, and it might seem implausible, so let me clarify and support the claim further.

The basic idea is that what works well as communication does not work so well as information. What might seem to be weird and frustrating as information starts to make perfect sense when we think of it as communication.

Let me take a few examples—to begin with, In case you’re not familiar with it, it’s a website in which people submit links for everyone else in the community to rate by a simple “thumbs up” or “thumbs down.” This description makes it look like a straightforward information resource: here are Internet pages that many people find interesting, useful, amusing, or whatever. Anyone can create an account, and all votes are worth the same. It’s the wisdom of the crowd at work. That, I assume, is the methodology behind the website.

But only the most naïve would actually say that the news item that gets the most “Diggs” is the most important, most interesting, or most worthwhile. Being at the top of means only one thing: popularity among Digg participants. I am sure most Digg users know that the front page of is little more than the outcome of an elaborate game. It can be interesting, to be sure. But the point is that Digg is essentially a tool for communication and socialization masquerading as an information resource.

YouTube is another example. On its face, it looks like a broadcast medium. By allowing anyone to have a YouTube account, carefully recording the number of video views and giving everyone an equal vote, it looks like the wisdom of the crowd is harnessed. But the fact of the matter is that YouTube is mainly a communication medium. Its ratings represent little more than popularity, or the ability to play the YouTube game. When people make their own videos (as opposed to copying stuff from DVDs), they’re frequently conversational videos. They are trying to provoke thought, or get a laugh, or earn praise for their latest song. They want others to respond, and others do respond, by watching videos, rating videos, and leaving comments. I suspect that YouTube contributors are not interested, first and foremost, in building a useful resource for the world in general. They are glad, I am sure, that they are doing that too. But what YouTube contributors want above all is to be highly watched and highly rated, and in short a success within the YouTube community. This is evidence that they have been heard and understood—in short, that they have communicated successfully.

I could add examples, but I think you probably already believe that most of the best-known Web 2.0 websites are set up as media of communication and socialization—not primarily as impersonal information sources.

But what about Wikipedia and Google Search? These are two of the most-used websites online, and they seem to be more strictly information resources.

Well, yes and no. Even Wikipedia breaks down the difference between a communication medium and an information resource. There has been a debate, going back to the very first year of Wikipedia, about whether Wikipedia is first and foremost a content-production project or a community. You might want to say that it is both, of course. That is true, but the relevant question is whether Wikipedia’s requirements as a community are actually more or less important than its requirements as a project. For example, one might look at many Wikipedia articles and say, “These badly need the attention of a professional editor.” One might look at Wikipedia’s many libel scandals and say, “This community needs real people, not anonymous administrators, to take responsibility so that rules can be enforced.” Wikipedia’s answer to that is to say, “We are all editors. No expert or professional is going to be given any special rights. That is the nature of our community, and we are not going to change it.” The needs of Wikipedia’s community outweigh the common-sense requirements of Wikipedia as an information resource.

Please don’t misunderstand. I am not saying that Wikipedia is useless as an information resource. Of course it is extremely useful as an information resource. I am also not saying that it is merely a medium of collaborative communication. It clearly is very informational, and it is intended to be, as well.

Indeed, most users treat Wikipedia first and foremost as an information resource. But, and this is my point, for the Wikipedians themselves, it is much more than that: it is their collaborative communication, which has become extremely personal for them, and this is communication they care passionately about. The personal requirements of the Wikipedians have dampened much of the support for policy changes that would make Wikipedia much more valuable as an information resource.

Why do we settle for so much informational noise?

Let me step back and try to understand what is going on here. I say that Web 2.0 communities masquerade as information resources, but they are really little more than tools for communication and socialization. Or, in the case of Wikipedia, the community’s requirements overrule common-sense informational requirements. So, why do we allow this to happen?

Well, that’s very simple. People deeply enjoy and appreciate the fact that they can share their thoughts and productions without the intermediation of editors or anything else that might make their resources more useful as information resources. And why is it so important to so many people that there be no editors? Because editors are irrelevant and get in the way of communication.

The fact that Web 2.0 communities are set up for communication, more than as information resources, explains why they have adopted a certain set of policies. Consider some policies that Wikipedia, YouTube, MySpace, and the many smaller Web 2.0 websites have in common.

First, on these websites, anyone can participate anonymously. Not only that, but you can make as many accounts as you want. Second, when submissions are rated, anyone can vote, and votes are (at least initially, and in many systems always) counted equally. Third, if there is any authority or special rights in the system, it is always internally determined. Your authority to do something or other never depends on some external credentials or qualification. University degrees, for example, are worth nothing on YouTube.

The result is that, on a website like Wikipedia, a person is associated with one or more accounts, and the performance of the accounts against all other accounts is all that the system really cares about.

To Internet community participants, this seems very rational. A person is judged based on his words and creations alone, and on his behavior within the system. This seems meritocratic. People also sometimes persuade themselves, based on a misinterpretation of James Surowiecki’s book The Wisdom of Crowds, that ratings are an excellent indicator of quality.

But these systems are not especially meritocratic. It is not quality, but instead popularity and the ability to game the system that wins success in Web 2.0 communities. High ratings and high watch counts are obviously not excellent indicators of quality, for the simple reason that so much garbage rises to the top. There is no mystery why there is so much time-wasting content on the front page of YouTube,, and many of the rest: it’s because the content is amusing, titillating, or outrageous. Being amusing, titillating, and outrageous is not a standard of good information, but it can be a sign of successful communication.

The less naïve participants, and of course the owners of these websites, know that Internet community ratings are largely a popularity contest or measure the ability to play the game. They don’t especially care that the websites do not highlight or highly rank the most important, relevant, or reliable information. The reason for this is perfectly clear: the purpose of these websites is, first and foremost, communication, socialization, and community-building. Building an information resource is just a very attractive side-benefit, but still only a side-benefit, of the main event of playing the game.

The attraction, in fact, is very similar to that of American Idol—I understand you have something similar called “Latin American Idol,” is that correct? Well, I have been known to watch American Idol. It is a television competition in which ordinary people compete to become the next Idol, who earns a record contract, not to mention the attention of tens of millions of television viewers. The singing on American Idol, especially in the early weeks, is often quite bad. But that is part of its entertainment value. We do not watch the program to be entertained with great singing—that is, of course, nice when it happens. Instead, we watch the program mainly because the drama of the competition is fascinating. Even though the quality of the singing is supposed to be what the program is about, in fact quality is secondary. The program’s attraction stems from the human element—from the fact that real people are putting themselves in front of a mass audience, and the audience can respond by voting for their favorites. The whole game is quite addictive, in a way not unlike the way Internet communities are addictive.

But let’s get back to the Internet. I want to suggest that the information resource most used online, Google Search itself, is also a popularity contest. Google’s PageRank technology is reputed to be very complex, and its details are secret. But the baseline methodology is well-known: Google ranks a web page more highly if it is linked to by other pages, which are themselves linked to by popular pages, and so forth. The assumption behind this ranking algorithm is somewhat plausible: the more that popular websites link to a given website, the more relevant and high-quality the website probably is. The fact that Google is as useful and dominant as it is shows that there is some validity to this assumption.

All that admitted, I want to make a simple point. Google Search is essentially a popularity contest, and frequently, the best and most relevant page is not even close to being a popular page. That is a straightforward failure. But just as annoying, perhaps, is the prevalence of false positives. I mean the pages that rank not because they are relevant or high-quality, but because they are popular or (even worse) because someone knows how to game the Google system.

Does this sound familiar? It should. I do not claim that Google is a medium of communication. Clearly, it is an information resource. But I want to point out that Google follows in the same policies of anonymity, egalitarianism, and merit determined internally through linkings and algorithms that machines can process. As far as we know, Google does not seed its rankings with data from experts. Its data is rarely edited at all. Google dutifully spiders all content without any prejudice of any sort, applies its algorithm, and delivers the results to us very efficiently.

I speculate—I can only speculate here—that Google does not edit its results much, for two reasons. First, I am sure that Google is deeply devoted the same values, values that favor a fair playing field for communication games that many Web 2.0 websites play. But, you might say, this is a little puzzling. Why doesn’t Google seek out ways to include the services of editors and experts, and improve its results? An even better idea, actually, would be to allow everyone to rate whatever websites they want, then publish their web ratings according to a standard syndication format, and then Google might use ratings from millions of people creatively to seed its results. In fairness to Google, it may do just this with the Google SearchWiki, which was launched last November. But as far as I know, SearchWiki does not aggregate search results; each individual can edit only the results that are displayed to that user.

So there is, I think, a second and more obvious reason that Google does not adjust its results with the help of editors or by aggregating syndicated ratings. Namely, its current, apparently impersonal search algorithm seems fair, and it is easy to sell it as fair. However much Google might be criticized because its results are not always the best, or because the results are gamable or influenced by blogs, at least it has the reputation of indeed being mostly fair, largely because PageRank is determined by features internal to the Internet itself—in other words, link data.

Google’s reputation for fairness is one of its most important assets. But why is such a reputation so important? Here I can finally return to the thread of my argument. Fairness is important to us because we want communication to be fair. In a certain way, the entire Internet is a communicative game. Eyeballs are the prize, and Google plays a sort of moderator or referee of the game. If that’s right, then we certainly want the referee to be fair, not to prefer one website over another simply because, for example, some expert happens to say the one is better. When it comes to conversations, fairness means equal consideration, equal time, an equal shot at impressing everyone in the room, so to speak. Communication per se is not the sort of thing over which editors should have any control, except sometimes to keep people polite.

The fact that Google has an impersonal search algorithm really means that it conceives of itself as a fair moderator of communication, not as a careful chooser of relevant, reliable content. And a lot of people are perfectly happy with this state of affairs.


In this paper I have developed an argument, and I hope I haven’t taken too long to explain it. I have argued that the Internet is devoted both to communication and information. I went on to say that communication and information are easily confused, and the Internet makes it even easier to confuse them, since what serves as mere communication for one person can be viewed later as useful information for another person. But what makes matters difficult is that we expect communication, and the websites that support online communication, to be as unconstrained and egalitarian as possible. As a result, however, the Internet serves rather well as a communication medium, as a means to socialize and build communities, but not nearly as well as an information resource.

I can imagine a reply to this, which would say: this is all a good thing. Information is about control. Communication is about freedom. Viva communication! Should our alleged betters—professors, top-ranked journalists, research foundations, and the like—enjoy more control over what we all see online, than the average person? The fact is that in the past, they have enjoyed such control. But the egalitarian policies of the Internet have largely removed their control. In the past, what those experts and editors have happened to say enjoyed a sort of status as impersonal information. But all information is personal. The Internet merely recognizes this fact when it treats allegedly impersonal information as personal communication.

This is the common analysis. But I think it is completely wrong.[1] First, the elites still exert control in many ways, and there is little reason to think the Internet will change this. Second, the radical egalitarianism of Internet policies does not disempower the elites so much as it disempowers intelligence, and empowers those with the time on their hands to create and enjoy popular opinion, and also those who care enough to game the system.

If more people were to emphasize the informative purpose of the Internet more, this would not empower elites; it would, rather, empower everyone who uses the Internet to learn and do research. We would have to spend less time sorting through the by-products of online communication, and could spend more time getting solid knowledge.

In fact, I think most people enjoy the Internet greatly as an information resource—at least as much as they enjoy it as a communication medium. But most of the people who create websites and Internet standards—the many people responsible for today’s Internet—have not had this distinction in mind. But I think it is very fruitful and interesting way to think about the Internet and its purposes, and—who knows?—perhaps it will inspire someone to think about how to improve the informational features of the Internet.

In fact, if my fondest hope for this paper were to come true, it would be that those building the Internet would begin to think of it a little bit more as a serious information resource, and a little bit less as just a fun medium of communication.

[1] As I have argued in a recent paper: “The Future of Expertise after Wikipedia,” Episteme (2009).

Why study higher mathematics and other stuff most people don't use in everyday life?

This video was posted in a Facebook group of mine here:

I find it ironic that some of the most listened-to speakers about education explain that the cure to our educational ills is to point out that education is unnecessary. I call this educational anti-intellectualism. Here's another representative sample and another.

It is possible to make the argument, "X isn't going to be necessary for most students in life, therefore X should not be taught," for almost everything that is taught beyond the sixth grade or so. After that, we should be taught "critical thinking" and vague "analytical abilities" and "reading comprehension" and other such claptrap; that seems to be the natural consequence of this commentator's thinking, and sadly, he is not alone.

The fact that educated people like this teacher, and all the people who approve of this stuff, cannot answer the question is very disappointing. It's not surprising, perhaps, because it's philosophy and philosophy is very hard. Moreover, there are a variety of sort-of-right answers that subtly get things wrong and might end up doing more damage than good.

In the latter category I might want to place E.D. Hirsch, Jr., one of the most prominent education traditionalists alive. (He just published a book I got today called Why Knowledge Matters, and he might have updated his views on this; I'll find out soon.) Hirsch's argument is that we ought to learn classics and, essentially, get a liberal arts education, because this is the knowledge we use to interact with other educated adults in our culture. It is "cultural literacy" and "cultural capital" and this is something we desperately need to thrive as individuals and as a civilization.

That's all true, I think. If Hirsch made the argument as, essentially a defense of Western (or just advanced) civilization—that we need to educate people in Western civilization if we are to perpetuate it—then I'd be fully on board. But Hirsch as I understand him appeals particularly to our individual desire to be a part of the elite, to get ahead, to be able to lord it over our less-educated citizens. This is a very bad argument that won't convince many people. If Hirsch or anyone makes it, I would put it in the category of arguing for the right conclusion for the wrong reason.

The argument I'd give to this math teacher is the same I'd give to someone who says we shouldn't memorize history facts or read boring, classic literature or learn the details of science or what have you. Of course you don't need that stuff to get through life. Most people are as dumb as a box of rocks when it comes to academic stuff (yes, in all countries; some are worse than others).

The reason you get an education, and study stuff like higher math, is more along the following lines. Education trains the mind and thereby liberates us from natural prejudice and stupidity. This is the proper work for human beings because we are rational creatures. We are honing the tool that comes more naturally to us than to any other animal. One must realize, as people like this educated fool and so many others seem not to, that education, such as math education, is not merely a tool in the sense of "abilities." The content, or what is known, is a deeply important part of the tool; in fact, as Hirsch does argue correctly and convincingly, any "analytical abilities" brought to a text will be very poor without relevant subject knowledge. If you want an analogy, it is a poor one to say that a course in logic sharpens your wit, to say you want to have sharp wits, and therefore you should study "critical thinking"; the heft or substance of your wit's ax is all the rest of the knowledge behind the cutting edge. Getting an A in a logic class (a course I taught many times) without knowledge of math, science, history, literature, etc., gives you about as much heft and effectiveness as a sharp-edged piece of paper: capable of paper-cuts.

The core of the argument for knowledge is that academic knowledge forms a sort of deeply interconnected system, and the more deeply and broadly that we understand this system, the more capable we are in every bit of life. This is true of us as individuals and also as a society or civilization. It is completely and literally true that the fantastic structure of modern civilization as we know it, all of the historically unprecedented developments we have seen, is a direct outgrowth of the deep commitment of our society's leaders—since the Enlightenment—to education in this system.

The system I refer to is deeply connected, but that doesn't mean it isn't also loosely connected in the sense that one can learn bits here and there and benefit somewhat. That's absolutely true. This is why it's possible for the math teacher to say, "Well, you don't really need to know higher math in order to live life." Some people are geniuses about literature but don't remember anything about any math they learned beyond the sixth grade.

But as everybody with higher education knows, in fact it is absolutely necessary to learn higher math if you are going to learn higher science—both the hard sciences and the social sciences, both of which require heavy calculation—and deal intelligently with statistics and probabilities, as is necessary in politics, or the financial part of business, or some of programming, etc.

This is because the "deep structure" of reality is mathematical. To declare that "you don't really need to know it" is to declare that you don't need to know the deep structure of reality. Sure, of course you don't. The birds of the air and the fish of the sea don't. But do you want our children to be more like them or more like fully rational, aware, human creatures?

Against language arts and social studies textbooks

Here's a little argument against language arts and social studies (e.g., history and geography) textbooks. We need to get rid of them. Period.

Prima facie, we don't need textbooks to teach a subject. Other pedagogical methods include chapter (trade, library) books, short readings, computer software, videos, lectures, worksheets, projects, etc. So what are textbooks for?

Well, consider what they are: Textbooks are systematic, book-length presentations of information for purposes of introducing students to a subject, systematically covering every aspect at some level. All information that is needed is presented. Modern texts include supplementary media, not just photos and charts but also, for example, videos and interactive widgets. Texts often have accompanying exercises and workbooks. In short, a modern textbook system is an end-to-end multimedia introduction to a subject at a certain level.

Textbooks make perfect sense for certain subjects, including—especially—math, science, foreign language, grammar, and programming. These subjects are suitable for textbook presentations because it is deeply important, first, that students of those subjects learn certain topics adequately before moving on to other topics and, second, that all the basic topics be covered in adequate depth. The textbook method is lends itself very nicely to both requirements. First, textbook readings, accompanying media, and exercises all structure information in a logical fashion so that the more fundamental information is mastered before moving on to the more derivative information. Second, textbooks marshal all the relevant information within chapters, and can cover the whole subject by simply making the book longer.

Most textbooks are so darned meaty and substantial-looking, it seems hard to argue against them, especially if you are someone—like me—who believes that absorbing a lot of knowledge is what school is primarily about. But actually, it's easier than it might look. You see, there are excellent reasons why certain subjects lend themselves to textbook presentation, while others do not.

There are a couple of very good reasons why math, science, foreign language, grammar, and programming lend themselves to textbook presentation. It is because the information in these fields lends itself to a logical, bottom-up structuring. You cannot learn certain things about math—and upper division science, and foreign language, and advanced grammar, and programming—before you have mastered certain other things. You'd better not tackle the subjunctive in Latin before mastering the indicative, or division before multiplication, or subordinate clauses in English grammar before adjectives and prepositions. Moreover, at a given level of mastery, we can agree that certain topics must be included, or the method is simply incomplete. If you have learned Latin noun declensions but not verb conjugations, you haven't learned Latin. If you have learned about processing loops but not about data storage, you haven't learned programming. If you haven't learned the Circle of Fifths, crack open that music theory book again. Textbooks seem necessary because they help guide the student (and the teacher!) so that the information is presented in the right order, and all of it (at a certain level) is presented.

Assuming there's a real phenomenon here, we may for shorthand refer to math, science, foreign language, grammar, and programming as structured subjects. (A couple other structured subjects are music theory and economics.) And we may also for shorthand refer to the logical dependency of one topic on another, within structured subjects, their foundational structure, while the tendency of certain topics to be needed for a complete presentation of a subject, the subject's completeness.

In short, then, my proposal is that textbooks are particularly useful for structured subjects, because such subjects exhibit a foundational structure and completeness (within a level of mastery), and the textbook approach can (if well executed) elegantly mirror the foundational structure and completeness of those subjects. All well and good. You don't have to use just a textbook, but I won't argue with you much if you do.

I now come to my point:

Subjects that do not exhibit foundational structure or completeness are very bad candidates for textbooks (dammit!).

Such subjects include:

Science at the elementary level. It manifestly does not matter what order you teach little kids science in or how much of each subject they learn (as long as they learn certain basics before they get to more advanced science).

Reading and writing. There is nothing less structured than literature. There is nothing less foundational or complete than writing. These are not bodies of knowledge to master. Literature is made up of narratives and great language to come to grips with, not logical structures. And reading and writing are both skills to practice, not to study in the systematic way one studies math or foreign language. Literature does not exhibit completeness. It does not matter whether you read certain books, although I think a good education will be heavy on the classics. "Reading comprehension," spelling and vocabulary exercises, integrated grammar, directed writing, and all the other claptrap that makes up a modern "English Language Arts" textbook-based program—it all positively obscures the beauty and appreciation of actual literature. It is decidedly not required. The only thing that is really required, I think, is copious reading and writing. All that textbook drivel is much more effectively and efficiently learned simply by reading and occasionally discussing great books, and writing copiously about anything that strikes your fancy (and sometimes about what you read) and getting occasional feedback on your work.

History. Now, it is true that history exhibits a kind of completeness; to be fully educated you have to have some exposure to, say, Roman history and the Renaissance and (in this country) the War of Independence. But it does not—not really—have any foundational structure. It doesn't matter what order you go in, or in what depth you cover various subjects. Again, I think that the more of it you cover in considerable depth, the better—but history is, in short, pretty much the opposite of a structured subject.

Geography. Same analysis as history. You'll want to cover certain basic topics for sure, but what order you go in, how much depth you go into, etc., it's all arbitrary.

Many other subjects also are not structured subjects, either, including the rest of the topics that go under the heading "social studies" in U.S. schools, art history, art and music appreciation, general computer literacy, etc.

Textbooks and textbook programs are, at best, necessary evils. Why? Because, especially for children, they are boring, unmotivating, and therefore less efficient than reading real books and other methods of teaching. Why? Let's see:

A single source. You read all year from one source, who or (worse!) which has one style, however brilliant, one point of view or bias, etc. That gets old before too long.

Human brains, while capable of great rationality, enjoy randomness. I am one of the biggest rationalists (depending on the sense of "rationalism" you mean) you'll find. But learning minds, especially young ones, love to leap from topic to topic. If you want to keep a student's motivation up, you have to change things up.

Texts are totalitarian. Of course I'm being facetious, but I do have a point. By being careful, orderly, and complete, students are forced to study certain things in a certain order. This is necessary (to some extent) for structured subjects, especially as one gets into higher and more technical aspects of subjects. It is decidedly not necessary for unstructured subjects.

Texts are often badly written, by committee. Enough people have complained about this that I don't have to.

Various educational practices delight or irritate me to various extents, but a special place in my personal hell is reserved for the practice of inflicting lame language arts texts on students through the eighth grade. In addition to turning off generations of school kids to reading and leaving them poorly prepared in their own language, the worst thing about such textbooks is the opportunity cost. Ironically, too much time is spent about reading about the reading, doing busywork exercises, and studying for and taking exams the point of which is to make sure one has understood everything taught so far. That all seriously cuts into the time spent actually, you know, reading something worthwhile.

I wish I could hear back from some language arts teacher or curriculum designer. Explain this to me, please. Let's suppose your poor students spend, in and out of class, 100 hours reading your groan-inducing textbook (sorry, but that really is how I feel) in a school year. At an average of, let's say, 6 hours per book (faster readers might finish them faster), those students could read about 17 great children's books. So, do you really think reading your  textbook all year long will teach and engage your students better than 17 shorter, more interesting chapter books?

The problem with history texts is different. History becomes seriously interesting only when one studies the narratives that make it up in some depth. Textbooks consist of, basically, a series of Cliffs Notes versions of historical narratives, cut so short as to be incomprehensible. Students should be reading chapter books and long meaty history books—not textbooks—in order really to appreciate and get something out of history. The whole idea, after all, is supposed to be to understand how human nature and society operates through the study of examples. If you don't study the examples closely enough, if you're just memorizing names and dates willy-nilly, you'll both forget them and fail to appreciate the purpose of the subject.


The notorious co-founder of Wikipedia interviews the notorious co-founder of Genius

I am spending a few days with the energetic and charming young crew of Everipedia at their offices in sunny L.A. I got to know Everipedia through Mahbod Moghadam, the 30-something but youthful and “thug” (this, apparently, is a good thing) co-founder of, whom I got to know last year when I was still working on Infobitt (which, alas, is still in mothballs). Mahbod is not the CEO but is certainly one of the leading lights of this approximately one-year-old company; he and the other guys are very friendly, easygoing, smart, and hard-working, as far as I can tell. Anyway, Mahbod likes to be interviewed, and he is a “character,” so I thought it would be fun to do that. After all, people have interviewed me a lot but I can’t remember ever interviewing anybody else. So the tables are turned! For this blog’s very first interview, here is Mr. Moghadam. This will be a fairly wide-ranging email interview, so here goes.

Everipedia is the project you’re now working on. What exactly is the vision, at present, behind Everipedia? What are you trying to achieve?
Everipedia is the baby of Sam Kazemian and Tedde Forselius — they are my sons. It is, in short, a better version of Wikipedia. There are lots of differences, but the biggest one is that you can make a page about ANYTHING. I’ve had a Wikipedia page written about me before — several times — and Wikipedia kept taking it down! It was heartbreaking, especially because it has always been my dream to have a Wikipedia page about me. I’m sure there are millions of people who feel this way. Sam showed me my Everipedia page when I was giving a talk at UCLA — I was over the moon! I went home and immediately started making pages for all of my friends, my friends’ companies…everything I think is cool! Adding pages on Everipedia is really easy — it’s like posting on Facebook. No complexities or weirdo markup language like Wikipedia.

You say you want Everipedia to be the encyclopedia of everything, covering not just the topics in Wikipedia, not just the topics snootily deemed not important enough to include, but topics far, far outside the mainstream of what is considered “encyclopedic.” Things like: Every person in the world (including me and you!). Every street in the U.S. All the products currently for sale. All the species in the world. Every chemical compound (!). Every gene (!!). Every episode of every lame TV show. Every website (!!!). Etc. First of all, are you frigging insane?
I think it’s insane to have a strict notability requirement! The cool thing about the Internet is there is so much bandwidth — everyone can have their piece. Even if you are a shitty photographer, you can have an Instagram. Even the WORST rappers annotate their lyrics on Rap Genius. (TRUST ME) So why shouldn’t everyone have a Wiki?

OK, setting aside issues about feasibility, maintainability, etc., there’s a more basic question: Why is it important to have an encyclopedia of everything? Aren’t you basically just trying to replicate the Internet, or what eventually will be on the Internet?
Yeeee! One of our nicknames for Everipedia is “Crowdsourced Google” — the same way that Google gives you information about any subject, we want Everipedia to give you the info, except humans are doing the sorting, summarizing and rating of the sources instead of a machine.

Right now the site actually reminds me no small amount of the early days of Wikipedia — same youthful enthusiasm, same friendly welcoming atmosphere, same lack of f’s given if someone starts work on a topic with a very lame article. But Wikipedia sort of grew up (not entirely) and became huge, with long, meaty articles. How are you going to “get from here to there” and avoid burnout or seeming irrelevant?
Hopefully we can steal a lot of users from Wikipedia! On Everipedia you get IQ for your contributions. Contributors get credit and recognition for their accomplishments, they are not simply working in a void. College students can be appointed “Everipedia Campus Representative” if they earn it, and celebrities can contribute via Verified Accounts. Wikipedia won’t even let Snoop Dogg contribute to his own page! That ain’t right…on Everipedia, Snoop can even cite himself as a source! Not to mention anyone can cite his Instagram posts, hit tweets…anything that has cool information.

Why should somebody work on Everipedia when they can work on Wikipedia and have a better chance of having their words read by people on the #7 website in the world?
Because on Everipedia you get rewarded for your work. On Wikipedia, you get no recognition, contributions are pretty much anonymous. Maybe that appeals to some people — but I know, personally, I would never want to spend time working on something without getting credit for it. I think I’m a very good writer, and I want to be recognized for my work. I’m sure there are a lot of talented writers who feel the same way I do!

You have sometimes called Everipedia the “Thug Wikipedia.” Come on, dude, isn’t “Thug Wikipedia” likely to be off-putting to people who are, you know, working on an encyclopedia? And what does this mean, anyway?
Haha, yeah, we should probably stop saying that. What I mean by “thug,” in this case, is that there aren’t a bunch of unnecessary rules. You might think rules are great, but look at the result. Wikipedia’s notability requirement results in systematic discrimination against women and minorities, which is truly shameful. The top-performing pages on Everipedia are often black actresses, like Mariah Lynn from “Love and Hip-hop,” who are massively popular but face “Wikipedia Discrimination.” Everipedia made a page for Sabrina Pasterski — known as the “Female Einstein.” Wikipedia scraped our article and didn’t cite us! So I think that symbolizes the different focus of Everipedia and Wikipedia. Maybe we should change “Thug Wikipedia” to “Feminist Wikipedia.”

You and your buddies started Genius, originally RapGenius, which is one of the coolest collaborative websites online. I put it up there with Wikipedia, Quora, and a very few others that feature open collaboration among equals in order to develop a resource that is of use to everyone. This is what I love, and you and I both agree people ought to make more of these sorts of sites. So what is your top advice for entrepreneurs or community organizers (so to speak) who want to organize other people to create awesome resources that are useful to everyone?
It is bizarre. Every wiki site blows up. Even WIKIFEET gets a ton of traffic. But nobody wants to make encyclopedias. Everybody wants to make “The Next Snapchat.” I think this is because making a social media app is sexier than making an encyclopedia. Also, if you succeed, it’s a lot less work. You don’t have to sit there and use your own product, add a bunch of cool pages, etc. But I don’t think it’s an accident that I am 2 for 2 on successful startups and both are encyclopedias. There is such a thirst for robust software to disseminate information. It is the future of media! And nobody is doing it…personally, I think Quora sucks, and even Quora is blowing up…

OK, I gotta ask. You’ve been asked this ad nauseam, I’m sure, and I’m sure you’re annoyed by it, but I gotta ask. (Remember, this question is coming from a guy who thinks we are falling in a moral abyss. I may be a libertarian but I am also a moralist.) In November 2014 you wrote an article ill-advisedly titled “How To Steal From Whole Foods.” First of all, WTF? What were you thinking of? You know that stealing is wrong, right?
The article was meant as a joke, the sole purpose was to make people laugh. The title is paying homage to Tao Lin’s classic tome “Shoplifting from American Apparel.” Lames like Mark Suster took my words literally, because they have the minds of sheep. A lot of people also told me they loved the article — those were the smart folks. I don’t steal, but personally, I don’t think there is anything wrong with stealing. You certainly can’t compare it to murder or rape, not even close. Stealing food, especially, strikes me as a morally neutral activity.

Being around you and the other Everipedia guys have introduced several items of slang that are completely new to me, because I don’t watch TV, don’t spend any time around teenagers or college students, and work from home in an exurb of Columbus, Ohio. I’m a bit cloistered, to tell the truth, but that’s how I like it. You meanwhile are the man about town, living in L.A. and hip to the scene (which shows how unhip-to-the-scene I am, since kids these days do not use the phrase “hip to the scene”). So I require brief, Urban Dictionary-type but Mahbod-crafted definitions of the follow terms d’art of the thug life. I give you…

The Mahbod Moghadam Lexicon
thug (not in the brutal ruffian sense):
Did you know this comes from the Hindi word “Thugee”? I use it in homage to 2PAC — my favorite human who ever existed. (He had “Thug Life” tattooed on his chest). It is a synonym for “disrupt.”
pimp (not in the employer-of-whores sense): If you’re a pimp, that means you’re charismatic! You can get others to serve you..
janky: Means “sucks.”
yeeeee: One of my Persian friends got me saying “yeee”! It is a refreshing alternative to “yasssss!” which is very popular with Hillary Clinton supporters…
hooooo: Short for “HOOOOOLY SHIT!” — we say this a lot at Everipedia HQ because we are constantly amazed and bewildered by our own product! It is changing the world. It is our catchphrase.
blowin up: This is what Everipedia is presently doing! YEEEEE
ewoking: Ah, my favorite word! This means “contributing to the site” — it is derived from the username of the TOP-IQ EDITOR OF RAP GENIUS, Monsieur William Goodwin aka EwokABDevito. He is one of only 2 users who have a higher IQ than I do.
shhhhht: This is the companion of “HOOOOO!” (See above.)
bae: I use “bae” sarcastically — “bae” is a word the kids say these days, it means “baby”/”babe” — I think it sounds ridiculous, which means I’m getting old! So I imitate them.
jag: “Jagh” means “masturbate” in Persian, my native language. This is pretty much the only non-work activity we are allowed to do at Everipedia HQ. (We’re also allowed to go to the gym once a day.)
swag: This is my favorite word of all time. The eccentric rapper Lil B “The Based God” popularized it. It is a nonsense word, similar to Kurt Vonngut’s “Ho Hum”…it can mean whatever you want it to mean! It is the best word.
dope: Dope means good, like drugs.
chill: Currrrr! (Sorry I got cold for a second there!) Chill means you’re icy, which indicates a state of jewel-encrusted repose.

Now for a microaggression. Where are you from? No, where are you really from?
I’m from the Barrio vato! Barrios weyyyy! Pinche cavron! (I’m from the San Fernando Valley — Encino to be exact — via Iran.)

At this juncture I would like to inform our readers that you have a B.A. in History from Yale, a J.D. from Stanford Law, and were a Fulbright scholar. You also helped Genius to go viral. So, in short, you are clearly pretty goddamned brilliant. And yet if a reader reads your answers so far, these revelations might seem surprising. I hate to, you know, lift the curtain on the mystique (although I suspect that’s not really possible in your case), but can you comment on why, particularly at age 33 (you know — when your friends have become boring adults), you affect a “thug” attitude?
I loathe snobbery and propriety — I am against society. I was making wikis for Merrick Garland and his family today — he is a Jew trying to be a WASP, very “Ivy League” — he makes me want to throw up. I consider myself to be a UCLA alum, not a Yale alum. UCLA is where I will be donating my money, it is a school where they teach you actual knowledge, instead of propagating bullshit yuppie culture.

What are your favorite topics in history? The law?
My specialty in college was French colonial history! I am obsessed with all things French — I don’t know why — it is embarrassing! My favorite legal subject is tax, by far. I had an amazing professor for several tax courses, Joe Bankman — he is my Rabbi, basically. He taught me the most about ethics and the way the world works. I love him.

I noticed you play piano pretty well — I think I heard some Bach. Did you have lessons or what?
I am OBSESSED with Bach! That is what I am first and foremost — a Bach performer. His music is so intellectual, and yet so emotional! He is the greatest artist of all time. Hopefully Everipedia will get really big within a year or so and I can leave the company and return to my REAL full-time job — learning the complete keyboard works of Bach. I took lessons from age 15–17 with a lovable Persian guy named Arjang Rad, who is now a famous composer.

Last question, back to Everipedia: Given the choice of Everipedia and Wikipedia, or spending time in some other similar online knowledge-sharing pursuits (e.g., Quora, Medium, etc.), why should people check out and start writing for Everipedia today, in March 2016? Is it ready for people to get involved?
Everipedia will give you recognition. You get IQ, badges, and top users get equity in the company. This company will be worth billions of dollars someday — and it will not only belong to the founders and investors — it will belong to everyone who helps build it. We have already awarded equity to top users.

Some unpopular opinions

Here are some unpopular opinions, for your outrage or delight.

1. One of the biggest but least recognized reasons that American school system sucks—and it most certainly does—is that so many teachers and education professors are just as anti-intellectual as most parents. This is why we homeschool.

2. A large contingent of geekdom is actually anti-intellectual, too, as paradoxical as that might sound. Not all; certainly not my friends.

3. The most important purpose of education is not vocational education, but to train and liberate the mind, to create fully competent and responsible free citizens of a free republic. This, contrary to the much-celebrated Sir Ken Robinson, is not "boring stuff." We've got to adopt the right educational goals, lest we continue to suffer great opportunity costs of various inefficient educational methods. It's a goddamned shame that national treasures like Marva Collins have not been listened to and learned from.

4. Knowledge—which is a key element of the mission of education—involves no small amount of memory work. No, it doesn't matter that research is updating our knowledge base very regularly. If we could only jettison our distaste for memory work, we might learn the tremendous advantages of spaced repetition.

5. Television is mostly a friggin' waste of time. You're better off without access to broadcast and cable TV. You can watch the good stuff on your own time via Netflix, Amazon Prime, etc.

6. Latin and Greek are still good languages for kids to study.

7. Yes, babies can read. Robert Titzer (of Your Baby Can Read fame) was badly misunderstood and unjustly attacked. At least, babies can start to learn to read. By the time they're preschoolers, they can read well. This doesn't require pressure in any way. It's fun. Maybe you just didn't know this. Try to keep an open mind.

8. Joyful, disorganized early education can generally do great things for little kids. It's a completely avoidable national disgrace that so many kids exit first grade without knowing how to read.

9. All that just goes to show you that experts can be really friggin' dogmatic, or so I find, as much as I do respect them. They're highly susceptible to groupthink, and we must not confuse devotion to science and scholarship with uncritical acceptance of whatever trends happen to be in the ascendancy among the current generation. Follies are frequently collective, even among smart, well-educated people. Sad, but all too true.

10. Another example of dogmatic experts: yes, we do have free will, properly understood. Oh-so-clever science students stupidly assume that science alone can establish the contrary. They pretend not to be doing philosophy, when that is exactly what they are doing (albeit badly). They are annoying in their stubborn failure to understand the issues. Compatibilist free will is the only sort of freedom we need.

11. Our university system is broken, but it's a huge mistake to conclude that college is a waste of time. I propose that we pop the education bubble by creating a new, more independent and modular system of higher education, with degrees by examination among other things.

12. It makes no sense to use reason to call into question the use of reason. "He must either be a fool, or want to make a fool of me, that would reason me out of my reason and senses," said one of my heroes, Thomas Reid. It is per se rational to begin our reasonings from the principles of what philosophers like Reid and G. E. Moore called "common sense."

13. An objective morality does exist. Relativism is dangerous and wrong. It is not the case that, if God is dead, everything is permitted. As Aristotle knew, life itself is the basic good that underlies our moral judgments; so our basic duty is to live well.

14. While in some ways Western civilization has never been more powerful and enlightened, it has also become morally and intellectually arrogant, sclerotic, and stunted. This can't end well.

15. More specifically, I am appalled and saddened by how cynical and morally bankrupt so many people can be today when acting as part of governments, bureaucracies, parties, corporations, schools, social cliques, the dating scene, gangs, law enforcement, publishing, etc., etc.—and when our supposed intellectual leaders mostly avoid moral judgment of the contemptible behavior that takes place in these social contexts. Corruption and cynicism are not OK; it doesn't matter if "everybody's doing it." Someday I'll write an essay, or a book, about this.

16. We've lost our moral and intellectual bearings. Religion is no longer a unifying force, of course. Even the formerly unifying ideals of western civilization—knowledge, freedom, dignity, excellence, self-control, etc.—have come under attack by much of our intelligentsia. Ideology is no substitute; no, nothing substantial is in its place. As a society, we're sleepwalking. It's alarming. Again, it can't end well.

17. Goddamned Hollywood is a morally depraved hot mess. They have got to get their house in order. They generally don't deserve our attention beyond any worthwhile entertainment they happen to produce.

18. I'm sorry if this offends, and I'm not saying this about my many liberal friends, who are generally very original and brilliant, but I'm going to say it anyway: conventional, dull, social-climbing, ambitious people are now mostly liberal or progressive Democrats. Being a lefty is no evidence that you are a smart nonconformist, not that it ever was. There are still plenty of dull, conventional conservatives too, of course. But at some point we've got to start talking about big-government left-wingers in this country as "conservatives," just as unreconstructed communists in the old Soviet Union were called "conservatives." Then I'll ask for the good old word "liberal" back.

19. I am particularly appalled by the illiberal hostility that certain left-leaning students, and some older people as well, are showing toward the fundamental American ideals of free speech and intellectual tolerance. In the Facebook alumni group for my alma mater, the uber-liberal Reed College, a lot of older liberals share my consternation at these trends; no, they aren't conservative or even libertarian.

20. Jonathan Chait is correct that there is a new political correctness. We have become too sensitive and rely far too much on dismissive arguments regarding how people have allegedly broken new social norms that not everyone shares. We ought instead to engage on issues of substance. That we don't is really screwing up our civic culture.

21. Speaking of political incorrectness, I have some guilty pleasures on YouTube that aren't quite politically correct for me to admit to liking. I admire their outspokenness, their intellectual courage in an increasingly censorious age, and their thoughtfulness. Let me introduce you to them:

Pat Condell. In-your-face atheist, old-fashioned liberal, vociferous defender of free speech. I might not always agree with him—actually, I often do—but in any case, I admire his spirit.

Karen Straughan. I'm really going to catch it for endorsing her, so let me just say first that I'm not convinced that her general take on feminism is right—it's a lot to process and I need to think her views through more (a book would help). Still, I love that she's a bisexual single mother and yet has the courage to comes down, hard, against the bigger stupidities of radical feminism. She comes across as remarkably articulate, intelligent, and frequently shows she's done a lot of research; it's hard to believe she doesn't even have a college degree. She's going to be famous in 10 years if not sooner.

I also like the brand of feminism of my fellow philosopher Christina Hoff Sommers; I have ever since reading her Who Stole Feminism? back in the 1990s.

"Rockin' Mr. E." He's sort of a long-haired Greek-Welsh cross between Pat Condell and Karen Straughan. Again, I don't always agree, often because his arguments would require research and thought to evaluate properly—but I often do find myself inclined to agree, anyway. I appreciate his nonconformist, independent spirit, anyway. And his chops on the electric guitar.

Let the sneering begin!

I'm sure I've managed to piss off everybody to some extent. I swear this isn't my intention. I'm not a troll because I actually believe what I say and think it actually important to say. I do own up to being a gadfly and possibly a pretentious, annoying git. But a troll, no.

Why is spaced repetition not better known?

Suppose a method let you remember things with a 95% success rate--in other words, whatever information you've put into a system, you'd have a 95% chance of recalling it--and this effect is permanent, as long you continue to use the method. That would be quite remarkable, wouldn't it?

Well, there is such a method, called spaced repetition. This is the method used by such software as Supermemo, Anki, Mnemosyne, and Memrise.

The figure, 95%, is very impressive to me. I've been thinking about it lately, as I delve into the world (it is a whole world) of spaced repetition. Ordinarily, we require much less out of our metrics. 95% is practically a guarantee. With just 15 or 30 minutes a day, adding maybe 20 questions per day, you can virtually guarantee that you will remember the answers.

In particular, I am wondering why spaced repetition is not used more widely in education. Of course, I'm not the first to wonder why. The answer is fairly simple, I think.

The more I read from and interact with educationists and even homeschoolers, the more I am struck by the fact that many of them hold knowledge in contempt (q.v.). Of course, they will cry foul if you call them on this (q.v.), but that doesn't change the fact (q.v.). So naturally I expect them to sneer at me when I express amazement at the 95% recall figure. I can hear the "arguments" already: this is "rote memorization" (not if you understand what you're memorizing); education is not about amassing mere facts (not just that, no); it suffices that you can just look answers up (wrong); we should be teaching critical thinking, not mere memorization (why not both?).

I am not going to defend the value of declarative knowledge (again) here. I simply wanted to observe what teachers (including homeschooling parents) could do with spaced repetition, if they wanted to. They could spend a half hour (or less) every day adding questions to their students' "stack" of questions; then assign them to review questions (both new and old) for a half hour.

Imagine that you did that, adding 20 questions per day, five days a week, 36 weeks per year (the usual U.S. school year), for six years. This is not impossible to manage, I gather, and would not take that long, per day. Yet by sixth grade, your students would have 21,600 facts in recall with about 95% accuracy. These would merely be the sorts of facts contained in regular textbooks.

Next, consider an exam that drills on a random selection of 100 of those facts. The students who used spaced repetition faithfully would probably get an A on the exam. That, I suspect, is much better than could be expected even from top students who used ordinary methods of study.

Would students who spent 30 minutes out of every class day on this sort of review benefit from it?

I think the answer is pretty obvious.

What I dislike about experts: dogmatism

Since I started Citizendium, which invites experts to be "village elders wandering the bazaar" of an otherwise egalitarian wiki, and am well-known for criticizing Wikipedia's often-hostile stance toward experts, I am sometimes held up as an example of someone who places too much trust in experts.

In fact, I have quite a bit less trust in experts than most people have.  When I learn that something that strikes me as, at least, capable of being reasonably doubted is the overwhelming majority opinion of experts, I become very suspicious.  Moreover, this has long been my attitude--not just recently, but since before Citizendium, or Wikipedia for that matter.  Let me explain why, and remove the puzzlement these claims must provoke.

First, however, let me explain why I respect and honor experts.  If they really are experts, and not just "the most knowledgeable person in the room" on a subject, it is because they have so goddamn much knowledge about their subject.  Even if I disagree with an expert's views on controversial issues, I stand in awe when it is clear that they can explain and evidently understand so much.  Knowledge per se is deeply important to me, and not just correct memorized information, which computers can ape, but deep understanding.  It is extremely satisfying to have demystified something that was previously puzzling, or to have come to a more complex understanding of something that seemed simple, though I earlier did not understand it and it was not simple.  A person has my respect who has grasped much of what is, to me, still mysterious and complex about a subject.

Still, my respect only goes so far, because I am aware of a certain problem with expertise and especially with the social nature of contemporary research.  People are sheep--even very smart people, trained in critical thinking.  When there is a "consensus" or "broad agreement" in many fields, it becomes politically difficult to express disagreement.  If you do, you seem to announce yourself as having some serious personal flaw: stupidity, ignorance of your own field, not being current on the literature, possessing poor judgment, or being ideologically motivated, dishonest, or unbalanced.  This is true not just in obviously controversial or politically-charged debates, it is also true about completely abstract, apolitical stuff that no one outside of a discipline gives a rat's patoot about.

Thus, due to the lemming-like conformity among many researchers, academic agreement tends to feed on itself.  An attitude becomes the only one worth expressing, even if, on the more objective merits of the evidence itself, such confidence is not warranted at all.  Such biases can swing 180 degrees in one generation (think of behaviorism in psychology).

I don't know enough about intellectual history to say for sure, but I suspect things weren't always quite as bad as they are now.  I suspect that academic conformity has been growing at least since I was in college myself, anyway.  There have been intellectual trends or "schools of thought" for millennia, of course, and when scholarship was dominated by the Church and religion--in medieval times and to a lesser extent until the 20th century--certain points of doctrine were held with easily as much dogmatism as one can find anywhere in academe today.  But in the last century, some causes of academic conformity have certainly grown more powerful: academic success is gauged based on how much one has published and in high-ranking journals, while researchers are expected to build upon the work of other researchers.  There is, therefore, an economic incentive to "play it safe" and march in lockstep with some particular view of the subject.  This situation has become even more dire both due to the extreme competition for jobs in academe and research, and due to the literal politicization of some fields (i.e., the devotion of whole disciplines to political goals).

This problem has become so pronounced that I find it is impossible really to evaluate the state of knowledge in a new field until I have come to grips with the leading biases of researchers--how professional conformity or political dogma might be giving an aura of certainty or consensus to views that ought, in fact, to be controversial and vigorously discussed.

I could cite several instances of unwarranted confidence in academic dogma from the fields of philosophy, psychology, and education, but frankly, I don't want to offend anyone.  Academics, of course, don't like to be called sheep or dogmatists.  Besides, I think my point will be more effective if I let people supply their own examples, because you might disagree with mine.  Care to discuss some in comments?

Let me conclude with a prediction.  Contrary to some, the Internet is not going to limit the prerogatives of experts; they have important roles to play in society, and we cannot function at our best without our most knowledgeable people in those roles.  But one of the more interesting and often delightful aspects of the Internet is that it provides a platform for people with nonstandard views.  It will also--it does not yet, but it will--provide a way to quickly compare current views with views from the past.  These two comparison points, nonstandard and historical opinion, were not so readily available in the past as they are or will be.  The easy availability of these dissenting views will make it increasingly obvious just how dogmatic academe has been.  Indeed, this has already started, and is one reason why experts and academics as a group have taken some hits to their credibility online.  Finally, I observe that, for all the ovine nature of researchers, youth often loves to smash idols, and new "education 2.0," degree-by-examinationbadge, and other schemes might make such nonconformist idol-smashing a better career option.  I suspect we will see a crop of younger researchers making careers on the newly-viable fringes of academe by pointing out just how ridiculously overblown certain academic dogmas really are--and students eager to save on tuition and get a broader perspective will flock to tutorials with such independent scholars.

The future, according to Kathy Sierra

Kathy Sierra blogged earlier today six years ago (!) that "The future is not in learning"; the future lies, instead, in "unlearning."  This sounds awfully like another example of the geek anti-intellectualism that I love to hate; we'll see about that.  Since that's how the post comes across--beginning with the title--it has already gotten a lot of attention.  Geeks just love to hear that, in the future, they won't have to learn things.  They just love to talk about how they'll be able to upload their memories to the Internet, how Internet search makes memorization a waste of time, how they just can't make themselves read books anymore, how the intellectual authority of experts is passe, and how the liberal arts and college generally are a waste of time.  For all the world it seems they really hate learning.  So when Kathy Sierra says that the future is not in learning, they start salivating.

In fact, Sierra's main message is excellent, one that is not at all anti-intellectual.  She's saying that since the times they are a-changin' so much, we have to roll with them faster and faster, as change accelerates.  This is itself very old news, but it's always nice to be reminded of such perennial wisdom.

Too bad that, as a premise, it hardly supports the post's dramatic title or its opening pseudo-historical timeline.  Her timeline asserts that in the 1970s, the question was (somehow--who knows what this means?) "how well can you learn?"  In the 1990s, it was "how fast and how much can you learn?"  But today we have evolved!  Now it's about how quickly you can unlearn!

If we take the latter claim in any general sense, the argument is fallacious, of course.  It is true that in the 1970s education theorists talked a lot about how well we learn; they still talk about that.  It's also true that there was a movement to accelerate education, especially among young children, which had its height in the 1990s and in the heyday of NCLB.  But when Kathy Sierra next points out the homey, perfectly old-fashioned truth that we must change our habits to keep up with the times, she is changing the subject.  The following argument contains a fallacy:

1. We should unlearn habits that do not conform to new developments.
2. New developments are now coming fast and furious.
3. Therefore, the important new virtue is not learning, but unlearning.

The premises (1 and 2) do not support the sweeping conclusion (3).  I do not contradict myself when I maintain that we should still learn quickly and a lot (allegedly the "1990s" virtue--I thought it was an ancient virtue, but maybe that's just me), even while maintaining that we should change our habits as they become outdated.  The premises do support a much more modest conclusion, that being fast and flexible in how we change our habits is a new virtue.  But to say so entails neither that "the future is unlearning, generally" nor that "the future is not learning, generally."  So this is a plain old logical fallacy.  I leave as an exercise to the reader to name the fallacy.

Lest I be accused of misconstruing Kathy Sierra, let me add this.  I know that she spends most of her post explaining how we should unlearn certain outdated habits.  I agree that this is excellent and timely advice.  But that does not stop her from titling her post "The future is not in learning..." and contrasting the new virtue of "how fast you can unlearn" with the old virtues of "how well you can learn" and "how fast and how much you can learn."  But the fact of the matter is that unlearning outdated habits is a very, very different kind of learning (or unlearning) from learning facts.

Besides, even if you say that what we should unlearn is certain facts, the facts tend to be about narrow, practical, and necessarily changeable fields, viz., technology and business.  Just because technology and business are changing quickly, that doesn't mean a lot of our knowledge about other topics is becoming uselessly outdated.  If that's the argument, it too is obviously fallacious.

So does Kathy Sierra deserve to be called an "anti-intellectual" for this argument?  Well, on the one hand, one can't take her argument at all seriously as an argument against "learning."  On the other hand, she does seem to have something of a disregard for logic, and if she doesn't literally believe her title, she does seem to pander to the anti-intellectual sentiments of certain geeks.  I hate to be uncharitable, and I wouldn't want to accuse her of encouraging people to stop learning so much, but look--the anti-intellectual sentiment is in the title of her post. Yes, maybe she is merely gunning for traffic by pandering to geek anti-intellectualism.  But why would she want to do that if she didn't share their biases against learning?

UPDATE: see below.  Kathy Sierra responds to point out that this is a six-year-old post.  I don't know quite why I thought it was posted today!  But I've already made a fool of myself, and I'm not one to stop doing so after I've done it publicly, especially at someone else's expense.