Could God have evolved?

1. How a common argument for the existence of God failed—or did it?

As a philosophy instructor, I often taught the topic of arguments for the existence of God. One of the most common arguments, called the argument from design or teleological argument, in one formulation compares God to a watchmaker.

If you were walking along a beach and found some complex machine that certainly appeared to be designed by someone, which did something amazing, then you'd conclude that it had a maker. But here we are in a universe that exhibits far more complexity and design than any machine we've ever devised. Therefore, the universe has a maker as well; we call it God.

This is sometimes called the Watchmaker Argument—since the mechanism our beachcomber finds is usually a watch—and is attributed to William Paley. Variations on this theme could be the single most commonly-advanced argument for God.

The reason the Watchmaker Argument doesn't persuade a lot of philosophers—and quite a few scientists and atheists generally—is that all the purported signs of design can be found in the biological world, and if biological complexity and appearance of design can be explained by natural selection, then God is no longer needed as an explanatory tool.

Some skeptics go a bit further and say that all the minds we have experience of are woefully inadequate for purposes of designing the complexity of life. Therefore, not only are natural mechanisms another explanation, they are a much better explanation, as far as our own experience of minds and designing is concerned.

But here I find myself skeptical of these particular skeptics.

2. Modern technology looks like magic

Recently, probably because I've been studying programming and am understanding the innards of technology better than ever, it has occurred to me very vividly that we may not be able to properly plumb the depths of what minds are capable of achieving. After all, imagine what a medieval peasant would make of modern technology. As lovers of technology often say, it would look like magic, and we would look like gods.

We've been working at this scientific innovation thing for only a few centuries, and we've been aggressively and intelligently innovating technology for maybe one century. Things we do now in 2017 are well into the realm of science fiction of 1917. We literally cannot imagine what scientific discovery and technological innovation will make available to us after 500 or 1000 years. Now let's suppose there are advanced civilizations in the galaxy that have been around for a million years.

Isn't it now hackneyed to observe that life on Earth could be a failed project of some super-advanced alien schoolchild? After all, we already are experimenting with genetic engineering, a field that is ridiculously young. As we unlock the secrets of life, who's to say we will not be able to engineer entirely different types of life, every bit as complex as the life we find on Earth, and to merge with our inventions?

Now, what havoc should these reflections wreak on our religious philosophy?

3. Could an evolved superbeing satisfy the requirements of our religions?

The scientific atheist holds the physical universe in great reverence, as something that exists in its full complexity far beyond the comprehension of human beings. The notion of a primitive "jealous God" of primitive religions is thought laughable, in the face of the immense complexity of the universe that this God is supposed to have created. Our brains are just so much meat, limited and fallible. The notion that anything like us might have created the universe is ridiculous.

Yet it is in observing the development of science and technology, thinking about how we ourselves might be enhanced by that science and technology, that we might come to an opposite conclusion. Perhaps the God of nomadic tent-dwellers couldn't design the universe. But what if there is some alien race that has evolved past where we are now for millions of years. Imagine that there is a billion-year-old superbeing. Is such a being possible? Consider the invention, computability, genetic engineering, and technological marvels we're witnessing today. Many sober heads think the advent of AI may usher in the Singularity within a few decades. What happens a millions years after that? Could the being or beings that evolve create moons? Planets? Suns? Galaxies? Universes?

And why couldn't such a superbeing turn out to be the God of the nomadic tent-dwellers?

Atheists are wrong to dismiss the divine if they do so on grounds that no gods are sufficiently complex to create everything we see around us. They believe in evolution and they see technology evolving all around us. Couldn't god-like beings have evolved elsewhere and gotten here? Could we, after sufficient time, evolve into god-like beings ourselves?

What if it turns out that the advent of the Singularity has the effect of joining us all to the Godhead that is as much technological as it is physical and spiritual? And suppose that's what, in reality, satisfies the ancient Hebrew notions of armageddon and heaven, and the Buddhist notion of nirvana. And suppose that, when that time comes, it is the humble, faithful, just, generous, self-denying, courageous, righteous, respectful, and kind people that are accepted into this union, while the others are not.

4. But I'm still an agnostic

These wild speculations aren't enough to make me any less of an agnostic. I still don't see evidence that God exists, or that the traditional (e.g., Thomistic) conception of God is even coherent or comprehensible. For all we know, the universe is self-existing and life on Earth evolved, and that's all the explanation we should ever expect for anything.

But these considerations do make me much more impressed by the fact that we do not understand how various minds in the universe might evolve, or might have evolved, and how they might have already interacted with the universe we know. There are facts about these matters about which we are ignorant, and the scientific approach is to withhold judgment about them until the data are in.


On intellectual honesty and accepting the humiliation of error

I. The virtue of intellectual honesty.
Honesty is a greatly underrated epistemic virtue.

There is a sound reason for thinking so. It turns out that probably the single greatest source of error is not ignorance but arrogance, not lack of facts but dogmatism. We leap to conclusions that fit with our preconceptions without testing them. Even when we are more circumspect, we frequently rule out views that turn out to be correct because of our biases. Often we take the easy way out and simply accept whatever our friends, religion, or party says is true.

These are natural habits, but there is a solution: intellectual honesty. At root, this means deep commitment to truth over our own current opinion, whatever it might be. That means accepting clear and incontrovertible evidence as a serious constraint on our reasoning. It means refusing to accept inconsistencies in one's thinking. It means rejecting complexity for its own sake, whereby we congratulate ourselves for our cleverness but rarely do justice to the full body of evidence. It means following the evidence where it leads.

The irony is that some other epistemic virtues actually militate against wisdom, or the difficult search for truth.

Intelligence or cleverness, while in themselves an obvious benefit, become a positive hindrance when we become unduly impressed with ourselves and the cleverness of our theories. This is perhaps the single biggest I became disappointed with philosophy and left academe; philosophers are far too impressed with complex and clever reasoning, paying no attention to fundamentals. As a result, anyone who works from fundamentals finds it to be child's play (I thought I did, as a grad student) to poke holes in fashionable theories. This is not because I was more clever than those theoreticians but because they simply did not care about certain constraints that I thought were obvious. And it's easy for them in turn to glibly defend their views; so it's a game, and to me it became a very tiresome one.

Another overrated virtue is, for lack of a better name, conventionality. In every society, every group, there is a shared set of beliefs, some of which are true and some of which are false. I find that in both political and academic discussions, following these conventions is held to be a sign of good sense and probity, while flouting them ranges from suspect to silly to evil. But there has never yet been any group of people with a monopoly on truth, and the inherent difficulty of everything we think about means that we are unlikely to find any such group anytime soon. I think most of my liberal friends are—perhaps ironically—quite conventional in how they think about political issues. Obviously conservatives and others can be as well.

Another virtue, vastly overrated today, is being "scientific." Of course, science is one of the greatest inventions of the modern mind, and it continues to produce amazing results. I am also myself deeply committed to the scientific method and empiricism in a broad sense. But it is an enormous mistake to think that the mere existence of a scientific consensus, especially in the soft sciences, means that one may simply accept what science instructs is true. The strength of a scientific theory is not determined by a poll but by the quality of evidence. Yet the history of science is the history of dogmatic groups of scientists having their confidently-held views corrected or entirely replaced. The problem is a social one; scientists want the respect of their peers and as a result are subject to groupthink. In an age of scientism this problem bleeds into the general nonscientific population, with dogmatists attempting to support their views by epistemically unquestionable (but often badly-constructed and inadequate) "studies"; rejecting anyone's argument, regardless how strong, if it is not presented with "scientific support"; and dismissing any non-scientist opining on a subject about which a scientist happens to have some opinion. As wonderful as science is, the fact is that we are far more ignorant than we are knowledgeable, even today, in 2017, and we would do well to remember that.

Here's another overrated virtue: incisiveness. Someone is incisive if he produces trenchant replies that allows his friends to laugh at the victims of his wit. Sometimes, balloons need to be punctured and there is nothing there when deflated—of course. But problems arise when glib wits attack some more complex theories and narratives. It is easy to tear down and hard to build. Fundamentally my issue is that we need to probe theories and narratives that are deeply rooted in facts and evidence, and simply throwing them on the scrap heap in ridicule means we do not fully learn what we can from the author's perspective. In philosophy, I'm often inclined to a kind of syncretistic approach which tips its hat to various competing theories that each seem to have their hands on different parts of the elephant. Even in politics, even if we have some very specific policy recommendation, much has been lost if we simply reject everything the other side says in the rough and tumble of debate.

I could go on, but I want to draw a conclusion here. When we debate and publish with a view to arriving at some well-established conclusions, we are as much performing for others as we are following anything remotely resembling an honest method for seeking the truth. We, with the enthusiastic support of our peers, are sometimes encouraged to think that we have the truth when we are still very far indeed from having demonstrated it. By contrast, sometimes we are shamed for considering certain things that we should feel entirely free to explore, because they do contain part of the truth. These social effects get in the way of the most efficient and genuine truth-seeking. The approach that can be contrasted with all of these problems is intellectual honesty. This entails, or requires, courageous individualism, humility, integrity, and faith or commitment to the cause of truth above ideology.

It's sad that it is so rare.

II. The dangers of avoiding humiliation.
The problem with most people laboring under error (I almost said "stupid people," but many of the people I have in mind are in fact very bright) is that, when they finally realize that they were in error, they can't handle the shame of knowing that they were in error, especially if they held their beliefs with any degree of conviction. Many people find error to be deeply humiliating. Remember the last time you insisted that a word meant one thing and it meant something else, when you cited some misremembered statistic, or when thought you knew someone who turned out to be a stranger. It's no fun!
 
Hence we are strongly motivated to deny that we are, in fact, in error, which creates the necessity of various defenses. We overvalue supporting evidence ("Well, these studies say...") and undervalue disconfirming evidence ("Those studies must be flawed"). Sometimes we just make up evidence, convincing ourselves that we just somehow know things ("I have a hunch..."). We seek to discredit people who present them with disconfirming evidence, to avoid having to consider or respond to it ("Racist!").
 
In short, emotional and automatic processes lead us to avoid concluding that we are in error. Since we take conscious interest in defending our views, complex explanatory methods are deployed in the same effort. ("Faith is a virtue.") But these processes and methods, by which we defend our belief systems, militate in favor of further error and against accepting truth. ("Sure, maybe it sounds weird, but so does a lot of stuff in this field.") This is because propositions, whether true or false, tend to come in large clusters or systems that are mutually supporting. Like lies, if you support one, you find yourself committed to many more.
 
In this way, our desire to avoid the humiliation of error leads us into complex systems of confusion—and, occasionally, into patterns of thinking that can be called simply evil. ("The ends justify the means.") They're evil because the pride involved in supporting systematically wrong systems of thought drives people into patterns of defense go beyond the merely psychological and into the abusive, psychologically damaging, and physical. ("We can't tolerate the intolerant!" "Enemy of the people." "Let him be anathema.")
 
What makes things worse is that we are not unique atoms each confronting a nonhuman universe, when we are coming to grips with our error. We are members of like-minded communities. We take comfort that others share our beliefs. This spreads out the responsibility for the error. ("So-and-so is so smart, and he believes this.") It is much easier to believe provably false things if many others do as well, and if they are engaged in the same processes and methods in defending themselves and, by extension, their school of thought.
 
This is how we systematically fail to understand each other. ("Bigot!" "Idiot!") This is why some people want to censor other people. ("Hate speech." "Bad influence.") This is how wars start.
 
Maybe, just maybe, bad epistemology is an essential cause of bad politics.
 
(I might be wrong about that.)
 
It's better to just allow yourself to be humiliated, and go where the truth leads. This is the nature of skepticism.
 
This, by the way, is why I became a philosopher and why I commend philosophy to you. The mission of philosophy is—for me, and I perhaps too dogmatically assert that it ought to be the mission for others—to systematically dismantle our systems of belief so that we may begin from a firmer foundation and accept only true beliefs.
 
This was what Socrates and Descartes knew and taught so brilliantly. Begin with what you know on a very firm foundation, things that you can see for yourself ("I know that here is a hand"), things that nobody denies ("Humans live on the surface of the earth"). And as you make inferences, as you inevitably will and must, learn the canons of logic and method so that you can correctly apportion your strength of belief to the strength of the evidence.
 
There is no way to do all this without frequently practicing philosophy and frequently saying, "This might or might not support my views; I don't know." If you avoid the deeper questions, you are ipso facto being dogmatic and, therefore, subject to the patterns of error described above.

Modern education and culture, or, what did you think would happen?

I. Modern education and culture

Look at where we are in education and culture today. Let's catalog the main issues, shall we?

School children are often not taught to read properly, and too many fall behind and grow up functionally illiterate. Yet students are kept in schools practically all day and are made to do endless amounts of busywork, and then they have to do even more busywork at home. The efficiency of the work they do is appalling, as their textbooks and assignments are all too often ill-conceived, being repetitious, deadly dull, and designed without any consideration for what individual children already know (or don't know). Generally, they aren't taught classics (but more on that below). So despite all that work, despite graduating at rates as high as ever, the average child emerges into adulthood shockingly ignorant. The educational process is regimented; little humans have essentially become cogs in a giant, humorless, bureaucratic machine. The whole process is soul-killing.

Growing up in these bureaucratized intellectual ghettos, it's no wonder that rebellion has become de rigeur, that everyone calls himself an individualist although few really are. Popular culture with each passing generation is more dumbed-down, delivering entertainment that can be effortlessly consumed by maleducated conformist rebels, increasingly avoiding any scintilla of intellectualism, any uncool and boring reference to any of the roots of Western culture. On TV, popular music, and the Internet—the ubiquitous refuges of the young from the horrors of the educational machine that dominates their young lives—one can navigate content of all sorts without any exposure to the classics of literature and the arts, or the root ideas of Western religion and philosophy. If a few lucky students are exposed to these things at their more academic high schools, most are not, and the taste for "the best which has been thought and said" is ruined by the presentation in a system that "critiques" and renders dull as much as it celebrates and usefully explains. It's a wonder if any students emerge with any taste for the classics of Western literature, art, and thought at all.

A problem about Western culture, for the modern world, is that it is intensely critical and challenging. The classics are beautiful, but hard—both difficult to appreciate and presenting lessons that require us to take a hard, critical look at ourselves. Although the classics can be profoundly inspiring and sublime in beauty, they require time, attention, intelligence, seriousness, and sincerity to appreciate. In the context of today's soul-killing schools, students are too exhausted and overworked to meet these challenges. Many students are also too narcissistic—having been told by their parents and teachers that they are already brilliant, having been idolized by popular culture for their cool, attractiveness, and cutting-edge thinking about everything—so the classics require a kind of self-criticism that is wholly foreign to many of them. It is no wonder the classics simply do not "speak to" the youth of today.

Moreover, almost all of the classics were created by white Western men. Spending much time on them is politically regressive, or that is what school teachers are trained to believe. Instead, the left at universities have been building a new kind of more critical culture, at once holding up the grievances of historically marginalized groups as a new gospel, while actually revering popular culture. Teachers and administrators marinade in this left-wing culture of criticism at universities for six or more years, before they make the choices of what pieces of culture are worth exposing to children. So, again, it's a wonder if any students emerge with any taste for the classics.

At the college level, matters have become dire in other ways. Everyone is expected to go to college, and at the same time universities have become corporatized, so that the students are now treated as "customers" whose evaluations determine how professors should teach. So, naturally, grades have inflated—which would have been necessary to coddle the "self-esteem" or narcissism of youth—and the courses themselves have been dumbed down, at least in the humanities. But who needs the humanities? Degrees in the liberal arts generally are held to be a waste of money, especially since college has become so expensive, and fewer people are pursuing such degrees. Even if one believed the knowledge gained through liberal arts degrees to be valuable enough to warrant spending $60,000/year, one spends much of the time, in most of the humanities, marinading in that same left-wing critical culture that produces our schoolteachers—so one wouldn't be exposed to the classics in the way that would incline a student to sign up for one of these degrees in the first place. So it's no wonder if students and their parents are finding it increasingly plausible to skip college altogether. This is a sad mistake, considering that young adults today, navigating a rapidly-changing world, are more in need of the wisdom and intellectual skills inculcated by a liberal arts education than ever before. And most recently, the consequences of our failure to pass on two of the ideals essential to Western thought—free speech and freedom of inquiry—has led to thoroughly illiberal efforts to "shut it down," i.e., prevent politically unpopular ideas from getting a hearing on campus at all. This is all in the name of intersectionality, empowering the disempowered, tearing down bad old ideas, and protecting the sensitive feelings of coddled students.

II. The once-radical ideas that got us here

Our education is degraded, and we are falling away from Western civilization. So how did it come to this? I put it down to a perfect storm of terrible ideas.

(1) To be effective in a fast-changing society, we need up-to-date know-how, not theory. American society developed out of a frontier mentality that placed a premium on a "can-do" attitude, an ability to get things done, with theorizing and book-reading being a waste of time. That might be understandable for the pioneers and peasants of a frontier or pre-industrial society, it is a terrible idea for the complexities of industrial and post-industrial societies, in which wisdom, trained intelligence, and sensitivity to nuance are essentials. Nevertheless, American parents and teachers alike generally seem to agree that practical knowledge and know-how are more important than book-larnin'. You would think that this might have changed with more people than ever going to college. But it has not.

(2) Books are old-fashioned in the Internet age. When, in the 2000s, the Internet came into its own as the locus of modern life, we began to ask, "Is Google making us stupid?" and to "complain" that we lacked the ability to read extended texts (long articles were "tl;dr" and books boring, old, and irrelevant). I think many of us took this to heart. Educated people still do want their children to read, but the habits of adults are slowly dying; you can't expect the children to do better.

(3) Western civilization is evil. "Hey, hey, ho, ho, Western civ has got to go," chanted those Stanford students in 1988, which became a watershed moment in the development of Western culture. At the time, it might have seemed a bit of left-wing excess, and just one side of the complex Culture War. But, in fact, it proved to be a taste of things to come. Many Western civilization requirements are long gone. What was once the province of the newly-created Women's Studies and Black Studies departments, and a few left-wing professors, gradually become the dominant viewpoint in all of the humanities. Why study the classics when the classics simply represent the point of view of the oppressor?

(4) Social justice is the new religion. Hand-in-hand with criticism of Western civilization came an increasing respect (which is good), then celebration (which is fine), and finally a veneration (which is undeserved) of everything that has traditionally been set in opposition to Western civilization, especially the usual identity groups: women, races other than white, ethnicities other than Western, religion other than Christianity, sexual orientation other than straight, etc. At universities, making these identity groups equal to straight, white, male, Christian Europeans has become nearly the only thing—apart from environmentalism and a few other such causes—that is taken really seriously. For many academics, intersectionality has replaced both religion and any apolitical ethics to become an all-encompassing worldview.

(5) Psychology is more scientific, accurate, and credible than philosophy and religion, and self-esteem must be cultivated at all costs. The gospel of self-esteem came into being in the 1970s, right around the time when the self-help publishing industry became fashionable. With the collapse of traditional (especially Christian) belief systems, people cast about for general advice on how to live their lives, and psychology delivered. As self-esteem was a key element of much self-help psychology, it was only natural that the parents of Generations X and Y would pull out the stops to protect the feelings and sense of self-worth of their precious darlings.

We have changed. Despite their education, too many of our children cannot read well, and fewer and fewer of us read books. Whatever we do teach or read, it is rarely classical literature. Classics have become an unexplored country, dull and reviled, to many of us. Recent generations are the first in centuries in which the upper echelons of society are quite shockingly ignorant of their own Western heritage. And here I don't just mean books, I mean also basic Western principles, ideas, and values. For many young people, social justice, psychology, and especially popular culture have replaced religion and wisdom literature. Popular culture may be a crass wasteland, yet it guides our youth more than ever, as being the only kind of culture that most of them have preparation and taste for.

We have declined. In past generations, this analysis would have sounded like scaremongering. Today, the analysis has come true; it is a postmortem.

But—and here I speak to the older generation, especially educated old liberals—what did you think would happen? This is precisely what some people did predict in decades past, because society's leaders were teaching a certain set of ideas to the leaders of the next generation:

European civilization colonized and exploited the world; it is irredeemably racist and the main source of the suffering in the world today.

Inequalities are deeply unfair, and white men have the best of everything; so we should celebrate everyone else and take white men down a peg or two.

We must be avoid saying anything that might even be thought to be offensive to disadvantaged identity groups.

Christianity is completely irrational and doesn't deserve a role in public life.

Science, and psychology in particular, studies all we need to know to live and be happy; philosophy and religion are based on muddle-headed superstition.

The self-esteem and sensitivities of young people are precious and must be protected from the buffets that life threatens to give them.

Even today, some of these ideas might sound ridiculous to some of us. But if you've been paying attention, you can't deny that these once-radical ideas have become increasingly mainstream.

III. The radical ideas that might guide our future

The desperate state of education today is predictable, given former trends and earnestly-expressed convictions. It was called scaremongering to say that these ideas were hacking away the roots of Western civilization—and yet they did. So one wonders: What can we predict about the future, based on ideas now growing in popularity, ideas that it is quite reasonable to believe will guide the education and enculturation of the next generation?

Here are some controversial ideas that are in vogue at universities today: 

Free speech is a dangerous idea, and it certainly doesn't include hate speech and harmful speech.

What determines whether speech is harmful is whether it causes its listeners to react with emotional pain.

But we can disregard the pain of "privileged" people—"male tears," "white tears," and all that.

Those who are really plugged in know that books aren't really what's important. Know-how is what's important. You can just look up things online that you need to know.

Popular culture is worth careful academic study, at least as much as "the classics" or "high culture."

Higher education isn't important except as a credential to become a corporate drone and in some fields.

Grave inequalities persist, and our very civilization is racist. We ought to tear down and malign all the productions of white men.

White society, and white people (whether they know it or not), are all racist, and all men (whether they know it or not) perpetuate a sexist patriarchy.

Religion isn't just irrational and wrong, it's evil, and we should take steps to stamp it out and perhaps prohibit it.

Reproducing does great harm to the world. Life is an evil. Babies are not to be celebrated. We should stop having them.

All of these ideas have plenty of adherents on campus today. They might well shape the next generation. If so, what might our brave new world look like? Let's listen in to the monologue from a typical, center-left future student, shall we?

"It's 2047. The way some people talk, you'd think it was, I don't know, 2017 or something. Check this out. I heard someone, and I don't care if she was a black woman, actually citing the Bible in class? That triggered a lot of people, and she was kicked out, of course. I doubt they'll let her back in. It just goes to show you how many people still believe that superstitious bullshit, even though it's revolting hate speech. But you know what, I was kind of impressed about what she was reading, before I realized what she was reading. It sounded like Old English. Who reads crap like that these days? Well, I guess she can. But it's still bullshit. You don't have to be able to read it to know that.

"It's not just superstitious bullshit, it's totally irrelevant. Books are so lame! My favorite professors don't teach books, they teach modern media. When I started this major, I swear, I had no idea pop music and movies were so deep. Seriously! So why do we require students to read so many books at all? Last year I was required to read three books for required Communications courses. Everyone knows that books aren't really what's important; knowledge is free for the taking online. Everything's there, instantly! Besides, the most influential thoughts of the last forty years are all in the form of briefer texts online. I'm thinking I might want to drop out. Half of my friends didn't even go to college and are just being trained by their employers. But you know, I think those tend to be the more conservative people, you know? So...

"Anyway, at the very least, it's time to stop requiring that we read any books written before 1970, or maybe 2000, especially if they were written by white men. I mean, of course white people and men are still welcome at our universities, it is perfectly fair that they wait their turn in classroom discussions. I hate it when some white man just starts talking first. You can hear some people hissing when they do. After all, everyone knows that less privileged people have more valid and relevant perspectives, and hearing white people and men—and on some issues, let's face it, hearing ignorant, insensitive white men at all—causes the marginalized great pain. We can't forget that white Western civilization persists even today, despite our best efforts. We renamed the state of Washington, but not the capital of our country—it continues to be named after the very embodiment of a white, slave-owning, breeding patriarch! That pisses me off so much!

"And speaking of breeders...don't get me started on the breeders. We had to fight tooth and nail against the misogynist, patriarchal society just to make it possible to license parents. But now we're allowing almost everyone to be licensed. What's the point? Surely we've got to prevent so many people from breeding. We don't let just anyone drive, right? We need to start imposing some restrictions. I know it's a little simplistic, but sometimes, simple is the best way: we could just, for a while, restrict the number of children white people could have. I know it sounds shocking, but look—everybody knows they use the most resources, they're the most racist, they create the most inequality. And they're still a plurality in this country. So it's really a no-brainer. It's 2047!"

Maybe that sounds over-the-top. But that's the point. There are cutting-edge activist types who would find all of this commendable or at least very plausible. And just think: the cutting-edge ideas of 1987, which would have sounded totally bizarre and radical back then, are totally up-to-date today, in 2017. I'm similarly extrapolating, from the "cutting-edge" ideas of today on the same topics to how those ideas might be evolve in another 30 years.

Also, of course, it could get much worse. Illiberal societies have been much worse at different times and places in history.

Am I predicting that the monologue is what awaits us? No, my crystal ball isn't that accurate and history never unfolds smoothly or predictably. What I'm saying is that it's a natural extrapolation from ideas about education and culture today. Is that what we want? If not, then what kind of thought world are we trying to build?


Independent study, a replacement for college

There are many things wrong with higher education today, as I've argued on this blog. It's way too expensive.  The amount of bureaucratic overhead is simply ridiculous. The focus on education as vocational training has deeply undercut appreciation and practice of the liberal arts. It has become too business-oriented, meaning that ratings by the customer—the students—count for far too much. The gospel of publish or perish has if anything become worse, and the quality of scholarship has suffered. Far too few faculty members are actually tenured or paid what they are worth.

But beyond all this, we have a special reason for concern. For anyone committed to the liberal arts in particular, the stories we hear coming out of academe are increasingly alarming. I won't make the case here, but it's not at all unreasonable to think that students, especially in the "soft sciences" and humanities, will simply be indoctrinated by their professors and bullied by their fellow students if they are not politically correct enough. There is a point at which the amount of intellectual dogma, dishonesty, and intolerance is so overwhelming that a college education (and especially a liberal arts degree) becomes more an exercise in indoctrination than training the rational mind. No doubt it depends on the institution, the major, and the professors. It's really the luck of the draw. But I would be concerned. I am concerned for my two children.

However that might be, I think we need another sort of option.

I've already argued that getting an education via tutors and a degree via examinations is a good way to pop the education bubble. What I want to do now is record a few thoughts on how a student might actually pursue college study independently. (This is not advice; or, follow it at your own risk!)


Move to a city with a lot of professors. Most big cities would do, and while Boston is maybe the most famous college town, other excellent ones in the U.S. would include Chicago, D.C., Philadelphia, and New York City.

Find one or a few good academic advisers. If they aren't 100% committed to you, pay one who will be. This person will help you plan your course of study, give you advice on many things, receive regular reports from you on your work, and encourage you and kick your ass as needed. Obviously, you'll want to find someone fairly like-minded, especially in terms of your academic goals. It needn't be (and probably shouldn't be) someone who has the title "academic adviser." Many academics will do just fine.

With the help of your academic adviser, map out your course of study for a year. It doesn't have to be complete, but you should know a year in advance what you want to do.

I'd create a web page explaining what's going on. This way you just send people to that URL where they can learn what you're doing, what you've studied so far, read samples of your work, etc. This will make it easier for you to get professors interested in helping you.

Professors are not all created equal. Lots are brilliant, excellent teachers, and very fair-minded, even today. Some are just execrable. So here you'll have to do your research. Find professors who are inspiring, clear (or understandable to you, anyway), and make time for you (but they should be if you're paying them).

Pay professors by the hour. One hour a week ought to be enough. The main thing you'll be doing is reading and discussing what you have written about a subject the professor knows about. Maybe offer to take them to lunch.

If a professor sends you to a grad student, forget 'em, unless you're doing introductory work, or just getting tutoring for some standard course. For more advanced work, look elsewhere. Trust me, I was a grad student for eight years. They will be cheaper but they won't be as good. Of course, grad students can grade and tutor certain kind of work and that can be well worth it.

I'd want to live centrally so I can visit professors from various campuses. I'd also want to live with some other students who are doing what I'm doing, rather than with enrolled students. I think independent students living together would encourage each other to stick to it. You might even be able to get some sponsors that way; a group of you doing this is a good cause, well worth supporting.

You don't have to think about your studies in terms of discrete courses. You can, and it might be a good idea. But reading a series of books or article collections, however long it takes you, is also a good idea. Bear in mind that grad schools will still probably want you to quantify your work if you ever want to apply to one.

The bulk of your work, unless you're in one of the hard sciences, will take the form of reading and writing. You'll read books and other things, and write essays, and your professors will read your essays and give you detailed feedback. Then you'll revise. Of course, in science and math you'll have to do problem sets and pay to get those graded.

Consider auditing college courses if you like. Offer to pay the professor to read and mark up your writing and exams, if that's possible. If it's possible for you to sit in on discussion sections, as long as it doesn't cost too much, you might consider doing that.

There are lots of free courses online. You probably know that. They are a great resource; you could use them instead of attending boring lectures in big impersonal lecture halls. Live lectures can be great, but it's the luck of the draw again. In any case, lectures aren't good enough on their own. You will get a better college education if, in addition to watching lectures on video and reading books, you speak face-to-face in real time with an expert passionate about the subject and interested in you in particular. That's really essential.

Do a "senior thesis" or "senior project," i.e., an extended piece of writing or other significant professional accomplishment on a narrowly-focused topic that requires about a year to finish. This will be impressive to grad schools and be a reasonable basis (in part) on which experts can judge your level of accomplishment.

You probably have a few different options for securing a college degree. Suppose you have put all your work on a website. This includes papers, comments by professors, exam scores, the whole nine yards. (Of course, it can be password protected.) On the basis of that, I suspect some professors would be willing to sign their name on a statement (probably for money to compensate them for their time in making the evaluation honestly) to the effect that the amount of work that you have done is equivalent, or more, than the amount of work normally needed to secure a B.A. or B.S. in in their field at their institution, and that your level of scholarship is also commensurate with that of a college graduate in the field.

A GPA? Transcript? You might even finagle a GPA for yourself. Get professors to agree in advance to grade you on chunks of work. Have them edit a document that you write, stating what was accomplished, credit equivalent at their institution, when the studying was done, and the name, institutional affiliation, specializations, and contact information of the professor. They write the grade in and sign it. You make a PDF of this signed document and save the original and give them a copy. Do this for all the independent study courses you do with various professors at various institutions, and make all the PDFs available alongside the grade in your self-made "transcript." My guess is that that will work for many purposes.

Award yourself a "B.A. (or B.S.) by independent study, endorsed by..." On resumes, you can add a brief paragraph explaining how you got a bachelor's degree without having enrolled anywhere. For example, a philosophy graduate might on his resume (I'm totally making this up), "B.A. Philosophy by independent study, endorsed by Profs. Smith (Harvard), Jones (MIT), Kim (Boston University), and Wang (Boston College)." Then in a footnote you describe your program and, especially, you link to the endorsements by the professors who did your final assessment. Make sure these endorsements are uploaded correctly on LinkedIn or some other such website where people publicly endorse other people.

Be prepared to pay professors for endorsing your work and "awarding" you a degree. Especially if it is an independent professor, someone you didn't study with (or, not much), it's going to take them time to look at your portfolio and decide that you've done the work and have shown the knowledge that you need to show.

Will employers accept your "bachelor's degree"? I can't make any guarantees (the risk is all yours!)—but why don't you ask some? Speaking for myself, if I looked at your page and your statements checked out (e.g., I saw the PDFs, got confirmation from the professor that the program was legit, and saw the LinkedIn endorsements), then I would. In fact I'd say, "Here's an entrepreneurial, independent-minded go-getter. This is the kind of person I'd like on my team!" Of course, boring conventional types might turn their noses up at this, but hiring decisions for good jobs are often not made by boring, conventional types.

This is going to be much cheaper and probably better education than you'd suffer through at most universities these days.

Finally, if you do this—or have done it—then email me with your story at yo.larrysanger@gmail.com. I'd love to hear about it.


On the Purposes of the Internet

SISCTI 34
February 28, 2009
Monterrey, Mexico

Introduction

I am going to begin by asking a philosophical question about the Internet. But I can hear some of you saying, “Philosophy? What does that have to do with the Internet? Maybe I will have a siesta.” Well, before you close your eyes, let me assure you that the question is deeply important to some recent debates about the future of the Internet.

The question is: what is the purpose of the Internet? What is the Internet good for? Perhaps you had never thought that something as vast and diverse as the Internet might have a single purpose. In fact, I am going to argue that it has at least two main purposes.

To begin with, think about what the Internet is: a giant global information network. To ask what the Internet is for is about the same as asking what makes information valuable to us, and what basic reasons there might be for networking computers and their information together.

 

The two purposes of the Internet: communication and information

I think the Internet has at least two main purposes: first, communication and socialization, and second, finding the information we need in order to learn and to live our daily lives. In short, the Internet is for both communication and information.

Let me explain this in a simple way. On the one hand, we use the Internet for e-mail, for online forum discussions, for putting our personalities out there on social networking sites, and for sharing our personal creativity. These are all ways we have of communicating and socializing with others.

On the other hand, we are constantly looking things up on the Internet. We might check a news website, look up the meaning of a word in an online dictionary, or do some background reading on a topic in Wikipedia. These are all ways of finding information.

I want to explain an important difference between communication and information. Communication is, we might say, creator-oriented. It’s all about you, your personal needs and circumstances, and your need for engagement and recognition. So communication is essentially about the people who are doing the communicating. If we have no interest in some people, we probably have no interest in their communications. This is why, for example, I have zero interest in most MySpace pages. Almost nobody I know uses MySpace. MySpace is mainly about communication and socialization, and since I’m not actually communicating or socializing with anybody on that website, I don’t care about it.

Information, on the other hand, is not about the person giving the information but about the contents of the information. In a certain way, it really does not matter who gives the information; all that matters is that the information is valid and is of interest to me. And the same information might be just as interesting to another person. So, we might say, communication is essentially personal, and information is essentially impersonal.

I say, then, that the Internet’s purposes are communication and information. In fact, the Internet has famously revolutionized both.

The Internet is addictive largely because it gives us so many more people to talk to, and we can talk to them so efficiently. It allows us to compare our opinions with others’, to get feedback about our own thinking and creative work. In some ways, the Internet does this more efficiently than face-to-face conversation. If we are interested in a specific topic, we do not need to find a friend or a colleague who is interested in the topic; we just join a group online that has huge numbers of people already interested, and ready to talk about the topic endlessly.

Online discussions of serious topics are often a simplistic review of research, with a lot of confused amateur speculation thrown in. We could, if we wanted to, simply read the research—go to the source material. But often we don’t. We often prefer to debate about our own opinions, even when we have the modesty to admit that our opinions aren’t worth very much. Discussion is preferred by many people; they prefer active discussion over passive absorption. Who can blame them? You can’t talk back to a scientific paper, and a scientific paper can’t respond intelligently to your own thoughts. The testing or evaluation of our own beliefs is ultimately what interests us, and this is what we human beings use conversation to do.

But the Internet is also wonderfully efficient at delivering impersonal information. Search engines like Google make information findable with an efficiency we have never seen before. You can now get fairly trustworthy answers to trivial factual questions in seconds. With a little more time and skilled digging, you can get at least plausible answers to more many complex questions online. The Internet has become one of the greatest tools for both research and education that has ever been devised by human beings.

So far I doubt I have told you anything you didn’t already know. But I am not here to say how great the Internet is. I wanted simply to illustrate that the Internet does have these two purposes, and that the purposes are different—they are distinguishable.

How the Internet confuses communication and information

Next, let me introduce a certain problem. It might sound at first like a purely conceptual, abstract, philosophical problem, but let me assure you that it is actually a practical problem.

The problem is that, as purposes, communication and information are inherently confusable. They are very easy to mix up. In fact, I am sure some of you were confused earlier, when I was saying that there are these two purposes, communication and information. Aren’t those just the same thing, or two aspects of the same thing? After all, when people record information, they obviously intend to communicate something to other people. And when people communicate, they must convey some information. So information and communication go hand-in-hand.

Well, that is true, they do. But that doesn’t mean that one can’t draw a useful distinction fairly clearly. Here’s a way to think about the distinction. In 1950, a researcher would walk into a library and read volumes of information. If you wanted to communicate with someone, you might walk up to a librarian and ask a question. These actions—reading and talking—were very different. Information was something formal, edited, static, and contained in books. Communication was informal, unmediated, dynamic, and occurred in face-to-face conversation.

Still, I have to agree that communication and information are indeed very easy to confuse. And the Internet in particular confuses them deeply. What gives rise to the confusion is this. On the Internet, if you have a conversation, your communication becomes information for others. It is often saved indefinitely, and made searchable, so that others can benefit from it. What was for you a personal transaction becomes, for others, an information resource. This happens on mailing lists and Web forums. I myself have searched through the public archives of some mailing lists for answers to very specialized questions. I was using other people’s discussions as an information resource. So, should we say that a mailing list archive is communication, or is it information? Well, it is both.

This illustrates how the Internet confuses communication and information, but many other examples can be given. The Blogosphere has confused journalism, which used to be strictly an information function, with sharing with friends, which is a communication function. When you write a commentary about the news, or when you report about something you saw at a conference, you’re behaving like a journalist. You invite anyone and everyone to benefit from your news and opinion. Perhaps you don’t initially care who your readers are. But when you write about other blog posts, other people write about yours, and you invite comments on your blog, you’re communicating. Personalities then begin to matter, and who is talking can become more important to us than what is said. Information, as it were, begins to take a back seat.

Moreover, when news websites allow commenting on stories, this transforms what was once a relatively impersonal information resource into a lively discussion, full of colorful personalities. And, of course, online newspapers have added blogs of their own. I have often wondered whether there is a meaningful difference between a newspaper story, a blog by a journalist, and a well-written blog written by a non-journalist. That precisely illustrates what I mean. The Internet breaks down the distinction between information and communication—in this case, the distinction between journalism and conversation.

Why is the distinction between communication and information important?

I’ll explore more examples later, but now I want to return to my main argument. I say that the communication and information purposes of the Internet have become mixed up.

But—you might wonder—why is it so important that we distinguish communication and information, and treat them differently, as I’m suggesting? Is having a conversation about free trade, for example, really all that different from reading a news article online about free trade? To anyone who writes about the topic online, they certainly feel similar. The journalist seems like just another participant in a big conversation, and you are receiving his communication, and you could reply online if you wanted to.

I think the difference between information and communication is important because they have different purposes and therefore different standards of value. When we communicate, we want to interface with other living, active minds and dynamic personalities. The aim of communication, whatever else we might say about it, is genuine, beneficial engagement with other human beings. Communication in this sense is essential to such good things as socialization, friendship, romance, and business. That, of course, is why it is so popular.

Consider this: successful communication doesn’t have to be particularly informative. I can just use a smiley face or say “I totally agree!” and I might have added something to a conversation. By contrast, finding good information does not mean a significant communication between individuals has taken place. When we seek information, we are not trying to build a relationship. Rather, we want knowledge. The aim of information-seeking is reliable, relevant knowledge. This is associated with learning, scholarship, and simply keeping up with the latest developments in the news or in your field.

Good communication is very different from good information. Online communication is free and easy. There are rarely any editors to check every word you write, before you post it. That is not necessary, because these websites are not about creating information, they are about friendly, or at least interesting, communication. No editors are needed for that.

These communities, and blogs, and much else online, produce a huge amount of searchable content. But a lot of this content isn’t very useful as information. Indeed, it is very popular to complain about the low quality of information on the Internet. The Internet is full of junk, we say. But to say that the Internet is full of junk is to say that most conversations are completely useless to most other people. That’s obviously true, but it is irrelevant. Those who complain that the Internet is full of junk are ignoring the fact that the purpose of the Internet is as much communication as it is information.

Personally, I have no objection whatsoever to the communicative function of the Internet. In fact, it is one of my favorite things about the Internet. I have had fascinating conversations with people from around the world, made online friendships, and cultivated interests I share with others, and I could not possibly have done all this without the communicative medium that is the Internet.

But, as I will argue next, in making communication so convenient, we have made the Internet much less convenient as an information resource.

Communicative signal is informational noise

You are probably familiar with how the concept of the signal-to-noise ratio has been used to talk about the quality of online information and communication. A clear radio transmission is one that has high signal and low noise. Well, I’d like to propose that the Internet’s two purposes are like two signals: the communication signal and the information signal. The problem is that the two signal are sharing the same channel. So I now come to perhaps the most important point of this paper, which I will sum up in a slogan: communicative signal is informational noise. That is at least often the case.

Let me explain. The Internet’s two purposes are not merely confusable. In fact, we might say that the communicative function of the Internet has deeply changed and interfered with the informative function of the Internet. The Internet has become so vigorously communicative that it has become more difficult to get reliable and relevant information on the Internet.

I must admit that this claim is still very vague, and it might seem implausible, so let me clarify and support the claim further.

The basic idea is that what works well as communication does not work so well as information. What might seem to be weird and frustrating as information starts to make perfect sense when we think of it as communication.

Let me take a few examples—to begin with, Digg.com. In case you’re not familiar with it, it’s a website in which people submit links for everyone else in the community to rate by a simple “thumbs up” or “thumbs down.” This description makes it look like a straightforward information resource: here are Internet pages that many people find interesting, useful, amusing, or whatever. Anyone can create an account, and all votes are worth the same. It’s the wisdom of the crowd at work. That, I assume, is the methodology behind the website.

But only the most naïve would actually say that the news item that gets the most “Diggs” is the most important, most interesting, or most worthwhile. Being at the top of Digg.com means only one thing: popularity among Digg participants. I am sure most Digg users know that the front page of Digg.com is little more than the outcome of an elaborate game. It can be interesting, to be sure. But the point is that Digg is essentially a tool for communication and socialization masquerading as an information resource.

YouTube is another example. On its face, it looks like a broadcast medium. By allowing anyone to have a YouTube account, carefully recording the number of video views and giving everyone an equal vote, it looks like the wisdom of the crowd is harnessed. But the fact of the matter is that YouTube is mainly a communication medium. Its ratings represent little more than popularity, or the ability to play the YouTube game. When people make their own videos (as opposed to copying stuff from DVDs), they’re frequently conversational videos. They are trying to provoke thought, or get a laugh, or earn praise for their latest song. They want others to respond, and others do respond, by watching videos, rating videos, and leaving comments. I suspect that YouTube contributors are not interested, first and foremost, in building a useful resource for the world in general. They are glad, I am sure, that they are doing that too. But what YouTube contributors want above all is to be highly watched and highly rated, and in short a success within the YouTube community. This is evidence that they have been heard and understood—in short, that they have communicated successfully.

I could add examples, but I think you probably already believe that most of the best-known Web 2.0 websites are set up as media of communication and socialization—not primarily as impersonal information sources.

But what about Wikipedia and Google Search? These are two of the most-used websites online, and they seem to be more strictly information resources.

Well, yes and no. Even Wikipedia breaks down the difference between a communication medium and an information resource. There has been a debate, going back to the very first year of Wikipedia, about whether Wikipedia is first and foremost a content-production project or a community. You might want to say that it is both, of course. That is true, but the relevant question is whether Wikipedia’s requirements as a community are actually more or less important than its requirements as a project. For example, one might look at many Wikipedia articles and say, “These badly need the attention of a professional editor.” One might look at Wikipedia’s many libel scandals and say, “This community needs real people, not anonymous administrators, to take responsibility so that rules can be enforced.” Wikipedia’s answer to that is to say, “We are all editors. No expert or professional is going to be given any special rights. That is the nature of our community, and we are not going to change it.” The needs of Wikipedia’s community outweigh the common-sense requirements of Wikipedia as an information resource.

Please don’t misunderstand. I am not saying that Wikipedia is useless as an information resource. Of course it is extremely useful as an information resource. I am also not saying that it is merely a medium of collaborative communication. It clearly is very informational, and it is intended to be, as well.

Indeed, most users treat Wikipedia first and foremost as an information resource. But, and this is my point, for the Wikipedians themselves, it is much more than that: it is their collaborative communication, which has become extremely personal for them, and this is communication they care passionately about. The personal requirements of the Wikipedians have dampened much of the support for policy changes that would make Wikipedia much more valuable as an information resource.

Why do we settle for so much informational noise?

Let me step back and try to understand what is going on here. I say that Web 2.0 communities masquerade as information resources, but they are really little more than tools for communication and socialization. Or, in the case of Wikipedia, the community’s requirements overrule common-sense informational requirements. So, why do we allow this to happen?

Well, that’s very simple. People deeply enjoy and appreciate the fact that they can share their thoughts and productions without the intermediation of editors or anything else that might make their resources more useful as information resources. And why is it so important to so many people that there be no editors? Because editors are irrelevant and get in the way of communication.

The fact that Web 2.0 communities are set up for communication, more than as information resources, explains why they have adopted a certain set of policies. Consider some policies that Wikipedia, YouTube, MySpace, and the many smaller Web 2.0 websites have in common.

First, on these websites, anyone can participate anonymously. Not only that, but you can make as many accounts as you want. Second, when submissions are rated, anyone can vote, and votes are (at least initially, and in many systems always) counted equally. Third, if there is any authority or special rights in the system, it is always internally determined. Your authority to do something or other never depends on some external credentials or qualification. University degrees, for example, are worth nothing on YouTube.

The result is that, on a website like Wikipedia, a person is associated with one or more accounts, and the performance of the accounts against all other accounts is all that the system really cares about.

To Internet community participants, this seems very rational. A person is judged based on his words and creations alone, and on his behavior within the system. This seems meritocratic. People also sometimes persuade themselves, based on a misinterpretation of James Surowiecki’s book The Wisdom of Crowds, that ratings are an excellent indicator of quality.

But these systems are not especially meritocratic. It is not quality, but instead popularity and the ability to game the system that wins success in Web 2.0 communities. High ratings and high watch counts are obviously not excellent indicators of quality, for the simple reason that so much garbage rises to the top. There is no mystery why there is so much time-wasting content on the front page of YouTube, Digg.com, and many of the rest: it’s because the content is amusing, titillating, or outrageous. Being amusing, titillating, and outrageous is not a standard of good information, but it can be a sign of successful communication.

The less naïve participants, and of course the owners of these websites, know that Internet community ratings are largely a popularity contest or measure the ability to play the game. They don’t especially care that the websites do not highlight or highly rank the most important, relevant, or reliable information. The reason for this is perfectly clear: the purpose of these websites is, first and foremost, communication, socialization, and community-building. Building an information resource is just a very attractive side-benefit, but still only a side-benefit, of the main event of playing the game.

The attraction, in fact, is very similar to that of American Idol—I understand you have something similar called “Latin American Idol,” is that correct? Well, I have been known to watch American Idol. It is a television competition in which ordinary people compete to become the next Idol, who earns a record contract, not to mention the attention of tens of millions of television viewers. The singing on American Idol, especially in the early weeks, is often quite bad. But that is part of its entertainment value. We do not watch the program to be entertained with great singing—that is, of course, nice when it happens. Instead, we watch the program mainly because the drama of the competition is fascinating. Even though the quality of the singing is supposed to be what the program is about, in fact quality is secondary. The program’s attraction stems from the human element—from the fact that real people are putting themselves in front of a mass audience, and the audience can respond by voting for their favorites. The whole game is quite addictive, in a way not unlike the way Internet communities are addictive.

But let’s get back to the Internet. I want to suggest that the information resource most used online, Google Search itself, is also a popularity contest. Google’s PageRank technology is reputed to be very complex, and its details are secret. But the baseline methodology is well-known: Google ranks a web page more highly if it is linked to by other pages, which are themselves linked to by popular pages, and so forth. The assumption behind this ranking algorithm is somewhat plausible: the more that popular websites link to a given website, the more relevant and high-quality the website probably is. The fact that Google is as useful and dominant as it is shows that there is some validity to this assumption.

All that admitted, I want to make a simple point. Google Search is essentially a popularity contest, and frequently, the best and most relevant page is not even close to being a popular page. That is a straightforward failure. But just as annoying, perhaps, is the prevalence of false positives. I mean the pages that rank not because they are relevant or high-quality, but because they are popular or (even worse) because someone knows how to game the Google system.

Does this sound familiar? It should. I do not claim that Google is a medium of communication. Clearly, it is an information resource. But I want to point out that Google follows in the same policies of anonymity, egalitarianism, and merit determined internally through linkings and algorithms that machines can process. As far as we know, Google does not seed its rankings with data from experts. Its data is rarely edited at all. Google dutifully spiders all content without any prejudice of any sort, applies its algorithm, and delivers the results to us very efficiently.

I speculate—I can only speculate here—that Google does not edit its results much, for two reasons. First, I am sure that Google is deeply devoted the same values, values that favor a fair playing field for communication games that many Web 2.0 websites play. But, you might say, this is a little puzzling. Why doesn’t Google seek out ways to include the services of editors and experts, and improve its results? An even better idea, actually, would be to allow everyone to rate whatever websites they want, then publish their web ratings according to a standard syndication format, and then Google might use ratings from millions of people creatively to seed its results. In fairness to Google, it may do just this with the Google SearchWiki, which was launched last November. But as far as I know, SearchWiki does not aggregate search results; each individual can edit only the results that are displayed to that user.

So there is, I think, a second and more obvious reason that Google does not adjust its results with the help of editors or by aggregating syndicated ratings. Namely, its current, apparently impersonal search algorithm seems fair, and it is easy to sell it as fair. However much Google might be criticized because its results are not always the best, or because the results are gamable or influenced by blogs, at least it has the reputation of indeed being mostly fair, largely because PageRank is determined by features internal to the Internet itself—in other words, link data.

Google’s reputation for fairness is one of its most important assets. But why is such a reputation so important? Here I can finally return to the thread of my argument. Fairness is important to us because we want communication to be fair. In a certain way, the entire Internet is a communicative game. Eyeballs are the prize, and Google plays a sort of moderator or referee of the game. If that’s right, then we certainly want the referee to be fair, not to prefer one website over another simply because, for example, some expert happens to say the one is better. When it comes to conversations, fairness means equal consideration, equal time, an equal shot at impressing everyone in the room, so to speak. Communication per se is not the sort of thing over which editors should have any control, except sometimes to keep people polite.

The fact that Google has an impersonal search algorithm really means that it conceives of itself as a fair moderator of communication, not as a careful chooser of relevant, reliable content. And a lot of people are perfectly happy with this state of affairs.

Conclusion

In this paper I have developed an argument, and I hope I haven’t taken too long to explain it. I have argued that the Internet is devoted both to communication and information. I went on to say that communication and information are easily confused, and the Internet makes it even easier to confuse them, since what serves as mere communication for one person can be viewed later as useful information for another person. But what makes matters difficult is that we expect communication, and the websites that support online communication, to be as unconstrained and egalitarian as possible. As a result, however, the Internet serves rather well as a communication medium, as a means to socialize and build communities, but not nearly as well as an information resource.

I can imagine a reply to this, which would say: this is all a good thing. Information is about control. Communication is about freedom. Viva communication! Should our alleged betters—professors, top-ranked journalists, research foundations, and the like—enjoy more control over what we all see online, than the average person? The fact is that in the past, they have enjoyed such control. But the egalitarian policies of the Internet have largely removed their control. In the past, what those experts and editors have happened to say enjoyed a sort of status as impersonal information. But all information is personal. The Internet merely recognizes this fact when it treats allegedly impersonal information as personal communication.

This is the common analysis. But I think it is completely wrong.[1] First, the elites still exert control in many ways, and there is little reason to think the Internet will change this. Second, the radical egalitarianism of Internet policies does not disempower the elites so much as it disempowers intelligence, and empowers those with the time on their hands to create and enjoy popular opinion, and also those who care enough to game the system.

If more people were to emphasize the informative purpose of the Internet more, this would not empower elites; it would, rather, empower everyone who uses the Internet to learn and do research. We would have to spend less time sorting through the by-products of online communication, and could spend more time getting solid knowledge.

In fact, I think most people enjoy the Internet greatly as an information resource—at least as much as they enjoy it as a communication medium. But most of the people who create websites and Internet standards—the many people responsible for today’s Internet—have not had this distinction in mind. But I think it is very fruitful and interesting way to think about the Internet and its purposes, and—who knows?—perhaps it will inspire someone to think about how to improve the informational features of the Internet.

In fact, if my fondest hope for this paper were to come true, it would be that those building the Internet would begin to think of it a little bit more as a serious information resource, and a little bit less as just a fun medium of communication.

[1] As I have argued in a recent paper: “The Future of Expertise after Wikipedia,” Episteme (2009).


Why study higher mathematics and other stuff most people don't use in everyday life?

This video was posted in a Facebook group of mine here:

I find it ironic that some of the most listened-to speakers about education explain that the cure to our educational ills is to point out that education is unnecessary. I call this educational anti-intellectualism. Here's another representative sample and another.

It is possible to make the argument, "X isn't going to be necessary for most students in life, therefore X should not be taught," for almost everything that is taught beyond the sixth grade or so. After that, we should be taught "critical thinking" and vague "analytical abilities" and "reading comprehension" and other such claptrap; that seems to be the natural consequence of this commentator's thinking, and sadly, he is not alone.

The fact that educated people like this teacher, and all the people who approve of this stuff, cannot answer the question is very disappointing. It's not surprising, perhaps, because it's philosophy and philosophy is very hard. Moreover, there are a variety of sort-of-right answers that subtly get things wrong and might end up doing more damage than good.

In the latter category I might want to place E.D. Hirsch, Jr., one of the most prominent education traditionalists alive. (He just published a book I got today called Why Knowledge Matters, and he might have updated his views on this; I'll find out soon.) Hirsch's argument is that we ought to learn classics and, essentially, get a liberal arts education, because this is the knowledge we use to interact with other educated adults in our culture. It is "cultural literacy" and "cultural capital" and this is something we desperately need to thrive as individuals and as a civilization.

That's all true, I think. If Hirsch made the argument as, essentially a defense of Western (or just advanced) civilization—that we need to educate people in Western civilization if we are to perpetuate it—then I'd be fully on board. But Hirsch as I understand him appeals particularly to our individual desire to be a part of the elite, to get ahead, to be able to lord it over our less-educated citizens. This is a very bad argument that won't convince many people. If Hirsch or anyone makes it, I would put it in the category of arguing for the right conclusion for the wrong reason.

The argument I'd give to this math teacher is the same I'd give to someone who says we shouldn't memorize history facts or read boring, classic literature or learn the details of science or what have you. Of course you don't need that stuff to get through life. Most people are as dumb as a box of rocks when it comes to academic stuff (yes, in all countries; some are worse than others).

The reason you get an education, and study stuff like higher math, is more along the following lines. Education trains the mind and thereby liberates us from natural prejudice and stupidity. This is the proper work for human beings because we are rational creatures. We are honing the tool that comes more naturally to us than to any other animal. One must realize, as people like this educated fool and so many others seem not to, that education, such as math education, is not merely a tool in the sense of "abilities." The content, or what is known, is a deeply important part of the tool; in fact, as Hirsch does argue correctly and convincingly, any "analytical abilities" brought to a text will be very poor without relevant subject knowledge. If you want an analogy, it is a poor one to say that a course in logic sharpens your wit, to say you want to have sharp wits, and therefore you should study "critical thinking"; the heft or substance of your wit's ax is all the rest of the knowledge behind the cutting edge. Getting an A in a logic class (a course I taught many times) without knowledge of math, science, history, literature, etc., gives you about as much heft and effectiveness as a sharp-edged piece of paper: capable of paper-cuts.

The core of the argument for knowledge is that academic knowledge forms a sort of deeply interconnected system, and the more deeply and broadly that we understand this system, the more capable we are in every bit of life. This is true of us as individuals and also as a society or civilization. It is completely and literally true that the fantastic structure of modern civilization as we know it, all of the historically unprecedented developments we have seen, is a direct outgrowth of the deep commitment of our society's leaders—since the Enlightenment—to education in this system.

The system I refer to is deeply connected, but that doesn't mean it isn't also loosely connected in the sense that one can learn bits here and there and benefit somewhat. That's absolutely true. This is why it's possible for the math teacher to say, "Well, you don't really need to know higher math in order to live life." Some people are geniuses about literature but don't remember anything about any math they learned beyond the sixth grade.

But as everybody with higher education knows, in fact it is absolutely necessary to learn higher math if you are going to learn higher science—both the hard sciences and the social sciences, both of which require heavy calculation—and deal intelligently with statistics and probabilities, as is necessary in politics, or the financial part of business, or some of programming, etc.

This is because the "deep structure" of reality is mathematical. To declare that "you don't really need to know it" is to declare that you don't need to know the deep structure of reality. Sure, of course you don't. The birds of the air and the fish of the sea don't. But do you want our children to be more like them or more like fully rational, aware, human creatures?


How the government can monitor U.S. citizens

Just what tools do American governments—federal, state, and local—have to monitor U.S. citizens? There are other such lists online, but I couldn't find one that struck me as being quite complete. This list omits strictly criminal tracking, because while criminals are citizens, actual crime obviously needs to be tracked.

  1. First, there's what you yourself reveal: the government can use whatever information you yourself put into the public domain. For some of us (like me), that's a heck of a lot of information.
  2. Government also tries to force tech companies to reveal our personal information, ostensibly to catch terrorists and criminals. The FBI and NSA have both been in the news about this.
  3. The NSA famously tracks our email and phone calls. They might be looking for terrorism and crime, but we're caught in the net too.
  4. The IRS, obviously, tracks your income, business information, and much else. That certainly qualifies as government monitoring.
  5. State, local, and school district tax systems do the same.
  6. The FBI's NSAC (National Security Branch Analysis Center) has hundreds of millions of records about U.S. citizens, many perfectly law-abiding.
  7. The State Dept., Homeland Security, and others contribute to systems that include biometric information on some 126 million citizens—that means fingerprints, photos, and other biographical information.
  8. For a small number of citizens—740,000 to 10 million, depending on the system—there is a lot more information available, not just because the people are actual or terrorists or criminals, but only because they are suspected of such activity. If someone in government with the authority thinks you fall into broad categories that make you possibly dangerous, they can start collecting a heck of a lot more information about you.
  9. The Census Bureau tracks our basic demographic information every ten years.
  10. U.S. school students in at least 41 states are tracked by Statewide Longitudinal Data Systems, including demographics, enrollment status, test scores, preparedness for college, etc.
  11. Many and various public cameras, including license plate readers, are used by many local authorities, mainly for crime prevention.
  12. Monitoring by police will be easier in the near future: As an expert on the subject, law professor Bill Quigley, puts it, "Soon, police everywhere will be equipped with handheld devices to collect fingerprint, face, iris and even DNA information on the spot and have it instantly sent to national databases for comparison and storage."
  13. The internet of things will be another avenue in which government will increasingly be able to view our habits.

So...explain to me again how we have a right to privacy under the Fourth Amendment.

By the way, it is a conceptual mistake to suppose that there is any one person or group of people who have access to (and care about) the information in all of these databases. How the databases are used is carefully circumscribed by law, obviously, and just because the information is in a database, it doesn't follow that there has been a privacy violation. But it does raise concerns in the aggregate: the extent to which we are monitored might be a problem even if most programs are individually constitutionally justifiable.

In short, is there any point at which we say "enough is enough"? Or do we grudgingly give government technical access into every area of our lives and hope that the law controls how the information is being used?

In the comments, please let me know what I've missed and I can do updates.

Sources: Common Dreams, ACLU, Ed.gov, Forbes, Wired, Guardian, and my own experience working at the Census Bureau long ago.


Why do smart people say such stupid things about politics?

Hey to all my friends who are smart people. (And if you wonder whether this "who" is restrictive or nonrestrictive, you may be one of my smart friends.)

When Thoreau said, "Simplify," he was not talking about your political positions. The truth is complex. You know this. You are capable of doing professional work (programming, philosophizing, writing, business, whatever) at the highest level. So why is it that we seem to turn off our brains and speak in simplistic, black-and-white, unnuanced and frequently obviously false terms when we talk about politics?


Top 10 hidden gems of central Ohio

Today my family discovered yet another hidden gem, a spot we had never been to before, in central Ohio where we live. This inspired me to catalog our favorite "hidden gems."

Central Ohio has some excellent landmarks that a visitor would enjoy. The Columbus Zoo is world class; the Whetstone Park of Roses is stunning when in bloom; the riverfront, COSI, LeVeque Tower, and State House downtown are all well worth a visit; nearby German Village is a great spot to stroll; Ohio State is nice to visit, especially around the Oval, Library, and Mirror Lake; the Columbus Metropolitan Library downtown is one of the best public libraries in the country; Franklin Park Conservatory is a beautiful spot; the various metro parks make an excellent park system; you've probably heard of Ohio Caverns, which we love; and everybody has heard about the Hocking Hills. But if you live in the area, you probably know about those spots.

Here are some spots you might or might not have encountered yet, which we have visited several times (or plan to visit again) and which we love—from least hidden to most hidden.

10. Hoover Dam. This is the least "hidden" and perhaps it doesn't belong on the list, but I didn't know about it for a long time. This isn't an earthen dam like so many others in Ohio, it is a tall and wide concrete dam with a massive gushing spillway. You can walk all the way across the dam, as well as from the top of the dam to the marshy, blue heron-filled area at the bottom. At the observation area on the eastern side, last time we were there, there were a bunch of swallow nests. On both sides of the dam and on either side of Hoover Reservoir are places to walk, play, and picnic. Hoover Dam is just one of the nicest places in central Ohio.

9. Slate Run Living Historical Farm. Again, perhaps it's not so well hidden now. If you have little kids, and maybe even if you don't, this is a must-see. A well-maintained, apparently well-run farm following 19th century farming ways, Slate Run features an open farmhouse, a separate kitchen, gardens, root cellar, horse-plowed fields, a massive barn, and a big variety of farm animals, from chicken and other poultry to cows, sheep, horses, and pigs. Just a great way to learn about the old ways of farm life. We also enjoy the pond.

8. The Wilds. Again, many people know about this so perhaps it isn't very "hidden." But if you haven't visited, you might find it to be a surprise. The bus and other tours allow you leisure to take in the unusual, vast, hilly landscape as well as the big animals scattered over a 14 square miles in giant paddocks. The animals we saw when we have visited in the past included rhinos, giraffes, unusual deer and oxen, zebras, bison, a cheetah, and many others. Like a safari, but fairly close to home. Also worth a mention is that the drive to the Wilds is quite nice, especially if you go through the very scenic Blue Rock Forest.

7. Conkles Hollow State Nature Preserve. Some of the Hocking Hills attractions, like Old Man's Cave and Cedar Falls, are unquestionably excellent and are far from "hidden." But one of our favorite spots is the less-visited but surprisingly awesome Conkles Hollow. The trail is very green and scenic, but flat and paved for most of the way, and thus excellent for small children. What awaits you at the end is stunning, resembling some landscapes I remember from the Grand Canyon or Zion National Park out west. The gorge is reputedly one of the deepest in Ohio and the end of it is a magical place.

6. Rising Park and Shallenberger State Nature Preserve. I put these together because they're both in the Lancaster area and they both feature similarly short, but steep hikes to the top of a hill, from which you get a beautiful view of the surrounding landscape. Rising Park is well-known (hardly a hidden gem) to the people of Lancaster, but worth a visit to those from outside the area. The main attraction is the gorgeous view overlooking the town of Lancaster, but there is also a scenic reservoir, an old house on the property, and plenty of places to wander. We visited Shallenberger in winter when the leaves didn't block the view. We had passed it many times on the way to the Hocking Hills, but spotted it on a map and decided to visit one day. Very nice little preserve, short and scenic but steep hike to the top of a hill that overlooks the surrounding country in all directions, although leaves might get in the way in the summer.

5. Blackhand Gorge State Nature Preserve. Now we come to some of the slightly more hidden spots. On the east side of Newark is this lovely area, a paved bike and hiking trail—a converted rail bed—next to the Licking River going through a very scenic gorge. Apparently, it was called "Blackhand" after Indian hand paintings on the cliff walls. There are some nice little waterfalls in the tributary dales along the trail, as well as some sandstone cliffs of the sort you'll find in the Hocking Hills. Old canal towpaths and locks are nearby. Also interesting is a notch or gap cut through a hillside, which is a little like a roofless tunnel.

4. Rockbridge State Nature Preserve. This is on the other side of 33 from the Hocking Hills, between Lancaster and Logan. The parking lot might take some finding, and the trails leading to the main attraction—a large natural bridge, or arch—take a bit of puzzling out. But Rockbridge itself is a stunning location, and the rugged hike to it is one of the nicer hikes central Ohio has to offer.

3. Tar Hollow State Park and Forest. One of the nicest areas just to take a drive would be Tar Hollow State Park and Forest, which we visited in the fall—highly recommended. Sweeping vistas. There's a pretty reservoir, Pine Lake, with swimming and paddleboats. In the middle of the forest is a giant fire tower that it is possible to climb, although it seemed somewhat rickety and lacking in railings for our two young boys, so we didn't attempt it. While there we were absolutely swarmed by ladybugs.

2. Rock Mill Park. This out-of-the way area is worth a bit of extra driving. The mill itself has been lovingly restored, with a giant mill wheel. To get to it, you walk across a particularly excellent example of an Ohio covered bridge—over a beautiful gorge—and if you proceed down a path from the mill, you'll come to one of the nicest waterfalls in the central Ohio area, which will strike you as a bit of the Hocking Hills, only a lot closer than you might have expected.

1. Wahkeena Nature Preserve. We first visited this preserve yesterday. We simply saw it on a map, read some intriguing descriptions, and decided to go. We're glad we did, because it's a very unusual, surprising place. Several things make it very special: beavers, a pine forest, wildflowers, an excellent free guide map, and an especially interesting nature center. There are two big beaver lodges at one edge of the pond. There are all sorts of little surprises. There is a floating boardwalk across one end of the pond, which takes you by one of three beaver dams. There are some stunningly tall pine trees you'll walk by on the very nice 1.5 mile circuit—a fragrant bit of landscape, reminds me of California and other western forests. Wildflowers are abundant, identified handily on the excellent guide map. A family of geese with brand new goslings, hatched earlier the same day (April 24), was swimming about. Frogs galore of course. Near the top of the hill are sandstone cliffs of the typical Hocking Hills variety. The guide map has numbers and letters which match numbers and letters posted along the well-maintained trail, with naturalist notes we enjoyed reading—I wish more parks would do this. A barred owl and a red-shouldered hawk are in a quiet area not far from the nature center, both injured, non-releasable, and cared for by preserve personnel. Unlike many nature centers, this one is hands-on and reading-light, but full of small stuffed Ohio mammals and birds of every description, many dozens of them, live turtles and snakes in aquariums, a fascinating indoor beehive and knowledgeable talkative staff members on hand. Absolutely perfect learning place for children.

Honorable mentions... The Wagnalls Memorial Library in Lithopolis is one of our favorite libraries, gorgeous old building, wonderful place to read. Pigeon Roost Farm is a great spot for fun, hay wagon rides, corn maze, etc., in the fall as a place to take little kids, although it's getting a little too popular so maybe doesn't qualify as a "hidden" gem. Yoctangee Park in Chillicothe has swans and beautiful trees—like Rising Park in Lancaster, not at all hidden to the residents of Chillicothe. Charles Alley Park on the south side of Lancaster has some very nice, scenic hikes in the hills above a reservoir. Close to home is one of our favorite places, maybe a "hidden gem" for some people not in the area: Chestnut Ridge Metro Park. Excellent hiking and views.

What have I missed? Please turn us on to other spots around that we have missed! Share in the comments!


Teaching reading — two suggestions

America’s literacy problems could be solved if parents, preschool teachers, and daycare workers did just two simple things. One is obvious. One is not.

First, we should read a lot more to our babies, toddlers, and preschoolers — say, at least an hour per day. That means picking up a good old-fashioned book, putting a kid in your lap or sitting up close in a small group, and reading the book to the kid. And do voices! Kids love voices.

To turbo-charge your little reader’s skills, simply point at the words as you read them. You’d be amazed at how much this helps them. Retirees can help by volunteering to read to kids at a local preschool or daycare.

That’s all common-sense advice, right?

My second piece of advice is less obvious: We should start teaching our little ones to read before kindergarten, at home and in our preschools and daycares.

Ten years ago, this would have just sounded crazy. Then we started hearing about “baby reading” and how little Emma or Aidan started reading at age one. You probably think their parents must have pushed their kids, and you don’t want to be one of “those parents.”

I am one of those parents, but I didn’t push my boys. They both started reading at age one. How?

I didn’t use workbooks, software, or other systems designed for five- or six-year-olds — that’s a terrible idea. Instead, in addition to all the reading I did to my oldest son, I showed him a lot of flashcards, when he was a baby. He seemed to get a kick out of them. If he didn’t, we stopped immediately.

When he was about two years old, in 2008, I started making him a new kind of card, with words put in phonetic groupings. We started with simple CVC (consonant-vowel-consonant) words, like “dog,” with a picture on the back, and gradually we worked our way to harder words. Lots of other parents used my flashcards (free online) and praised them highly. At the same time, we started using some “teach your baby to read” programs.

All together we didn’t spend much time on all that sort of training — no more than a half-hour a day — but we did keep reading to him a lot, maybe one or two hours per day. Of course he spent most of the day playing like any other kid.

The result? At age three, he was reading at the 3rd to 4th grade level. You can find a video I made of him on YouTube:

My second son was born in 2010, shortly after I bought the first iPad. We did lots of flashcard apps, which show big words and colorful pictures. I strongly recommend using whatever flashcard apps your baby likes the most. There are a lot.

At that time, I was working on WatchKnowLearn.org, funded by an anonymous Memphis-area philanthropist. He saw the video of my son and said, “Why don’t you make a reading program of your own?” The result was ReadingBear.org— I based it on those old phonics flashcards I made, but it’s a lot more than just words and pictures. The words, all 1,200 of them, are pronounced at four speeds, they’re used in a sentence, and a picture and a video illustrate them. Thanks to that Memphis philanthropist, the website is 100% free, ad-free, and nonprofit.

My second son was just as good a reader as my first by the age of three:

https://www.youtube.com/watch?v=6wmlOkiOo08

Users tell me that regular use of Reading Bear leads to spectacular results. But you’re not limited to that. Lots of other free or cheap tools — apps and websites — are available, too.

Now, here’s the point: Reading Bear and those other tools need not take much time. They aren’t terribly challenging. Just find the tool a child likes — there’s so much to choose from, you’ll find something. It doesn’t require pushing or forcing. Just 15 minutes a day, and within months, children as young as two can be reading out loud, as two boys did.

Why isn’t every Head Start preschool in the country making use of these freely-available tools? We know they work, and they can solve our illiteracy problems. So why aren’t we using them?

Just two things, and so many problems connected to poor education will disappear: read to very young children religiously for an hour per day, and start teaching them with these 21st century reading tools that they like.

If we do these two things, we’ll see our country’s reading problems disappear.

Larry Sanger (yo.larrysanger@gmail.com) is co-founder of Wikipedia and has helped developed many other educational websites, including ReadingBear.org. Sanger has posted a free book on his experience teaching his son, How and Why I Taught My Toddler to Read. He earned his Ph.D. in Philosophy in 2000 from Ohio State University.