How not to use the Internet, part 2: the pernicious design philosophy of the Internet

<< Part 1: It's a problem that the Internet distracts us

2. The pernicious design philosophy of the Internet.

The way that the Internet is designed—not graphic design, but overall habits and architecture—encourages the widespread distractability that I, at least, hate.

This basic notion is not my idea; I freely admit that I learned it from Nicholas Carr. I did not quite notice some features about the Internet until reading Carr's The Shallows some time ago, and the following borrows from Carr. My analysis consists of two related parts, the first being about the nature of the Internet, and the second being about the design philosophy of the Internet.

First, consider what the Internet is, or the public side of it, so to speak. (Not the technical, "back end" part.) The public side of the Internet consists of (a) information of various media that is presumably of some public interest, together with (b) ways of repackaging, sending, publishing, and rating the information and, especially, of linking to it for public consumption.

Category (a) is rapidly growing to include all of the public information we know of, or at least all of it that can be digitized—and not just all extant information, but also all new information that arrives on the scene. This fact is of interest not just to "geeks," but to everyone who finds books, news, movies, and virtually everything else that we can communicate and share digitally. Category (a) is the concern of all of humanity, not just geekdom.

This makes category (b), what we might call the net's meta-information, all the more important to us. Google makes the inherently interesting information findable. Wikipedia tries to summarize it. Email, texting, and VoIP (like Skype) allow us to communicate it more efficiently. Twitter gives acquaintances and colleagues a way to share the latest and greatest with us. Facebook gives us easy, one-page access to information about our friends and families. Other sites, like YouTube and Amazon, offer us view counts, ratings, samples, and reviews that are crucial to deciding what long-form content worth pursuing.

Now I can explain a notion, which again owes a great deal to Carr, of the current two-part "design philosophy" of the Internet, to wit:

Interconnectivity: information that is of some inherent public interest is typically marinated in meta-information: (a) is bathed in (b). It is not enough to make the inherently interesting content instantly available and easy to find; it must also be surrounded by links, sidebars, menus, and other info, and promoted on social media via mail. This is deliberate, but it has gotten worse in the last ten years or so, with the advent of syndicated blog feeds (RSS), then various other social media feeds. This is, of course, supposed to be for the convenience and enlightenment of the user, and no doubt sometimes it is. But I think it usually doesn't help anybody, except maybe people who are trying to build web traffic.

Recency: the information to be most loudly announced online is not just recent, but the brand-spanking-newest, and what allegedly deserves our attention now is determined democratically, with special weight given to the opinions of people we know.

Something like this two-part design philosophy, I believe with Carr, is what makes the Internet so distracting. Carr found some interesting studies that indicate that text that is filled with hyperlinks and surrounded by "helpful" supporting media tends to be poorly understood, and we spend less time on each page of such text. As soon as we come across a link, video, or infographic sufficiently interesting to distract us, the surrounding mass of text becomes "tl;dr". Over time, we have largely lost the habit of reading longer texts, and this problem is apt to get worse.

Moreover, when we and our social networks place a premium on recency, we naturally feel a need to check various news streams and data feeds regularly, and coders oblige this tendency by providing us various distracting push notifications when the latest arrives. Even more, the Internet industry hungrily pounces on new tools and devices that allow people to share and be connected in ever more and newer ways. The Internet increasingly goes wherever we are, first with the advent of laptops, then smart phones, then the iPad—and eventually, maybe "Google Glasses."

The result is that, soon after we surf to a page of rich media, its interconnections lead us away from whatever led us to the page in the first place, even while our various alerts and, just as important, our habits of checking stuff, conspire to pull us away as well. Ironically, what might look to the naive to be an efficient, intelligent system of alerting us and giving us instant access to the latest and greatest online has the effect of making us unable to focus on any one thing for long.

Let that sink in a little. Back in 2000, what we were so excited about, when we thought about the potential of the Internet, was the sheer amount of knowledge that would be available and presented (and developed!) in all sorts of brilliantly engaging ways. Now it is 2012. Is that what we have? Yes—and no. Some of the dream has indeed arrived. Vast amounts of content are there. Frequently it is presented engagingly (although we have a lot more to do before we reach our potential). But it is also presented in a context that is so extremely distracting that we, even despite our best intentions, often do not really appreciate it. We are not encouraged to study, absorb, savor; we are encouraged to skim and move on.

I think there is something really wrong with this design philosophy. We ought to try to change it, if we can. But how, especially considering that it mostly grew organically, not as a result of any grand design?

Part 3: How the Internet's current design philosophy fails >>

Relevant links:

Nick Carr's blog, "Rough Type"

To see how SEO analysts (and many webmasters) think about recency, see "New Rules: Fresh Content Is King" (undated, natch!).

Of course, the Google Glasses that appeared in the video are probably vaporware, for now.

"Vast amounts of content" that is "presented engagingly"? Well, Wikipedia and YouTube, for just two examples. I didn't say presented perfectly, but their popularity is evidence of their being engaging. Their vastness is obvious. Many more examples could be given.


How not to use the Internet, part 1: it's a problem that the Internet distracts us

For almost a year, I've been at work on a very long essay about some problems with the Internet and social media in particular. I've worked on it now and then and occasionally I think I'm really going to finish it—but I never do. So, as a concession to failure, or partial failure anyway, I have decided to divide it up into several self-contained brief essays. I'll release an essay a day and see how it goes. Here is the first.

Note, rather than tempt the reader to click out of the essay, I've moved links to the end, and annotated them. This is an example of one way in which the Internet could change (although I'm not exactly holding my breath).

1. It's a problem that the Internet distracts us, dammit.

I too am distracted by ubiquitous digital media. This is a problem—a common, serious, and real problem—and I wish I could get to the bottom of it, but it is very deep.

In the last several years, like many of us, I've often felt out of control of my time. Following basic time management principles is more difficult than ever, especially when I'm spending time online and looking at screens generally. My situation is probably similar to that of many people reading this: I check my mail many times per day; Twitter and Facebook beckon, as do my favorite online communities (and I dread joining Google+); people push the latest news at me; people Skype me; and the time seems to slip away in spite of my better intentions to, you know, get work done.

What I think of as an unmitigated vice has been complacently described by some as "multi-tasking," as if allowing yourself to be distracted were some sort of advanced technical ability. We are told (though, I gather, not by most psychologists) that being able to multi-task effectively is one of the skills that should now be in every plugged-in person's toolkit. But the notion that multi-tasking is an advanced ability is merely an excuse, I think. When you are "multi-tasking," usually, you are not using your time efficiently; you are simply letting yourself be distracted, because you don't want to "miss out."

That's not all. As much as I hate to admit it, the Internet also seems to have made it difficult for me, as it has Nicholas Carr and Richard Foreman, to write and pay attention to long texts, and to think deep thoughts. To be sure, I still try and occasionally succeed. I seem to skim more along the surface of things, despite myself. Thoughtful insight is far from impossible, but it seems to require more deliberate effort. Creativity still flows, but less often and less spontaneously. Believe me, I wish it weren't this way. I fear that I, too, am becoming one of Carr's "shallows" and one of Foreman's "pancake people."

Many heavy Internet users have fairly admitted the same, often apparently with pride or without shame—or at least without hope of improvement. Do you feel the same?

The nature of these now-common problems—a mind ironically made poorer in spite of, indeed by, the Internet's riches—has been much discussed, for example by Maggie Jackson in Distracted, Mark Bauerlein in The Dumbest Generation (a much better book than you might expect from the title), Nicholas Carr in The Shallows, and Jaron Lanier in You Are Not a Gadget.

So why do we let ourselves get so distracted? Why are we so often incapable of sticking to a single task?

I think there is a simple answer, actually: we intensely feel the presence of all the world's information and people and the digital fun that entails, miraculously made available to us. Impersonal information made it bad enough for us early adopters in the earlier days of the Internet. But now that everybody and his grandma (literally) has joined social networks, the situation got a lot worse, for me at least. We are constantly available to our colleagues, friends, and acquaintances, so they may "interrupt" us at random times throughout the day, offering insights and telling us that some new website or blog post or picture or video is a "must read" or "must see," or simply reporting their own sometimes-interesting thoughts and news. We constantly feel pulled in a thousand directions. This general problem seems likely to worsen as our access to the world's information becomes more and more complete, speedy, and convenient. Before long, we will have virtually instant access to every bit of content we might want, always and everywhere, and with a minimum of effort (though not necessarily with a minimum of cost). We are nearly there, too.

This revolution—inadequately described as a revolution of "information" or the "digital" or the "Internet"—is wholly unprecedented in history. Not long ago I had to tell blasé skeptics that it is not "hype" to call it a revolution. But clearly, a lot of regular folks, not necessarily in the vanguard, have started to understand the enormousness of how the world has changed in the last fifteen years or so. It's a real revolution, not a mere fad or development, and even as we stare it in the face, it is still hard to grasp just how far-reaching it is. We have been swept up by the one of the most novel and dramatic transformations that humanity has ever undergone. We read about "revolutions" throughout history, the printing press, of religion, of ideology, of industry. This is another one; it's the real deal. It's more important than, for example, who will be elected president in 2012, whether the Euro will collapse, or Iran's nuclear ambitions.

Anyway, this revolution is so novel that it is not surprising if we act like kids screaming in a candy store, not knowing what to sample first. Maybe it's time that we started taking stock of the Internet's candy store more like mature adults and less like sugar-crazed children.

For some time, I've known that I would have to come to a personal understanding of this situation, and make some personal resolutions to deal with it. Like so much else, I've been putting this off, because the problem is massive and I haven't felt equal to it. I'm not sure I am yet. Nobody seems to—not even the above writers, who offer bleak reports and little in the way of helpful advice. The Internet bedazzles us. But for me, things have come to a head. I do not want to go through the rest of my life in the now all-too-familiar state of Internet bedazzlement, if I can help it. For me, it begins now. It's time for me—and maybe, for you too—to get over the fact that all of the world's information and the people that drive it are (or soon will be) accessible in moments. But how?

Some people won't admit that there is even a problem in the first place. They celebrate the Internet uncritically, leaping upon every new site, app, or gadget that promises to connect us in newer and deeper ways. But it is precisely the wonders of the Internet that we celebrate that have become a major distraction. Some people don't seem to want to admit that distractability is a serious problem; they do nothing but offer blithe predictions and analysis of how thinking, social interaction, education, etc., are moving into a wonderful new age. That is all very well as far as it goes, but I sometimes wonder if some of the recent economic downturn might be explained by the amount of time we waste online. Surely it's possible that the global economy is significantly less productive because we're distracting ourselves, and each other, so much, and with so little to show for it.

Other people seem to think that there's nothing that can be done about our distractability and "shallowness." Whatever their disagreements, Internet commentators Clay Shirky and Nicholas Carr seem to agree on this: the brevity of information chunks, the pace of their flow, and the fact that they are mediated democratically by giant web communities are all inevitable features of the Internet; so we can't help but be "distracted." Or so Shirky, Carr, and many techie A-listers seem to think. This is where modern life is lived, for better or worse. If you want to be part of things, you've got to jump into the data stream and do your best to manage. If your distractability is making you "shallow" or "flat," that is just a new and unfortunate feature of life today.

I will not "go gentle into that good night." I can't help but observe that this sort of techno-fatalism might be why some Internet geeks are becoming anti-intellectual. I'm far from alone in my view that the overall tendency of the Internet, as it is now and as we use it now, is to make us less intellectual. So, many Internet geeks make a virtue of necessity and begin slagging intellectual things like memory (and thus declarative knowledge), books and especially the classics, expertise, and liberal education. At least critics like Carr and Lanier have the good taste and sense to bemoan the situation rather than mindlessly celebrating it.

As to me, I disagree with techno-fatalism strongly. Isn't it obvious that the Internet is still very new, that we are still experiencing its birth pangs, and that dramatic changes to how we use it will probably continue for another generation or two? Isn't it also quite obvious that we have not really figured out how to design and use the Internet in a way that is optimal for us as fully-realized human beings? I love the new universal accessibility of so much recorded knowledge. Over the last dozen years I have been a booster of this myself, and in my work I still aim to enlarge our store of free, high-quality knowledge resources. I also deeply love the free exchange of ideas that the Internet makes possible. These things are why I "live online" myself. I do agree with the boosters that all this will, in time, probably, change us for the better. But the idea that the mindless digital helter-skelter of the early 2000s is how things will always be, from here on out, is highly doubtful.

We simply can't go on like this. I think we can change, and we should.

Part 2: the pernicious design philosophy of the Internet >>

Relevant links

A good place to start learning about what psychologists say about Internet distraction would be via this search.

Nicholas Carr's famous essay, "Is Google Making Us Stupid?" in The Atlantic, is one of those articles you kind of wish you'd written. It focused many people's thinking about the effect of the Internet on how we think. I actually prefer his book The Shallows, however.

The "pancake people" reference is to a short essay by Richard Foresman in Edge.

For some of what I've said about the "revolution" that the Internet and digital media represent, see thisthis, and this, just for example.

When I think about the suggestion that it's not a bad thing that information chunks are getting smaller, I think of this Britannica Blog post by Clay Shirky, lauding short-form online communication as an "upstart literature" that will "become the new high culture." Perhaps an older, more widely-read introduction to this notion would be Small Pieces, Loosely Joined by David Weinberger--it's just that the pieces are even smaller and looser than when Weinberger published that book (2002).

"Go gentle into that good night" is, of course, a phrase from the poem "Invictus."

The surely absurd notion that there is a new geek anti-intellectualism is broached in this much-discussed essay.


Efficiency as a basic educational principle

It occurred to me that there is a simple pedagogical principle that explains the appeal of very early learning, homeschooling, and certain (not all) traditional methods of education, as well as why certain other methods of education strike me as a waste of time.

I hereby dub the following the principle of individual efficiency:

Seize every opportunity to help the individual student to learn efficiently--which occurs when the student is interested in something not yet learned but is capable of learning it, and especially when learning it makes it easier to learn more later.

In other words, when an individual student is capable of learning efficiently, seize the opportunity.  If students spend too much idle time when they could be learning, if they are learning only a little, if they are not interested in what is being taught, if they have already learned it, or if they will not understand it, then they aren't learning efficiently.  When a certain approach ceases to conduce to efficient learning, try something else.

Why insert the word "individual" here?  Because "efficiency" in education has entailed, historically, the "industrial model" of education.  It might be an efficient use of resources for the state to pay teachers teach 35 students the same thing at once, but this is decidedly not the most efficient way for the individual student to learn.  More on this below.

So far, the principle is unremarkable.  But see how I apply the principle to a variety of educational issues.

1. Very early learning, by certain methods, is efficient learning. Under-fives, and even babies, are capable of learning much more than most people give them credit for.  Just for example, they are capable of learning to read.  Maybe more importantly, the use of books above all--but also flashcards, powerpoint presentations, videos, and iPad apps--can efficiently teach very young children vocabulary and basic concepts and skills that historically were not introduced until some years later.  A lot of old dogmas about "developmental appropriateness" are going by the wayside as parents discover ways to teach their tiny tots much earlier, but in a fun, engaging way.  Just bear in mind, I do not think that pressuring small children to learn is efficient.  That makes them lose interest--which is inefficient--and not just in the "pressured" subject, but in all learning.  Indeed, it is usually best to avoid pressure, whenever possible, regardless of age, which leads me to the next point...

2. Homeschooling's main advantage is its higher potential for efficiency. In a homeschool (not a radical unschooling situation), parents can choose exactly the right books and other materials to match the student, both her interests and her capacities.  (I am by no means saying that most homeschoolers actually do this, though.  Just that they are free to.)  Endless sifting for exactly the right educational materials and methods is extremely important if you want to keep your student's attention and interest, and to keep challenging her.  Done right, a homeschool involves constantly challenging the student, with no unnecessary review.  (But beneficial review, yes.)  Teaching my five-year-old homeschool student, I have developed a sense for what learning "feels like": it seems challenging indeed, but not so difficult as to be boring or impossible; and it lasts for a limited length of time, about the length of my son's attention span.  We rarely spend too much time on a subject, but when we study, we tend to  learn efficiently.  In my own schooling, in good public schools, learning was rarely so efficient.  As a result, my son is far better educated than I was at age five.

3. Unschooling, or at least "radical" unschooling, is often inefficient. Unschooling in its purest form entails allowing the child to choose both the subjects and the methods of study--and even whether to study at all.  The parent does, of course, support and foster the child's pursuits.  It would be wonderful if it always worked.  Unschooling does hold some appeal to me, because I think it is extremely important that students enjoy learning--efficient learning can't happen if it isn't motivated learning.  Insofar as unschooling emphasizes listening to the student and getting heavy student input, I'm a fan.  But unschooling in its purer forms permits students to avoid learning subjects when they, and their future learning, could benefit hugely.  However much fun it might be for the student, however well it might prepare them for a particular trade, this is inefficient as a method of getting a liberal education.

4. Memorizing some facts is efficient. The reason students should memorize, for example, basic arithmetic facts is efficiency.  While I agree that they should be fully exposed to mathematical concepts and multiple methods of attack (understanding math is paramount), memorization of math facts is important because it makes it much easier to do higher math and science later.  Use of calculators in elementary math is sometimes defended on grounds that adults use calculators, too, and learning how to use them is efficient.  That may be, but it is far faster, and more efficient, to be able to do basic arithmetic without a calculator.  This is only an example.  Another example, which I'm going to choose just to annoy people, is history dates.  Consider this list of important dates in history, which looks pretty good to me.  If you're a well-educated person, you should know some such list of dates.  Such dates are the backbone needed to contextualize the historical order and length of other historical events.  If you don't have quite a few of those dates under your belt, you can't really make much sense of other dates that you might come across in reading history--which means you won't learn history properly, and you won't want to learn history because it will be a puzzle.  So it is necessary to commit a fair number of dates to memory simply to make later history more comprehensible and interesting.

5. Reading many carefully-chosen, well-written books is an efficient way to learn. Why is book-reading so efficient?  A well-written book, when chosen to match the student's interest and comprehension level, is designed to teach information in as efficient and attractive a way as possible.  That, after all, is why we say certain books are well-written.  Videos can achieve the same thing, but most videos cannot teach vocabulary and language skills as well as books.  While it is not so popular for educationists to come out against books, they talk up a lot of other methods that do not require book-reading, and--well, there's only so much time.  My approach is different.  Our homeschool is completely "book-centered."  We have six bookcases filled to overflowing (we need another one now) with children's books, both nonfiction and fiction.  I am absolutely convinced that my son is reading and learning far above grade level not because he has a high IQ but simply because I've read a zillion books to him, explaining everything in them that I thought he might not understand.  I truly believe that, of the various general methods of learning, this is the most efficient way to gain knowledge.  It even makes certain "skills development" unnecessary.  Because we have read so much, we have not needed to study vocabulary, spelling, or even basic grammar as separate subjects (see below).

6. Incorporating illustrative multimedia to supplement reading is efficient. Book-reading is great, but you can make it even better by having an iPad on hand to instantly look up pictures, videos, maps, and encyclopedia articles to help clarify what it is difficult for you to explain in words.  Sometimes a picture or video is absolutely invaluable in explaining some subject.  I find this to be especially true in geography, and to a slightly lesser extent history and science.  For science reading, I frequently do "mini-experiments" with whatever is on hand, using videos viewed on the iPad as a fallback.

7. Learning the texts of Western civilization is efficient. The more of the ancient Greek and Roman classics that students learn, the better they will be able to understand why our society thinks, judges, and works the way it does.  This goes just as well for the most important works of literature, philosophy, religion, and art throughout the ages.  Studying these texts is efficient because someone with a great foundation in the liberal arts finds it much easier to read and learn from all sorts of other texts.  I suppose the same would also go for students of other great, ancient civilizations like China and India, but I don't have any experience with that.  Anyway, it is profoundly inefficient to expect students to be able to think or say anything interesting, or to learn much, about the big policy questions that are frequently the subject of "bull sessions," without prior exposure to "the best that has been thought and said" about history, philosophy, or political theory.  The same can be said for the discussion of classic literature taken out of context.  A student who is mostly ignorant of history and other classics simply can't appreciate, or say much that is not banal or simply incorrect, about a work of classic literature.  This is why reading of the classics has declined: if you do it halfway, these books are just going to seem confusing and boring.  If you go at it whole hog, you'll actually enjoy them and learn a lot from reading them.

8. Grounded in enough reading, it is much more efficient to write a lot than to do "language arts" workbooks. Elementary school students spend hours and hours doing workbook exercises about grammar, vocabulary, and spelling.  Some such work is, I agree, beneficial.  But a lot of such work is unnecessary busywork if one has read and written a lot.  The best way to get to an 800 verbal score on the GRE is not by studying vocabulary as a subject but by reading a lot of books and being introduced to vocabulary in context, and then looking up words that are puzzling.  If, through reading, one is extremely familiar and comfortable with correct English, reproducing it in written form is much easier, and some of the time spent on grammar, vocabulary, and spelling becomes unimportant.  Far more efficient is to do a lot of writing daily, to get copious feedback from a very literate person, and to revise.  All that said, I am inclined to think that students should go through a full, systematic course of grammar a few times in their academic careers, and some supplementary work on spelling and vocabulary is a good idea.  It's also important to teach children how to use and appreciate reference books as they write.  If a student enjoys browsing style guides, you've done something right.

9. Ed tech's main appeal is its efficiency.  When inefficient, it sucks. Educational technology--I think of websites like WatchKnowLearn, Reading Bear, and educational apps on the iPad--can greatly increase the efficiency of learning.  At its best, ed tech increases student interest and attention span while delivering information or skills practice in a way that fosters understanding and memory.  It doesn't always work that way, though.  Some educational software and Web tools and communities are decidedly less efficient than more traditional methods.  We avoid it in our homeschool.  Sometimes, though--as with the "Presidents vs. Aliens" app--I'll let my son have a little "inefficient" fun if it means he's going to know presidential facts backwards and forwards.  Besides, sometimes having fun in this way makes something that otherwise might seem boring, like a thick volume about presidential history, suddenly more interesting.

10. The project method is inefficient. Now let me explain why I have it in for the project method.  I have loathed this method since it was inflicted upon me back in the 1970s and early 1980s.  It never fails to amazing me that teachers and education professors apparently can't see--or worse, don't care--that making models, playing dress up, putting on lame plays, and doing endless navel-gazing projects about themselves, and so forth, are an amazingly inefficient use of time.  It is true that students can learn a few things very well from such projects.  But in the same 20 hours that it takes to do some elaborate history project, a student could have read ten related or increasingly difficult books all on the same subject, written a serious report, and emerged a little expert.  True, he wouldn't be able to point proudly to a model of the pyramids or a mud hut village.  But he would actually know something about ancient Egypt or African village life, something that he would remember.  Moreover, if the books are carefully chosen to fit the student and for quality, and the student can choose the report topic and gets enough help with it, the student can actually like the reading and writing, as much as if not more than yet-another-art-project.

11. Many textbooks are inefficient. Textbooks are written to satisfy textbook adoption committees which are devoted to requirements that often make textbooks deadly boring, especially in the earlier grades.  Going through a textbook might guarantee that you cover the "scope and sequence" of educational standards, but if students are bored, if they find some parts too easy and other parts insufficiently detailed, if textbooks insert unnecessary bias or instead render them so vanilla as to lack any personality, the result won't inspire anyone.  As a result, students don't learn what they should from textbooks, which is just to say that textbooks are inefficient.  We find that replacing one big textbook with many shorter books, chosen for maximum student interest due to excellent writing and accessibility, we learn far more than we would by studying a textbook. That said, I believe there are still some subjects, at some levels, that are best approached with a textbook--math is an example.

(Added later.) 12. Spaced repetition is efficient. The spaced repetition method, well known to psychologists but shockingly poorly known among actual educators, has the student review refresh information in memory, via active (quiz) review, just before it is forgotten. Free software (such as Supermemo, Mnemosyne, and Anki) makes such review easy. Most students can achieve a 95% recall rate for information put into such a system, as long as a daily review (which needn't be very long or arduous) is done. The same cannot be said for worksheets, cramming for exams, or passive review of information.

------------------------------

I'm sure I could go on, but I think I've demonstrated that the principle of individual efficiency does pretty deeply explain my stands on various educational issues.  Well, at least I find that interesting; I seem to have put my finger on a system.

For the philosophers out there, if you want a further argument for the principle itself, I think it follows from a traditionalist goal for education, together with a basic principle of rationality.  Given that the goal of education is the development of academic knowledge and skills (to include a broad and deep comprehension of Western civilization and science a.k.a. liberal arts), the next big question in philosophy of education is how to describe the most rational means to this end.  The principle of individual efficiency is my stab at that.

I wonder--would progressive educators in their many contemporary forms disagree with the principle, or would they instead disagree that the principle supports my conclusions?  I'm guessing it would be the latter.  But I think it is ultimately the principle itself that they are bound to reject.  Ultimately, progressive education is not about individual efficiency in education at all.  Maybe I'll say what I think it's really about later.

Some people will inevitably read the title and first few paragraphs of this post, skim the rest, and come to a major misinterpretation.  Misinterpretation #1. Some might assume that I am defending the "factory model," merely because the word "efficiency" is associated with that in the field of education.  Nothing could be further from the truth.  I reject the factory model and instead embrace homeschooling precisely because the factory model is so inefficient.   Misinterpretation #2. Some might suppose that I am defending "tiger moms" who constantly pressure students to learn and achieve.  Well, no.  Efficiency is about quality, not quantity.  It requires some discipline, but not harsh discipline.  I think education is most efficient when the student is sincerely interested and motivated.  That requires plenty of breaks and plenty of student input.


What I dislike about experts: dogmatism

Since I started Citizendium, which invites experts to be "village elders wandering the bazaar" of an otherwise egalitarian wiki, and am well-known for criticizing Wikipedia's often-hostile stance toward experts, I am sometimes held up as an example of someone who places too much trust in experts.

In fact, I have quite a bit less trust in experts than most people have.  When I learn that something that strikes me as, at least, capable of being reasonably doubted is the overwhelming majority opinion of experts, I become very suspicious.  Moreover, this has long been my attitude--not just recently, but since before Citizendium, or Wikipedia for that matter.  Let me explain why, and remove the puzzlement these claims must provoke.

First, however, let me explain why I respect and honor experts.  If they really are experts, and not just "the most knowledgeable person in the room" on a subject, it is because they have so goddamn much knowledge about their subject.  Even if I disagree with an expert's views on controversial issues, I stand in awe when it is clear that they can explain and evidently understand so much.  Knowledge per se is deeply important to me, and not just correct memorized information, which computers can ape, but deep understanding.  It is extremely satisfying to have demystified something that was previously puzzling, or to have come to a more complex understanding of something that seemed simple, though I earlier did not understand it and it was not simple.  A person has my respect who has grasped much of what is, to me, still mysterious and complex about a subject.

Still, my respect only goes so far, because I am aware of a certain problem with expertise and especially with the social nature of contemporary research.  People are sheep--even very smart people, trained in critical thinking.  When there is a "consensus" or "broad agreement" in many fields, it becomes politically difficult to express disagreement.  If you do, you seem to announce yourself as having some serious personal flaw: stupidity, ignorance of your own field, not being current on the literature, possessing poor judgment, or being ideologically motivated, dishonest, or unbalanced.  This is true not just in obviously controversial or politically-charged debates, it is also true about completely abstract, apolitical stuff that no one outside of a discipline gives a rat's patoot about.

Thus, due to the lemming-like conformity among many researchers, academic agreement tends to feed on itself.  An attitude becomes the only one worth expressing, even if, on the more objective merits of the evidence itself, such confidence is not warranted at all.  Such biases can swing 180 degrees in one generation (think of behaviorism in psychology).

I don't know enough about intellectual history to say for sure, but I suspect things weren't always quite as bad as they are now.  I suspect that academic conformity has been growing at least since I was in college myself, anyway.  There have been intellectual trends or "schools of thought" for millennia, of course, and when scholarship was dominated by the Church and religion--in medieval times and to a lesser extent until the 20th century--certain points of doctrine were held with easily as much dogmatism as one can find anywhere in academe today.  But in the last century, some causes of academic conformity have certainly grown more powerful: academic success is gauged based on how much one has published and in high-ranking journals, while researchers are expected to build upon the work of other researchers.  There is, therefore, an economic incentive to "play it safe" and march in lockstep with some particular view of the subject.  This situation has become even more dire both due to the extreme competition for jobs in academe and research, and due to the literal politicization of some fields (i.e., the devotion of whole disciplines to political goals).

This problem has become so pronounced that I find it is impossible really to evaluate the state of knowledge in a new field until I have come to grips with the leading biases of researchers--how professional conformity or political dogma might be giving an aura of certainty or consensus to views that ought, in fact, to be controversial and vigorously discussed.

I could cite several instances of unwarranted confidence in academic dogma from the fields of philosophy, psychology, and education, but frankly, I don't want to offend anyone.  Academics, of course, don't like to be called sheep or dogmatists.  Besides, I think my point will be more effective if I let people supply their own examples, because you might disagree with mine.  Care to discuss some in comments?

Let me conclude with a prediction.  Contrary to some, the Internet is not going to limit the prerogatives of experts; they have important roles to play in society, and we cannot function at our best without our most knowledgeable people in those roles.  But one of the more interesting and often delightful aspects of the Internet is that it provides a platform for people with nonstandard views.  It will also--it does not yet, but it will--provide a way to quickly compare current views with views from the past.  These two comparison points, nonstandard and historical opinion, were not so readily available in the past as they are or will be.  The easy availability of these dissenting views will make it increasingly obvious just how dogmatic academe has been.  Indeed, this has already started, and is one reason why experts and academics as a group have taken some hits to their credibility online.  Finally, I observe that, for all the ovine nature of researchers, youth often loves to smash idols, and new "education 2.0," degree-by-examinationbadge, and other schemes might make such nonconformist idol-smashing a better career option.  I suspect we will see a crop of younger researchers making careers on the newly-viable fringes of academe by pointing out just how ridiculously overblown certain academic dogmas really are--and students eager to save on tuition and get a broader perspective will flock to tutorials with such independent scholars.


The future, according to Kathy Sierra

Kathy Sierra blogged earlier today six years ago (!) that "The future is not in learning"; the future lies, instead, in "unlearning."  This sounds awfully like another example of the geek anti-intellectualism that I love to hate; we'll see about that.  Since that's how the post comes across--beginning with the title--it has already gotten a lot of attention.  Geeks just love to hear that, in the future, they won't have to learn things.  They just love to talk about how they'll be able to upload their memories to the Internet, how Internet search makes memorization a waste of time, how they just can't make themselves read books anymore, how the intellectual authority of experts is passe, and how the liberal arts and college generally are a waste of time.  For all the world it seems they really hate learning.  So when Kathy Sierra says that the future is not in learning, they start salivating.

In fact, Sierra's main message is excellent, one that is not at all anti-intellectual.  She's saying that since the times they are a-changin' so much, we have to roll with them faster and faster, as change accelerates.  This is itself very old news, but it's always nice to be reminded of such perennial wisdom.

Too bad that, as a premise, it hardly supports the post's dramatic title or its opening pseudo-historical timeline.  Her timeline asserts that in the 1970s, the question was (somehow--who knows what this means?) "how well can you learn?"  In the 1990s, it was "how fast and how much can you learn?"  But today we have evolved!  Now it's about how quickly you can unlearn!

If we take the latter claim in any general sense, the argument is fallacious, of course.  It is true that in the 1970s education theorists talked a lot about how well we learn; they still talk about that.  It's also true that there was a movement to accelerate education, especially among young children, which had its height in the 1990s and in the heyday of NCLB.  But when Kathy Sierra next points out the homey, perfectly old-fashioned truth that we must change our habits to keep up with the times, she is changing the subject.  The following argument contains a fallacy:

1. We should unlearn habits that do not conform to new developments.
2. New developments are now coming fast and furious.
3. Therefore, the important new virtue is not learning, but unlearning.

The premises (1 and 2) do not support the sweeping conclusion (3).  I do not contradict myself when I maintain that we should still learn quickly and a lot (allegedly the "1990s" virtue--I thought it was an ancient virtue, but maybe that's just me), even while maintaining that we should change our habits as they become outdated.  The premises do support a much more modest conclusion, that being fast and flexible in how we change our habits is a new virtue.  But to say so entails neither that "the future is unlearning, generally" nor that "the future is not learning, generally."  So this is a plain old logical fallacy.  I leave as an exercise to the reader to name the fallacy.

Lest I be accused of misconstruing Kathy Sierra, let me add this.  I know that she spends most of her post explaining how we should unlearn certain outdated habits.  I agree that this is excellent and timely advice.  But that does not stop her from titling her post "The future is not in learning..." and contrasting the new virtue of "how fast you can unlearn" with the old virtues of "how well you can learn" and "how fast and how much you can learn."  But the fact of the matter is that unlearning outdated habits is a very, very different kind of learning (or unlearning) from learning facts.

Besides, even if you say that what we should unlearn is certain facts, the facts tend to be about narrow, practical, and necessarily changeable fields, viz., technology and business.  Just because technology and business are changing quickly, that doesn't mean a lot of our knowledge about other topics is becoming uselessly outdated.  If that's the argument, it too is obviously fallacious.

So does Kathy Sierra deserve to be called an "anti-intellectual" for this argument?  Well, on the one hand, one can't take her argument at all seriously as an argument against "learning."  On the other hand, she does seem to have something of a disregard for logic, and if she doesn't literally believe her title, she does seem to pander to the anti-intellectual sentiments of certain geeks.  I hate to be uncharitable, and I wouldn't want to accuse her of encouraging people to stop learning so much, but look--the anti-intellectual sentiment is in the title of her post. Yes, maybe she is merely gunning for traffic by pandering to geek anti-intellectualism.  But why would she want to do that if she didn't share their biases against learning?

UPDATE: see below.  Kathy Sierra responds to point out that this is a six-year-old post.  I don't know quite why I thought it was posted today!  But I've already made a fool of myself, and I'm not one to stop doing so after I've done it publicly, especially at someone else's expense.


Why online conversation cannot be the focus of a new pedagogy

One of the most commonly touted features of a new, digitally-enhanced pedagogy, championed by many plugged-in education theorists, is that education in the digital age can and should be transformed into online conversation. This seems possible and relevant because of online tools like wikis and blogs.  There has been a whole cottage industry of papers and blogs touting such notions.  Frankly, I'm not interested in grappling with a lot of this stuff.  Actually, I wish I had time, because it's kind of fun to expose nonsense to the harsh light of reason.  But for now, let's just say that I've read and skimmed a fair bit of it, and I find it decidedly half-baked, like a lot of the older educational theories that hoped for various educational "reforms."  Some reference points would include fuzzy buzzwords like connectivism, constructivism, conversation, the social view of learning, participatory learning, and many more.

I am interested in briefly discussing a very basic question that, I imagine, underlies a lot of this discussion: can online conversation serve as the focus of a new pedagogy?  I've already written a bit about this in "Individual Knowledge in the Internet Age," but I wanted to return to the topic briefly.

A lot of educators are--not surprisingly--very much struck by the fact that we can learn a lot from each other online.  This is something I've been aware of since the mid-90s, when I ran some mailing lists and indeed did learn a lot from my fellow early adopters.  I continue to learn a lot from people online.  Quora is a great way to learn (albeit it's mostly light intellectual entertainment); so are many blogs and forums.  And of course, wikis can be a useful source of learning both for writers and readers.  These all involve an element of online community, so it of course makes sense that educators might wonder how these new tools could be used as educational tools.  I've developed a few myself and actively participate in other online communities.

But when we, adults, use these tools and participate in these forums, we are building upon our school (and sometimes college) education.  We have learned to write.  We have (hopefully) read reasonably widely, and studied many subjects, giving us the background we absolutely require to understand and build upon common cultural references in our online lives.  But these are not attainments that school children share.  (My focus here will be K-12 education, not college-level education.)  You are making a very dubious assumption if you want to conclude that children can learn the basics of various subjects by online participation modeled after the way adults use online tools.  Namely, you are assuming that children can efficiently learn the basics of science, history, geography, and other academic subjects through online tools and communities that are built by and for educated people.

Of course they can't, and the reason is plain: they usually have to be told new information in order to learn it, and taught and corrected to learn new skills.  These are not "participatory" features.  They require that a teacher or expert be set up to help, in a way that does not correspond to the more egalitarian modes of interaction online.  Moreover, except in some fields that are highly interpretive such as literature or philosophy, the relevant information cannot be arrived at via reflection on what they know--because most children are quite ignorant and much in need of education.  To be able to reflect, they need input.  They need content.  They need food for thought.  They need training and modeling.  They need correction.  We adults don't experience these needs (at least, not so much) when we are surfing away.  We're mostly done learning the concepts, vocabulary, and facts that we need to make sense of conversation in the forums that interest us.

So the reason online conversation cannot be the focus of a new pedagogy is that online conversation, as used by adults for learning, requires prior education.

I have nothing whatsoever against K-12 classes putting their essays or journals on blogs, or co-writing things using wikis, or in other ways using online tools to practice research, writing, and computer skills.  But we should not fool ourselves into thinking that when children do these things, they are doing what we adults do, or that they're learning in the ways we do when we use blogs, wikis, etc.  They aren't.  They're using these as alternative media for getting basic knowledge and practicing skills.  We adults mainly use these media to expand our knowledge of current events and our special interests.  The way we use them is radically different from proper pedagogical uses precisely because our uses require a general education.

Are you skeptical?  Well, I expect that if you're reading this sentence right now, you're pretty well educated.  So consider, please.  What would it be like to read a science blog, or Quora answer on a scientific question, without having studied high school mathematics and science?  Pretty confusing.  What would it be like to read any of the bettter blogs out there--the ones in your own blogrolls or feeds--if you had not read a lot of literature and in other ways learned a lot of college-level vocabulary?  Difficult and boring.  What would it be like if you had to read the news, or political blogs or Wikipedia's current affairs articles, having only minimal knowledge of geography and civics?  Puzzling at best.  Could you really hold your own in a blog discussion about politics if you had an elementary student's grasp of history and politics?  Would you find it easy to write a forum post coherently, clearly, and with good mechanics and spelling, even just to ask a question, if you had not practiced and studied academic writing and grammar as much as you did?  I could go on, but you get the idea.  You can't do these various things that make you an effective, articulate, plugged-in netizen without already having a reasonably good liberal arts education.

I imagine it's sort of possible, but conversation online among your fellow students would be an incredibly inefficient way for you to learn these things in the first place.  Why spend your time trying to glean facts from the bizarre misunderstandings of your fellow 10-year-olds when you can get an entertaining, authoritative presentation of the information in a book or video?  And I'll tell you one thing--someone in your online study community, the teacher or the class nerd, will have to have read such "authoritative" media, and reveal the secrets to everyone, or you'll be "learning" in a very empty echo chamber.

At this point, someone is bound to point out that they don't really oppose "mere facts" (which can just be looked up), declarative knowledge, or "elitist" academics, or books, or content, or all the other boo-hiss villains of this mindset.  They just want there to be less emphasis on content (memorization is so 20th century!), and more on conversation and hands-on projects.  Why is that so hard to understand?  But this is where they inevitably get vague.  If books and academic knowledge are part of the curriculum after all, then in what way is online conversation the "focus" of the curriculum?  How are academics, really, supposed to figure in education--in practice?

My guess is that when it comes down to implementation, the sadly misled teacher-in-the-trenches will sacrifice a few more of the preciously scarce books in the curriculum and use the time for still more stupid projects and silly groupwork assignments, now moved online using "cutting edge" tools because that's what all the clever people say where "the future" lies.  As a result, the students will learn little more about computers and online communities than they would learn through their own use of things like Facebook, and they'll get something that barely resembles a "reasonably good liberal arts education."

EDIT: I greatly enjoyed this literature review/analysis article:

Kirschner, Paul A., John Sweller, and Richard E. Clark, "Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching," Educational Psychologist 41 (2), 75-86: http://www.cogtech.usc.edu/publications/kirschner_Sweller_Clark.pdf


On educational anti-intellectualism: a reply to Steve Wheeler

Suppose a student arrived at the age of 18 not knowing anything significant about World War II or almost any other war, barely able to do arithmetic, ignorant of Sophocles, Shakespeare, Dickens, and most other great writers, and wholly unschooled in the hard sciences (apart from some experiments and projects which made a few random facts stick).  Now, we can charitably concede that such a person could know his way around a computer, the Internet, and other technology very well.  He might have any number of vocational skills and have a job.  We can also imagine that such a person even writes and speaks reasonably well (although this seems unlikely).  Finally, we can imagine such a person being happy with himself and his "education."  This is all easy to imagine, because such students are being minted with appalling frequency these days in the U.S. and (to a lesser extent) the U.K.

Let us try to put aside our differences about educational philosophy for a moment; surely we can agree that, objectively speaking, this student is ignorant. He lacks an adequate amount of--to employ some jargon used by epistemologists, and by Steve Wheeler in a recent blog post that I responded to--"declarative knowledge."

So next, let's suppose that an education professor (whether this corresponds to Wheeler remains to be discussed) were to maintain that (1) our schools should be teaching even less declarative knowledge than they have been, (2) such traditional subjects as literature, history, geography, science, and French had become unimportant, or at least much less important, particularly now that Google supplies instant answers, and (3) we should not teach individual subjects such as those just listed, but instead mix various subjects together in projects that display how holistic and interrelated the world is.  Now, whatever else he might believe or say, it is reasonable to conclude that these recommendations, if followed by schools, would contribute to ignorance of the sort described above.

Now, I do not claim to have an interesting theory of anti-intellectualism.  But I do think that we can identify a theorist as anti-intellectual if his theories, when implemented on a large scale, would obviously and directly lead to widespread ignorance.  This isn't a definition; it's merely a sufficient condition.  (Forgive me for not refining this formula further, but I think it will do well enough.)  I could say more plainly that such a theorist supports ignorance over knowledge, but of course most people will deny supporting that.  So--to use some other philosophical jargon--I only ascribe the view to him de re, not de dicto.

This is not necessarily "anti-intellectual" in some more derivative senses, which have a lot of play in the media today.  For example, an anti-intellectual according to my test might also be an academic and staunchly in support of universities and academic work; he might support a technocratic government of experts; he might support science against faith-based criticisms.  But these are, I maintain, derivative senses of "anti-intellectual," because universities, experts, and science are each bastions of knowledge. Knowledge is the main thing.  So in a more basic sense, to be intellectual is to be a devoted adherent of knowledge, and particularly of abstract or general knowledge.  I don't intend this as a theory of anti-intellectualism, but more of a general, rough sketch.

Someone who recommends (or whose theories entail) that students should gain much less knowledge than they otherwise would seems to me a better example of an anti-intellectual than, say, a creationist or a climate change denier.  This is because the ignorance permitted is not limited to a particular topic, but is thoroughgoing--and deliberate.  The (perhaps fictional) education professor I described earlier is opposed to students getting more declarative knowledge, per se, than they get right now.  Whatever their problems, you can't say that of the creationist or the climate change denier; at worst, their positions make them hostile to particular examples of knowledge, not to knowledge per se. Which do you think is worse?

In his recent post, Steve Wheeler defends himself against my charge of "anti-intellectualism."  Now, I hope it's very clear that my posts are not only about Steve Wheeler.  He's just one example of a whole class of education theorist.  He has merely stated the position of educational anti-intellectualism with admirable clarity and brevity, making it especially easy for me identify and dissect the phenomenon.  Wheeler cites another Brit, Sir Ken Robinson, as someone who shares his views.  I'm sure he will not be surprised to learn that I have, in fact, responded similarly to Robinson (though I forebore to apply the label "anti-intellectual" in that case--I came close).  I also responded to another theorist Wheeler mentioned, John Seely-Brown, in this paper.

In his defense, Wheeler archly, with great irony, claims to be "gratified that someone with such a standing in the academic community had taken the time to read my post and respond so comprehensively" and "My list of peer reviewed publications and the frequency of my invited speeches around the world will not compare with his."  In case you have any doubt, let's just say that I am pretty sure Prof. Wheeler took the time to look at my site and gauge my meager academic and speaking credentials.  That would be the first thing that most academics would do.  So of course Wheeler knows that, in fact, I don't have much standing in the academic community at all; I have very few peer reviewed publications, and my speeches, most of which were not for an academic audience, are not as "frequent" as his.  He has me hopelessly outclassed in these areas, and he knows it.  He's the academic and the intellectual, and I'm the outsider--or so he seems to convey.

But his deliberate irony backfires, I find.  It is very easy for a distinguished academic, like Wheeler, to be hostile to knowledge, or science, or reason, or the prerogatives of experts.  Otherwise perfectly "intellectual" people have been justly called "anti-intellectual" because of their hostility to the products, power, or institutions of the mind.  "Anti-intellectual intellectual" is no more a contradiction than "anti-Semitic Jew" or "anti-American American."  So this defense is incorrect: "It seems a contradiction that he can view me as a 'serious theorist' and then spend the majority of his post trying to convince his readers that I am 'anti-intellectual'.  Surely the two cannot be compatible?"  Surely they can--and in our twisted and ironic age, all too often are.  So, while I have respect for Wheeler's work, it doesn't defend him from charges of anti-intellectualism.  He would conscientiously, on principle, deny our students just the sort of knowledge that he benefited from in his life and career--though he questions whether he needed them later in life, and says that his schooling "didn't make that much sense to me," and questions the worth of various subjects and facts that a liberally educated person, such as he himself, might pick up.

No, pointing out that he is a distinguished academic won't shield Wheeler from accusations of anti-intellectualism.  Only a frontal reply to my argument would do that.  Does his recent post contain such a reply?

Not exactly.  I am not going to do another line-by-line reply, as tempting as that might be.  He does deny that he wants to remove "all knowledge...from curricula."  I didn't think so, and my argument doesn't attack such a straw man.

In place of the relatively clear attack on "declarative knowledge," Wheeler's more cautious restatement resorts to a vague, contentless call for reform:

In my post I suggested that a possible way forward would require a reappraisal of the current curricula, with more emphasis on competencies and literacies. I wish to make something clear: My remark that some knowledge was susceptible to obsolescence was not a call for all knowledge to be removed from curricula - that would indeed be ridiculous. I am not attacking knowledge, as Sanger asserts. Rather, I am calling for schools to re-examine the content of curricula and to find ways to situate this knowledge within more open, relevant and dynamic learning contexts. I am also calling for more of an emphasis on the development of skills that will prepare children to cope better in uncertain futures.

He doesn't give many details here or later, nor does he really retract anything in particular from his earlier post.  He does regret using "poor illustrations and analogies to underpin this call," but only because it created a rhetorical opening for me.  As I see it, he wants us to believe that he were merely calling for schools to add a little more discussion and reflection into an otherwise really hardcore "facts-only" curriculum.

But it would be frankly ridiculous to characterize the American educational system, at least, this way.  Many teachers here are already deeply committed to the project method and skills education.  Students can get through an entire 13 years without reading many classics at all.  Indeed, just re-read the first paragraph of this post.  That (at least the first part) describes a lot of students.  Such poor results are no doubt partly because students don't study enough, and their parents aren't committed to school enough to get their children committed.  But it's also partly because schools simply don't teach enough, period.  I had an "honors and AP" sort of public school education in an excellent district (Anchorage, Alaska in the 1980s) and I didn't learn nearly as much as I could or should have.  This is why I'll be homeschooling both of my sons (my first is in kindergarten at home)--because standards have declined even farther from where they were when I was a student.

Schools do, clearly, require a huge amount of work. I think we can agree there.  But let's not confuse work with sound training in the basics and the liberal arts.  There's altogether too much busywork, worksheets, low-priority but time-consuming projects, group reports, etc., and not nearly enough reading of good books and reflective discussion and writing about it.  We could be requiring less but using more high-impact activities (like reading the classics and letting students go at their own pace through math texts, self-selected from a list proven to raise test scores), and students would learn more.

When Wheeler cites Ken Robinson in criticism of "old industrialised models" of education, calls for "conversation" and "self discovery," and approvingly quotes Richard Gerver in support of a "personal and unpredictive journey," I can stand up and cheer too.  I think Wheeler might be surprised to learn this.  On some issues, we might not be so far apart.  I'm an advocate of home schooling, in which such things are actually possible.  (As I said in my analysis of a Robinson speech, effectively opposing the "industrialized" or "factory" model of education really requires something like homeschooling en masse, which does not seem possible as long as control of education is centralized.)  But we still study subjects. Our studies still have coherence and benefit from our studying conceptually related topics near to the same time.  We still cover the traditional subjects like history and science--in far more detail than I ever did at this age.  It's just that we are able to take detours, choose the books we like, drop the ones we don't, etc.  The point is that you don't have to throw out the baby (knowledge) with the bathwater (regimented, unpersonalized school curricula).

So much for Wheeler's defense.

The question in my mind is whether his explanation has made his commitment to (1)-(3) any less clear.  Should our schools be teaching even less declarative knowledge than they have been?  So it seems, though now he regrets listing individual subjects and facts.  (Maybe fear of being called out as I've done with Wheeler explains why education professors often write so vaguely.)  He didn't mention--not to support or retract--all the business about declarative knowledge being trivial to access and going out of date anyway.  No retraction of the line that the availability of instant facts via Google make study of various academic subjects pointless.  Should we avoid teaching individual subjects, in favor of (much less efficient) projects that display how holistic and interrelated the world is?  He defended that in his latest.

Well then, my conclusion still stands: someone who believes (1)-(3) is, admit it or not, advocating for even more ignorance than we suffer from today.  It seems that Wheeler supports (1)-(3), and that looks pretty anti-intellectual to me.

Applying "anti-intellectual" to Wheeler's views is not a mere rhetorical "tactic," as he calls it.  Harsh and possibly impolite it might be, but it names an important feature of his views.  If I wanted to, I could politely agree to drop the epithet.  Then I would simply say that Wheeler's recommendations would have us, deliberately, on purpose, make students more ignorant and less knowledgeable.  Would that really be less damning than the epithet "anti-intellectual"?


An example of educational anti-intellectualism

I've got to stop blogging quite so much, but I couldn't let this pass without comment.

One would expect Steve Wheeler, Associate Professor of learning technology at Plymouth University in England, to be plugged into and more or less represent the latest academic trends in education technology.  If so, I'm a bit scared.

I came across Prof. Wheeler's blog post from yesterday, "Content as curriculum?" If I had wanted to create a parody of both kinds of anti-intellectualism I've mentioned recently--among geeks and among educationists--I couldn't have invented anything better. Wheeler hits many of the highlights, repeating the usual logical howlers as if they were somehow deeply persuasive. While I've already debunked a lot of this elsewhere, I thought it would be instructive to see that I have not, in fact, exaggerated in my characterization of the anti-intellectualism of some educationists.

Wheeler's post is so interesting and provocative, I'm going to go through line-by-line.

I think it's about time we reconsidered the way curricula in schools are presented. The tired, just in case model of curriculum just doesn't make sense anymore. Content is still very much king in schools, because 'content as curriculum' is easy and cost effective to deliver, and that is what most governments require and impose.

"Curriculum" is a very slippery term.  Wheeler here appears to mean "whatever can be taught." Later, he later brings out a distinction, familiar to philosophers, between declarative knowledge (knowledge that, "I know that 2+2=4") and procedural knowledge (knowledge how, "I know how to ride a bicycle"). Wheeler's main thesis seems to be that schools should concentrate on teaching procedural knowledge much more and declarative knowledge even less. So we can unpack the title of the blog post, "Content as curriculum?": he is skeptical that "content," or stuff that students might gain declarative knowledge of, should be the focus of the "curriculum," or what is taught.  The curriculum, he seems to maintain, should be practice, skills--not content.

In other words, if you strip away the edu-speak, Wheeler is saying that students should be taught a lot less declarative knowledge. Since this is what we ordinarily mean by "knowledge," we can put it even more simply: Wheeler is opposed to teachers imparting knowledge.

Now, this might sound like a ridiculous oversimplification of Wheeler's views. But if so, that's not my fault, it's Wheeler's. If you read his blog post, you'll see that I'm not being uncharitable in my interpretation. I'm simply explaining what he means. If there were any doubts or question that he really means this, he makes it all too clear in the next paragraphs, as we'll see.

But most teachers will tell you it's not the best approach.

I'm sure that teachers would be surprised to learn that their peers believe it's "not the best approach" to use "content," or what can be learned as declarative knowledge, as the "curriculum." All I can say is, I hope he's wrong. To be sure, there are some teachers out there who have great contempt for books and what I would call substantial learning. But surely they are still a minority.

When I went to school I was required to attend classes in mathematics, English language and literature, science (physics, biology, chemistry), history, geography, music, art, Religious Education, craft and design, home economics, German and French - all just in case I might need them later in life. With the exception of a few subjects, my schooling didn't make that much sense to me.

...and it appears that how these traditional subjects are useful to him "later in life" still doesn't make sense to him.  I'll enlighten him below.

Occasionally I hear someone saying "I'm glad I took Latin at school", and then arguing that it helped them to discover the name of a fish they caught whilst out angling on holiday. Well, knowing that thalassoma bifasciatum is a blue-headed wrasse may be wonderful for one's self esteem. It may impress your friends during a pub quiz, but it won't get you a job.... and was it really worth all those hours learning how to conjugate amo, amas, amat simply to be able to one day identify a strange fish, when all you need to do in the digital mobile age is Google it?

Here, finally, we get the hint of an argument: the reason that Latin, and presumably all those other subjects, are not "needed" in the curriculum is that we can Google that information. Actual knowledge of those subjects is not needed--because we've got Google.

But does he really mean that all those subjects he listed, and Latin, are not needed?

The question is, how much do children now need to learn in school that is knowledge based? Do children really need to know what a phrasal verb is, or that William Shakespeare died in 1616 when what they really need to be able to do is write a coherent and convincing job application or construct a relevant CV? We call this type of learning declarative knowledge, because it is 'knowing that' - in other words, the learning of facts. Yet, in a post-modernist world where all knowledge has become increasingly mutable and open to challenge, facts go quickly out of date. I was taught in school that there are nine planets orbiting the sun. Today it appears that Pluto is no longer a planet (but for me he will always be a cartoon dog). Is it Myanmar or Burma? I was told by my geography teacher it was Burma. Then she was right, now she is wrong. Just when did Mao Tse-tung change his name to Mao Zedong? And is the atom still the smallest object known to humankind? No. Now we have something called quantum foam. Apparently it's great for holding the universe together but pretty useless in a wet shave. You see, facts are changing all the time, and very little appears to remain concrete. So why are teachers wasting their own time, and that of the kids, teaching them facts which in a few years time may be utterly out of date?

Yep. He means it.

Now, look, I don't know how many times I need to repeat the arguments against this sort of nonsense--I think I did a pretty good job in "Individual Knowledge in the Internet Age"--but it won't hurt to rehearse a few of them briefly:

1. Much of the declarative knowledge that matters, and which requires time and energy to learn, is not of the sort that can be gained by looking it up in Google.  You can read some quick analysis of the causes of the Great Depression, but you won't really know them until you've studied the subject.

2. Most accepted knowledge doesn't change, even over a lifetime.  Fine, Pluto's no longer a planet.  The others are.  99% of what we knew about the solar system 50 years ago has not been disconfirmed.  Most new knowledge adds detail; it does not render old knowledge useless.  (Besides, the professor would not be able to cite this as an example if he had not learned that Pluto was a planet; he couldn't be an articulate, plugged-in thinker without his "useless" declarative knowledge, which he could count on other educated people sharing.)

3. Understanding an answer usually requires substantial background knowledge.  Suppose I want to know when Shakespeare died, and I find out that it is 1616.  But suppose I haven't memorized any dates.  Then this information, "1616," means absolutely nothing whatsoever to me.  It is, at best, a meaningless piece of trivia to me.  Only if I have studied enough history, and yes, memorized enough dates, will 1616 begin to have some significance to me.  I wonder if Wheeler thinks the date doesn't matter, period, because Shakespeare doesn't matter, period. After all, if that date isn't important, is any important?

4. Most vocabulary is learned in context of copious reading.  If schools start teaching "procedural knowledge" instead of "declarative knowledge," then the vocabulary and conceptual stockpile of students will be so poor that they can't understand the answers they Google.  (They certainly wouldn't be able to understand this blog post.)

5. Finally, declarative knowledge is its own reward.  Understanding the universe is a joy in itself, one of the deepest and most important available to us.   You are a cretin if this point means nothing to you.

Mainly what I think is interesting here is that this is a professor of education, and he is espousing flat-out, pure, unadulterated anti-intellectualism. An educator opposed to teaching knowledge--it's like a chemist opposed to chemicals--a philosopher opposed to thinking. Beyond the sheer madness of the thing, just look at how simple-minded the argument is, and from what appears to be a rather distinguished academic. I actually find this rather sobering and alarming, as I said. It's one thing to encounter such sentiments in academic jargon, with sensible hedging and qualifications, or made by callow, unreflective students; it's another to encounter them in a heartfelt blog post in which, clearly, a serious theorist is airing some of his deeply-held views in forceful language.

Wheeler goes on:

Should we not instead be maximising school contact time by teaching skills, competencies, literacies? After all, it is the ability to work in a team, problem solve on the fly, and apply creative solutions that will be the common currency in the world of future work. Being able to think critically and create a professional network will be the core competencies of the 21st Century knowledge worker. Knowing how - or procedural knowledge - will be a greater asset for most young people. You see, the world of work is in constant change, and that change is accelerating.

There was never a time in history in which "ability to work in a team, problem solve on the fly, and apply creative solutions" were not significant advantages in the work world.  These are not new features.

Another point that seems to be completely lost on those who make such sophomoric arguments as the above is that having a deep well of conceptual understanding is itself absolutely necessary to the ability to "work in a team, problem solve on the fly, and apply creative solutions."  It's even more important for the ability to "think critically."  This is why philosophy graduates famously excel in the business world.  They are trained to think through problems.  Difficult problem solving requires abstract thinking, and the only way to train a person to think effectively and abstractly is by tackling such difficult academic subjects as science, history, classic literature, and philosophy.  Besides, the skills and knowledge learned in these subjects frequent provide a needed edge in fields that require a mathematician's accuracy, a historian's eye to background, a litterateur's or psychologist's grasp of human nature, or a philosopher's clarity of thought.

Besides, not only has declarative knowledge mostly not changed, procedural knowledge changes much faster--which is probably part of the reason it was not taught in schools, for a long time, apart from a few classes.  The specific skills for the work world were, and largely still are, learned on the job.  So let's see, which would have been better for me to learn back in 1985, when I was 17: all the ins and outs of WordPerfect and BASIC, or U.S. History?  There should be no question at all: what I learned about history will remain more or less the same, subject to a few corrections; skills in WordPerfect and BASIC are not longer needed.

My 16 year old son has just embarked on training to become a games designer. If, when I was his age I had told my careers teacher that I wanted to be a games designer, he would have asked me whether I wanted to make cricket bats or footballs. Jobs are appearing that didn't exist even a year or two ago. Other jobs that people expected to be in for life are disappearing or gone forever. Ask the gas mantel fitters or VHS repair technicians. Ask the tin miners, the lamplighters or the typewriter repair people. Er, sorry you can't ask them. They don't exist anymore.

I don't quite understand Wheeler's point here.  His 16-year-old son is training to become a games designer, at an age when Wheeler and I were spending our time learning "mathematics, English language and literature, science (physics, biology, chemistry), history, geography, music, art," etc.  By contrast, his son's early training in games design is supposed to help him in twenty or thirty years--when games will be exactly the same as they are today?  I thought the point was that things like game design change very fast.  Well, I don't know his circumstances, but my guess is that his son would be better off learning the more abstract, relatively unchanging kind of knowledge, providing a foundation, or scaffolding, that will make it easier to learn particular, changeable things later, as well as communicate more effectively with other well-educated people.  Here I hint at another argument for declarative knowledge, E.D. Hirsch's: it provides us an absolutely essential common culture, which makes it possible for Wheeler and I to understand each other, at least as well as we do.  Ask entrepreneurs you know and they'll tell you: the ability to communicate quickly, precisely, and in general effectively is a deeply important ability to have in a employee.  You don't often gain that ability on the job; you develop it, or not, by studying various modes of communication.

Why do some teachers still provide children with answers when all the answers are out there on the Web? Contemporary pedagogy is only effective if there is a clear understanding of the power and efficacy of the tools that are available. Shakespeare may well have died in 1616, but surely anyone can look this up on Wikipedia if and when they need to find out for themselves? ...

Well, here's another puzzling thing to say.  Teachers don't "provide children with answers."  If they are doing their jobs properly, they are getting children to learn and understand the answers (and the questions).  A teacher is not a search engine.  Moreover, it's unconscionable that a trainer of teachers would pretend that what teachers do can be done by a search engine.

Get them using the digital tools they are familiar with to go find the knowledge they are unfamiliar with. After all, these are the tools they carry around with them all the time, and these are the tools they will be using when they enter the world of work. And these are the tools that will mould them into independent learners in preparation for challenging times ahead.

"Digital tools" will "mould them into independent learners"?  I've heard a lot of things about digital tools, but never that they would make an independent learner out of a student.

If you want to make an independent learner, you have to get them, at a minimum, interested in the world around them--in all of its glorious aspects, preferably, including natural, social, political, mathematical, geographical, philosophical, and so forth.  If they aren't interested in those rich, fascinating aspects of the world--and why should they be, if their teachers dismiss such knowledge as "useless"?--they'll no doubt only be interested in the easiest-to-digest entertainment pablum.  Why think they'll get very interested in how to build software if they haven't been led to be curious about the facts about how software, or the world generally, works?  Surely Wheeler's son has learned some facts that have provided essential scaffolding for his interest in computer programming.

I don't think digital tools, and the mere ability to use them, will make curious, independent learners out of people all by themselves.  Most of all, we need exposure to the content of thought, concerning the natural world and our place in it; then we need to be given the freedom to seek out the answers on our own time.  I don't support the idea of spoon-feeding information--that's what poor teachers do, as well as search engines, come to think of it.  I think students should be challenged to read deeply and reflect on what they read.  That's the tried and true way to make a curious, independent learner.

We need to move with the times, and many schools are still lagging woefully behind the current needs of society. Why do we compartmentalise our subjects in silos? When will we begin to realise that all subjects have overlaps and commonalities, and children need to understand these overlaps to obtain a clear and full picture of their world. Without holistic forms of education, no-one is going to make the link between science and maths, or understand how art or music have influenced history. Some schools such as Albany Senior High School in Auckland are already breaking down the silos and supporting learning spaces where students can switch quickly between 'subjects' across the curriculum. Other schools are beginning to realise that ICT is not a subject and shouldn't be taught, but is best placed as embedded across the entire curriculum.

To answer Wheeler's no doubt rhetorical question, we study various special subjects independently for the reason that there is no other efficient way to learn those subjects.  To be sure, it makes sense to learn the humanities all together, in historical order.  Other subjects might, perhaps, be usefully combined.  But if you make a big melange of it, you'll find that the subjects simply can't be mastered nearly as quickly.  You have to spend a significant amount of time on a subject before the neurons really start firing.  If you're always skipping around from this to that, on the basis of rough analogies and not more substantive conceptual relationships (as become clear when you study anything systematically), you never get the extremely significant advantages that accrue from studying things that are conceptually closely related at about the same time.

But if, like Wheeler, you don't think that declarative knowledge of academic subjects is especially important, and you can't grasp what importance abstract conceptual dependencies might have for enhancing understanding, communication, and curiosity, then, no.  You might not think there's any point in focusing on particular subjects.

So, no, "moving with the times" does not require that our children arrive at adulthood as ignoramuses.

It's about time we all woke up and realised that the world around us is changing, and schools need to change too. After all, the school still remains the first and most important place to train and prepare young people for work. If we don't get it right in school, we are storing up huge problems for the future. Education is not life and death. It's much more important than that.

Didn't Wheeler ever learn plus ça change, plus c'est la même chose in his French class?  The pace of change has been increasing, no doubt.  But the world has been changing fairly quickly for the last 150 years, and one of the classic arguments for liberal arts education--something that I can't imagine Wheeler actually endorsing, given what he's written--is precisely that the liberal arts enable us to deal with such changes by giving us a solid foundation, a mature (not to say perfect or unchanging) comprehension of the natural and social world.  They also give us the ability to think and communicate more deeply and accurately about new and changing things.

A student well educated in the liberal arts, who has "memorized"--and really understood--a boatload of "mere facts," will be far better prepared to meet the changes of the coming century than someone who is well trained in the use of digital technology to talk at random about things of which he has been left tragically ignorant.


A short manifesto for schools: time to focus on knowledge

Ever since I was an elementary school student myself, I have been chronically disappointed with the American education establishment. Don't get me wrong--I get along fine with most of the educators I encounter, who are good people and full of all sorts of good ideas and real competence. But I also believe a sickness pervades the entire American system of education, the sickness of anti-intellectualism.

I read plenty of blogs, tweets, and articles on education and ed tech, as well as the occasional book, from all sorts of cutting-edge teachers, administrators, and education theorists. They are all abuzz about the latest online educational resources, which I love and use (and develop) too. But whenever the subject of learning facts or substantially increasing levels of subject knowledge, and--especially--tests of such things comes up, I seem to hear nothing but boos and hisses. This might be surprising, because, after all, what are those educational resources for if not to increase levels of subject knowledge? It seems an exception is made for technology.

But to focus attention on ignorance among students, poor test results, etc., apparently means caring too much about "direct instruction" and a lot of other betes noire of the education establishment. If I talk much about raising standards or returning the focus to knowledge, I threaten to reject "student-centered" education, the project method, "useful" (read: vocational) knowledge, and authentic assessments, and replace such allegedly good things with "drill and kill," learning "trivia," boring textbooks, and in general a return to soul-killing, dusty old methods discarded long ago and rightly so. What I rarely encounter from the education establishment--though critical outsiders like myself talk endlessly about it--is evidence of an earnest concern about, quite simply, how much students learn.

Enter this Atlantic article about a recent study of the factors correlated with test success, by Harvard researchers Will Dobbie and Roland Fryer--not education professors, but economists who know a thing or two about research methods. Dobbie and Fryer discovered, unsurprisingly, that higher student scores are correlated with a "relentless focus on academic goals." Such a focus entails "frequent teacher feedback, the use of data to guide instruction, high-dosage tutoring, increased instructional time, and high expectations." The factors not correlated with school effectiveness, by contrast, included "class size, per pupil expenditure, the fraction of teachers with no certification, and the fraction of teachers with an advanced degree." My hat is off to these researchers for reminding us, and giving new supporting data, for what those of us outside of the education establishment already knew: a culture of commitment to academic success is the main thing that matters to academic success. But we may confidently predict that the education establishment will dismiss this study, as it has done so many others that came to similar conclusions.

Reading about the study inspired a few thoughts. First, it is indeed the culture of a school that determines whether its students are successful. This claim makes sense. If the goal of schools were to inculcate knowledge in students--success at which is might be measured by the study's "credible estimates of each school's effectiveness"--then it is simple rationality, and requires no special studies, to conclude that the school's staff should have a "relentless focus on academic goals."

If you want to make schools better, it's important that they be absolutely focused on academic goals, which is to say, on knowledge. Excess resources won't buy such a focus (and the study indicates that this might be inversely correlated with success). Class size doesn't matter. Focus on "self-esteem" doesn't help. What is needed more than anything is a pervasive, institutional commitment to knowledge as a goal.

Sadly, increasing student knowledge is not the overriding goal of American schools.

I wish I had time to write a book for a popular audience of parents, teachers, and older students, defending knowledge (not "critical thinking," which requires knowledge; not self-esteem; not social training; not high-tech training) as the most important goal of education. I'd also like to make the case that the education establishment has been really, truly, and in fact anti-intellectual and "anti-knowledge." I'm merely asserting this perhaps startling claim in this post--this isn't the place to make the case. But those who are familiar with intellectualist critiques of American public schools will understand.

If you really want to know why so many kids in the United States do poorly on exams and are basically know-nothings who turn into know-nothing adults, I'll tell you. It's because many parents, many teachers, the broader education establishment, and especially the popular culture of "cool" which guides children's development after a certain age are simply anti-intellectual. Europeans and especially Asians do not have such an albatross around their necks. They actually admire people who are knowledgeable, instead of calling them nerds, and (later in life) dismissing their knowledge as "useless" and dismissing them because of their less-finely-tuned social skills, and (after they gain some real-world status) envying them and labeling them "elitist."

It's been widely believed and often admitted for a long time that the United States has a long, nasty streak of anti-intellectualism. Democrats love to bash Republicans on this point. (And recently I bashed geek anti-intellectualism as well.) But anti-intellectualism in schools? This is apt to make many Democrats, and the more establishment sort of Republican, pretty uncomfortable. Still, it's true. This broad societal illness has kept our schools underperforming not just recently, but for generations.

The common complaints about standardized testing are misplaced. If schools were filling their charges' minds adequately with academic knowledge and skills, they would welcome standardized tests and would not have to spend any extra time on preparing students for them. The focus on the project method is misplaced, too. Projects and experiments are important as far as they go, but it is simply inefficient--if the goal is to develop student knowledge--to make projects the centerpiece of pedagogy.

Finally, the generations-long the flight from books, especially well-written books, is a travesty. Books are where the knowledge is. If you want students to know a lot, have them read a lot of non-fiction books. They will inevitably become very knowledgeable. If you make sure that the books are well-written--not boring library fodder like so many geography books, for example--and hand-picked by students, they will more likely enjoy their reading. Having read probably a few thousand children's books to my son in the last five years, I can assure you that there is no shortage of excellent children's books on most subjects. We're in a sort of golden age of children's books--never before has there been such a tremendous variety of offerings so readily available.

Principals and teachers need to lead the way. They need to convey to their students and their students' parents that their schools are all about getting knowledge. "When you leave our school, you will know a lot," educators should be able to tell their students--honestly. "But we will expect you to pay attention and read a lot of books. We will expect a lot of you. Learning at this school won't be easy, but the effort will definitely be worth it. It will open up your eyes to a world that is far larger and deeper than you knew. The knowledge you will gain will make you, in a way, a bigger person, more connected to everything all around you, and better prepared to make the world a better place."

Finally, if schools don't throw off this anti-intellectualism, which has become positively stodgy and stale, and which is so contrary to their mission, they can expect to encounter more and more competition in the form of charter schools, school vouchers, homeschooling, and now virtual schools. If parents who really care about learning run toward new educational systems that have a better chance of actually educating their children, who can blame them?


On changing student beliefs

I came across a very irritating post in the Coffee Theory blog by Greg Linster, and felt inspired to respond.  This began as a comment on his blog, but after a while it became too long for that, so I figured I'd just put it on my own blog.

Greg, you actually seem like a nice guy to me, so try not to take this personally.  I am surprised that you apparently did not notice that you were assuming that it is even permissible to indoctrinate your students; I'm surprised that you thought that the only serious question was whether you have a responsibility to do so.

I think it's wrong--possibly morally wrong, but ill-advised, certainly--for college professors, and most other instructors, to presume to correct any of their students' earnestly-held and not merely mistaken beliefs.  I have felt this way since I was a high school student.  The force of the neutral presentation of information and argumentation is usually enough to work its magic on an adequately receptive mind.  What about the inadequately receptive minds?  Indeed there might be some religious types who are so dogmatic that they might not be benefited by a college education (although I really doubt that); there are some who become so disgusted that they quit college and become thoroughly anti-intellectual.  But such people won't be helped by the likes of Boghossian.  They'll be helped by methods that are more neutral and respectful toward the views of the student.

The role of the teacher is to guide the student's minds, not to force them--to point them in the right direction, or better, to lay out a pristinely clear map of the land--and leave it up to them to come to sincerely held beliefs that are, one hopes, truer and more nuanced than they were before.

As a college student, I always found the tendency of some college professors to teach their own pet views to be extremely annoying.  What I believe is my own business.  I have no desire to be part of your project to transform the world in your image; as a student, I regard myself as a free agent and merely want the tools to shape my own beliefs.  Indeed, a lot of college students (and for that matter, high school students) are especially irritated by the tendency of some professors (and teachers) to "indoctrinate" them.  Some students make decisions about where to go to college based on how much they can expect to be indoctrinated, or not.  Of course, the fact that a student finds a practice extremely irritating does not mean that it is a poor idea in itself; but the extremeness of the irritation is certainly suggestive.

My basic argument is that the aim of a liberal education is to train the mind to think independently and critically.  But berating students and indoctrinating them has the opposite effect, and for that reason is ruled out.  It leads them to understand that in the so-called "life of the mind," we may lord it over others once we gain the authority to do so.  We do not persuade, we (at best) cajole, and if necessary pressure and even force, the minds in our care.  A mind that is in the habit of viewing intellectual disputes between professor and student as best settled not as a matter of rational disputation but instead by authority handing down the law, such as you and Boghossian (i.e., Peter Boghossian, not to be confused with a more distinguished philosopher, Paul Boghossian) maintain, is a mind that is not liberated to think critically.  (Boghossian's article was, frankly, awful.  I'm embarrassed that it was written by a philosopher.)

Student beliefs in creationism merely require an application of the general rule here.  By all means, state the facts as scientists have uncovered them.  Sure, present the arguments for and against creationism; have the confidence that the latter will appear more plausible on their own merits; but do not berate or belittle students for disagreeing with you, do not mark down essays because they express objectively false theories (there might be exceptions to this rule, of course), and certainly do not require them to repudiate Creationism.  The best way to persuade them to give up cherished, but ill-supported and irrational beliefs, is to train them better in the habits of rational thought. And if you can't do that, and have to resort to "repudiating" and "challenging" student opinions, you have no business teaching at an institution of liberal education.  I certainly wouldn't want my sons taking classes from you, even if I share a scientific, nonbelieving attitude toward Creationism (as I do).  I would much rather have my sons taking classes from someone who will present the liveliest versions of all arguments and then challenge them to come to the most carefully-argued, nuanced view of the situation.

Boghossian's problem isn't precisely "intellectual arrogance."  He is not wrong merely because he arrogantly assumes that he is correct in repudiating Creationism.  If this is the main argument against him, people are arguing against him for the entirely wrong reason.  He is wrong, instead, for attempting to bully his student into changing his beliefs.  This--not the professor's confidence, but his bullying--does harm to the student's mind and even character.

Greg, you cite a couple of examples of beliefs that, you say, one should certainly try to change in students.  First is the "belief" that 2+2=5; but this doesn't even make any sense as an example for you to cite.  A person who says such a thing has a weird personality quirk that makes him say things the falseness of which is obvious, even to himself.  He might be a poet, or a jokester.  Why assume that a person who merely says that 2+2=5 actually holds an obviously false belief?  As Wittgenstein would say, we can't really understand what it would mean for an adult person, with a grasp of math, to deny that 2+2=4.  (Of course, a person who had never learned enough math to grasp that 2+2=4 wouldn't be in college.)  As to the believer in weird mythologies, it really depends on the case.  Some such people might need psychologists, not college professors.  A philosopher should not attempt such therapy.  Others might have accepted a weird pagan religion (such as "Odinism," which I've come across online).  Are you seriously saying that it's the job of college professors to disabuse people of their religious beliefs, if they are especially bizarre?  Aside from being well-nigh impossible, you'd be undermining the student's respect for you as a teacher, and you'd be inculcating the view that it's OK for him to push his weird views onto others if he ever gets into the position of teaching others.

So let's set aside those two examples of beliefs-that-students-should-be-disabused-of.  Clearly, more common, real-world examples are more apt to resemble Creationism and thus be open to the objection I've raised above.

By the way, I think one of the reasons that so many academics today act like such pompous asses, confidently expounding from "on high" about things well outside their areas and de facto requiring their students to endorse their own worldview, is that their own professors got them into the habit.  They, too, were taught that the way to deal with students is not to liberate their minds, but to indoctrinate them.  Boghossian merely reports and defends the habits of some of his peers--and to that extent, he has that honesty that I quite frankly admire when I find it among philosophers.  If Boghossian is taking his own advice, he no doubt behaves like an ass toward his students, and his arguments on this point are crap.  But he has the admirable honesty to own up to an assumption lying behind his and so many of his colleagues' practices of indoctrination.

In short, I find you and Boghossian embracing indoctrination as a pedagogical method.  Now, it might sound strange to call the inculcation of belief in the conclusions of science "indoctrination," but it is not the quality of the belief that makes for indoctrination, it is the method whereby it is taught.  If you're saying that you want to repudiate student belief, to label student beliefs as "mythology," you're not in favor of persuading recalcitrant students with argument; you want to shame them.

Of course, it should not be lost on anyone that there are several great ironies here.  The defenders of indoctrination are, of all things, philosophers. They are attempting to teach the values of science, which include, first and foremost, skepticism and the repudiation of dogmatism.  In so doing, they are seeking to replace dogmatically-held religious views--but with dogmatically-held scientific views, taught not by gentle persuasion and neutral presentation of fact and argument, but by "repudiation" and "labelling" of student views.

Sorry to be so pointed, but I really feel strongly about this, and have, as I said, since high school.