The future, according to Kathy Sierra

Kathy Sierra blogged earlier today six years ago (!) that "The future is not in learning"; the future lies, instead, in "unlearning."  This sounds awfully like another example of the geek anti-intellectualism that I love to hate; we'll see about that.  Since that's how the post comes across--beginning with the title--it has already gotten a lot of attention.  Geeks just love to hear that, in the future, they won't have to learn things.  They just love to talk about how they'll be able to upload their memories to the Internet, how Internet search makes memorization a waste of time, how they just can't make themselves read books anymore, how the intellectual authority of experts is passe, and how the liberal arts and college generally are a waste of time.  For all the world it seems they really hate learning.  So when Kathy Sierra says that the future is not in learning, they start salivating.

In fact, Sierra's main message is excellent, one that is not at all anti-intellectual.  She's saying that since the times they are a-changin' so much, we have to roll with them faster and faster, as change accelerates.  This is itself very old news, but it's always nice to be reminded of such perennial wisdom.

Too bad that, as a premise, it hardly supports the post's dramatic title or its opening pseudo-historical timeline.  Her timeline asserts that in the 1970s, the question was (somehow--who knows what this means?) "how well can you learn?"  In the 1990s, it was "how fast and how much can you learn?"  But today we have evolved!  Now it's about how quickly you can unlearn!

If we take the latter claim in any general sense, the argument is fallacious, of course.  It is true that in the 1970s education theorists talked a lot about how well we learn; they still talk about that.  It's also true that there was a movement to accelerate education, especially among young children, which had its height in the 1990s and in the heyday of NCLB.  But when Kathy Sierra next points out the homey, perfectly old-fashioned truth that we must change our habits to keep up with the times, she is changing the subject.  The following argument contains a fallacy:

1. We should unlearn habits that do not conform to new developments.
2. New developments are now coming fast and furious.
3. Therefore, the important new virtue is not learning, but unlearning.

The premises (1 and 2) do not support the sweeping conclusion (3).  I do not contradict myself when I maintain that we should still learn quickly and a lot (allegedly the "1990s" virtue--I thought it was an ancient virtue, but maybe that's just me), even while maintaining that we should change our habits as they become outdated.  The premises do support a much more modest conclusion, that being fast and flexible in how we change our habits is a new virtue.  But to say so entails neither that "the future is unlearning, generally" nor that "the future is not learning, generally."  So this is a plain old logical fallacy.  I leave as an exercise to the reader to name the fallacy.

Lest I be accused of misconstruing Kathy Sierra, let me add this.  I know that she spends most of her post explaining how we should unlearn certain outdated habits.  I agree that this is excellent and timely advice.  But that does not stop her from titling her post "The future is not in learning..." and contrasting the new virtue of "how fast you can unlearn" with the old virtues of "how well you can learn" and "how fast and how much you can learn."  But the fact of the matter is that unlearning outdated habits is a very, very different kind of learning (or unlearning) from learning facts.

Besides, even if you say that what we should unlearn is certain facts, the facts tend to be about narrow, practical, and necessarily changeable fields, viz., technology and business.  Just because technology and business are changing quickly, that doesn't mean a lot of our knowledge about other topics is becoming uselessly outdated.  If that's the argument, it too is obviously fallacious.

So does Kathy Sierra deserve to be called an "anti-intellectual" for this argument?  Well, on the one hand, one can't take her argument at all seriously as an argument against "learning."  On the other hand, she does seem to have something of a disregard for logic, and if she doesn't literally believe her title, she does seem to pander to the anti-intellectual sentiments of certain geeks.  I hate to be uncharitable, and I wouldn't want to accuse her of encouraging people to stop learning so much, but look--the anti-intellectual sentiment is in the title of her post. Yes, maybe she is merely gunning for traffic by pandering to geek anti-intellectualism.  But why would she want to do that if she didn't share their biases against learning?

UPDATE: see below.  Kathy Sierra responds to point out that this is a six-year-old post.  I don't know quite why I thought it was posted today!  But I've already made a fool of myself, and I'm not one to stop doing so after I've done it publicly, especially at someone else's expense.


Why online conversation cannot be the focus of a new pedagogy

One of the most commonly touted features of a new, digitally-enhanced pedagogy, championed by many plugged-in education theorists, is that education in the digital age can and should be transformed into online conversation. This seems possible and relevant because of online tools like wikis and blogs.  There has been a whole cottage industry of papers and blogs touting such notions.  Frankly, I'm not interested in grappling with a lot of this stuff.  Actually, I wish I had time, because it's kind of fun to expose nonsense to the harsh light of reason.  But for now, let's just say that I've read and skimmed a fair bit of it, and I find it decidedly half-baked, like a lot of the older educational theories that hoped for various educational "reforms."  Some reference points would include fuzzy buzzwords like connectivism, constructivism, conversation, the social view of learning, participatory learning, and many more.

I am interested in briefly discussing a very basic question that, I imagine, underlies a lot of this discussion: can online conversation serve as the focus of a new pedagogy?  I've already written a bit about this in "Individual Knowledge in the Internet Age," but I wanted to return to the topic briefly.

A lot of educators are--not surprisingly--very much struck by the fact that we can learn a lot from each other online.  This is something I've been aware of since the mid-90s, when I ran some mailing lists and indeed did learn a lot from my fellow early adopters.  I continue to learn a lot from people online.  Quora is a great way to learn (albeit it's mostly light intellectual entertainment); so are many blogs and forums.  And of course, wikis can be a useful source of learning both for writers and readers.  These all involve an element of online community, so it of course makes sense that educators might wonder how these new tools could be used as educational tools.  I've developed a few myself and actively participate in other online communities.

But when we, adults, use these tools and participate in these forums, we are building upon our school (and sometimes college) education.  We have learned to write.  We have (hopefully) read reasonably widely, and studied many subjects, giving us the background we absolutely require to understand and build upon common cultural references in our online lives.  But these are not attainments that school children share.  (My focus here will be K-12 education, not college-level education.)  You are making a very dubious assumption if you want to conclude that children can learn the basics of various subjects by online participation modeled after the way adults use online tools.  Namely, you are assuming that children can efficiently learn the basics of science, history, geography, and other academic subjects through online tools and communities that are built by and for educated people.

Of course they can't, and the reason is plain: they usually have to be told new information in order to learn it, and taught and corrected to learn new skills.  These are not "participatory" features.  They require that a teacher or expert be set up to help, in a way that does not correspond to the more egalitarian modes of interaction online.  Moreover, except in some fields that are highly interpretive such as literature or philosophy, the relevant information cannot be arrived at via reflection on what they know--because most children are quite ignorant and much in need of education.  To be able to reflect, they need input.  They need content.  They need food for thought.  They need training and modeling.  They need correction.  We adults don't experience these needs (at least, not so much) when we are surfing away.  We're mostly done learning the concepts, vocabulary, and facts that we need to make sense of conversation in the forums that interest us.

So the reason online conversation cannot be the focus of a new pedagogy is that online conversation, as used by adults for learning, requires prior education.

I have nothing whatsoever against K-12 classes putting their essays or journals on blogs, or co-writing things using wikis, or in other ways using online tools to practice research, writing, and computer skills.  But we should not fool ourselves into thinking that when children do these things, they are doing what we adults do, or that they're learning in the ways we do when we use blogs, wikis, etc.  They aren't.  They're using these as alternative media for getting basic knowledge and practicing skills.  We adults mainly use these media to expand our knowledge of current events and our special interests.  The way we use them is radically different from proper pedagogical uses precisely because our uses require a general education.

Are you skeptical?  Well, I expect that if you're reading this sentence right now, you're pretty well educated.  So consider, please.  What would it be like to read a science blog, or Quora answer on a scientific question, without having studied high school mathematics and science?  Pretty confusing.  What would it be like to read any of the bettter blogs out there--the ones in your own blogrolls or feeds--if you had not read a lot of literature and in other ways learned a lot of college-level vocabulary?  Difficult and boring.  What would it be like if you had to read the news, or political blogs or Wikipedia's current affairs articles, having only minimal knowledge of geography and civics?  Puzzling at best.  Could you really hold your own in a blog discussion about politics if you had an elementary student's grasp of history and politics?  Would you find it easy to write a forum post coherently, clearly, and with good mechanics and spelling, even just to ask a question, if you had not practiced and studied academic writing and grammar as much as you did?  I could go on, but you get the idea.  You can't do these various things that make you an effective, articulate, plugged-in netizen without already having a reasonably good liberal arts education.

I imagine it's sort of possible, but conversation online among your fellow students would be an incredibly inefficient way for you to learn these things in the first place.  Why spend your time trying to glean facts from the bizarre misunderstandings of your fellow 10-year-olds when you can get an entertaining, authoritative presentation of the information in a book or video?  And I'll tell you one thing--someone in your online study community, the teacher or the class nerd, will have to have read such "authoritative" media, and reveal the secrets to everyone, or you'll be "learning" in a very empty echo chamber.

At this point, someone is bound to point out that they don't really oppose "mere facts" (which can just be looked up), declarative knowledge, or "elitist" academics, or books, or content, or all the other boo-hiss villains of this mindset.  They just want there to be less emphasis on content (memorization is so 20th century!), and more on conversation and hands-on projects.  Why is that so hard to understand?  But this is where they inevitably get vague.  If books and academic knowledge are part of the curriculum after all, then in what way is online conversation the "focus" of the curriculum?  How are academics, really, supposed to figure in education--in practice?

My guess is that when it comes down to implementation, the sadly misled teacher-in-the-trenches will sacrifice a few more of the preciously scarce books in the curriculum and use the time for still more stupid projects and silly groupwork assignments, now moved online using "cutting edge" tools because that's what all the clever people say where "the future" lies.  As a result, the students will learn little more about computers and online communities than they would learn through their own use of things like Facebook, and they'll get something that barely resembles a "reasonably good liberal arts education."

EDIT: I greatly enjoyed this literature review/analysis article:

Kirschner, Paul A., John Sweller, and Richard E. Clark, "Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching," Educational Psychologist 41 (2), 75-86: http://www.cogtech.usc.edu/publications/kirschner_Sweller_Clark.pdf


On educational anti-intellectualism: a reply to Steve Wheeler

Suppose a student arrived at the age of 18 not knowing anything significant about World War II or almost any other war, barely able to do arithmetic, ignorant of Sophocles, Shakespeare, Dickens, and most other great writers, and wholly unschooled in the hard sciences (apart from some experiments and projects which made a few random facts stick).  Now, we can charitably concede that such a person could know his way around a computer, the Internet, and other technology very well.  He might have any number of vocational skills and have a job.  We can also imagine that such a person even writes and speaks reasonably well (although this seems unlikely).  Finally, we can imagine such a person being happy with himself and his "education."  This is all easy to imagine, because such students are being minted with appalling frequency these days in the U.S. and (to a lesser extent) the U.K.

Let us try to put aside our differences about educational philosophy for a moment; surely we can agree that, objectively speaking, this student is ignorant. He lacks an adequate amount of--to employ some jargon used by epistemologists, and by Steve Wheeler in a recent blog post that I responded to--"declarative knowledge."

So next, let's suppose that an education professor (whether this corresponds to Wheeler remains to be discussed) were to maintain that (1) our schools should be teaching even less declarative knowledge than they have been, (2) such traditional subjects as literature, history, geography, science, and French had become unimportant, or at least much less important, particularly now that Google supplies instant answers, and (3) we should not teach individual subjects such as those just listed, but instead mix various subjects together in projects that display how holistic and interrelated the world is.  Now, whatever else he might believe or say, it is reasonable to conclude that these recommendations, if followed by schools, would contribute to ignorance of the sort described above.

Now, I do not claim to have an interesting theory of anti-intellectualism.  But I do think that we can identify a theorist as anti-intellectual if his theories, when implemented on a large scale, would obviously and directly lead to widespread ignorance.  This isn't a definition; it's merely a sufficient condition.  (Forgive me for not refining this formula further, but I think it will do well enough.)  I could say more plainly that such a theorist supports ignorance over knowledge, but of course most people will deny supporting that.  So--to use some other philosophical jargon--I only ascribe the view to him de re, not de dicto.

This is not necessarily "anti-intellectual" in some more derivative senses, which have a lot of play in the media today.  For example, an anti-intellectual according to my test might also be an academic and staunchly in support of universities and academic work; he might support a technocratic government of experts; he might support science against faith-based criticisms.  But these are, I maintain, derivative senses of "anti-intellectual," because universities, experts, and science are each bastions of knowledge. Knowledge is the main thing.  So in a more basic sense, to be intellectual is to be a devoted adherent of knowledge, and particularly of abstract or general knowledge.  I don't intend this as a theory of anti-intellectualism, but more of a general, rough sketch.

Someone who recommends (or whose theories entail) that students should gain much less knowledge than they otherwise would seems to me a better example of an anti-intellectual than, say, a creationist or a climate change denier.  This is because the ignorance permitted is not limited to a particular topic, but is thoroughgoing--and deliberate.  The (perhaps fictional) education professor I described earlier is opposed to students getting more declarative knowledge, per se, than they get right now.  Whatever their problems, you can't say that of the creationist or the climate change denier; at worst, their positions make them hostile to particular examples of knowledge, not to knowledge per se. Which do you think is worse?

In his recent post, Steve Wheeler defends himself against my charge of "anti-intellectualism."  Now, I hope it's very clear that my posts are not only about Steve Wheeler.  He's just one example of a whole class of education theorist.  He has merely stated the position of educational anti-intellectualism with admirable clarity and brevity, making it especially easy for me identify and dissect the phenomenon.  Wheeler cites another Brit, Sir Ken Robinson, as someone who shares his views.  I'm sure he will not be surprised to learn that I have, in fact, responded similarly to Robinson (though I forebore to apply the label "anti-intellectual" in that case--I came close).  I also responded to another theorist Wheeler mentioned, John Seely-Brown, in this paper.

In his defense, Wheeler archly, with great irony, claims to be "gratified that someone with such a standing in the academic community had taken the time to read my post and respond so comprehensively" and "My list of peer reviewed publications and the frequency of my invited speeches around the world will not compare with his."  In case you have any doubt, let's just say that I am pretty sure Prof. Wheeler took the time to look at my site and gauge my meager academic and speaking credentials.  That would be the first thing that most academics would do.  So of course Wheeler knows that, in fact, I don't have much standing in the academic community at all; I have very few peer reviewed publications, and my speeches, most of which were not for an academic audience, are not as "frequent" as his.  He has me hopelessly outclassed in these areas, and he knows it.  He's the academic and the intellectual, and I'm the outsider--or so he seems to convey.

But his deliberate irony backfires, I find.  It is very easy for a distinguished academic, like Wheeler, to be hostile to knowledge, or science, or reason, or the prerogatives of experts.  Otherwise perfectly "intellectual" people have been justly called "anti-intellectual" because of their hostility to the products, power, or institutions of the mind.  "Anti-intellectual intellectual" is no more a contradiction than "anti-Semitic Jew" or "anti-American American."  So this defense is incorrect: "It seems a contradiction that he can view me as a 'serious theorist' and then spend the majority of his post trying to convince his readers that I am 'anti-intellectual'.  Surely the two cannot be compatible?"  Surely they can--and in our twisted and ironic age, all too often are.  So, while I have respect for Wheeler's work, it doesn't defend him from charges of anti-intellectualism.  He would conscientiously, on principle, deny our students just the sort of knowledge that he benefited from in his life and career--though he questions whether he needed them later in life, and says that his schooling "didn't make that much sense to me," and questions the worth of various subjects and facts that a liberally educated person, such as he himself, might pick up.

No, pointing out that he is a distinguished academic won't shield Wheeler from accusations of anti-intellectualism.  Only a frontal reply to my argument would do that.  Does his recent post contain such a reply?

Not exactly.  I am not going to do another line-by-line reply, as tempting as that might be.  He does deny that he wants to remove "all knowledge...from curricula."  I didn't think so, and my argument doesn't attack such a straw man.

In place of the relatively clear attack on "declarative knowledge," Wheeler's more cautious restatement resorts to a vague, contentless call for reform:

In my post I suggested that a possible way forward would require a reappraisal of the current curricula, with more emphasis on competencies and literacies. I wish to make something clear: My remark that some knowledge was susceptible to obsolescence was not a call for all knowledge to be removed from curricula - that would indeed be ridiculous. I am not attacking knowledge, as Sanger asserts. Rather, I am calling for schools to re-examine the content of curricula and to find ways to situate this knowledge within more open, relevant and dynamic learning contexts. I am also calling for more of an emphasis on the development of skills that will prepare children to cope better in uncertain futures.

He doesn't give many details here or later, nor does he really retract anything in particular from his earlier post.  He does regret using "poor illustrations and analogies to underpin this call," but only because it created a rhetorical opening for me.  As I see it, he wants us to believe that he were merely calling for schools to add a little more discussion and reflection into an otherwise really hardcore "facts-only" curriculum.

But it would be frankly ridiculous to characterize the American educational system, at least, this way.  Many teachers here are already deeply committed to the project method and skills education.  Students can get through an entire 13 years without reading many classics at all.  Indeed, just re-read the first paragraph of this post.  That (at least the first part) describes a lot of students.  Such poor results are no doubt partly because students don't study enough, and their parents aren't committed to school enough to get their children committed.  But it's also partly because schools simply don't teach enough, period.  I had an "honors and AP" sort of public school education in an excellent district (Anchorage, Alaska in the 1980s) and I didn't learn nearly as much as I could or should have.  This is why I'll be homeschooling both of my sons (my first is in kindergarten at home)--because standards have declined even farther from where they were when I was a student.

Schools do, clearly, require a huge amount of work. I think we can agree there.  But let's not confuse work with sound training in the basics and the liberal arts.  There's altogether too much busywork, worksheets, low-priority but time-consuming projects, group reports, etc., and not nearly enough reading of good books and reflective discussion and writing about it.  We could be requiring less but using more high-impact activities (like reading the classics and letting students go at their own pace through math texts, self-selected from a list proven to raise test scores), and students would learn more.

When Wheeler cites Ken Robinson in criticism of "old industrialised models" of education, calls for "conversation" and "self discovery," and approvingly quotes Richard Gerver in support of a "personal and unpredictive journey," I can stand up and cheer too.  I think Wheeler might be surprised to learn this.  On some issues, we might not be so far apart.  I'm an advocate of home schooling, in which such things are actually possible.  (As I said in my analysis of a Robinson speech, effectively opposing the "industrialized" or "factory" model of education really requires something like homeschooling en masse, which does not seem possible as long as control of education is centralized.)  But we still study subjects. Our studies still have coherence and benefit from our studying conceptually related topics near to the same time.  We still cover the traditional subjects like history and science--in far more detail than I ever did at this age.  It's just that we are able to take detours, choose the books we like, drop the ones we don't, etc.  The point is that you don't have to throw out the baby (knowledge) with the bathwater (regimented, unpersonalized school curricula).

So much for Wheeler's defense.

The question in my mind is whether his explanation has made his commitment to (1)-(3) any less clear.  Should our schools be teaching even less declarative knowledge than they have been?  So it seems, though now he regrets listing individual subjects and facts.  (Maybe fear of being called out as I've done with Wheeler explains why education professors often write so vaguely.)  He didn't mention--not to support or retract--all the business about declarative knowledge being trivial to access and going out of date anyway.  No retraction of the line that the availability of instant facts via Google make study of various academic subjects pointless.  Should we avoid teaching individual subjects, in favor of (much less efficient) projects that display how holistic and interrelated the world is?  He defended that in his latest.

Well then, my conclusion still stands: someone who believes (1)-(3) is, admit it or not, advocating for even more ignorance than we suffer from today.  It seems that Wheeler supports (1)-(3), and that looks pretty anti-intellectual to me.

Applying "anti-intellectual" to Wheeler's views is not a mere rhetorical "tactic," as he calls it.  Harsh and possibly impolite it might be, but it names an important feature of his views.  If I wanted to, I could politely agree to drop the epithet.  Then I would simply say that Wheeler's recommendations would have us, deliberately, on purpose, make students more ignorant and less knowledgeable.  Would that really be less damning than the epithet "anti-intellectual"?


An example of educational anti-intellectualism

I've got to stop blogging quite so much, but I couldn't let this pass without comment.

One would expect Steve Wheeler, Associate Professor of learning technology at Plymouth University in England, to be plugged into and more or less represent the latest academic trends in education technology.  If so, I'm a bit scared.

I came across Prof. Wheeler's blog post from yesterday, "Content as curriculum?" If I had wanted to create a parody of both kinds of anti-intellectualism I've mentioned recently--among geeks and among educationists--I couldn't have invented anything better. Wheeler hits many of the highlights, repeating the usual logical howlers as if they were somehow deeply persuasive. While I've already debunked a lot of this elsewhere, I thought it would be instructive to see that I have not, in fact, exaggerated in my characterization of the anti-intellectualism of some educationists.

Wheeler's post is so interesting and provocative, I'm going to go through line-by-line.

I think it's about time we reconsidered the way curricula in schools are presented. The tired, just in case model of curriculum just doesn't make sense anymore. Content is still very much king in schools, because 'content as curriculum' is easy and cost effective to deliver, and that is what most governments require and impose.

"Curriculum" is a very slippery term.  Wheeler here appears to mean "whatever can be taught." Later, he later brings out a distinction, familiar to philosophers, between declarative knowledge (knowledge that, "I know that 2+2=4") and procedural knowledge (knowledge how, "I know how to ride a bicycle"). Wheeler's main thesis seems to be that schools should concentrate on teaching procedural knowledge much more and declarative knowledge even less. So we can unpack the title of the blog post, "Content as curriculum?": he is skeptical that "content," or stuff that students might gain declarative knowledge of, should be the focus of the "curriculum," or what is taught.  The curriculum, he seems to maintain, should be practice, skills--not content.

In other words, if you strip away the edu-speak, Wheeler is saying that students should be taught a lot less declarative knowledge. Since this is what we ordinarily mean by "knowledge," we can put it even more simply: Wheeler is opposed to teachers imparting knowledge.

Now, this might sound like a ridiculous oversimplification of Wheeler's views. But if so, that's not my fault, it's Wheeler's. If you read his blog post, you'll see that I'm not being uncharitable in my interpretation. I'm simply explaining what he means. If there were any doubts or question that he really means this, he makes it all too clear in the next paragraphs, as we'll see.

But most teachers will tell you it's not the best approach.

I'm sure that teachers would be surprised to learn that their peers believe it's "not the best approach" to use "content," or what can be learned as declarative knowledge, as the "curriculum." All I can say is, I hope he's wrong. To be sure, there are some teachers out there who have great contempt for books and what I would call substantial learning. But surely they are still a minority.

When I went to school I was required to attend classes in mathematics, English language and literature, science (physics, biology, chemistry), history, geography, music, art, Religious Education, craft and design, home economics, German and French - all just in case I might need them later in life. With the exception of a few subjects, my schooling didn't make that much sense to me.

...and it appears that how these traditional subjects are useful to him "later in life" still doesn't make sense to him.  I'll enlighten him below.

Occasionally I hear someone saying "I'm glad I took Latin at school", and then arguing that it helped them to discover the name of a fish they caught whilst out angling on holiday. Well, knowing that thalassoma bifasciatum is a blue-headed wrasse may be wonderful for one's self esteem. It may impress your friends during a pub quiz, but it won't get you a job.... and was it really worth all those hours learning how to conjugate amo, amas, amat simply to be able to one day identify a strange fish, when all you need to do in the digital mobile age is Google it?

Here, finally, we get the hint of an argument: the reason that Latin, and presumably all those other subjects, are not "needed" in the curriculum is that we can Google that information. Actual knowledge of those subjects is not needed--because we've got Google.

But does he really mean that all those subjects he listed, and Latin, are not needed?

The question is, how much do children now need to learn in school that is knowledge based? Do children really need to know what a phrasal verb is, or that William Shakespeare died in 1616 when what they really need to be able to do is write a coherent and convincing job application or construct a relevant CV? We call this type of learning declarative knowledge, because it is 'knowing that' - in other words, the learning of facts. Yet, in a post-modernist world where all knowledge has become increasingly mutable and open to challenge, facts go quickly out of date. I was taught in school that there are nine planets orbiting the sun. Today it appears that Pluto is no longer a planet (but for me he will always be a cartoon dog). Is it Myanmar or Burma? I was told by my geography teacher it was Burma. Then she was right, now she is wrong. Just when did Mao Tse-tung change his name to Mao Zedong? And is the atom still the smallest object known to humankind? No. Now we have something called quantum foam. Apparently it's great for holding the universe together but pretty useless in a wet shave. You see, facts are changing all the time, and very little appears to remain concrete. So why are teachers wasting their own time, and that of the kids, teaching them facts which in a few years time may be utterly out of date?

Yep. He means it.

Now, look, I don't know how many times I need to repeat the arguments against this sort of nonsense--I think I did a pretty good job in "Individual Knowledge in the Internet Age"--but it won't hurt to rehearse a few of them briefly:

1. Much of the declarative knowledge that matters, and which requires time and energy to learn, is not of the sort that can be gained by looking it up in Google.  You can read some quick analysis of the causes of the Great Depression, but you won't really know them until you've studied the subject.

2. Most accepted knowledge doesn't change, even over a lifetime.  Fine, Pluto's no longer a planet.  The others are.  99% of what we knew about the solar system 50 years ago has not been disconfirmed.  Most new knowledge adds detail; it does not render old knowledge useless.  (Besides, the professor would not be able to cite this as an example if he had not learned that Pluto was a planet; he couldn't be an articulate, plugged-in thinker without his "useless" declarative knowledge, which he could count on other educated people sharing.)

3. Understanding an answer usually requires substantial background knowledge.  Suppose I want to know when Shakespeare died, and I find out that it is 1616.  But suppose I haven't memorized any dates.  Then this information, "1616," means absolutely nothing whatsoever to me.  It is, at best, a meaningless piece of trivia to me.  Only if I have studied enough history, and yes, memorized enough dates, will 1616 begin to have some significance to me.  I wonder if Wheeler thinks the date doesn't matter, period, because Shakespeare doesn't matter, period. After all, if that date isn't important, is any important?

4. Most vocabulary is learned in context of copious reading.  If schools start teaching "procedural knowledge" instead of "declarative knowledge," then the vocabulary and conceptual stockpile of students will be so poor that they can't understand the answers they Google.  (They certainly wouldn't be able to understand this blog post.)

5. Finally, declarative knowledge is its own reward.  Understanding the universe is a joy in itself, one of the deepest and most important available to us.   You are a cretin if this point means nothing to you.

Mainly what I think is interesting here is that this is a professor of education, and he is espousing flat-out, pure, unadulterated anti-intellectualism. An educator opposed to teaching knowledge--it's like a chemist opposed to chemicals--a philosopher opposed to thinking. Beyond the sheer madness of the thing, just look at how simple-minded the argument is, and from what appears to be a rather distinguished academic. I actually find this rather sobering and alarming, as I said. It's one thing to encounter such sentiments in academic jargon, with sensible hedging and qualifications, or made by callow, unreflective students; it's another to encounter them in a heartfelt blog post in which, clearly, a serious theorist is airing some of his deeply-held views in forceful language.

Wheeler goes on:

Should we not instead be maximising school contact time by teaching skills, competencies, literacies? After all, it is the ability to work in a team, problem solve on the fly, and apply creative solutions that will be the common currency in the world of future work. Being able to think critically and create a professional network will be the core competencies of the 21st Century knowledge worker. Knowing how - or procedural knowledge - will be a greater asset for most young people. You see, the world of work is in constant change, and that change is accelerating.

There was never a time in history in which "ability to work in a team, problem solve on the fly, and apply creative solutions" were not significant advantages in the work world.  These are not new features.

Another point that seems to be completely lost on those who make such sophomoric arguments as the above is that having a deep well of conceptual understanding is itself absolutely necessary to the ability to "work in a team, problem solve on the fly, and apply creative solutions."  It's even more important for the ability to "think critically."  This is why philosophy graduates famously excel in the business world.  They are trained to think through problems.  Difficult problem solving requires abstract thinking, and the only way to train a person to think effectively and abstractly is by tackling such difficult academic subjects as science, history, classic literature, and philosophy.  Besides, the skills and knowledge learned in these subjects frequent provide a needed edge in fields that require a mathematician's accuracy, a historian's eye to background, a litterateur's or psychologist's grasp of human nature, or a philosopher's clarity of thought.

Besides, not only has declarative knowledge mostly not changed, procedural knowledge changes much faster--which is probably part of the reason it was not taught in schools, for a long time, apart from a few classes.  The specific skills for the work world were, and largely still are, learned on the job.  So let's see, which would have been better for me to learn back in 1985, when I was 17: all the ins and outs of WordPerfect and BASIC, or U.S. History?  There should be no question at all: what I learned about history will remain more or less the same, subject to a few corrections; skills in WordPerfect and BASIC are not longer needed.

My 16 year old son has just embarked on training to become a games designer. If, when I was his age I had told my careers teacher that I wanted to be a games designer, he would have asked me whether I wanted to make cricket bats or footballs. Jobs are appearing that didn't exist even a year or two ago. Other jobs that people expected to be in for life are disappearing or gone forever. Ask the gas mantel fitters or VHS repair technicians. Ask the tin miners, the lamplighters or the typewriter repair people. Er, sorry you can't ask them. They don't exist anymore.

I don't quite understand Wheeler's point here.  His 16-year-old son is training to become a games designer, at an age when Wheeler and I were spending our time learning "mathematics, English language and literature, science (physics, biology, chemistry), history, geography, music, art," etc.  By contrast, his son's early training in games design is supposed to help him in twenty or thirty years--when games will be exactly the same as they are today?  I thought the point was that things like game design change very fast.  Well, I don't know his circumstances, but my guess is that his son would be better off learning the more abstract, relatively unchanging kind of knowledge, providing a foundation, or scaffolding, that will make it easier to learn particular, changeable things later, as well as communicate more effectively with other well-educated people.  Here I hint at another argument for declarative knowledge, E.D. Hirsch's: it provides us an absolutely essential common culture, which makes it possible for Wheeler and I to understand each other, at least as well as we do.  Ask entrepreneurs you know and they'll tell you: the ability to communicate quickly, precisely, and in general effectively is a deeply important ability to have in a employee.  You don't often gain that ability on the job; you develop it, or not, by studying various modes of communication.

Why do some teachers still provide children with answers when all the answers are out there on the Web? Contemporary pedagogy is only effective if there is a clear understanding of the power and efficacy of the tools that are available. Shakespeare may well have died in 1616, but surely anyone can look this up on Wikipedia if and when they need to find out for themselves? ...

Well, here's another puzzling thing to say.  Teachers don't "provide children with answers."  If they are doing their jobs properly, they are getting children to learn and understand the answers (and the questions).  A teacher is not a search engine.  Moreover, it's unconscionable that a trainer of teachers would pretend that what teachers do can be done by a search engine.

Get them using the digital tools they are familiar with to go find the knowledge they are unfamiliar with. After all, these are the tools they carry around with them all the time, and these are the tools they will be using when they enter the world of work. And these are the tools that will mould them into independent learners in preparation for challenging times ahead.

"Digital tools" will "mould them into independent learners"?  I've heard a lot of things about digital tools, but never that they would make an independent learner out of a student.

If you want to make an independent learner, you have to get them, at a minimum, interested in the world around them--in all of its glorious aspects, preferably, including natural, social, political, mathematical, geographical, philosophical, and so forth.  If they aren't interested in those rich, fascinating aspects of the world--and why should they be, if their teachers dismiss such knowledge as "useless"?--they'll no doubt only be interested in the easiest-to-digest entertainment pablum.  Why think they'll get very interested in how to build software if they haven't been led to be curious about the facts about how software, or the world generally, works?  Surely Wheeler's son has learned some facts that have provided essential scaffolding for his interest in computer programming.

I don't think digital tools, and the mere ability to use them, will make curious, independent learners out of people all by themselves.  Most of all, we need exposure to the content of thought, concerning the natural world and our place in it; then we need to be given the freedom to seek out the answers on our own time.  I don't support the idea of spoon-feeding information--that's what poor teachers do, as well as search engines, come to think of it.  I think students should be challenged to read deeply and reflect on what they read.  That's the tried and true way to make a curious, independent learner.

We need to move with the times, and many schools are still lagging woefully behind the current needs of society. Why do we compartmentalise our subjects in silos? When will we begin to realise that all subjects have overlaps and commonalities, and children need to understand these overlaps to obtain a clear and full picture of their world. Without holistic forms of education, no-one is going to make the link between science and maths, or understand how art or music have influenced history. Some schools such as Albany Senior High School in Auckland are already breaking down the silos and supporting learning spaces where students can switch quickly between 'subjects' across the curriculum. Other schools are beginning to realise that ICT is not a subject and shouldn't be taught, but is best placed as embedded across the entire curriculum.

To answer Wheeler's no doubt rhetorical question, we study various special subjects independently for the reason that there is no other efficient way to learn those subjects.  To be sure, it makes sense to learn the humanities all together, in historical order.  Other subjects might, perhaps, be usefully combined.  But if you make a big melange of it, you'll find that the subjects simply can't be mastered nearly as quickly.  You have to spend a significant amount of time on a subject before the neurons really start firing.  If you're always skipping around from this to that, on the basis of rough analogies and not more substantive conceptual relationships (as become clear when you study anything systematically), you never get the extremely significant advantages that accrue from studying things that are conceptually closely related at about the same time.

But if, like Wheeler, you don't think that declarative knowledge of academic subjects is especially important, and you can't grasp what importance abstract conceptual dependencies might have for enhancing understanding, communication, and curiosity, then, no.  You might not think there's any point in focusing on particular subjects.

So, no, "moving with the times" does not require that our children arrive at adulthood as ignoramuses.

It's about time we all woke up and realised that the world around us is changing, and schools need to change too. After all, the school still remains the first and most important place to train and prepare young people for work. If we don't get it right in school, we are storing up huge problems for the future. Education is not life and death. It's much more important than that.

Didn't Wheeler ever learn plus ça change, plus c'est la même chose in his French class?  The pace of change has been increasing, no doubt.  But the world has been changing fairly quickly for the last 150 years, and one of the classic arguments for liberal arts education--something that I can't imagine Wheeler actually endorsing, given what he's written--is precisely that the liberal arts enable us to deal with such changes by giving us a solid foundation, a mature (not to say perfect or unchanging) comprehension of the natural and social world.  They also give us the ability to think and communicate more deeply and accurately about new and changing things.

A student well educated in the liberal arts, who has "memorized"--and really understood--a boatload of "mere facts," will be far better prepared to meet the changes of the coming century than someone who is well trained in the use of digital technology to talk at random about things of which he has been left tragically ignorant.


A short manifesto for schools: time to focus on knowledge

Ever since I was an elementary school student myself, I have been chronically disappointed with the American education establishment. Don't get me wrong--I get along fine with most of the educators I encounter, who are good people and full of all sorts of good ideas and real competence. But I also believe a sickness pervades the entire American system of education, the sickness of anti-intellectualism.

I read plenty of blogs, tweets, and articles on education and ed tech, as well as the occasional book, from all sorts of cutting-edge teachers, administrators, and education theorists. They are all abuzz about the latest online educational resources, which I love and use (and develop) too. But whenever the subject of learning facts or substantially increasing levels of subject knowledge, and--especially--tests of such things comes up, I seem to hear nothing but boos and hisses. This might be surprising, because, after all, what are those educational resources for if not to increase levels of subject knowledge? It seems an exception is made for technology.

But to focus attention on ignorance among students, poor test results, etc., apparently means caring too much about "direct instruction" and a lot of other betes noire of the education establishment. If I talk much about raising standards or returning the focus to knowledge, I threaten to reject "student-centered" education, the project method, "useful" (read: vocational) knowledge, and authentic assessments, and replace such allegedly good things with "drill and kill," learning "trivia," boring textbooks, and in general a return to soul-killing, dusty old methods discarded long ago and rightly so. What I rarely encounter from the education establishment--though critical outsiders like myself talk endlessly about it--is evidence of an earnest concern about, quite simply, how much students learn.

Enter this Atlantic article about a recent study of the factors correlated with test success, by Harvard researchers Will Dobbie and Roland Fryer--not education professors, but economists who know a thing or two about research methods. Dobbie and Fryer discovered, unsurprisingly, that higher student scores are correlated with a "relentless focus on academic goals." Such a focus entails "frequent teacher feedback, the use of data to guide instruction, high-dosage tutoring, increased instructional time, and high expectations." The factors not correlated with school effectiveness, by contrast, included "class size, per pupil expenditure, the fraction of teachers with no certification, and the fraction of teachers with an advanced degree." My hat is off to these researchers for reminding us, and giving new supporting data, for what those of us outside of the education establishment already knew: a culture of commitment to academic success is the main thing that matters to academic success. But we may confidently predict that the education establishment will dismiss this study, as it has done so many others that came to similar conclusions.

Reading about the study inspired a few thoughts. First, it is indeed the culture of a school that determines whether its students are successful. This claim makes sense. If the goal of schools were to inculcate knowledge in students--success at which is might be measured by the study's "credible estimates of each school's effectiveness"--then it is simple rationality, and requires no special studies, to conclude that the school's staff should have a "relentless focus on academic goals."

If you want to make schools better, it's important that they be absolutely focused on academic goals, which is to say, on knowledge. Excess resources won't buy such a focus (and the study indicates that this might be inversely correlated with success). Class size doesn't matter. Focus on "self-esteem" doesn't help. What is needed more than anything is a pervasive, institutional commitment to knowledge as a goal.

Sadly, increasing student knowledge is not the overriding goal of American schools.

I wish I had time to write a book for a popular audience of parents, teachers, and older students, defending knowledge (not "critical thinking," which requires knowledge; not self-esteem; not social training; not high-tech training) as the most important goal of education. I'd also like to make the case that the education establishment has been really, truly, and in fact anti-intellectual and "anti-knowledge." I'm merely asserting this perhaps startling claim in this post--this isn't the place to make the case. But those who are familiar with intellectualist critiques of American public schools will understand.

If you really want to know why so many kids in the United States do poorly on exams and are basically know-nothings who turn into know-nothing adults, I'll tell you. It's because many parents, many teachers, the broader education establishment, and especially the popular culture of "cool" which guides children's development after a certain age are simply anti-intellectual. Europeans and especially Asians do not have such an albatross around their necks. They actually admire people who are knowledgeable, instead of calling them nerds, and (later in life) dismissing their knowledge as "useless" and dismissing them because of their less-finely-tuned social skills, and (after they gain some real-world status) envying them and labeling them "elitist."

It's been widely believed and often admitted for a long time that the United States has a long, nasty streak of anti-intellectualism. Democrats love to bash Republicans on this point. (And recently I bashed geek anti-intellectualism as well.) But anti-intellectualism in schools? This is apt to make many Democrats, and the more establishment sort of Republican, pretty uncomfortable. Still, it's true. This broad societal illness has kept our schools underperforming not just recently, but for generations.

The common complaints about standardized testing are misplaced. If schools were filling their charges' minds adequately with academic knowledge and skills, they would welcome standardized tests and would not have to spend any extra time on preparing students for them. The focus on the project method is misplaced, too. Projects and experiments are important as far as they go, but it is simply inefficient--if the goal is to develop student knowledge--to make projects the centerpiece of pedagogy.

Finally, the generations-long the flight from books, especially well-written books, is a travesty. Books are where the knowledge is. If you want students to know a lot, have them read a lot of non-fiction books. They will inevitably become very knowledgeable. If you make sure that the books are well-written--not boring library fodder like so many geography books, for example--and hand-picked by students, they will more likely enjoy their reading. Having read probably a few thousand children's books to my son in the last five years, I can assure you that there is no shortage of excellent children's books on most subjects. We're in a sort of golden age of children's books--never before has there been such a tremendous variety of offerings so readily available.

Principals and teachers need to lead the way. They need to convey to their students and their students' parents that their schools are all about getting knowledge. "When you leave our school, you will know a lot," educators should be able to tell their students--honestly. "But we will expect you to pay attention and read a lot of books. We will expect a lot of you. Learning at this school won't be easy, but the effort will definitely be worth it. It will open up your eyes to a world that is far larger and deeper than you knew. The knowledge you will gain will make you, in a way, a bigger person, more connected to everything all around you, and better prepared to make the world a better place."

Finally, if schools don't throw off this anti-intellectualism, which has become positively stodgy and stale, and which is so contrary to their mission, they can expect to encounter more and more competition in the form of charter schools, school vouchers, homeschooling, and now virtual schools. If parents who really care about learning run toward new educational systems that have a better chance of actually educating their children, who can blame them?


On changing student beliefs

I came across a very irritating post in the Coffee Theory blog by Greg Linster, and felt inspired to respond.  This began as a comment on his blog, but after a while it became too long for that, so I figured I'd just put it on my own blog.

Greg, you actually seem like a nice guy to me, so try not to take this personally.  I am surprised that you apparently did not notice that you were assuming that it is even permissible to indoctrinate your students; I'm surprised that you thought that the only serious question was whether you have a responsibility to do so.

I think it's wrong--possibly morally wrong, but ill-advised, certainly--for college professors, and most other instructors, to presume to correct any of their students' earnestly-held and not merely mistaken beliefs.  I have felt this way since I was a high school student.  The force of the neutral presentation of information and argumentation is usually enough to work its magic on an adequately receptive mind.  What about the inadequately receptive minds?  Indeed there might be some religious types who are so dogmatic that they might not be benefited by a college education (although I really doubt that); there are some who become so disgusted that they quit college and become thoroughly anti-intellectual.  But such people won't be helped by the likes of Boghossian.  They'll be helped by methods that are more neutral and respectful toward the views of the student.

The role of the teacher is to guide the student's minds, not to force them--to point them in the right direction, or better, to lay out a pristinely clear map of the land--and leave it up to them to come to sincerely held beliefs that are, one hopes, truer and more nuanced than they were before.

As a college student, I always found the tendency of some college professors to teach their own pet views to be extremely annoying.  What I believe is my own business.  I have no desire to be part of your project to transform the world in your image; as a student, I regard myself as a free agent and merely want the tools to shape my own beliefs.  Indeed, a lot of college students (and for that matter, high school students) are especially irritated by the tendency of some professors (and teachers) to "indoctrinate" them.  Some students make decisions about where to go to college based on how much they can expect to be indoctrinated, or not.  Of course, the fact that a student finds a practice extremely irritating does not mean that it is a poor idea in itself; but the extremeness of the irritation is certainly suggestive.

My basic argument is that the aim of a liberal education is to train the mind to think independently and critically.  But berating students and indoctrinating them has the opposite effect, and for that reason is ruled out.  It leads them to understand that in the so-called "life of the mind," we may lord it over others once we gain the authority to do so.  We do not persuade, we (at best) cajole, and if necessary pressure and even force, the minds in our care.  A mind that is in the habit of viewing intellectual disputes between professor and student as best settled not as a matter of rational disputation but instead by authority handing down the law, such as you and Boghossian (i.e., Peter Boghossian, not to be confused with a more distinguished philosopher, Paul Boghossian) maintain, is a mind that is not liberated to think critically.  (Boghossian's article was, frankly, awful.  I'm embarrassed that it was written by a philosopher.)

Student beliefs in creationism merely require an application of the general rule here.  By all means, state the facts as scientists have uncovered them.  Sure, present the arguments for and against creationism; have the confidence that the latter will appear more plausible on their own merits; but do not berate or belittle students for disagreeing with you, do not mark down essays because they express objectively false theories (there might be exceptions to this rule, of course), and certainly do not require them to repudiate Creationism.  The best way to persuade them to give up cherished, but ill-supported and irrational beliefs, is to train them better in the habits of rational thought. And if you can't do that, and have to resort to "repudiating" and "challenging" student opinions, you have no business teaching at an institution of liberal education.  I certainly wouldn't want my sons taking classes from you, even if I share a scientific, nonbelieving attitude toward Creationism (as I do).  I would much rather have my sons taking classes from someone who will present the liveliest versions of all arguments and then challenge them to come to the most carefully-argued, nuanced view of the situation.

Boghossian's problem isn't precisely "intellectual arrogance."  He is not wrong merely because he arrogantly assumes that he is correct in repudiating Creationism.  If this is the main argument against him, people are arguing against him for the entirely wrong reason.  He is wrong, instead, for attempting to bully his student into changing his beliefs.  This--not the professor's confidence, but his bullying--does harm to the student's mind and even character.

Greg, you cite a couple of examples of beliefs that, you say, one should certainly try to change in students.  First is the "belief" that 2+2=5; but this doesn't even make any sense as an example for you to cite.  A person who says such a thing has a weird personality quirk that makes him say things the falseness of which is obvious, even to himself.  He might be a poet, or a jokester.  Why assume that a person who merely says that 2+2=5 actually holds an obviously false belief?  As Wittgenstein would say, we can't really understand what it would mean for an adult person, with a grasp of math, to deny that 2+2=4.  (Of course, a person who had never learned enough math to grasp that 2+2=4 wouldn't be in college.)  As to the believer in weird mythologies, it really depends on the case.  Some such people might need psychologists, not college professors.  A philosopher should not attempt such therapy.  Others might have accepted a weird pagan religion (such as "Odinism," which I've come across online).  Are you seriously saying that it's the job of college professors to disabuse people of their religious beliefs, if they are especially bizarre?  Aside from being well-nigh impossible, you'd be undermining the student's respect for you as a teacher, and you'd be inculcating the view that it's OK for him to push his weird views onto others if he ever gets into the position of teaching others.

So let's set aside those two examples of beliefs-that-students-should-be-disabused-of.  Clearly, more common, real-world examples are more apt to resemble Creationism and thus be open to the objection I've raised above.

By the way, I think one of the reasons that so many academics today act like such pompous asses, confidently expounding from "on high" about things well outside their areas and de facto requiring their students to endorse their own worldview, is that their own professors got them into the habit.  They, too, were taught that the way to deal with students is not to liberate their minds, but to indoctrinate them.  Boghossian merely reports and defends the habits of some of his peers--and to that extent, he has that honesty that I quite frankly admire when I find it among philosophers.  If Boghossian is taking his own advice, he no doubt behaves like an ass toward his students, and his arguments on this point are crap.  But he has the admirable honesty to own up to an assumption lying behind his and so many of his colleagues' practices of indoctrination.

In short, I find you and Boghossian embracing indoctrination as a pedagogical method.  Now, it might sound strange to call the inculcation of belief in the conclusions of science "indoctrination," but it is not the quality of the belief that makes for indoctrination, it is the method whereby it is taught.  If you're saying that you want to repudiate student belief, to label student beliefs as "mythology," you're not in favor of persuading recalcitrant students with argument; you want to shame them.

Of course, it should not be lost on anyone that there are several great ironies here.  The defenders of indoctrination are, of all things, philosophers. They are attempting to teach the values of science, which include, first and foremost, skepticism and the repudiation of dogmatism.  In so doing, they are seeking to replace dogmatically-held religious views--but with dogmatically-held scientific views, taught not by gentle persuasion and neutral presentation of fact and argument, but by "repudiation" and "labelling" of student views.

Sorry to be so pointed, but I really feel strongly about this, and have, as I said, since high school.


The "times-are-changing, specific-knowledge-is-unnecessary" canard

I don't know how many times I have read, "The world, and research findings, are changing so fast that it is pointless to insist on learning particular bits of knowledge.  Much of it will be outdated soon.  So the only really important thing to teach children is how to think."

This is dangerous nonsense.  It is partly responsible for educators' ambivalent attitude toward substantial knowledge, which is in turn responsible for so much ignorance in American society--I chalk it up to schools dominated by the various 20th-century "progressive" educational methods.  After all, just think of what it means to say that it's pointless to insist on learning any particular subject.  It's pointless to learn math?  Reading?  Writing?  Science?  History?  There are truly excellent and I daresay objective reasons that these, and others, are important subjects to study.  I recently gave some reasons to study geography, a relatively "fast-changing" field.  In fact, exactly to the contrary of the above fashionable canard, I think it's important to learn a lot about the whole world--far more than most elementary students learn.

Imagine someone saying to a child 75 years ago, quite accurately, that in your lifetime the world will change radically, and much new knowledge will be discovered and things that are relevant in 75 years are today unknown or unemphasized.  Would it follow, 75 years ago, that we had no idea what we needed to know or learn?  Of course not.  There is no need to throw the curriculum out the window simply because the world is changing.  Consider the elementary school subjects (or what should be such subjects) and ask yourself how many of them have changed much in the last 75 years, despite the fact that the most revolutionary technological changes in all of human history happened in this period: reading; penmanship; arithmetic; world history since ancient times; national histories; geography; science; art; music; and various others to taste.

What new subjects have become important for elementary students to know?  Typing and computers, and that's pretty much it.  The only things that have become even possibly outdated are cursive handwriting and Latin (though Latin was already mostly out, 75 years ago).  National history has had 75 more years' worth of facts, but a similar observation could be made 75 and 150 years ago, and that wasn't a reason not to learn history.  Same is true of geography--we don't not learn geography because borders will change in our lifetimes.

What gives this meme (actually, I hate the concept of "memes" but it fits in this case) its teeth is that science and technology changes and develops so fast.  But the vast bulk of these changes are relevant to students at higher levels, in high school and college.  And even then, we should not tell students, "You don't have to learn brain science because the field will look totally different in 20 years."  We should say, "You really ought to learn brain science because it's such a hot field with fascinating discoveries being made all the time."

This is not to mention a point well discussed by E.D. Hirsch Jr. in Cultural Literacy that specific knowledge greatly enhances one's ability to reason; if one does not have specific knowledge, then one can neither mount a defense of one's claims nor evaluate others' arguments for their factual content.  All that formal logic by itself can do, of course, is evaluate the formal logic of an argument, and most fallacies that people commit are informal, not formal fallacies--and informal fallacies often require some specific knowledge to identify.  More to the point, the soundness of arguments requires the ability to evaluate the truth value of premises, which is a matter of factual content, not logic.  The more knowledge one has about a field, the better one can reason and judge the reasoning about it.  If you want to teach children "how to think," it is crucial that you acquaint them with the subject of thought and not just its form.


Unobvious concepts that are important, but rarely taught

To understand history, geography, fiction, and a whole lot else besides, it is fairly important that children be taught in considerable depth certain practical yet universal concepts about history that, I think, are rarely explained in much very depth:

Gathering (as in "hunting and gathering")
Animal skins (and tanning)
Nomadic life
Shepherding, spinning, weaving, dyeing
Plowing, sowing, reaping/harvesting
Mining, smelting, smithing
Religion: how polytheist peoples thought of gods, why they turned to gods, and why religion became important to them
Tribe
Lord or chieftain
City-states, small kingdoms and fiefdoms
What "empire" means, and what an empire was like (beyond lines on a map)
Trading and bartering
The marketplace (e.g., agora)
Ignorance: what life was like in the absence of geographical, historical, and scientific knowledge

I could go on. I could make similar lists for other subjects: science, geography, etc. I find a pattern here: these are abstract concepts. Explaining them to children might seem daunting. But of course many of them are no more difficult to explain than telling a story; it just requires a little creativity. I think the reason that schools avoid teaching such things in more depth to students is that--no offense, I hope--most educators are very concrete-bound themselves. They haven't been well trained in thinking conceptually or abstractly, much less teaching their charges to think this way. So textbooks, and teachers using them, are often content merely to offer definitions, which are often themselves ill-understood, when what is needed is a long discussion.

H. and I have read a lot of history in the last year and I get the sense that only now is he starting to get a proper handle on such very basic, crucial concepts as trade and polytheism. If you don't have a fairly firm grasp on what the above concepts really entail, by which I mean more than a poorly-understood definition, you simply won't understand history, period. That is why it will be boring to you.

One of the main reasons we get "deeply" into certain subjects (lately, South America, as we begin our systematic study of geography) is that with each new book on the subject, more of such "basics" become clear. With the first few books, I spend quite a bit of time explaining what various words mean, and even taking breaks to show illustrative videos and such. After the third or fourth book on a subject, we suddenly find ourselves not only understanding much better but enjoying ourselves more and able to learn more quickly. Another example. If you can believe it, I'm reading Howard Pyle's The Merry Adventures of Robin Hood to 5-year-old H.  Here is a sample:

Then he chose the stoutest bow among them all, next to Robin's own, and a straight gray goose shaft, well-feathered and smooth, and stepping to the mark--while all the band, sitting or lying upon the greensward, watched to see him shoot--he drew the arrow to his cheek and loosed the shaft right deftly, sending it so straight down the path that it clove the mark in the very center. "Aha!" cried he, "mend thou that if thou canst"; while even the yeomen clapped their hands at so fair a shot.

"That is a keen shot indeed," quoth Robin. "Mend it I cannot, but mar it I may, perhaps."

As I re-read this, I think I must be crazy to try to read this to a 5-year-old.  But he has repeatedly declared he likes it, a lot, and has repeatedly requested it and has never said "no" to it.  (We started reading it when I went looking for a simple version of Robin Hood to download, after he had read two even simpler versions to himself; but I found only the Howard Pyle version.  So I said, "Listen to this," and read a few paragraphs.  Then he urged me to go on.)  When we started reading it, I had to explain everything.  Now, a lot of this is a kind of cant and is not hard to interpret, when you understand the oft-repeated archaisms.  Stripped of archaisms, it wouldn't be too hard.  So while in the first couple chapters I had to explain things like "thou canst" and "yeoman," which are used all the time, by the time we got to chapter 5, I was left explaining mostly the things that are conceptually difficult, like "mend it I cannot, but mar it I may."  I simply offer a quick gloss in such a case; I might say, "OK, 'mend' means 'fix,' and 'mar' means 'hurt.'  So he means: 'I cannot fix it--I cannot get a better shot with my arrow--but I can get close enough to hurt it--to hurt the arrow.'"  Then often I'll read the original sentence again.  (By the way, it helps tremendously that we are reading this on the iPad, so we can look up words instantly, with a tap.  I'm not sure we'd have the patience for this without the iPad.)  H. might be a weird kid, but he rarely gets tired of these explanations.

By the way, if you were a zealot for "constructivist" theory, I imagine you might say, "Hey, you're explaining everything to your child.  That's direct instruction.  He won't learn as much that way."  Our experience shows this to be completely wrong.  He reads over an hour a day to himself--right now he's on Harry Potter #2--well, I'll explain more further down.

The point is that even supposedly "advanced" subjects, and texts like Robin Hood, are accessible after just a little explanation of the cant necessary to make sense of it.

This is one of the reasons I'm rather excited about putting the software behind Reading Bear to use creating explanations of the concepts necessary to understand, well, everything.  If I am right, a child who goes through my history concept primer from beginning to end will be able to pick up history books and really start to appreciate and make sense of them.  And again, I hope I'll have time do this for many different subjects, not just history--but then, it's not even quite a sure thing that I'll start the project at all.

I'm also interested in what will happen when I start explaining abstract "philosophical" concepts to children, which I think will not be very difficult to do.  I think we may be very surprised by the increased sophistication of the thinking of children, when they are trained with the right materials.

One last topic in this rambling post.  I'm sure some people (who wouldn't make it this far in the post, probably) would react to all this by saying, "What is the point? Suppose you can teach a child to be able to understand history at age five.  Why would you want to?  Why not wait until they are older and can understand the concepts more naturally, with less difficulty?  Don't you think they'll benefit from the training more then?"

I think about this sort of objection a fair bit, but I really don't find it compelling, because it seems to be based on a confusion, or a false assumption.  The assumption is that the concepts in question are especially difficult, and for this reason it's especially easier to teach them later on.  I deny this.  What's hard about explaining what weaving, for example, is?  Can't a three-year-old, when confronted with the materials and process of weaving, more or less understand what's going on?  I've seen this over and over teaching H., and I am seeing it start again with E.--where a concept traditionally taught much later is introduced entertainingly and effectively early on, in an "age-appropriate" way, and as a result the child is able to understand things that are thought to be appropriate only for older children.

It's not just because I want to get my boys learning sooner, so that they can learn more in the long run, although that's definitely part of it.  It's also because I remember, I really do, being frustrated with my teachers, thinking that they should just explain stuff more.  I thought I could have learned a lot more, and I still think so, if teachers and texts had simply taken more care to explain all the various little things that are unexplained.  In this age in which constructivist pedagogy reigns, there is even more of a notion that children should just figure things out for themselves--which sounds rather like they shouldn't be taught.  The pedagogical notion, which back then I found frustrating, seemed to be that children learn best when they are left to figure things out for themselves, heroically.  But a lot of times, kids never do figure stuff out, not ever.  They just end up being ignorant, and increasingly frustrated because they never understood something that someone could have explained to them when they were five.  This is why most people hate history: they simply don't get it.

A "constructivist" argument, as I understand it, would be that the motivation for learning and the quality of understanding are much greater if a child is left free to figure things out for himself.  Now, I do think a child should be asked open-ended questions, given experiments, and generally trained to think a problem through.  But the amount of knowledge there is to learn is massive.  (Knowing this, constructivists end up dismissing substantial learning as "mere memorization," a tendency that causes me no end of frustration.)  If you want a sophisticated understanding of the world, it is a fool's errand--literally--to try to construct it on your own, as if you could reproduce the entire history of thought by yourself, as a child.

Moreover, the more that concepts are simply and clearly explained to a student, the greater his toolset to let him figure things out on his own.  As I said, H. is now reading on his own over an hour a day, and he's reading chapter books well above his grade level.  I regularly quiz him on these.  Not only his memory but his comprehension of what he reads always surprises me.  When I ask him "deeper" questions, regarding things like motivations and general explanations, he is also well above grade level.  The answers seem to come quite naturally to him.  He can also articulate his answers very well, using advanced vocabulary and complex grammatical structures.  I credit my tendency to explain everything to him, so that he has a very explicit understanding of nearly everything he has learned, and especially concepts.  If the constructivist objection to my schemes (at this point I am only imagining such an objection) carried any water, then one would expect to see H. unmotivated and frustrated in his lack of ability to understand what he reads with any sophistication.  But I have no complaints about his motivation level, and his ability to understand is great.  I think that when he's ten or twelve, he'll be reading and understanding philosophy and thinking more deeply than a lot of people.

Anyway, my great curiosity, which I hope to explore with a Reading Bear "concept encyclopedia for children," is whether it is possible to impart "precocious" levels of understanding to very young children about all sorts of things, simply by systematically exploring concepts.  I think so, and I think it's worth a try to see if such a resource is well-used and -liked by people.


Reply to Nathan Jurgenson on anti-intellectualism

Thanks to Nathan Jurgenson for a thoughtful critique of "Is there a new geek anti-intellectualism?"  I wish I had more time to respond, especially since it is so earnestly intellectual itself.  The following will have to do.

Jurgenson provides a definition (which he says is based on Hofstadter's discussion in Anti-Intellectualism in American Life), which combines anti-expertism, anti-reason, and "unreflective instrumentalism" or the view that ideas must be put to work if they are to have any value.  I think that "isms" are, since they are theoretical inventions, either trivially easy to define (just quote the inventor), or else very difficult.  I don't feel qualified to evaluate this definition, but neither do I want to accept it uncritically.  Instead I'll just skip ahead to the part where Jurgenson starts using it to formulate some interesting questions and claims on behalf of geeks:

...are [geeks] evangelical/dogmatic in their knowledge justification (beyond populism)? Do they appreciate knowledge and thinking for its own sake, or does it always have to be put to work towards instrumental purposes? Neither of these points are substantiated by Sanger (yet).

I would argue that geeks are not dogmatic, but instead typically rely on reason (e.g., they employ reason in their defense of populism and the “wisdom of the crowds”; even if I and many others are unconvinced). Further, geeks indeed do seem to engage in knowledge projects for fun. Part of the success of Wikipedia is that it allows for purposelessly clicking through random entries for no other reason than because learning is fun. However, my task in this essay is to better conceptualize Sanger’s points and not really make the case for a geek intellectualism. I’m only half-convinced myself of these last two points. I’ll leave it to Sanger to describe how geeks are anti-intellectual on these other two dimensions of anti-intellectualism. Until then, the story of geek anti-intellectualism remains mixed.

I haven't encountered geek anti-intellectuals of a fideist stripe--those who regard faith as a cardinal virtue and who criticize over-reliance on reason and science.  Computer geeks are mostly scientistic rationalists, or at least, they try to be.  (Sometimes they seem not to be simply because so many of them aren't actually trained in rational argumentation or the scientific method.  They learned how to argue on Internet forums.)  If there is something weird about calling geeks intellectuals, it would surely be this.  Indeed, geek rationalism actually explains why there was such an interesting response to my essay.  It didn't surprise me that geeks replied in various highly rational ways to my essay.  That's not the sort of response that a lot of religious anti-intellectuals, of the sort Hofstadter studied, would have, if I had made them my target; they probably wouldn't have responded at all.

As to the second point (on "unreflective instrumentalism"), however, I think Jurgenson lets geekdom off far too easily.  Of course geeks "engage in knowledge projects for fun" (so do many religious fundamentalists).  But geeks frequently talk about how the humanities are useless (this ties in to my point (2)) and for that reason, a waste of time.  One of the recent geek arguments for the pointlessness of college is precisely that college features so much abstract theorizing which doesn't have any practical application.  A lot of geeks love books, to be sure, but some of them reject books not merely because they prefer ebook editions over paper books, but because they have become true believers that social media is what Clay Shirky described as an "upstart literature" which promises to become the "new high culture," just give it some time.  Besides, we often hear, books are becoming outmoded because they are not collaborative, and they're boring and irrelevant because they were not made recently.  And if you try to argue that college might have a non-vocational purpose, their eyes seem to glaze over.  They just don't get that.

Here's a couple of points elaborated, also probably related to "unreflective instrumentalism," or as I would put it, to the devaluing of theoretical knowledge.  First, if you diss the classics, if you reject the intellectual basis for Western civilization wholesale, as some silly-clever geeks (to say nothing of culture-warrior college professors) do, then by golly, you're anti-intellectual. This isn't because you are an instrumentalist, it is because you reject the conceptual-historical basis which allows you to think what you're thinking, including even the math and computer science that forms the basis of the computers you're working on.  If you ignore the giant shoulders you're standing on, and pretend to be thinking through issues a priori or innocent of all scholarship, then you'll certainly fall prey to all sorts of significant errors and confusions.  A person who pretends to be able to speak intelligently on the political issues of capitalist democracy but who has not read theorists like Locke, Rousseau, or Marx is almost certain to make various sophomoric mistakes (regardless of his political leanings).  And that's just one example from one field.  If you don't care about making such mistakes based in historical ignorance, and the whole idea of understanding the content of the ideas you're so passionate about leaves you cold, then you are to that extent not intellectual, and perhaps not really as much of a rationalist as you'd like to think of yourself.  If you go farther and say that persons who inform themselves of the intellectual underpinnings of Western civilization are wasting their time, then plainly, your contempt for the knowledge that people get from such study is so great that you do deserve to be called anti-intellectual.

Second, there's my point (4).  If you reject the necessity of learning things for yourself--if you actually endorse ignorance merely because certain things can be looked up so easily now--then you're anti-intellectual in the rather basic sense that you're anti-knowing stuff.  The three-part definition Jurgenson gives is ultimately grounded, I would argue, in this basic concept: an anti-intellectual undervalues knowledge for its own sake.  That's what explains the stance of anti-expertism, anti-reason, and unreflective instrumentalism.   And if you had any doubt about whether there were a lot of geeks who undervalue knowledge for its own sake, just look at the comments on my essay.  There, on Slashdot, and in other places you'll find plenty of people dismissing college not just because it's a poor economic decision but because the sort of theoretical knowledge you get in college is allegedly a waste of time.  The very claim is anti-intellectual.

It would be different if I saw many geeks hastening to add, after dissing lit crit and philosophy and political theory, that they really mainly have it in for an over-politicized academe, while they still do have respect for the likes of Aristotle and Locke, Michelangelo and Picasso, Thucydides and Gibbon, and for those intellectuals who, along with most scientists, continue to work in the old tradition of advancing knowledge instead of deconstructing it.  But I don't come across geeks saying things like this too often.

The people I'm describing use their minds (often professionally, and very competently), and therefore their minds have a life, so to speak.  But many do not go in for, in Jurgenson's phrase, "the life of the mind."  That involves some level of commitment to understanding the world, including the humanistic elements of the world, at an abstract level, bringing the tools of reason and science to bear.  Just because you write computer software and are fascinated by a few geeky topics, it doesn't follow that you have this commitment.

But then, a lot of academics don't, either.  As I said, it's no contradiction to speak of academic anti-intellectuals.  Their influence is no doubt one of the reasons certain geeks are anti-intellectual.


Geek anti-intellectualism: replies

My essay on "geek anti-intellectualism" hit a nerve.  I get the sense that a lot of geeks are acting--quite unusually for them--defensively, because I've presented them with a sobering truth about themselves that they hadn't realized.  Consequently they've been unusually thoughtful and polite.  This is quite new and startling to me--I mean, there's something about this discussion that I can't remember ever seeing before.  Anyway, it must have seemed relevant, because it was posted live on Slashdot within minutes of my submitting it--something I'd never seen before--and proceeded to rack up 916 comments, as of this writing, which is quite a few for Slashdot.  It was also well discussed on Metafilter, on Twitter, and here on this blog (where I've had over 160 comments so far).  What struck me about these discussions was the unusually earnest attempts, in most cases, to come to grips with some of the issues I raised.  Of course, there has been some of the usual slagging from the haters, and a fair number of not-very-bright responses, but an unusually high proportion of signal, some of it quite insightful.  Reminds me of some old college seminars, maybe.

First, let me concede that I left a lot unsaid.  Of course, what I left unsaid ended up being said, sometimes ad nauseam, in the comments, and a few points I found to be quite enlightening.  On the other hand, I find a lot of geeks thinking that they understand aspects of higher education that they really don't.  I'm not sure I can set them right, but I'll try to make a few points anyway.

I am going to do what I've always done, since the 1990s, when something I've written elicited a much greater response than I could possibly deal with: make a numbered laundry list of replies.

1. How dare you accuse all geeks of being anti-intellectual? I didn't; RTFA.  I know there are lots of very intellectual geeks and that geekdom is diverse in various ways.  I'm talking about social trends, which are always a little messy; but that doesn't mean there's nothing to discuss.

2. There's a difference between being anti-intellectual and being anti-academic. Maybe the most common response was that geeks don't dislike knowledge or the intellect, they dislike intellectuals with their academic institutions and practices.  First, let me state my geek credentials.  I've spent a lot of time online since the mid-90s.  I started many websites, actually learned some programming, and managed a few software projects.  You'll notice that I'm not in academe now.  I have repeatedly (four times) left academe and later returned.

I agree that academia has become way too politicized.  Too many academics think it's OK to preach their ideology to their students, and their tendency to organize conferences and journals around tendentious ideological themes is not just annoying, it is indeed unscholarly.  Moreover, speaking as a skeptically-inclined philosopher, I think that some academics have an annoying tendency to promote their views with unwarranted confidence, and also to pretend to speak authoritatively on subjects outside of their training.  Also, in many fields, the economics of academic advancement and publishing has created a tendency to focus on relatively unimportant minutiae, to the detriment of broader insight and scholarly wisdom.  Also, I completely agree that college work has been watered down (but more on that in the next point).

Having admitted all that, I'm still not backing down; I knew all that when I was writing my essay.  Please review the five points I made.  None of them is at odds with this critique of academe.  Just because some experts can be annoyingly overconfident, it doesn't follow that they do not deserve various roles in society articulating what is known about their areas of expertise.  If you deny that, then you are devaluing the knowledge they actually have; that's an anti-intellectual attitude.  If you want to know what the state of the research is in a field, you ask a researcher.  So even if your dislike of academics is justified in part, it does not follow that their word on their expertise is worth the same as everyone else's.  Besides, most of my points had little to do with academics per se: I also had points about books in general, classics in particular, and memorization and learning.

3. Just because you think college is now a bad deal, economically speaking, it doesn't follow that you're anti-intellectual. Well, duh.  I didn't really take up the question whether the present cost of college justifies not going, and I'm not going to get into that, because I don't really think it's relevant.  Let's suppose you're right, and that for some people, the long-term cost of college loans, combined with the fact that they won't get much benefit from their college education, means that they're justified not going.  My complaint is not about people who don't go to college, my complaint is about people who say that college is "a waste of time" if you do go and are committed.  Maybe, for people who don't study much and who don't let themselves benefit, it is a waste of time.  But that's their fault, not the fault of college.  I taught at Ohio State, which is not nearly as demanding as the college I attended myself (Reed), and I saw many students drifting through, not doing the reading, not coming to class, rarely practicing their writing skills.  I also saw people who always did the reading, always came to class, participated regularly, and were obviously benefiting from their encounter with great writing and great ideas.  Moreover, how college affects you isn't "the luck of the draw."  It depends on your commitment and curiosity.  This is why some partiers drop out and come back to college after five or ten years, and then they do great and finally enjoy themselves in class.

Finally, may I say again (I said it first in the 1990s, and also a few days ago), it is possible to get a degree by examination from programs like Excelsior College?  This way, you bypass the expense of college and pick all your instructors for a fraction of the cost.  This entails that you can get intellectually trained, as well as earn a real college degree, without going into debt.  This would be my advice to the clever ex-homeschoolers who claim that it is college that is, somehow, anti-intellectual.  Put up or shut up, home scholars: if you really are committed to the life of the mind, as you say, and you've already got experience directing your own studies, why not get a degree through independent study with academic tutors, and then take tests (and portfolio evaluations) to prove your knowledge and get the credential?

4. The people you're describing are not true geeks; they are the digerati, or "hipsters," or leftist academics who were already anti-intellectual and then started doing geek stuff. Uh, no.  I mean, you're probably right that some anti-intellectual thinkers who weren't geeks have started talking about the Internet a lot, and they have a big web presence, so now they might appear to be part of geekdom.  But they aren't really, by any reasonably stringent definition of "geek."  Besides, if you look at my article, you'll see that that's what I said (such people fall into the category of "digerati").  My point is that claims (1)-(5) started circulating online among geeks, and they are, each of them, commonly spouted by lots of geeks.  Take them in turn.  (1) Anti-expert animus is a well-known feature of the geek thought-world.  Wikipedia became somewhat anti-expert because of the dominance of geeks in the project.  (2) Of course, the geeks at Project Gutenberg love books, but all too often I see comments online that books went out in the 20th century, and good riddance.  One of the leading idols of the geeks, Clay Shirky, essentially declared books to be a dying medium, to be replaced with something more collaborative.  (3) It is obvious just from the comments here on this blog, and elsewhere, that some geeks find the classics (that means philosophy, history, novels, epics, poetry, drama, religious texts, etc.)  to be a waste of time.  They don't have the first clue about what they're talking about.  (4) The first time I saw the idea discussed much that Internet resources mean we no longer have to memorize (and hence learn) as many facts was among Wikipedians in 2002 or so (when it was totally dominated by geeks, even more than it is now).  (5) The whole college-is-a-waste-of-time thing is a not uncommon geek conceit.  It's not surprising in the least that a founder of Paypal.com would spout it.  It's easy for computer geeks to say, because they can get well-paying jobs without degrees.  In many other fields, that's (still) not true.

5. But I'm an intellectual, and I know that learning facts is indeed passe.  The things to be learned are "relationships" or "analysis" or "critical thinking." Oh?  Then I claim that you are espousing an anti-intellectual sentiment, whether you know it or not.  I'm not saying you're opposed to all things intellectual, I'm saying that that opinion is, to be perfectly accurate, a key feature of anti-intellectualism.  Look, this is very simple.  If you have learned something, then you can, at the very least, recall it.  In other words, you must have memorized it, somehow.  This doesn't necessarily mean you must have used flashcards to jam it into your recalcitrant brain by force, so to speak.  Memorization doesn't have to be by rote.  But even if you do a project, if you haven't come to remember some fact as a result, then you don't know it.  Thus I say that to be opposed to the memorization of facts is to be opposed to the learning, and knowing, of those facts.  To advocate against all memorization is to advocate for ignorance.  For more on this, please see my EDUCAUSE Review essay "Individual Knowledge in the Internet Age."

I know that this is an old and common sentiment among education theorists--which is a shame.  Indeed, the educationists who say that it is not necessary to memorize the multiplication table are implying that it is OK for kids to be ignorant of those math facts.  (No, it's not OK.  They should know them.)  Anyway, it might have started with misguided educators, but it is becoming far too common among geeks too.

6. The Internet is changing, that's all.  Most people are anti-intellectual, and they're getting online. No doubt about it, the Internet has changed greatly in the last five to ten years.  And it might well be the case that the average netizen is more anti-intellectual than in the past, in the very weak sense that more stupid people and uneducated people are getting online.  This might have been clever to say, if my point had been, "Folks online seem to be getting anti-intellectual."  But that isn't at all what I said or meant.  If you will review the evidence I marshalled, you'll see that the people I'm talking about are not the great unwashed masses.  I'm talking about geeks and the digerati who presume to speak about geeky things.  And their influence, as I said, has been growing.

7. Americans are anti-intellectual.  Geek anti-intellectualism is just a reflection of that. Think about what you're saying here; it doesn't make much sense.  I claim that geeks are increasingly anti-intellectual, or increasingly giving voice to anti-intellectual sentiments.  This is a trend, which many people are discussing now because they recognize it as well.  American anti-intellectualism, a well-known phenomenon, goes back to colonial days, and was rooted in our distance from the erstwhile European sources of intellectual life as well as the physical difficulty of frontier life.  The pattern of anti-intellectualism I discern is a relatively recent phenomenon, which has grown up especially with the rise of the Internet.

8. Conservatives never were the anti-intellectuals; it was always the liberal lefties! Glenn Reynolds linked my post, and so some conservatives grumbled about my line, "Once upon a time, anti-intellectualism was said to be the mark of knuckle-dragging conservatives, and especially American Protestants.  Remarkably, that seems to be changing."  Well, I hate to wade into politics here.  I used the passive voice deliberately, because I did not want to endorse the claim that anti-intellectualism is the mark of "knuckle-dragging conservatives" (I don't endorse this phrase, either).  All I meant to say is that this is one of liberals' favorite things to say about American fundamentalists.  I was about to, but did not, go on to say that actually, among the home schooling crowd, liberals and libertarians tend to go in for "unschooling," which is relatively (and not necessarily) hostile to traditional academics, and it is conservatives who go in for  uber-academic Latin-and-logic "classical education."  I didn't say that, because I knew it would be distracting to my point.  So I'm kind of sorry I made the remark about conservatives, because it too was distracting to my point.  Suffice it to say that there are plenty of knuckle-draggers, so to speak, everywhere.

9. Are you crazy?  Geeks are smart, and you're calling geeks stupid by calling them anti-intellectual. You didn't know that "anti-intellectual" does not mean "stupid," apparently.  There are plenty of anti-intellectual geeks who are crazy smart.  They aren't stupid in the least.  You also must distinguish between having anti-intellectual attitudes or views, which is what I was talking about, and having anti-intellectual practices. There are plenty of intellectuals in academia who are anti-intellectual.  (There are Jewish anti-Semites, too.)  Just think of any progressive education professor who inveighs against most academic work in K-12 schools, describes academic work that involves a little memorization and practice as "drill and kill," wants the world to institute unschooling and the project method en masse, has nothing but the purest P.C. contempt for the Western canon, advocates for vocational education for all but those who are truly, personally enthusiastic about academics, wants academic education to be as collaborative as possible rather than requiring students to read books, which are "irrelevant" to the fast-changing daily lives of students, and channeling Foucault rails against the hegemony of scientists and other experts.  Well, such a person I would describe as an anti-intellectual intellectual.  The person might well write perfectly-crafted articles with scholarly apparatus, read classics in her field, and so forth.  It's just that her opinions are unfortunately hostile to students getting knowledge (in my opinion).

10. But the liberal arts are a waste of time.  Studying Chaucer?  Philosophy?  History?  The vague opinionizing is pointless and facts can be looked up. If you believe this way, then I have to point out that virtually any really educated person will disagree with you.  Once you have received a liberal education, your mind expands.  You might not understand how, or why it's important, but it does.  That's why people devote their lives to this stuff, even when it doesn't pay much, as it usually doesn't.  If you haven't studied philosophy, you can't begin to understand the universe and our place in it--I don't care how much theoretical physics you've studied.  There are aspects of reality that can be grasped only by critically examining the content of our concepts.  Similarly, if you haven't read much literature and especially if you are young, then you are very probably a complete babe in the woods when it comes to the understanding of human nature and the human condition; that's why people read literature, not so that they can sniff disdainfully at others over their lattes.

11. What you call "anti-intellectual" is really "anti-authority."  You're merely defending the prerogatives of snooty intellectuals whose authority is on the wane. This is one of the most common and often snarkiest replies I've run across.  But it's also a very interesting point.  Still, on analysis, I'm going to call it flimsy at best.  I'm going to spend quite a bit of space on this one.  Feel free to skip to down to the end ("In Sum" before "Conclusion").

Let's distinguish between being opposed to knowledge in its various forms, on the one hand, and being opposed to the prerogatives of intellectuals, on the other.  I claim that the path many geeks are headed down really has them opposed to theoretical and factual knowledge per se. I think the evidence I offered supported this reasonably well, but let me try to make it a little more explicit.

Consider point (1), about experts.  ("Experts do not deserve any special role in declaring what is known.")  That certainly looks like it is about the prerogatives of experts.  If for example on Wikipedia I encountered people saying, for example, "Experts need to prove this to us, not just assert their authoritah," that would be fair enough.  That's not anti-intellectual at all.  But going farther to say, "You merely have access to resources, you don't understand this any better than I do" and "You're not welcome here" is to fail to admit that through their study and experience, the experts have something more to contribute than the average Joe.  If you can't bring yourself to admit that--and I submit that the stripe of geek I'm describing can't--then your attitude is anti-intellectual.  (Some people are refreshingly honest about just this.)  Then what you're saying is that specialized study and experience do not lead to anything valuable, and are a waste of time.  But they lead to knowledge, which is valuable, and not a waste of time.

Point (2) (that books per se are outmoded) also, admittedly, has a little to do with intellectual authority--but only a little.  One of the reasons that some geeks, and others, are welcoming the demise of books is that they resent a single person set up as an authority by a publisher.  They say that publishing can and should be more like a conversation, and in a conversation, there shouldn't be one "authority," but rather a meeting of equal minds.  So perhaps those who are pleased to attack the medium of books couch their views as an attack on authority.  Perhaps.  But when I defend books, I really don't care about authority so much.  Of course, when thinking adults read books, they don't read them it in order to receive the truth from on high.  They are interested (in argumentative books, to take just one kind) in a viewpoint being fully and intelligently canvassed.  As some of the geeks commenting do not realize, and as some people don't realize until they get to graduate school, it frequently requires a book--or several books--to fully articulate a case for some relatively narrow question.  Scholars should be praised, not faulted, for being so committed to the truth that they are willing to write, and read, discussions that are that long.  The fact that publishers have to pick authors who are capable of mounting excellent arguments at such length doesn't mean that their readers are supposed simply to accept whatever they are told.  At bottom, then, to oppose books as such is to be opposed to the only way extended verbal arguments (and narratives and exposition) can be propagated.  An indeterminately large collaboration can't develop a huge, integrated, complex theory, write a great novel, or develop a unified, compelling narrative about some element of our experience.  If you want to call yourself intellectual, you've got to support the creation of such works by individual people.

Point (3), about the classics, has almost nothing to do with the prerogatives of authority.  The shape of the Western Canon, if you will, does not rest on anybody's authority, but instead on the habits of educators (school and university) as an entire class.  You're not rebelling against anybody's authority when you rebel against classics; you are, if anything, rebelling against the ideas the classics contain, or against the labor of reading something that is demanding to read.  In any case, anybody who comes down squarely against reading the classics is, to that extent, decidedly anti-intellectual.  Face it.

Point (4), which has us memorizing as little as possible and using the Internet as a memory prosthesis as much as possible, has absolutely nothing to do with authority.  If you're opposed to memorizing something, you're opposed to learning and knowing it.  That's quite anti-intellectual.

Point (5) concerns college, and on this many people said, in effect, "I oppose the stupidity of an overpriced, mediocre, unnecessary product that rests on the alleged authority of college professors."  Then it looks like you're criticizing the authority of professors, and so you think I'm defending that.  Well, to be sure, if college professors had no significant knowledge, which (as I think) gives their views some intellectual authority, then there would be no point in paying money to study with them.  But I can defend the advisability of systematic college-level study (I choose these words carefully) without making any controversial claims about the authority of college professors.  I do not, for example, have to assume that college professors must always be believed, that they are infallible, that we should not be skeptical of most of what they say (especially in the humanities and social sciences).  After all, most professors expect their students to be skeptical and not to take what they say uncritically; and only a very dull student will do that, anyway.  If you didn't know that, it's probably because you haven't been to college.  So, no.  I am not merely defending the authority of college professors.  I am personally quite critical of most scholarship I encounter.

In sum, I know that libertarian geeks (I'd count myself as one, actually) love to rail against the prerogatives of authority.  You'd like to justify your anti-intellectual attitudes (and sometimes, behavior) as fighting against The Man.  Maybe that is why you have your attitudes, maybe not.  In any case, that doesn't stop said attitudes from being anti-intellectual, and your issues don't mean that I am especially concerned to defend the prerogatives of authority.  I am not.

Conclusion

I think I've hit most of the high points.

One thing I didn't discuss in my original essay was why geeks have become so anti-intellectual, especially with the rise of the Internet.  Here is my take on that.  Most geeks are very smart, predominantly male, and capable of making an excellent livelihood from the sweat of their minds.  Consequently, as a class, they're more arrogant than most, and they naturally have a strong independent streak.  Moreover, geeks pride themselves on finding the most efficient ("laziest") way to solve any problem, even if it is a little sloppy.  When it comes to getting qualified for work, many will naturally dismiss the necessity of college if they feel they can, because they hate feeling put-upon by educators who can't even write two lines of code.  And the whole idea of memorizing stuff, well, it seems more and more unnecessarily effortful when web searching often uncovers answers just as well (they very misguidedly think).  What about books, and classics in particular?  Well, geek anti-intellectual attitudes here are easily explained as a combination of laziness and arrogance.  The Iliad takes a lot of effort, and the payoff is quite abstract; instead, they could read a manual or write code or engineer some project, and do a lot more of what they recognize as "learning."  The advent of new social media and the decline of the popularity of books are developments that only confirm their attitude.  It doesn't hurt that geek is suddenly chic, which surely only inflates geek arrogance.  If they admit to themselves that there is something to philosophy, history, or anything else that takes time, hard study, and reflection to learn, but which does not produce code or gadgetry, then they would feel a little deflated.  This doesn't sit well with their pride, of course.  They're smart, they think, and so how could they be opposed to any worthwhile knowledge?

So it shouldn't be surprising that some (only some) geeks turn out to be anti-intellectual.  This is no doubt why many people said, in response to my essay, "This is just what I've been thinking."