An assortment of things that should exist

Occasionally I wish I had time to write a book to explain these ideas in detail. (Some of these are actually book ideas. Some of them are project ideas.)

1. A tutorial system, independent of any university, managed via a neutral online database; and an expanded system of degrees by examination.

2. Textop! I love this idea whenever I think about it!

3. A medium-sized secular (but not anti-religious) chapter book explaining for elementary-aged children, in non-condescending but easy language, why various virtues are virtues and their corresponding vices are vices. It should also explain why moral relativism is silly, which of course it is. I've looked for such a book, hard. I've started to write such a book, but never find enough time to finish. I truly believe such a book would be an enormous best-seller.

4. A system of non-fiction e-books, roughly similar to what you can find here, but which have more intelligently-written scripts, like some of these videos and these powerpoints. I hope to start such a system using the ReadingBear.org software as a platform.

5. This is going to be very hard to explain briefly, and it will sound half-baked, but since when did that ever stop me? Actually, the rough idea (not my version, but something vaguely like it) comes from a Heinlein novel (I forget what Heinlein calls them and where--maybe someone will tell me) combined with my original idea for neutrality on Wikipedia (and before that, Nupedia). I think that civilization could use a society of people who are meticulously and publicly committed to neutrality. Somewhat like judges, but who operate in the public sphere, they do not make any public judgments on controversial issues of any sort. Their role in society would be, rather, to summarize "what is known"--or what various people take themselves to know--about this and that, according to some clear and deeply studied rules of scholarship and neutrality. If someone, or a group, required a neutral, expert analysis of a question, a field, or a situation, they would provide it. These people would have to be experts in ideology, logic, and the arts of communication, understanding when a statement is the slightest bit tendentious, and be able to quickly formulate a more neutral one. These people would be perfect candidates to write neutral Congressional reports as well as serve as expert witnesses in trials. There would have to be a fairly elaborate system of professional ethics for this group, and members would no doubt have to be regularly evaluated by their peers. Among other things, they would not be able to serve in politics, as attorneys or judges, or as corporate executives. They could serve as journalists and scholars, but under stringent rules that do not apply to most journalists and scholars. -- Why such a profession? Because the world has gone insane, and it desperately needs people who are professionally committed to explaining obvious things to crazy people. Do you really think that people well-qualified and publicly committed in the way I've described would lack for work? They'd be extremely well employed as consultants, internal and external.

6. A website+app with spaced repetition questions that teach basic facts school students (preK and up).

I've had quite a few more. I'll make another post later, perhaps, with more of the same.

Feel free to swipe any of these ideas and do a world of good by bringing them to fruition. You might or might not get rich, but if well-executed, you certainly could help a lot of people.


Why is spaced repetition not better known?

Suppose a method let you remember things with a 95% success rate--in other words, whatever information you've put into a system, you'd have a 95% chance of recalling it--and this effect is permanent, as long you continue to use the method. That would be quite remarkable, wouldn't it?

Well, there is such a method, called spaced repetition. This is the method used by such software as Supermemo, Anki, Mnemosyne, and Memrise.

The figure, 95%, is very impressive to me. I've been thinking about it lately, as I delve into the world (it is a whole world) of spaced repetition. Ordinarily, we require much less out of our metrics. 95% is practically a guarantee. With just 15 or 30 minutes a day, adding maybe 20 questions per day, you can virtually guarantee that you will remember the answers.

In particular, I am wondering why spaced repetition is not used more widely in education. Of course, I'm not the first to wonder why. The answer is fairly simple, I think.

The more I read from and interact with educationists and even homeschoolers, the more I am struck by the fact that many of them hold knowledge in contempt (q.v.). Of course, they will cry foul if you call them on this (q.v.), but that doesn't change the fact (q.v.). So naturally I expect them to sneer at me when I express amazement at the 95% recall figure. I can hear the "arguments" already: this is "rote memorization" (not if you understand what you're memorizing); education is not about amassing mere facts (not just that, no); it suffices that you can just look answers up (wrong); we should be teaching critical thinking, not mere memorization (why not both?).

I am not going to defend the value of declarative knowledge (again) here. I simply wanted to observe what teachers (including homeschooling parents) could do with spaced repetition, if they wanted to. They could spend a half hour (or less) every day adding questions to their students' "stack" of questions; then assign them to review questions (both new and old) for a half hour.

Imagine that you did that, adding 20 questions per day, five days a week, 36 weeks per year (the usual U.S. school year), for six years. This is not impossible to manage, I gather, and would not take that long, per day. Yet by sixth grade, your students would have 21,600 facts in recall with about 95% accuracy. These would merely be the sorts of facts contained in regular textbooks.

Next, consider an exam that drills on a random selection of 100 of those facts. The students who used spaced repetition faithfully would probably get an A on the exam. That, I suspect, is much better than could be expected even from top students who used ordinary methods of study.

Would students who spent 30 minutes out of every class day on this sort of review benefit from it?

I think the answer is pretty obvious.


How not to use the Internet, part 4: how "social" is social media?

<< Part 3: How the Internet's current design philosophy fails

4. How "social" is social media?

A person who is "social," we think, gets along with others and does not always stay at home. They mix well. This is, we hope, because they like other people, not because they're trying to take advantage of them. They have an interest in getting to know others and doing fun things with them.

So I wonder if "social media" is misnamed.

Social media features the trappings of social behavior: conversation (with head shots and indications of mood), sharing interests, and doing things together. But how these activities happen in so-called social media are mostly a weak shadow of what happens face-to-face. The conversation is typically brief. It is rarely one-to-one, but instead one-to-many, rather like broadcasting a message over an intercom to a group of people who are only half-listening and busy broadcasting themselves. We often do not know who, precisely, is receiving our message, and we act as if we do not care. We do not expect a reply, and if we do not receive a reply, we are at worst disappointed; face-to-face, if we received no reply at all, we would think the person we spoke to was rude and cold. In many venues, the conversation happens among literal strangers, often from around the world, which at first glance seems charming—and it sometimes is. But after the novelty wears off, we discover that the rewards are rare. Such interactions rarely involve personal understanding and regard, as friends share.

Conversation online is rarely as meaningful, from a social point of view, as conversation face-to-face among friends and known colleagues. (In terms of logic and rhetoric, I have found that it can be more rigorous and rewarding than much face-to-face conversation. But I'm talking about sociality now, not logic.)

When we get online and engage in "social" media, I wonder how much we—most of us—do so because we like people. I wonder if we do it because we want to use people and promote ourselves. This is not social, properly speaking, any more than PR work is "social." "Now just a minute, Sanger," I hear you saying, "you've gone too far. I like people. I am not a user. How dare you accuse me, and all users of social media, of being selfish 'users'?" I apologize if I offend. I did not accuse all users of social media of being "users" of people. That really isn't my intention. But I have an important point to make and it isn't pretty. When you do an update, are you acting like a friend, or like a PR agent? I'll be honest. Personally, I do a lot more PR updates than friendly updates. I find it a little surprising and charming when my friends and acquaintances respond to such updates, but that doesn't stop them from being, mainly, PR updates. Sure, I understand that some people do mainly engage with their close friends. I think that's nice (as I said before), as far as it goes. But a lot of what we say is personal advertising, so to speak. Some have even taken to speaking of their online identities, to mind rather pathetically, as their "personal brand," and they invest much time on social networks buffing their "personal brands." This behavior is "social" in a very weak sense, in that it involves people, but not in the strong sense that it involves building friendships.

Social media is a poor replacement for a real social life. To the extent that social media is replacing it, friendship as an institution weakens.

Relevant links:

I was tempted to try to coin a phrase, "anti-social media," but of course someone beat me to it.

On "personal branding," see this Mashable post.


How not to use the Internet, part 3: how the Internet's current design philosophy fails

<<Part 2: The pernicious design philosophy of the Internet

3. How the Internet's current design philosophy fails.

Websites compete for the really limited commodity online, namely, attention. That much is understandable, and not likely to change. How they compete is the problem.

Putting lots of menus, internal links, feeds, and self-promotional media on pages drives traffic around a website internally, while putting external links and various media on a given page is thought to increase its value and interest to end users. Competition for limited attention also motivates others to link to us (through reciprocal links, which are often automatic in blogging systems). More information seems better, so more pointers to information and ways to organize it seems better. Similarly, systems for regularly alerting us to mail, news, blogs, and so forth are straightforward attempts to grab our limited attention. Software-driven media tries to prove its relevance to us this way, and sometimes succeeds.

But if we really are trying to capture and hold each others' attention, isn't this busy, distracting design philosophy puzzling?

Why saturate a blog post (or other media) with a panoply of enticing choices to other things on our website, when we surely know that most users will, by habit, bounce right off of the page that brought them to our website, the very page that has the best chance of keeping them there? Such internal links might in a few cases get your user to go elsewhere on your site, but it also reduces the chance that the visitor will at least read the thing that brought them to your site in the first place. Why not seize the bird in hand? For that matter, why have so many external links right in our own text? Why don't we design our pages so that, when we are graced with a visitor, the visitor will focus on, and actually want to stay to the end of, what brought them?

Similarly, if we really want to get others' attention, why do we flood their Twitter and Facebook feeds with so much noise? Why do we bore them with too much news, repetition, and chitchat? We are instructed to increase the signal if we want more followers, yet most of us don't. Why not?

Yet if the choices of web designers and marketers  seem paradoxical, how much more paradoxical is it that we, as end users, continue to consume—ravenously—what so often contains more noise than signal? Consider that many of us follow hundreds of people on Twitter (far more than we can really keep up with), that we have "friended" people from high school whose names we barely remember, that many of us welcome in more mail than we can reasonably manage, and so on.

Both paradoxes, of Internet producers and consumers, disappear when we reflect on the fact that we are very anxious about "missing out," and Internet producers are merely exploiting this anxiety. It's not too much of an exaggeration to say that we are in a collective panic, a veritable mania, over the fantastic content now online. Information purveyors, working in this frenzied atmosphere, and who are end users themselves, naturally go to great lengths to seize their portion of the online public's attention. Faced with a zillion things to pay attention to, calm, slow decision-making seems ridiculously inefficient. In this atmosphere, there is no time to exercise wisdom.

It's bad enough that this design philosophy looks, at least to some extent, self-defeating for information purveyors. Even worse is that it doesn't really benefit the end user. Consider:

Many of us spend a lot of time on Twitter. Why? The people you're following come up with some quite insightful observations? Actually, not so often. Few can say much that is really worthwhile in 140 characters. The best that most of our Tweeps can do is be occasionally interesting, clever, or funny—and otherwise a waste of time. But maybe you get a lot of links to fascinating news articles, blogs, and so forth? Maybe, yet most of the links go unclicked. You are usually quickly in-and-out of those that you do click. Even if you don't bounce out after a glance, even if you actually read something, you'll probably just skim it quickly and forget it, which means you don't really benefit from even the things you spend the most time on. But, you fret, if you don't follow your feed, wouldn't you be out of it and disconnected? Not necessarily. If you focus on a few high-quality news sites and blogs that cover your industry and interests, if you actually read them, you'll almost certainly be more up-to-date about those topics than someone who uses Twitter as a replacement for such sources.

But you knew that. No, surely in your heart of hearts you know that the reason Twitter exists is not information exchange, but a kind of socialization. Yet it's rarely bona fide socialization or friendship-building. It's mostly networking. For most people, I suspect, we just have a somewhat pathetic desire to see our username replied-to and retweeted. This makes us feel relevant, popular, and connected. Our ego swells with each new follower, reply, and retweet. Yet this is clearly illusory. It is increasingly fashionable to apply the self-effacing epithet "narcissistic" to these, our common social networking habits. We know that, just because our vanity is flattered by public attention, it does not follow that we are relevant, or popular, or connected in any way that matters.

Face it: the only reason we (some of us) waste so much time on Twitter and Facebook is that "everybody else" is there, wasting time too, and we would feel out-of-it and incomplete, somehow, to be drop out. The whole advent of truly mass participation in social media, beginning in the mid-2000s with Myspace, seems to reflect not "the wisdom of crowds" but "the madness of crowds," like tulip mania. I think Twitter exemplifies this observation perfectly.

Facebook looks open to the same observations. Why do you spend time on Facebook? Because your Mom and old friends are on it, for one thing. Are you closer to them now than you were before Facebook? Probably not, in most cases, except for the few comments you've exchanged with people you haven't otherwise spoken to in years. On Facebook, we frequently exchange sentiments (and media) with people close and not-so-close to you, and that is being sociable. I won't be so churlish or anti-social as to deny that it's nice. Of course it's nice. But this style of interaction makes socialization less personal than it once was. If you spend a lot of time socializing on Facebook (I'm guessing; no doubt someone's done a study) you probably talk less on the phone. You probably feel less of a need to spend face-time, or even ear-time, with loved ones. Be honest, now: is Facebook really enhancing the quality of your social life and family relations? For society as a whole, is it bringing us closer together and improving our social relations in general? I strongly doubt it. It seems only to make our social lives more "efficient"—and impersonal, too. Doesn't this social media par excellence actually make us less social, in the ways that matter? Why shouldn't I draw that conclusion? Some might have a knee-jerk tendency to call me a Luddite for saying such things. But I live online and have devoted much of my adult life to building bits of the Internet, so that would be silly; can you explain why I'm wrong?

Wikipedia is an amazing and frequently useful resource. (For all my criticisms, I've never denied this.) But when you look something up there, how often do you increase your store of knowledge, rather than gaining a temporary grasp of not-fully-reliable "fact" and fleeting sense of understanding? Is your mind significantly improved? Probably not. Even if you spent the evening lost in Wikipedia's hyperlinkage, you are apt to forget most of what you come across. It's intellectual fast food; the taste is strangely compelling, but it is not exactly mentally nutritious. Building your personal store of knowledge requires deep reading and critical study, focus on a topic for a lengthy period of time. The design philosophy of Wikipedia—the copious irrelevant hyperlinks, and the way text tends to be written in smallish, loosely-related chunks instead of woven into a coherent narrative—militates against deep reading and critical study. I'm not saying you can't use Wikipedia as part of a program to do serious research and gain solid knowledge. Of course you can. Some people even have, I'm sure. But I doubt that's how most people use it. Its design encourages surface grazing, not immersion.

Even Google Search itself falls prey to this sort of analysis. What could be better than Google, which delivers highly relevant results and often answers your questions instantly? Well, yes. But we should demand more. There is more to search than faux-relevance and speed. When you do a search to find the best possible information on a subject, is that what you are shown? Not necessarily, because what Google shows you is the most popular and the most recent (and now, if you're logged in with your Google account, what they think you'll be most likely to click on). The highest quality results are too often far down the list. Google's daily influence on us may well have trained us to overvalue popularity and recency, frequently at the cost of more significant qualities like reliability, clarity, historical importance, and depth.

I could give many similar examples, but let me skip to a general conclusion.

The Internet is ostensibly set up to let us help each other navigate the wealth of information online and, by speeding communication and new ways of collaboration, bring us closer together. But that isn't quite what it does. When I spend much time on social networks, I find the experience to consist more of noise and alienation than signal and connection. What many, including myself, have touted as a potential tool of enlightenment and increased social connection right now seems to be making us less enlightened, less sociable, and less disciplined to boot. The Internet caters particularly to those who want to promote their work. Because so many people are doing this at once, its most striking effect is to distract us endlessly with what are, at the end of the day, mostly trivialities.

Part 4: How "social" is social media? >>

Relevant links:

I know that SEO people have answers to my rhetorical questions about menus and links. Here is a sample (chosen only because it's highly ranked in a Google search and thus, no doubt, played the SEO game well). But the SEO strategy is about building traffic. It is not about encouraging them to finish reading what they came for.

It's common advice to Twitterers that they increase their focus and signal in order to get more followers; example.

This Google search is a good place to start reading about how social media is narcissistic.

The famous phrase "the wisdom of crowds" seems to have gotten its start in the book by James Surowiecki of that name. "The madness of crowds," by contrast, comes from Extraordinary Popular Delusions and the Madness of Crowds. I've read the former but not the latter, even though it is free (courtesy a part of the Internet that really doesn't suck).

I read on the Internet that 71% of all U.S. citizens are on Facebook. So, probably, your Mom is.

While I don't recall ever being accused of being a Luddite, I probably was at some point. Nicholas Carr, though, makes much of the purported "Luddite" aspect of Internet criticism.

Wikipedia doesn't seem to place much stock in narrative coherence, contrary to Citizendium.

On the idea that the Internet generally (Wikipedia is not mentioned) encourages surface grazing and does not increase our knowledge significantly, see this speech of mine.


How not to use the Internet, part 2: the pernicious design philosophy of the Internet

<< Part 1: It's a problem that the Internet distracts us

2. The pernicious design philosophy of the Internet.

The way that the Internet is designed—not graphic design, but overall habits and architecture—encourages the widespread distractability that I, at least, hate.

This basic notion is not my idea; I freely admit that I learned it from Nicholas Carr. I did not quite notice some features about the Internet until reading Carr's The Shallows some time ago, and the following borrows from Carr. My analysis consists of two related parts, the first being about the nature of the Internet, and the second being about the design philosophy of the Internet.

First, consider what the Internet is, or the public side of it, so to speak. (Not the technical, "back end" part.) The public side of the Internet consists of (a) information of various media that is presumably of some public interest, together with (b) ways of repackaging, sending, publishing, and rating the information and, especially, of linking to it for public consumption.

Category (a) is rapidly growing to include all of the public information we know of, or at least all of it that can be digitized—and not just all extant information, but also all new information that arrives on the scene. This fact is of interest not just to "geeks," but to everyone who finds books, news, movies, and virtually everything else that we can communicate and share digitally. Category (a) is the concern of all of humanity, not just geekdom.

This makes category (b), what we might call the net's meta-information, all the more important to us. Google makes the inherently interesting information findable. Wikipedia tries to summarize it. Email, texting, and VoIP (like Skype) allow us to communicate it more efficiently. Twitter gives acquaintances and colleagues a way to share the latest and greatest with us. Facebook gives us easy, one-page access to information about our friends and families. Other sites, like YouTube and Amazon, offer us view counts, ratings, samples, and reviews that are crucial to deciding what long-form content worth pursuing.

Now I can explain a notion, which again owes a great deal to Carr, of the current two-part "design philosophy" of the Internet, to wit:

Interconnectivity: information that is of some inherent public interest is typically marinated in meta-information: (a) is bathed in (b). It is not enough to make the inherently interesting content instantly available and easy to find; it must also be surrounded by links, sidebars, menus, and other info, and promoted on social media via mail. This is deliberate, but it has gotten worse in the last ten years or so, with the advent of syndicated blog feeds (RSS), then various other social media feeds. This is, of course, supposed to be for the convenience and enlightenment of the user, and no doubt sometimes it is. But I think it usually doesn't help anybody, except maybe people who are trying to build web traffic.

Recency: the information to be most loudly announced online is not just recent, but the brand-spanking-newest, and what allegedly deserves our attention now is determined democratically, with special weight given to the opinions of people we know.

Something like this two-part design philosophy, I believe with Carr, is what makes the Internet so distracting. Carr found some interesting studies that indicate that text that is filled with hyperlinks and surrounded by "helpful" supporting media tends to be poorly understood, and we spend less time on each page of such text. As soon as we come across a link, video, or infographic sufficiently interesting to distract us, the surrounding mass of text becomes "tl;dr". Over time, we have largely lost the habit of reading longer texts, and this problem is apt to get worse.

Moreover, when we and our social networks place a premium on recency, we naturally feel a need to check various news streams and data feeds regularly, and coders oblige this tendency by providing us various distracting push notifications when the latest arrives. Even more, the Internet industry hungrily pounces on new tools and devices that allow people to share and be connected in ever more and newer ways. The Internet increasingly goes wherever we are, first with the advent of laptops, then smart phones, then the iPad—and eventually, maybe "Google Glasses."

The result is that, soon after we surf to a page of rich media, its interconnections lead us away from whatever led us to the page in the first place, even while our various alerts and, just as important, our habits of checking stuff, conspire to pull us away as well. Ironically, what might look to the naive to be an efficient, intelligent system of alerting us and giving us instant access to the latest and greatest online has the effect of making us unable to focus on any one thing for long.

Let that sink in a little. Back in 2000, what we were so excited about, when we thought about the potential of the Internet, was the sheer amount of knowledge that would be available and presented (and developed!) in all sorts of brilliantly engaging ways. Now it is 2012. Is that what we have? Yes—and no. Some of the dream has indeed arrived. Vast amounts of content are there. Frequently it is presented engagingly (although we have a lot more to do before we reach our potential). But it is also presented in a context that is so extremely distracting that we, even despite our best intentions, often do not really appreciate it. We are not encouraged to study, absorb, savor; we are encouraged to skim and move on.

I think there is something really wrong with this design philosophy. We ought to try to change it, if we can. But how, especially considering that it mostly grew organically, not as a result of any grand design?

Part 3: How the Internet's current design philosophy fails >>

Relevant links:

Nick Carr's blog, "Rough Type"

To see how SEO analysts (and many webmasters) think about recency, see "New Rules: Fresh Content Is King" (undated, natch!).

Of course, the Google Glasses that appeared in the video are probably vaporware, for now.

"Vast amounts of content" that is "presented engagingly"? Well, Wikipedia and YouTube, for just two examples. I didn't say presented perfectly, but their popularity is evidence of their being engaging. Their vastness is obvious. Many more examples could be given.


How not to use the Internet, part 1: it's a problem that the Internet distracts us

For almost a year, I've been at work on a very long essay about some problems with the Internet and social media in particular. I've worked on it now and then and occasionally I think I'm really going to finish it—but I never do. So, as a concession to failure, or partial failure anyway, I have decided to divide it up into several self-contained brief essays. I'll release an essay a day and see how it goes. Here is the first.

Note, rather than tempt the reader to click out of the essay, I've moved links to the end, and annotated them. This is an example of one way in which the Internet could change (although I'm not exactly holding my breath).

1. It's a problem that the Internet distracts us, dammit.

I too am distracted by ubiquitous digital media. This is a problem—a common, serious, and real problem—and I wish I could get to the bottom of it, but it is very deep.

In the last several years, like many of us, I've often felt out of control of my time. Following basic time management principles is more difficult than ever, especially when I'm spending time online and looking at screens generally. My situation is probably similar to that of many people reading this: I check my mail many times per day; Twitter and Facebook beckon, as do my favorite online communities (and I dread joining Google+); people push the latest news at me; people Skype me; and the time seems to slip away in spite of my better intentions to, you know, get work done.

What I think of as an unmitigated vice has been complacently described by some as "multi-tasking," as if allowing yourself to be distracted were some sort of advanced technical ability. We are told (though, I gather, not by most psychologists) that being able to multi-task effectively is one of the skills that should now be in every plugged-in person's toolkit. But the notion that multi-tasking is an advanced ability is merely an excuse, I think. When you are "multi-tasking," usually, you are not using your time efficiently; you are simply letting yourself be distracted, because you don't want to "miss out."

That's not all. As much as I hate to admit it, the Internet also seems to have made it difficult for me, as it has Nicholas Carr and Richard Foreman, to write and pay attention to long texts, and to think deep thoughts. To be sure, I still try and occasionally succeed. I seem to skim more along the surface of things, despite myself. Thoughtful insight is far from impossible, but it seems to require more deliberate effort. Creativity still flows, but less often and less spontaneously. Believe me, I wish it weren't this way. I fear that I, too, am becoming one of Carr's "shallows" and one of Foreman's "pancake people."

Many heavy Internet users have fairly admitted the same, often apparently with pride or without shame—or at least without hope of improvement. Do you feel the same?

The nature of these now-common problems—a mind ironically made poorer in spite of, indeed by, the Internet's riches—has been much discussed, for example by Maggie Jackson in Distracted, Mark Bauerlein in The Dumbest Generation (a much better book than you might expect from the title), Nicholas Carr in The Shallows, and Jaron Lanier in You Are Not a Gadget.

So why do we let ourselves get so distracted? Why are we so often incapable of sticking to a single task?

I think there is a simple answer, actually: we intensely feel the presence of all the world's information and people and the digital fun that entails, miraculously made available to us. Impersonal information made it bad enough for us early adopters in the earlier days of the Internet. But now that everybody and his grandma (literally) has joined social networks, the situation got a lot worse, for me at least. We are constantly available to our colleagues, friends, and acquaintances, so they may "interrupt" us at random times throughout the day, offering insights and telling us that some new website or blog post or picture or video is a "must read" or "must see," or simply reporting their own sometimes-interesting thoughts and news. We constantly feel pulled in a thousand directions. This general problem seems likely to worsen as our access to the world's information becomes more and more complete, speedy, and convenient. Before long, we will have virtually instant access to every bit of content we might want, always and everywhere, and with a minimum of effort (though not necessarily with a minimum of cost). We are nearly there, too.

This revolution—inadequately described as a revolution of "information" or the "digital" or the "Internet"—is wholly unprecedented in history. Not long ago I had to tell blasé skeptics that it is not "hype" to call it a revolution. But clearly, a lot of regular folks, not necessarily in the vanguard, have started to understand the enormousness of how the world has changed in the last fifteen years or so. It's a real revolution, not a mere fad or development, and even as we stare it in the face, it is still hard to grasp just how far-reaching it is. We have been swept up by the one of the most novel and dramatic transformations that humanity has ever undergone. We read about "revolutions" throughout history, the printing press, of religion, of ideology, of industry. This is another one; it's the real deal. It's more important than, for example, who will be elected president in 2012, whether the Euro will collapse, or Iran's nuclear ambitions.

Anyway, this revolution is so novel that it is not surprising if we act like kids screaming in a candy store, not knowing what to sample first. Maybe it's time that we started taking stock of the Internet's candy store more like mature adults and less like sugar-crazed children.

For some time, I've known that I would have to come to a personal understanding of this situation, and make some personal resolutions to deal with it. Like so much else, I've been putting this off, because the problem is massive and I haven't felt equal to it. I'm not sure I am yet. Nobody seems to—not even the above writers, who offer bleak reports and little in the way of helpful advice. The Internet bedazzles us. But for me, things have come to a head. I do not want to go through the rest of my life in the now all-too-familiar state of Internet bedazzlement, if I can help it. For me, it begins now. It's time for me—and maybe, for you too—to get over the fact that all of the world's information and the people that drive it are (or soon will be) accessible in moments. But how?

Some people won't admit that there is even a problem in the first place. They celebrate the Internet uncritically, leaping upon every new site, app, or gadget that promises to connect us in newer and deeper ways. But it is precisely the wonders of the Internet that we celebrate that have become a major distraction. Some people don't seem to want to admit that distractability is a serious problem; they do nothing but offer blithe predictions and analysis of how thinking, social interaction, education, etc., are moving into a wonderful new age. That is all very well as far as it goes, but I sometimes wonder if some of the recent economic downturn might be explained by the amount of time we waste online. Surely it's possible that the global economy is significantly less productive because we're distracting ourselves, and each other, so much, and with so little to show for it.

Other people seem to think that there's nothing that can be done about our distractability and "shallowness." Whatever their disagreements, Internet commentators Clay Shirky and Nicholas Carr seem to agree on this: the brevity of information chunks, the pace of their flow, and the fact that they are mediated democratically by giant web communities are all inevitable features of the Internet; so we can't help but be "distracted." Or so Shirky, Carr, and many techie A-listers seem to think. This is where modern life is lived, for better or worse. If you want to be part of things, you've got to jump into the data stream and do your best to manage. If your distractability is making you "shallow" or "flat," that is just a new and unfortunate feature of life today.

I will not "go gentle into that good night." I can't help but observe that this sort of techno-fatalism might be why some Internet geeks are becoming anti-intellectual. I'm far from alone in my view that the overall tendency of the Internet, as it is now and as we use it now, is to make us less intellectual. So, many Internet geeks make a virtue of necessity and begin slagging intellectual things like memory (and thus declarative knowledge), books and especially the classics, expertise, and liberal education. At least critics like Carr and Lanier have the good taste and sense to bemoan the situation rather than mindlessly celebrating it.

As to me, I disagree with techno-fatalism strongly. Isn't it obvious that the Internet is still very new, that we are still experiencing its birth pangs, and that dramatic changes to how we use it will probably continue for another generation or two? Isn't it also quite obvious that we have not really figured out how to design and use the Internet in a way that is optimal for us as fully-realized human beings? I love the new universal accessibility of so much recorded knowledge. Over the last dozen years I have been a booster of this myself, and in my work I still aim to enlarge our store of free, high-quality knowledge resources. I also deeply love the free exchange of ideas that the Internet makes possible. These things are why I "live online" myself. I do agree with the boosters that all this will, in time, probably, change us for the better. But the idea that the mindless digital helter-skelter of the early 2000s is how things will always be, from here on out, is highly doubtful.

We simply can't go on like this. I think we can change, and we should.

Part 2: the pernicious design philosophy of the Internet >>

Relevant links

A good place to start learning about what psychologists say about Internet distraction would be via this search.

Nicholas Carr's famous essay, "Is Google Making Us Stupid?" in The Atlantic, is one of those articles you kind of wish you'd written. It focused many people's thinking about the effect of the Internet on how we think. I actually prefer his book The Shallows, however.

The "pancake people" reference is to a short essay by Richard Foresman in Edge.

For some of what I've said about the "revolution" that the Internet and digital media represent, see thisthis, and this, just for example.

When I think about the suggestion that it's not a bad thing that information chunks are getting smaller, I think of this Britannica Blog post by Clay Shirky, lauding short-form online communication as an "upstart literature" that will "become the new high culture." Perhaps an older, more widely-read introduction to this notion would be Small Pieces, Loosely Joined by David Weinberger--it's just that the pieces are even smaller and looser than when Weinberger published that book (2002).

"Go gentle into that good night" is, of course, a phrase from the poem "Invictus."

The surely absurd notion that there is a new geek anti-intellectualism is broached in this much-discussed essay.


Efficiency as a basic educational principle

It occurred to me that there is a simple pedagogical principle that explains the appeal of very early learning, homeschooling, and certain (not all) traditional methods of education, as well as why certain other methods of education strike me as a waste of time.

I hereby dub the following the principle of individual efficiency:

Seize every opportunity to help the individual student to learn efficiently--which occurs when the student is interested in something not yet learned but is capable of learning it, and especially when learning it makes it easier to learn more later.

In other words, when an individual student is capable of learning efficiently, seize the opportunity.  If students spend too much idle time when they could be learning, if they are learning only a little, if they are not interested in what is being taught, if they have already learned it, or if they will not understand it, then they aren't learning efficiently.  When a certain approach ceases to conduce to efficient learning, try something else.

Why insert the word "individual" here?  Because "efficiency" in education has entailed, historically, the "industrial model" of education.  It might be an efficient use of resources for the state to pay teachers teach 35 students the same thing at once, but this is decidedly not the most efficient way for the individual student to learn.  More on this below.

So far, the principle is unremarkable.  But see how I apply the principle to a variety of educational issues.

1. Very early learning, by certain methods, is efficient learning. Under-fives, and even babies, are capable of learning much more than most people give them credit for.  Just for example, they are capable of learning to read.  Maybe more importantly, the use of books above all--but also flashcards, powerpoint presentations, videos, and iPad apps--can efficiently teach very young children vocabulary and basic concepts and skills that historically were not introduced until some years later.  A lot of old dogmas about "developmental appropriateness" are going by the wayside as parents discover ways to teach their tiny tots much earlier, but in a fun, engaging way.  Just bear in mind, I do not think that pressuring small children to learn is efficient.  That makes them lose interest--which is inefficient--and not just in the "pressured" subject, but in all learning.  Indeed, it is usually best to avoid pressure, whenever possible, regardless of age, which leads me to the next point...

2. Homeschooling's main advantage is its higher potential for efficiency. In a homeschool (not a radical unschooling situation), parents can choose exactly the right books and other materials to match the student, both her interests and her capacities.  (I am by no means saying that most homeschoolers actually do this, though.  Just that they are free to.)  Endless sifting for exactly the right educational materials and methods is extremely important if you want to keep your student's attention and interest, and to keep challenging her.  Done right, a homeschool involves constantly challenging the student, with no unnecessary review.  (But beneficial review, yes.)  Teaching my five-year-old homeschool student, I have developed a sense for what learning "feels like": it seems challenging indeed, but not so difficult as to be boring or impossible; and it lasts for a limited length of time, about the length of my son's attention span.  We rarely spend too much time on a subject, but when we study, we tend to  learn efficiently.  In my own schooling, in good public schools, learning was rarely so efficient.  As a result, my son is far better educated than I was at age five.

3. Unschooling, or at least "radical" unschooling, is often inefficient. Unschooling in its purest form entails allowing the child to choose both the subjects and the methods of study--and even whether to study at all.  The parent does, of course, support and foster the child's pursuits.  It would be wonderful if it always worked.  Unschooling does hold some appeal to me, because I think it is extremely important that students enjoy learning--efficient learning can't happen if it isn't motivated learning.  Insofar as unschooling emphasizes listening to the student and getting heavy student input, I'm a fan.  But unschooling in its purer forms permits students to avoid learning subjects when they, and their future learning, could benefit hugely.  However much fun it might be for the student, however well it might prepare them for a particular trade, this is inefficient as a method of getting a liberal education.

4. Memorizing some facts is efficient. The reason students should memorize, for example, basic arithmetic facts is efficiency.  While I agree that they should be fully exposed to mathematical concepts and multiple methods of attack (understanding math is paramount), memorization of math facts is important because it makes it much easier to do higher math and science later.  Use of calculators in elementary math is sometimes defended on grounds that adults use calculators, too, and learning how to use them is efficient.  That may be, but it is far faster, and more efficient, to be able to do basic arithmetic without a calculator.  This is only an example.  Another example, which I'm going to choose just to annoy people, is history dates.  Consider this list of important dates in history, which looks pretty good to me.  If you're a well-educated person, you should know some such list of dates.  Such dates are the backbone needed to contextualize the historical order and length of other historical events.  If you don't have quite a few of those dates under your belt, you can't really make much sense of other dates that you might come across in reading history--which means you won't learn history properly, and you won't want to learn history because it will be a puzzle.  So it is necessary to commit a fair number of dates to memory simply to make later history more comprehensible and interesting.

5. Reading many carefully-chosen, well-written books is an efficient way to learn. Why is book-reading so efficient?  A well-written book, when chosen to match the student's interest and comprehension level, is designed to teach information in as efficient and attractive a way as possible.  That, after all, is why we say certain books are well-written.  Videos can achieve the same thing, but most videos cannot teach vocabulary and language skills as well as books.  While it is not so popular for educationists to come out against books, they talk up a lot of other methods that do not require book-reading, and--well, there's only so much time.  My approach is different.  Our homeschool is completely "book-centered."  We have six bookcases filled to overflowing (we need another one now) with children's books, both nonfiction and fiction.  I am absolutely convinced that my son is reading and learning far above grade level not because he has a high IQ but simply because I've read a zillion books to him, explaining everything in them that I thought he might not understand.  I truly believe that, of the various general methods of learning, this is the most efficient way to gain knowledge.  It even makes certain "skills development" unnecessary.  Because we have read so much, we have not needed to study vocabulary, spelling, or even basic grammar as separate subjects (see below).

6. Incorporating illustrative multimedia to supplement reading is efficient. Book-reading is great, but you can make it even better by having an iPad on hand to instantly look up pictures, videos, maps, and encyclopedia articles to help clarify what it is difficult for you to explain in words.  Sometimes a picture or video is absolutely invaluable in explaining some subject.  I find this to be especially true in geography, and to a slightly lesser extent history and science.  For science reading, I frequently do "mini-experiments" with whatever is on hand, using videos viewed on the iPad as a fallback.

7. Learning the texts of Western civilization is efficient. The more of the ancient Greek and Roman classics that students learn, the better they will be able to understand why our society thinks, judges, and works the way it does.  This goes just as well for the most important works of literature, philosophy, religion, and art throughout the ages.  Studying these texts is efficient because someone with a great foundation in the liberal arts finds it much easier to read and learn from all sorts of other texts.  I suppose the same would also go for students of other great, ancient civilizations like China and India, but I don't have any experience with that.  Anyway, it is profoundly inefficient to expect students to be able to think or say anything interesting, or to learn much, about the big policy questions that are frequently the subject of "bull sessions," without prior exposure to "the best that has been thought and said" about history, philosophy, or political theory.  The same can be said for the discussion of classic literature taken out of context.  A student who is mostly ignorant of history and other classics simply can't appreciate, or say much that is not banal or simply incorrect, about a work of classic literature.  This is why reading of the classics has declined: if you do it halfway, these books are just going to seem confusing and boring.  If you go at it whole hog, you'll actually enjoy them and learn a lot from reading them.

8. Grounded in enough reading, it is much more efficient to write a lot than to do "language arts" workbooks. Elementary school students spend hours and hours doing workbook exercises about grammar, vocabulary, and spelling.  Some such work is, I agree, beneficial.  But a lot of such work is unnecessary busywork if one has read and written a lot.  The best way to get to an 800 verbal score on the GRE is not by studying vocabulary as a subject but by reading a lot of books and being introduced to vocabulary in context, and then looking up words that are puzzling.  If, through reading, one is extremely familiar and comfortable with correct English, reproducing it in written form is much easier, and some of the time spent on grammar, vocabulary, and spelling becomes unimportant.  Far more efficient is to do a lot of writing daily, to get copious feedback from a very literate person, and to revise.  All that said, I am inclined to think that students should go through a full, systematic course of grammar a few times in their academic careers, and some supplementary work on spelling and vocabulary is a good idea.  It's also important to teach children how to use and appreciate reference books as they write.  If a student enjoys browsing style guides, you've done something right.

9. Ed tech's main appeal is its efficiency.  When inefficient, it sucks. Educational technology--I think of websites like WatchKnowLearn, Reading Bear, and educational apps on the iPad--can greatly increase the efficiency of learning.  At its best, ed tech increases student interest and attention span while delivering information or skills practice in a way that fosters understanding and memory.  It doesn't always work that way, though.  Some educational software and Web tools and communities are decidedly less efficient than more traditional methods.  We avoid it in our homeschool.  Sometimes, though--as with the "Presidents vs. Aliens" app--I'll let my son have a little "inefficient" fun if it means he's going to know presidential facts backwards and forwards.  Besides, sometimes having fun in this way makes something that otherwise might seem boring, like a thick volume about presidential history, suddenly more interesting.

10. The project method is inefficient. Now let me explain why I have it in for the project method.  I have loathed this method since it was inflicted upon me back in the 1970s and early 1980s.  It never fails to amazing me that teachers and education professors apparently can't see--or worse, don't care--that making models, playing dress up, putting on lame plays, and doing endless navel-gazing projects about themselves, and so forth, are an amazingly inefficient use of time.  It is true that students can learn a few things very well from such projects.  But in the same 20 hours that it takes to do some elaborate history project, a student could have read ten related or increasingly difficult books all on the same subject, written a serious report, and emerged a little expert.  True, he wouldn't be able to point proudly to a model of the pyramids or a mud hut village.  But he would actually know something about ancient Egypt or African village life, something that he would remember.  Moreover, if the books are carefully chosen to fit the student and for quality, and the student can choose the report topic and gets enough help with it, the student can actually like the reading and writing, as much as if not more than yet-another-art-project.

11. Many textbooks are inefficient. Textbooks are written to satisfy textbook adoption committees which are devoted to requirements that often make textbooks deadly boring, especially in the earlier grades.  Going through a textbook might guarantee that you cover the "scope and sequence" of educational standards, but if students are bored, if they find some parts too easy and other parts insufficiently detailed, if textbooks insert unnecessary bias or instead render them so vanilla as to lack any personality, the result won't inspire anyone.  As a result, students don't learn what they should from textbooks, which is just to say that textbooks are inefficient.  We find that replacing one big textbook with many shorter books, chosen for maximum student interest due to excellent writing and accessibility, we learn far more than we would by studying a textbook. That said, I believe there are still some subjects, at some levels, that are best approached with a textbook--math is an example.

(Added later.) 12. Spaced repetition is efficient. The spaced repetition method, well known to psychologists but shockingly poorly known among actual educators, has the student review refresh information in memory, via active (quiz) review, just before it is forgotten. Free software (such as Supermemo, Mnemosyne, and Anki) makes such review easy. Most students can achieve a 95% recall rate for information put into such a system, as long as a daily review (which needn't be very long or arduous) is done. The same cannot be said for worksheets, cramming for exams, or passive review of information.

------------------------------

I'm sure I could go on, but I think I've demonstrated that the principle of individual efficiency does pretty deeply explain my stands on various educational issues.  Well, at least I find that interesting; I seem to have put my finger on a system.

For the philosophers out there, if you want a further argument for the principle itself, I think it follows from a traditionalist goal for education, together with a basic principle of rationality.  Given that the goal of education is the development of academic knowledge and skills (to include a broad and deep comprehension of Western civilization and science a.k.a. liberal arts), the next big question in philosophy of education is how to describe the most rational means to this end.  The principle of individual efficiency is my stab at that.

I wonder--would progressive educators in their many contemporary forms disagree with the principle, or would they instead disagree that the principle supports my conclusions?  I'm guessing it would be the latter.  But I think it is ultimately the principle itself that they are bound to reject.  Ultimately, progressive education is not about individual efficiency in education at all.  Maybe I'll say what I think it's really about later.

Some people will inevitably read the title and first few paragraphs of this post, skim the rest, and come to a major misinterpretation.  Misinterpretation #1. Some might assume that I am defending the "factory model," merely because the word "efficiency" is associated with that in the field of education.  Nothing could be further from the truth.  I reject the factory model and instead embrace homeschooling precisely because the factory model is so inefficient.   Misinterpretation #2. Some might suppose that I am defending "tiger moms" who constantly pressure students to learn and achieve.  Well, no.  Efficiency is about quality, not quantity.  It requires some discipline, but not harsh discipline.  I think education is most efficient when the student is sincerely interested and motivated.  That requires plenty of breaks and plenty of student input.


What I dislike about experts: dogmatism

Since I started Citizendium, which invites experts to be "village elders wandering the bazaar" of an otherwise egalitarian wiki, and am well-known for criticizing Wikipedia's often-hostile stance toward experts, I am sometimes held up as an example of someone who places too much trust in experts.

In fact, I have quite a bit less trust in experts than most people have.  When I learn that something that strikes me as, at least, capable of being reasonably doubted is the overwhelming majority opinion of experts, I become very suspicious.  Moreover, this has long been my attitude--not just recently, but since before Citizendium, or Wikipedia for that matter.  Let me explain why, and remove the puzzlement these claims must provoke.

First, however, let me explain why I respect and honor experts.  If they really are experts, and not just "the most knowledgeable person in the room" on a subject, it is because they have so goddamn much knowledge about their subject.  Even if I disagree with an expert's views on controversial issues, I stand in awe when it is clear that they can explain and evidently understand so much.  Knowledge per se is deeply important to me, and not just correct memorized information, which computers can ape, but deep understanding.  It is extremely satisfying to have demystified something that was previously puzzling, or to have come to a more complex understanding of something that seemed simple, though I earlier did not understand it and it was not simple.  A person has my respect who has grasped much of what is, to me, still mysterious and complex about a subject.

Still, my respect only goes so far, because I am aware of a certain problem with expertise and especially with the social nature of contemporary research.  People are sheep--even very smart people, trained in critical thinking.  When there is a "consensus" or "broad agreement" in many fields, it becomes politically difficult to express disagreement.  If you do, you seem to announce yourself as having some serious personal flaw: stupidity, ignorance of your own field, not being current on the literature, possessing poor judgment, or being ideologically motivated, dishonest, or unbalanced.  This is true not just in obviously controversial or politically-charged debates, it is also true about completely abstract, apolitical stuff that no one outside of a discipline gives a rat's patoot about.

Thus, due to the lemming-like conformity among many researchers, academic agreement tends to feed on itself.  An attitude becomes the only one worth expressing, even if, on the more objective merits of the evidence itself, such confidence is not warranted at all.  Such biases can swing 180 degrees in one generation (think of behaviorism in psychology).

I don't know enough about intellectual history to say for sure, but I suspect things weren't always quite as bad as they are now.  I suspect that academic conformity has been growing at least since I was in college myself, anyway.  There have been intellectual trends or "schools of thought" for millennia, of course, and when scholarship was dominated by the Church and religion--in medieval times and to a lesser extent until the 20th century--certain points of doctrine were held with easily as much dogmatism as one can find anywhere in academe today.  But in the last century, some causes of academic conformity have certainly grown more powerful: academic success is gauged based on how much one has published and in high-ranking journals, while researchers are expected to build upon the work of other researchers.  There is, therefore, an economic incentive to "play it safe" and march in lockstep with some particular view of the subject.  This situation has become even more dire both due to the extreme competition for jobs in academe and research, and due to the literal politicization of some fields (i.e., the devotion of whole disciplines to political goals).

This problem has become so pronounced that I find it is impossible really to evaluate the state of knowledge in a new field until I have come to grips with the leading biases of researchers--how professional conformity or political dogma might be giving an aura of certainty or consensus to views that ought, in fact, to be controversial and vigorously discussed.

I could cite several instances of unwarranted confidence in academic dogma from the fields of philosophy, psychology, and education, but frankly, I don't want to offend anyone.  Academics, of course, don't like to be called sheep or dogmatists.  Besides, I think my point will be more effective if I let people supply their own examples, because you might disagree with mine.  Care to discuss some in comments?

Let me conclude with a prediction.  Contrary to some, the Internet is not going to limit the prerogatives of experts; they have important roles to play in society, and we cannot function at our best without our most knowledgeable people in those roles.  But one of the more interesting and often delightful aspects of the Internet is that it provides a platform for people with nonstandard views.  It will also--it does not yet, but it will--provide a way to quickly compare current views with views from the past.  These two comparison points, nonstandard and historical opinion, were not so readily available in the past as they are or will be.  The easy availability of these dissenting views will make it increasingly obvious just how dogmatic academe has been.  Indeed, this has already started, and is one reason why experts and academics as a group have taken some hits to their credibility online.  Finally, I observe that, for all the ovine nature of researchers, youth often loves to smash idols, and new "education 2.0," degree-by-examinationbadge, and other schemes might make such nonconformist idol-smashing a better career option.  I suspect we will see a crop of younger researchers making careers on the newly-viable fringes of academe by pointing out just how ridiculously overblown certain academic dogmas really are--and students eager to save on tuition and get a broader perspective will flock to tutorials with such independent scholars.


The future, according to Kathy Sierra

Kathy Sierra blogged earlier today six years ago (!) that "The future is not in learning"; the future lies, instead, in "unlearning."  This sounds awfully like another example of the geek anti-intellectualism that I love to hate; we'll see about that.  Since that's how the post comes across--beginning with the title--it has already gotten a lot of attention.  Geeks just love to hear that, in the future, they won't have to learn things.  They just love to talk about how they'll be able to upload their memories to the Internet, how Internet search makes memorization a waste of time, how they just can't make themselves read books anymore, how the intellectual authority of experts is passe, and how the liberal arts and college generally are a waste of time.  For all the world it seems they really hate learning.  So when Kathy Sierra says that the future is not in learning, they start salivating.

In fact, Sierra's main message is excellent, one that is not at all anti-intellectual.  She's saying that since the times they are a-changin' so much, we have to roll with them faster and faster, as change accelerates.  This is itself very old news, but it's always nice to be reminded of such perennial wisdom.

Too bad that, as a premise, it hardly supports the post's dramatic title or its opening pseudo-historical timeline.  Her timeline asserts that in the 1970s, the question was (somehow--who knows what this means?) "how well can you learn?"  In the 1990s, it was "how fast and how much can you learn?"  But today we have evolved!  Now it's about how quickly you can unlearn!

If we take the latter claim in any general sense, the argument is fallacious, of course.  It is true that in the 1970s education theorists talked a lot about how well we learn; they still talk about that.  It's also true that there was a movement to accelerate education, especially among young children, which had its height in the 1990s and in the heyday of NCLB.  But when Kathy Sierra next points out the homey, perfectly old-fashioned truth that we must change our habits to keep up with the times, she is changing the subject.  The following argument contains a fallacy:

1. We should unlearn habits that do not conform to new developments.
2. New developments are now coming fast and furious.
3. Therefore, the important new virtue is not learning, but unlearning.

The premises (1 and 2) do not support the sweeping conclusion (3).  I do not contradict myself when I maintain that we should still learn quickly and a lot (allegedly the "1990s" virtue--I thought it was an ancient virtue, but maybe that's just me), even while maintaining that we should change our habits as they become outdated.  The premises do support a much more modest conclusion, that being fast and flexible in how we change our habits is a new virtue.  But to say so entails neither that "the future is unlearning, generally" nor that "the future is not learning, generally."  So this is a plain old logical fallacy.  I leave as an exercise to the reader to name the fallacy.

Lest I be accused of misconstruing Kathy Sierra, let me add this.  I know that she spends most of her post explaining how we should unlearn certain outdated habits.  I agree that this is excellent and timely advice.  But that does not stop her from titling her post "The future is not in learning..." and contrasting the new virtue of "how fast you can unlearn" with the old virtues of "how well you can learn" and "how fast and how much you can learn."  But the fact of the matter is that unlearning outdated habits is a very, very different kind of learning (or unlearning) from learning facts.

Besides, even if you say that what we should unlearn is certain facts, the facts tend to be about narrow, practical, and necessarily changeable fields, viz., technology and business.  Just because technology and business are changing quickly, that doesn't mean a lot of our knowledge about other topics is becoming uselessly outdated.  If that's the argument, it too is obviously fallacious.

So does Kathy Sierra deserve to be called an "anti-intellectual" for this argument?  Well, on the one hand, one can't take her argument at all seriously as an argument against "learning."  On the other hand, she does seem to have something of a disregard for logic, and if she doesn't literally believe her title, she does seem to pander to the anti-intellectual sentiments of certain geeks.  I hate to be uncharitable, and I wouldn't want to accuse her of encouraging people to stop learning so much, but look--the anti-intellectual sentiment is in the title of her post. Yes, maybe she is merely gunning for traffic by pandering to geek anti-intellectualism.  But why would she want to do that if she didn't share their biases against learning?

UPDATE: see below.  Kathy Sierra responds to point out that this is a six-year-old post.  I don't know quite why I thought it was posted today!  But I've already made a fool of myself, and I'm not one to stop doing so after I've done it publicly, especially at someone else's expense.


Why online conversation cannot be the focus of a new pedagogy

One of the most commonly touted features of a new, digitally-enhanced pedagogy, championed by many plugged-in education theorists, is that education in the digital age can and should be transformed into online conversation. This seems possible and relevant because of online tools like wikis and blogs.  There has been a whole cottage industry of papers and blogs touting such notions.  Frankly, I'm not interested in grappling with a lot of this stuff.  Actually, I wish I had time, because it's kind of fun to expose nonsense to the harsh light of reason.  But for now, let's just say that I've read and skimmed a fair bit of it, and I find it decidedly half-baked, like a lot of the older educational theories that hoped for various educational "reforms."  Some reference points would include fuzzy buzzwords like connectivism, constructivism, conversation, the social view of learning, participatory learning, and many more.

I am interested in briefly discussing a very basic question that, I imagine, underlies a lot of this discussion: can online conversation serve as the focus of a new pedagogy?  I've already written a bit about this in "Individual Knowledge in the Internet Age," but I wanted to return to the topic briefly.

A lot of educators are--not surprisingly--very much struck by the fact that we can learn a lot from each other online.  This is something I've been aware of since the mid-90s, when I ran some mailing lists and indeed did learn a lot from my fellow early adopters.  I continue to learn a lot from people online.  Quora is a great way to learn (albeit it's mostly light intellectual entertainment); so are many blogs and forums.  And of course, wikis can be a useful source of learning both for writers and readers.  These all involve an element of online community, so it of course makes sense that educators might wonder how these new tools could be used as educational tools.  I've developed a few myself and actively participate in other online communities.

But when we, adults, use these tools and participate in these forums, we are building upon our school (and sometimes college) education.  We have learned to write.  We have (hopefully) read reasonably widely, and studied many subjects, giving us the background we absolutely require to understand and build upon common cultural references in our online lives.  But these are not attainments that school children share.  (My focus here will be K-12 education, not college-level education.)  You are making a very dubious assumption if you want to conclude that children can learn the basics of various subjects by online participation modeled after the way adults use online tools.  Namely, you are assuming that children can efficiently learn the basics of science, history, geography, and other academic subjects through online tools and communities that are built by and for educated people.

Of course they can't, and the reason is plain: they usually have to be told new information in order to learn it, and taught and corrected to learn new skills.  These are not "participatory" features.  They require that a teacher or expert be set up to help, in a way that does not correspond to the more egalitarian modes of interaction online.  Moreover, except in some fields that are highly interpretive such as literature or philosophy, the relevant information cannot be arrived at via reflection on what they know--because most children are quite ignorant and much in need of education.  To be able to reflect, they need input.  They need content.  They need food for thought.  They need training and modeling.  They need correction.  We adults don't experience these needs (at least, not so much) when we are surfing away.  We're mostly done learning the concepts, vocabulary, and facts that we need to make sense of conversation in the forums that interest us.

So the reason online conversation cannot be the focus of a new pedagogy is that online conversation, as used by adults for learning, requires prior education.

I have nothing whatsoever against K-12 classes putting their essays or journals on blogs, or co-writing things using wikis, or in other ways using online tools to practice research, writing, and computer skills.  But we should not fool ourselves into thinking that when children do these things, they are doing what we adults do, or that they're learning in the ways we do when we use blogs, wikis, etc.  They aren't.  They're using these as alternative media for getting basic knowledge and practicing skills.  We adults mainly use these media to expand our knowledge of current events and our special interests.  The way we use them is radically different from proper pedagogical uses precisely because our uses require a general education.

Are you skeptical?  Well, I expect that if you're reading this sentence right now, you're pretty well educated.  So consider, please.  What would it be like to read a science blog, or Quora answer on a scientific question, without having studied high school mathematics and science?  Pretty confusing.  What would it be like to read any of the bettter blogs out there--the ones in your own blogrolls or feeds--if you had not read a lot of literature and in other ways learned a lot of college-level vocabulary?  Difficult and boring.  What would it be like if you had to read the news, or political blogs or Wikipedia's current affairs articles, having only minimal knowledge of geography and civics?  Puzzling at best.  Could you really hold your own in a blog discussion about politics if you had an elementary student's grasp of history and politics?  Would you find it easy to write a forum post coherently, clearly, and with good mechanics and spelling, even just to ask a question, if you had not practiced and studied academic writing and grammar as much as you did?  I could go on, but you get the idea.  You can't do these various things that make you an effective, articulate, plugged-in netizen without already having a reasonably good liberal arts education.

I imagine it's sort of possible, but conversation online among your fellow students would be an incredibly inefficient way for you to learn these things in the first place.  Why spend your time trying to glean facts from the bizarre misunderstandings of your fellow 10-year-olds when you can get an entertaining, authoritative presentation of the information in a book or video?  And I'll tell you one thing--someone in your online study community, the teacher or the class nerd, will have to have read such "authoritative" media, and reveal the secrets to everyone, or you'll be "learning" in a very empty echo chamber.

At this point, someone is bound to point out that they don't really oppose "mere facts" (which can just be looked up), declarative knowledge, or "elitist" academics, or books, or content, or all the other boo-hiss villains of this mindset.  They just want there to be less emphasis on content (memorization is so 20th century!), and more on conversation and hands-on projects.  Why is that so hard to understand?  But this is where they inevitably get vague.  If books and academic knowledge are part of the curriculum after all, then in what way is online conversation the "focus" of the curriculum?  How are academics, really, supposed to figure in education--in practice?

My guess is that when it comes down to implementation, the sadly misled teacher-in-the-trenches will sacrifice a few more of the preciously scarce books in the curriculum and use the time for still more stupid projects and silly groupwork assignments, now moved online using "cutting edge" tools because that's what all the clever people say where "the future" lies.  As a result, the students will learn little more about computers and online communities than they would learn through their own use of things like Facebook, and they'll get something that barely resembles a "reasonably good liberal arts education."

EDIT: I greatly enjoyed this literature review/analysis article:

Kirschner, Paul A., John Sweller, and Richard E. Clark, "Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching," Educational Psychologist 41 (2), 75-86: http://www.cogtech.usc.edu/publications/kirschner_Sweller_Clark.pdf