This essay can be read in my 2020 book, Essays on Free Knowledge. Perhaps ironically, it is no longer free.
UPDATE: I’ve posted a very long set of replies.
UPDATE 2: I’ve decided to reply below as well–very belatedly…
This essay can be read in my 2020 book, Essays on Free Knowledge. Perhaps ironically, it is no longer free.
UPDATE: I’ve posted a very long set of replies.
UPDATE 2: I’ve decided to reply below as well–very belatedly…
He makes some good points and occasionally shoots himself in the foot. Here’s what’s really happening:
The author is realizing that the un-educated or “anti-itellectual” movement isn’t actually from who he judged as anti-intellectual previously. He revealed a lot in his statement that it used to be for “knuckle dragging conservatives”, to which he specified “Protestants”. He mentions surprise that it’s not them “anymore”.
What he fails to see is that it never WAS those “knuckle dragging Protestants”. The anti-intellectual movement BEGAN on the left! It always was and always will be the “anti establishment” left that disparages classics, tears down traditional education and labels any previous methods of education as “outdated”. He’s just beginning to see that his real allies in such education are the people he previous put down.
Ironic, too, that he uses the politically correct term “humankind” instead of “mankind” while lamenting the lack of tradition in education. Very revealing.
I agree with some aspects of your piece.
There is a definite geek culture against the intellectual establishment, and some voices, and many actual examples, are telling people that they can drop-out and build a billion-dollar website, enjoying success in a “scientific” industry, without having to go through the educational peer-reviewed wringer.
However, I think this is nothing new from “youth culture”, and I think you’re confusing anti-intellectualism, with anti-establishment feelings.
I don’t think people are decrying education, just the rote style of education currently practiced. Plenty of facts, particularly in subjects like history, really are not worth memorising. Even in physics and math there seems to be an emphasis on learning formulas, rather than on understanding what those formulas mean – the classic example is learning the area of the right-angled triangle. By rote learning, it is simply 1/2bh. Learning that is enough to allow students to become productive triangle-area-calculators, but a simple geometric description shows clearly WHY it is so, and offers far more insight into the beauty of mathematics.
People are not against books as such, they have just recognised that books are a form factor, just like the 4-minute or so music single that had to fit on a ’45 vinyl record. Scientists publish their ideas in books, because that’s the form-factor that publishers deal in. It would save everyone a lot of time if they didn’t have to pad their ideas to fill a 200-page book, and perhaps even better if books were more interactive, animated, and so on.
Part of the problem is the speed of change. Old, wise, experts are most respected in fields that change slowly, as their expertise is still relevant. When things are changing quickly, new skills are needed, and of course the old establishment will face accusations of irrelevance, or even of holding back progress.
On the other hand, plus ca change, plus c’est la meme chose. I think the perception of the rate of change probably far exceeds the reality, especially amongst the young.
Superficially hemlines go up and down, computers get smaller, user interfaces brighter, but something at the core really hasn’t changed much at all. Us, perhaps.
I’m confused as to how individuals that call themselves geeks also classify themselves as anti-intellectuals. Don’t geeks embrace knowledge, as it enables them to achieve? Has geek culture been debased by its push into mainstream culture, such that the majority of those that call themselves “geeks” really aren’t, and simply are looking for approval with that label that society has deemed as its new hot item now that we live in a more technology-enabled culture (with regards to computing)? If this is the case, then wouldn’t these same individuals have simply been the same masses that us more old-school geeks would have traditionally been shunned by? Perhaps the comment regarding it being the hipsters (posers if you will then?) being the more appropriate classification has some merit here. Oh, and to the folks that say this article is too long, wow, if you don’t have the attention span to get through this, then I’d say it is a lack of your own mental capacity to digest this and no fault of the author – your ignorance really shines through loud and clear.
A very interesting thread.
Anti-intellectualism is certainly not new. I don’t believe geeks are becoming anti-intellectual, I think anti-intellectuals are simply calling themselves geeks because geeks have become “cool”. In essence, THEY haven’t got true geek credentials, which must include not only cool toys but a deep understanding of them. And when did “geek” ever equate to “intellectual”? Geeks were collectors of arcane knowledge. They were knowledgeable, not necessarily intelligent (an important distinction for this conversation).
Anti-intellectualism is a reaction by people who are not particularly intelligent (by lack of capacity, lack of opportunity, or lack of self-discipline) that is mostly sour grapes. They convince themselves that they don’t really want what they can’t have anyway.
So, to be intelligent (the true issue for which “intellectual” is a proxy here), you must combine factual knowledge with the ability to analyze and synthesize information. All three aspects must be cultivated somehow. How you get there is not restricted to one true path, however, there are some harsh realities: most people lack self-discipline; even with self-discipline, solo learning is HARD because there is no one to test your ideas with or to ask questions; and solitary intelligence is necessarily limited by available time and effort.
Colleges provide the most efficient environment where you can become intelligent. The fact that most students don’t realize that goal, even if they complete the curriculum, is a comment on humanity, not colleges. That’s why in intelligent circles, a degree is the beginning of the evaluation of someone’s capacity, not the end. What college, what major, and how well you did are the shorthand for a basic evaluation of intellectual competence. You can be just as intelligent without setting foot in a university but the odds are against you and you will need to establish your credibility through some other means that will take time and effort.
And that’s the true problem with the anti-intellectual movement, geeky or not: anti-intellectuals demand that equality of their value as human beings must extend to equal regard for their “ideas”. When an opinion must be regarded as equal to a demonstrable fact, we have foolishness. When a demonstrable falsity (sometimes called a “lie”) must be given equal weight to a demonstrable fact, we have insanity. That’s where we are today.
The idea that looking something up on the web is the same as an intelligent grasp of a topic is to confuse facts with knowledge. Poincare nailed it long ago: “Science is built of facts the way a house is built of bricks; but an accumulation of facts is no more science than a pile of bricks is a house.”
The person that alluded to Watson’s performance on Jeopardy proved the point (but not the one he intended). Watson displayed what passes for intelligence now: the ability to fetch facts on demand. I don’t think anyone would claim, however, that Watson understands its answers. Without understanding, there is very little intelligence or intellect.
Have you checked out the Wikipedia page for Paul Revere, specifically the talk page? What’s interesting to me is that the Palinistas can be as serious and intellectual as the anti-palinists, they simply have different standards for what constitutes a reliable source, what constitutes a reasonable inference, and consequently what truth and history are.
e.g. Obviously Sarah Palin as a prominent Politician, widely quoted, former governor and presidential candidate constitutes a particularly reliable source.
You may (and no doubt will;) argue that that is misunderstanding of what the words mean, but that is an argument you cannot successfully put forward in a modern university. Theirs is simply a different epistemology, no reason to privilege yours. You might check out the book Higher Superstition.
I’m not suggesting that the geeks you lambaste are right to reject Academia out of hand (they’d need the ability to reason to be capable of “right”), but rather that there is not really any way for them to be exposed to the habit of intelligent thought any more. Do you think your essay is really all that well constructed? This is the age of shoddiness. Innumerable hurried utterances.
I should respond directly to your five points and your manifesto:
1) Authorities can give you pointers as to where to look, but you can’t rely on them for knowledge. They lie, just like anyone else. They have hidden motivations, ground axen, all the usual baggage. Revisit the Autism/Vaccine scandal. Was Dr. Wakefield no expert? Look what terrible price children paid because their parents subscribed to your world view. Please no special pleading. Reread Robert Kennedy’s highly influential essay.
2) Physical books that collate information on a topic are pretty damn obsolete, even fresh ones by the time they’re published. I will always go to the internet, usually starting with Wikipedia, if I want to get started on a topic. I understand that my government pays thousands of people to slant wikipedia articles in favor of its fascist interpretation of history; but my cost-benefit analysis says that payoff in speed of access is worth it. Books were never immune to the influence of power, and both books and wikipedia are mere authorities. They do not propagate “knowledge”, only gossip, and we must learn to find the knowledge that we need despite them.
3) Narrative fiction can be loosely divided into internal and external; external narration is probably nigh-obsolete replaced with multi-sensory narrative (e.g. movies); internal narrative is still best done in text. You couldn’t pay me to read War & Peace; Tolstoy is a crap writer and the translations don’t help, and it’s long and boring except for the dancing bears. Strike that, Tolstoy can even make dancing bears boring. But I’ve read The Idiot and the Possessed (the Devils) and quite a few books from that era. It’s about other times and places and no doubt as irrelevant as Austen; there’s no accounting for tastes. I certainly wouldn’t require it of someone and expect them to be happy. The physical version of text narrative is rapidly obsoleting itself; physicality raises my cost of acquisition and maintaining inventory (shelf space).
4) Well duh? I mean, what were dictionaries for? Thesauri? Annotated Shakespeare? I’m completely unclear whence your outrage. My wife can recite almost any classic Start Trek script verbatim; me, I need to look them up. In the 15th century, her feats of memory would have been almost ordinary…for the educated classes. There’s always going to be something you remember and something you don’t remember. What you miss saying here is that without a basic grounding in both fact and reasoning the person has no way to sanity-check what they find when they consult Wikipedia. For my part, I’m pretty sure Paul Revere did not wear a bicycle helmet. He was after all a patriot and no liberal.
5) Certainly a type of success is a lengthy NFL career, so if you’re good at football in high-school you don’t need to study. Quite similarly, yes a few thousand people out of millions have dropped out of college (apparently you need to GO to college, but not stay) and become wealthy. If there is an error in this reasoning it is probably not in the “it’s possible to be successful without college” part. Where would Google be if Page & Brin had waited to finish their graduate degrees?
Your manifesto is pretty much a run-on block of text which is weirdly in the second person instead of the more usual first. It’s TL;DNR to all sensible people and makes me suspect you’re a fan of Ayn Rand. Still:
I’m not sure what it means to be a fan of knowledge for knowledge’s sake, as opposed to say a fan of insight or understanding or a fan of accomplishment. It’s pretty much inarguable that those chumps who master facts without mastering the application of facts to real world problems or mastering the social dance will end up working for those who have mastered the basic building block skills of management. Why on earth would someone who derides people skills expect to be put in charge of people?
Just because I feel derision for people who point to their credentials when questioned about their conclusions doesn’t mean I necessarily feel derision for all experts. I merely feel derision for anyone who tells me that they are an expert. Why would I care about that? Do you have insight into the problem at hand? Can you justify your conclusions? It’s perfectly all right if you can’t; that’s just a data point. Lacking firmer arguments it may have to do.
Similarly I have absolutely no interest in knowing that someone is brilliant. I’m interested in whether what they say can be made to make sense, whether it can be used to shed light on a problem.
You seem to have an obsession with recognition and respect. I propose to you that these are the concerns of an older person who fears his or her faculties are failing. (Alternatively: someone suffering from impostor syndrome)
Let me tell you a little anecdote from my workplace:
The CTO is in my cubicle and we’re discussing something and he gets frustrated and says that I’m not making my proposed solution particularly attractive. I tell him that I have not the slightest interest in persuading him to take my advice. He says, “Then why am I talking to you?” and stalks off.
After a short while he comes back. I tell him that he’s talking to me because that’s part of getting his problem fixed.
“Listen to me, I’m the expert” sounds sort of pathetic. Respect is not something you demand, Larry, it’s something you can’t escape.
Just because people dislike experts and books and intellectuals does not mean they are against knowledge. Wiki, free for all students to read, is the future of knowledge. Books cost too much for poor children, and are always out of date before they are printed- even e-books. Someone who has a car and thinks horses and buggies are silly, that person is not “anti-transportation”. In fact, the person who clings to the book is the one who is against knowledge. What good is a book that only rich people can read? Worthless. If you want to give knowledge to the world the only way is wiki.
@Marko June I’m sure you understand that not all books can be represented through Wikis. Instead the Wikis are an evolved form of Encyclopaedias that stemmed from Vannevar Bush’s idea of the memex (Atlantic 1944?). Wikis help people search for ideas faster than the traditional Index Pages or tabs that have been replaced with searching algorithms.
In the Computer Science, we learn that algorithms are used to determine the efficiency of a certain implementation. In this scenario of Wikis versus Encyclopaedias, Wikis win because of their efficient search and online dictionaries could also be favourable. Efficient searching, however, should not be confused with efficient learning. A wiki most often shows the tip of the iceberg but will not yield its depth.
Books aren’t the sole answer to this article because Larry does not propose a full solution. Instead, he introduces the problem and its growing importance. Although printed works is dying, there should be alternative answers that inhibit learning.
Personally though, I’m currently 17 and I love both my books and my e-books. Books are worth more to me because they can physically encapsulate my knowledge and I can use my natural senses to perceive that: I can see its size, I can feel its weight, I can smell its age. Printed books allow me to give a physical estimation of how much knowledge the book may provide me. Although I’m a programmer with many math achievements that follow, I cannot look at an e-book and say that 4MB of data is worth something. It’s the combination of skills and effort put into the book ranging from typography, van de graaf canons, and mechanical printing that allow me appreciate their existence.
The depth and breadth of the counter arguments to the article within these comments serve to highlight the failings of that article.
IMHO, your essay attacks the weakest form of “anti-intellectualism.” The issue is not that learning and knowledge are outdated. The issue is that the 20th century credentialing system is outdated and far too expensive in the digital age. It was surprising to me that neither you nor the comments mention http://www.khanacademy.org. Bill Gates said last year that the best education in the world will be online within 5 years. Which isn’t surprising, because all you need to learn (and signal that you’ve mastered) a discipline are:
1.) An expert-crafted syllabus
2.) Internet access, so you can access all the books, videos, lectures, etc. which can be created at zero marginal cost.
3.) A healthy diet so your brain can function
4.) Time for your brain to organize itself
5.) Discipline/motivation, so you actually learn
6.) Hands-on experience to the extent that it is necessary (though this is often obtained after credentialing)
7.) As close to an objective assessment mechanism as you can find, for feedback and ultimately credentialing
Those 7 things can be provided far more cheaply than academia currently cares to admit. The sad reality is that the credential goes to the person who can afford to pay the tens to hundreds of thousands of dollars for it, when the expertise can be acquired more cheaply (both in time and money) outside of academic institutions.
So is it surprising that many people, including techno-geeks, are opposed to extortion by monopolistic, overpriced credentialing institutions? Knowledge should be treated as a public good, not as a good whose value derives from its artificial scarcity.
http://www.ted.com/conversations/1650/in_2011_is_it_possible_to_mak.html
Do you really believe that you can educate an engineer,
a medical doctor or a scientist whitout laboratories?
I direct you to point #6. “6.) Hands-on experience to the extent that it is necessary (though this is often obtained after credentialing)”
Also, the link I posted answers the same question. Do I believe we can educate doctors, engineers, or scientists without labs? Obviously not. But if my doctor had completed an accredited online program, had done well on her medical board certification exams, and had successfully completed her residency, then I would have no problem whatsoever with taking my child to that doctor.
Apprenticeships, internships, residencies, labs, etc. are still very necessary, but to the extent that the dissemination of knowledge/expertise can be made significantly cheaper through the use of digital technology, then credentialing should also be made significantly cheaper.
For what it’s worth, by way of background, I am a recent convert to the value of formal learning.
I think with Point 6 (6. Hands-on experience to the extent that it is necessary (though this is often obtained after credentialing)) we come close to the nub of the issue.
My previous ‘anti-academicism’ was based in large part on my criticism of the lack of practical experience inherent in learning as I experienced it. A valid criticism, but I, like the anti-intellectual geeks of this article, made the mistake of therefore rejecting the entire thing on the basis of a single weakness.
In fact, hands-on experience is absolutely vital for contextualising the information learnt formally. I am both a (not very successful) student and (very successful) teacher, and I am convinced by my own experience that contextualisation is not an airy-fairy trendy catchphrase but provides the relevance that is essential for motivation and learning.
One thing I miss, though, as brilliant as I am in my chosen field, is not only the credentials, but also the completeness of having studied the subject formally, as boring as that can sometimes be. Formal study is the cheatbook that lets you solve the puzzle when you get stuck; it alerts you to areas you would have missed on your own; it speeds up the process of putting two and two together; and it saves you from having to reinvent the wheel.
Replace “classics” with “Pokemon” and you’ll see how pointless this argument is.
Many years ago I went through a phase where I considered methods and processes to be the essence of an educated mind, and mere facts to be best looked up when needed. Why bother having a clue what number Pi is when you know how to derive and calculate it? I took this to the extreme of even scorning technical terminology as mere jargon. Why waste precious memory on all these esoteric terms that are really just conglomerations of description in simpler, purer, more orthogonal and universal language?
A mentor once pointed out to me, “You have to know the jargon in order to *communicate*”. Humans have a pathetic amount of working memory. Memorized jargon lets us build concepts too complex to hold in our minds if thought of only in terms of first principles. Likewise, memorized constants and equations are crucial for quick mental calculations that will let you accomplish things in a reasonable amount of time. Nobody is going to wait for you to derive pi when all you need to do is figure out how much a manhole weighs!
Computers are fantastic collectors and collators of information, but the human mind still excels at synthesizing new knowledge by combining and extending seemingly unrelated pieces of knowledge. When a person devotes enough time to moving “mere facts” from records into his working memory, we call him or her an expert. If he sees one problem, he can use the facts in his memory to solve that problem in a way no computer has proven any more capable of doing than a human who knows nothing at all.
This is the heart of the problem as I see it. The best ideas, when explained by a genius, force one to smack his own forehead and say, “Of course!”. Great ideas are almost always obvious when viewed in the right light. A+B=C. Simple! So why is it that experts still come up with most of them? Most possible A’s and B’s don’t add up to C. Only somebody who knows a lot of seemingly unrelated A’s and B’s is going to be able to conceive of C.
The attitude that education is pointless and facts are best looked up only when needed is not an attitude any self-respecting geek should hold.
@sopor: very GOOD
Leave a Reply