Social media stupidifies and radicalizes us

Back when the buzzword switched from "Web 2.0" to "social media," I started to get quite suspicious. When I was participating in online communities, I wasn't propagating "media." That is something that boring corporate media types did.

What would those boring corporate media types, or rather their Silicon Valley equivalents, do with once-unconstrained, lively, frequently long-form debate communities? Make the conversations shorter, more vapid, more appealing to the masses, and more addictive. In short, more of a really dumb waste of time.

The Zucks and Dorseys of the world did this in order to hook people more and more. What they probably didn't realize at first is that they had built tools for stupidification and radicalization. I don't think "dumb down" is quite the right phrase: dumbing down means making something complex simpler, easier to understand, but also less accurate. To "stupidify" focuses on the effects on us; in social media mobs, we are truly stupid herd animals, and when enraged, rather frighteningly stupid mobs. What we are fed and say is dumbed down; consequently, we are stupidified.

That degraded quality of social relationship--that is these fools' legacy. I have no respect for what Mark Zuckerberg and Jack Dorsey achieved. (This isn't a personal slam; I don't have that much respect for Wikipedia, either, which is something I built.)

If you had set out to reduce human Internet interactions to a subhuman, irrational, emotional level, an excellent strategy would be to replace long mailing list and Usenet newsgroup posts and rambling blog posts like this one with tweets (whether 140 or 280 characters--at that tiny length, it doesn't matter), propaganda memes, and emotion-driven comments that are cut short and sent by default if you try to write more than one paragraph.

To make the medium of social interaction briefer and more visual is to convey that intelligence, which is almost always long-form, is not valued. We live in a tl;dr world, the world that Zuck and Jack built. They must be very proud. If Marshall McLuhan was right that the medium is the message, social media's message is that your intelligence and individuality are worth little; your emotions and loyalty to your tribe are everything.

I will go farther than that. I lay the ongoing destruction of democratic institutions squarely at their feet. That's a dramatic and indeed emotional-sounding claim, but just look at what has happened and what is going on right now. It's a disaster. We increasingly distrust our institutions insofar as they are co-governed by our ideological opponents. That didn't used to be the case; what changed? That we are constantly presented with idiotic and easily-refuted versions of our opponents' social and political views. Consequently, we have lost all respect for each other. Staggering percentages of the American people want to split up the country and predict civil war. Long-term friendships and even family relationships have been broken up by relentlessly stupid arguments on social media.

It isn't just that increased familiarity with, or constant exposure to, our opponents' points of view has led to mutual contempt. Sure, familiarity might breed contempt; but through social media we do not project our most genuine, nuanced, intelligent, sensitive, and human selves. Social media makes us, rather, into partisan, tribal drones. We are not really more familiar with each other. We are familiar with stupidified versions of each other. And that is making society insane.

It certainly looks as if the combination of short, visual messages and simplified reactions to them--"hearting," upvoting and downvoting, or choosing from an extremely limited menu of emotional reactions--is enough to dumb down, to stupidify, the versions of ourselves we portray to each other. And that is, again, wreaking havoc on our society. With social media absolutely dominant as the locus of modern socialization, how could this fail to have a profound impact on our broader societal and political mood?

It is Zuck's and Dorsey's fault. They built the medium. The medium stupidifies us. Stupid people are particularly bad at democracy, as our Founding Fathers knew. The leadership of republican institutions must be wisely chosen by a sober citizenry using good sense improved by education. What we have now, thanks to social media, is a citizenry made punch-drunk by meaningless but addictive endorphins awarded them by reinforcing their tribal alliances, stupidly incapable of trusting "the Other" and, therefore, of reaching anything like a reasonable, democratic consensus.

This is one of the main reasons why I quit social media cold turkey over a month ago. I don't miss or regret it. I will continue to use it only for work purposes, i.e., essentially for advertising, which I hope is a reasonable use for it.

I sincerely, fervently hope that in five or ten years' time this is the conventional wisdom about social media. What comes next, I don't know. But we can't survive as a democratic society under these conditions.


I'm quitting social media cold turkey

"Yet another public resolution to leave Facebook or Twitter," you say with a laugh. "Only soon to be given up like so many others, no doubt." That's a reasonable reaction. But go ahead, check up on me: here are my Twitter account and my Facebook account. My last posts were Sept. 11 and Sept. 12. I promise to leave this blog post up forever--that'll shame me if I get back to it.

I've critiqued social media philosophically and even threatened to abandon it before, and I've advised people not to use it during work time (I admit I've later completely ignored this advice myself). But I've never really quit social media for any length of time.

Until now. As of earlier today, I've quit cold turkey. I've made my last posts on Twitter and Facebook, period. I'm not even going to say goodbye or explain or link to this blog post on social media, which I'll let others link to (or not). Friends and family will have to either call or email me or make their way here to get an explanation. I'll be happy to explain further and maybe engage in some debate in the comment section below.

I thought I'd explain what has led to this decision. You'll probably think it's my sniffy political stance against social media's threats to free speech and privacy, but you'd be wrong--although I'm glad I'll no longer be supporting these arrogant, vicious companies.

This resolution didn't really start as a reaction to social media at all. It began as a realization about my failings and about some important principles of ethics and psychology.

1. Socrates was right: we're not weak, we just undervalue rationality.

We are a remarkably irrational species.

Recently I began giving thought to the fact that we so rarely think long-term. If we were driven by the balance of long-term consequences, there are so many things we would do differently. If you think about this long enough, you can get quite depressed about your life and society. Perhaps I should only speak for myself--this is true of me, for sure--but I think it is a common human failing. Not exercising, overeating, wasting time in various ways, indulging in harmful addictions, allowing ourselves to believe all sorts of absurd things without thinking, following an obviously irrational crowd--man might be the rational animal, as Aristotle thought, but that doesn't stop him from also being a profoundly irrational animal.

I'm not going to share my admittedly half-baked thoughts on rationality in too much detail. You might expect me to, since I'm a Ph.D. philosopher who was once a specialist in epistemology, who has spent a great deal of time thinking about the ethical requirements of practical rationality, and who has done some training and reading in psychology. I'm not going to pretend that my thoughts on these things are more sophisticated than yours; I know they're probably not. I'm not an expert.

I will say this, just to explain where my head is at these days. I have always taken Socrates' theory of weakness of will (akrasia) very seriously. He thought that if we do something that we believe we shouldn't--have an extra cookie or a third glass of wine, say--then the problem is not precisely that our will is weak. No, he said, the problem is that we are actually ignorant of what is good, at least in this situation.

This sounds ridiculously wrong to most philosophers and students who encounter this view for the first time (and, for most of us, on repeated encounters). Of course there is such a thing as weakness of will. Of course we sometimes do things that we know are wrong. That's the human condition, after all.

But I can think of a sense in which Socrates was right. Let's suppose you have a rule that says, "No more than one cookie after dinner," and you end up eating two. Even as you bite into the second, you think, "I really shouldn't be eating this. I'm so weak!" How, we ask Socrates, do you lack knowledge that you shouldn't eat the second cookie? But there is a straightforward answer: you don't believe you shouldn't, and belief is necessary for knowledge. We can concede that you have some information or insight--but it is quite questionable whether, on a certain level, you actually believe that you shouldn't eat the cookie. I maintain that you don't believe it. You might say you believe it; but you're not being honest with yourself. You're not being sincere. The fact is that your rule just isn't important to you, not as important as that tasty second cookie. You don't really believe you shouldn't have it. In a certain sense, you actually think you should have it. You value the taste more than your principle.

From long experience--see if you agree with me here--I have believed that our desires carry with them certain assumptions, certain premises. New information can make our desires turn on a dime. I think there are a number false premises that generally underpin weakness of will. I'm not saying that, if we persuade ourselves that these premises are false, we will thereafter be wonderfully self-disciplined. I am saying, however, that certain false beliefs do make it much easier for us to discount sober, rational principles, naturally tuned to our long-term advantage, in favor of irrational indulgence that will hurt us in the long run.

Here, then, are two very general premises that underpin weakness of will.

(a) Sometimes, it's too strict and unreasonable to be guided by what are only apparently rational, long-term considerations.

There are many variations on this: being too persnickety about your principles means you're being a hard-ass, or uncool, or abnormal, or unsociable, or positively neurotic (surely the opposite of rational!). And that might be true--depending on your principles. But it is not true when it comes to eating healthy and exercising daily, for example: in the moment, it might seem too strict to stick by a reasonable diet, so it might seem unreasonable. But it really isn't unreasonable. It is merely difficult. It is absolutely reasonable because you'll benefit and be happier in the long run if you stick to your guns. It will get easier to do so with time, besides.

(b) Avoiding pain and seeking pleasure are, sometimes, simply better than being guided by rational, long-term considerations.

This is reflected, at least somewhat, in the enduring popularity of hedonism, ethical and otherwise. The aesthete who takes the third glass of wine doesn't want narrow principles to stand in the way of pleasure (it's such good wine! I don't want to be a buzzkill to my awesome friends!); instead, he will also congratulate himself on his nuance and openness to experience. The same sort of thinking is used to justify infidelity.

Such considerations are why I think it is plausible to say that, no, indeed, in our moments of weakness, we have actually abandoned our decent principles for cynical ones. You might object, "But surely not. I'm merely rationalizing. I don't really take such stuff seriously; I take my principles seriously. I know I'm doing wrong. I'm just being weak."

Well, maybe that's right. But it's also quite reasonable to think that, at least in that moment, you actually are quite deliberately and sincerely choosing the path of the cool, of the sociable friend, of the aesthete; you are shrugging with a self-deprecating smile as you admit to yourself that, yes, your more decent principles are not all that. You might even congratulate yourself on being a complex, subtle mensch, and not an unyielding, unemotional robot. This is why, frankly, it strikes me as more plausible that you're not merely rationalizing: you are, at least temporarily, embracing different (less rational, more cynical) principles.

But as it turns out, there are good reasons to reject (a) and (b). Recently, I was talking myself out of them, or trying to, anyway. I told myself this:

Consider (a) again, that sometimes, rationality is too strict. When we avoid strict rationality, the things we allow ourselves are frequently insipid and spoiled by the fact that they are, after all, the wrong things to do. Take staying up late: it's so greatly overrated. Overindulgence in general is a great example. Playing a game and watching another episode of a television program are simply not very rewarding; just think of the more gainful ways you could be spending your time instead. Having one cookie too many is hardly an orgasmic experience, and it is absolutely foolish, considering that the consequences of breaking a necessary diet can be so unpleasant.

Indeed, most Americans need to be on a diet (or to exercise a lot more), and that is an excellent example of our inability to think long term. It is hard to imagine the advantages of being healthy and thin. But those advantages are very real. They can spell the difference of years of a longer life, and considerably greater activity and, indeed, comfort in life. That is only one example of the advantages of rationality. The simple but profoundly beneficial activity of going to bed early enough and getting up early enough can make you much more alert, active, happy, and healthy. Why do so many people not do that every night? I think the reason is, at least in part, that we literally cannot imagine—not without help or creative effort—what that better life would be like. We are stuck in our own moment, and it seems all right to us.

In short, the requirements of a rational human life seem unreasonably "strict" only because we lack the imagination to consider a better sort of life.

Consider (b) now. Pain, and especially discomfort, are not all that awful. They are an important part of life, and if you attempt to avoid all pain, you ultimately invite even more. There is nothing particularly degrading about discomfort. Especially if it is unavoidable, and if working or fighting or playing through it results in some great achievement, then doing so can even be heroic. I’m not meaning to suggest that pain for its own sake is somehow desirable. It isn’t, of course. But being able to put up with discomfort in order to achieve something worthwhile is part of the virtue of courage.

2. It is irrational to use social media.

I want to be fair. So if I'm going to examine whether indulgence in social media is rational or not, I'll begin with some purported advantages and see how solid they are.

Social media seems to benefit the careers of a few people. This seems true of people with a lot of followers; but my guess is that most people with a lot of followers already have successful careers, which is why they have a lot of followers. (Models on Instagram and popular video makers on YouTube might be an exception, in that they can make their career via the platform itself.) People with fewer than, say, 10,000 Twitter followers don't really reach enough people to have a very interesting platform. I have about 3,000 Twitter followers, and I've deliberately kept my Facebook numbers smaller just because I use Facebook in a more personal way. Frankly, my career doesn't seem to be helped all that much by my presence on social media. Besides, that's not why I do it.

My Everipedia colleagues might be a little upset with me that I won't be sharing Everipedia stuff on Twitter and Facebook anymore (which I won't--because I know that even that little bit would pull me back in). But I can assure them that I'll get more substantive and impactful work done as a result of all the time freed up from social media. I will continue to use communication platforms like Telegram and Messenger, by the way, and Reddit, in the Everipedia group, will also be OK. I'll also keep using LinkedIn to connect to people for work purposes. But Quora and Medium are out. Those are too much like blogging anyway. My time is better spent writing here on this blog, or for publication, if I'm going to do long-form writing.

Social media also seems to be a way for us to make a political impact. We can talk back against our political opponents. We can share propaganda for our side. Now this, I was surprised to learn, does seem to have some effect in my case. I've heard from one person that she actually became a libertarian mostly because of my posts on Facebook. (I could hardly believe it.) Others say they love my posts, and I think I do probably move the needle some miniscule distance in the direction of Truth and Goodness. But I'm only writing to a few hundred people on Facebook, at most. My reach on Twitter is larger, but I almost certainly do not persuade anyone 280 characters at a time.

This isn't to say that, in the aggregate, social media doesn't have a great deal of impact on society. It clearly does. But I think its total impact is negative, not positive. Perhaps the way I use it is positive, although I doubt it. I am more given to long-form comments than most people on Facebook and Twitter. I like to think that my comments model good reasoning and other intellectual virtues. But are they my best? Hell no. Does my influence matter, on the whole? Of course not. I am participating in a system that does, on my account and on most people's, lower the level of discourse.

On balance, I'm not proud of the political impact of my social media participation. I don't think many of us, if any, have the right to be proud of theirs.

Social media is kind of fun. Sure, it's fun to butt heads with clueless adversaries and get an endorphin boost from likes and other evidence of public visibility. But political debate is more frustrating than interesting, and the endorphin boosts are meaningless artifacts of how the system is designed. Nobody really thinks otherwise, and yet we do it anyway. It's pathetically, absurdly irrational.

Facebook keeps me in touch with my friends and family. Admittedly, there is very little downside to this one. I frankly love hearing from old high school friends that otherwise I might not hear from for years. Facebook keeps me a little closer to my extended family. That's a great thing. A common response to this is that the quality of our interactions is much worse than it would have been otherwise. But if I'm going to be honest with myself, I just don't see this. I mean, Facebook lets me see remarks from my funny and nice old friends from high school, and I probably wouldn't talk to them at all if it weren't for Facebook (sorry, friends, but I think you understand! There isn't enough time in the day to keep up with all the friends I've ever made in my life!). There's no downside there. And no, I don't think it makes my relationship with my family any worse. I think it makes it a little better.

So what about the disadvantages of social media?

We are driven by algorithms. Facebook, Twitter, and the rest carefully design algorithms that highlight the posts our friends make to fit their purposes, which are not ours. The whole system has been designed by psychologists to hook us to participate as much as we can, which it frequently does.

Social media companies spy on us. And they make it easier for other companies, organizations, and (most concerning to me) potentially repressive governments to do so. And by participating, we endorse that behavior. That seems extremely irrational.

Social media companies have started to openly censor their political opponents. And again, if you participate, you're endorsing that behavior. Continuing to participate under those circumstances is irrational for conservatives and libertarians.

I sometimes get kind of addicted. I go through phases where I use social media a lot, and that can be a pretty awful waste of time, at least when I have many other things I should be doing. This is the main reason I think the right strategies are "cold turkey" and "you won't see me again"--like it or not. In short, I want to minimize temptation.

We indulge in petty debates that are beneath us. This bothers me. I don't like dignifying disgusting propaganda with a response, but I seem not to be able to restrain myself when I come across it in my feeds. Often, a proper response would require an essay; but I'd be writing an essay in response to an idiotic meme (say), which is kind of pathetic. I'd much rather have long-form debates on my blog (or between blogs that reply to each other, as we used to do).

It takes time away from more serious writing. I can write for publication. So why should I waste my time writing long Facebook posts that only a few people see? For things not quite worthy of publication, at least if I focus on my blog, I can write at a longer length and develop an argument more completely. Did you used to have a blog on which you had longer, better things to say?

So it's a waste of time, on balance. The opportunity cost is too high. I can and should be spending my time in better ways--work, programming study, helping to homeschool my boys, and doing more serious writing. That's the bottom line. Apart from keeping me in touch with family and friends on Facebook, the advantages of social media are pretty minimal, while the disadvantages are huge and growing.

Why don't I just limit my social media use to personal interactions with family and friends on Facebook, you ask? Because I don't want to take the risk of falling back into bad old habits. My friends can visit my blog and interact with me here, if they want. My family I'll call and visit every so often.

So I'm turning the page. I don't expect this to be big news for anybody. But it's going to change the way I interact online. If you want to keep seeing me online, start following my blog.

3. Can I really do this?

I suppose I've given a reasonably good analysis of why using social media is irrational. I've said similar things before, and many others have as well. And yet we keep using social media. Obviously, human beings are often not guided by rationality; much would be different in our crazy old world if we always were.

It is remarkable, though, just how much we acknowledge all the irrationalities about social media, and yet we indulge in it anyway. There's something deeply cynical about this. It can't be good for the soul.

The big question in my own mind is whether I will really be able to stay away from social media as I say I will. My use of social media is irrational, sure. But I don't pretend that the mere fact that  is, all by itself, enough to motivate me; indeed, I'm not sure who it's rational for, apart from the very few people who make a career out of it.

But I want to try. And as I said at the start of this post, it's not just about social media. It's about making my life more rational. So at the same time, I want to start eating more healthily and exercising more regularly, going to bed earlier, etc. Doing all that at once seems very ambitious. It might even seem silly and naive for me to say all this. But the insights I've reported on in part 1 above have really stuck in my mind, and they don't seem to be going away. So we'll see.


So I tried out Gab.ai

After the recent purges of Alex Jones and assorted conservatives and libertarians by Facebook, YouTube, Twitter, and others, I decided it really is time for me to learn more about other social networks that are more committed to free speech. I decided to try Gab.ai, hoping against hope that it wouldn't prove to be quite as racist as it is reputed to be.

See, while I love freedom of speech and will strongly defend the right of free speech—sure, even of racists and Nazis, even of Antifa and Communists—I don't want to hang out in a community dominated by actual open racists and Nazis. How boring.

So I went to the website, and, well, Gab.ai certainly does have a lot of people who are at least pretending to be Nazis. I never would have guessed there were that many Nazis online.

To support my impression, I posted a poll:

 Are you OK with all the open racism and anti-Semitism on Gab.ai? 57% Yes. 37% I tolerate it. 6% Makes me want to leave.

Wow! 1,368 votes! I sure hit a nerve with Gab.ai. But the results, well, they were disappointing: 57% of self-selected poll answerers on the web poll said they were OK with open racism on Gab.ai, 37% tolerated it, and it made 6% of them want to leave. But I was told by several people that I should have added another option: "That's what the Mute button is for."

There's another reason I've spent this much time exploring the site. It's that I really doubt there are that many actual Nazis on the site. Consider for a moment:

  1. The Establishment is increasingly desperate to silence dissenting voices.
  2. Gab.ai and some other alternative media sites have been getting more popular.
  3. Silicon Valley executives know the fate of MySpace and Yahoo: it's possible for giants to be replaced. Users are fickle.
  4. Like progressives, most conservatives aren't actually racist, and they will be put off by communities dominated by open, in-your-face racists.
  5. There's a midterm election coming up and people spending untold millions to influence social media, since that, we are now told, is where it's at.

Considering all that, it stands to reason that lots of left-wing trolls are being paid (or happily volunteer; but no doubt many are paid) to flood Gab.ai and make appallingly racist, fascist, anti-Semitic accounts. Of course they are; it's an obvious strategy. The only question is how many—i.e., what percentage of the Gab.ai users—consist of such faux racists.

Such trolls aside, there are at least two broad categories of people on Gab.ai. In one category there are the bona fide racists, Nazis, anti-Semites, and other such miscreants, and in the other category there is everyone else—mostly conservatives, libertarians, and Trump voters who do things like share videos of (black conservative) Candace Owens and shill for Trump (I voted for Gary Johnson, and I've always been bored by political hackery). The latter category of user mutes those of the former category, apparently.

So, feeling desperate for an alternative to Twitter, I spent a few hours today on the site, mostly muting racists, and a bit of getting introduced to some people who assured me that most of the people on the site were decent and non-racist, and that what you had to do was—especially in the beginning—spend a lot of time doing just what I was doing, muting racists.

Boy, are there a lot of racists (or maybe faux racists) there to mute. I still haven't gotten to the end of them.

But I'm not giving up on Gab.ai, not yet. Maybe it'll change, or my experience will get better. A lot of people there assured me that it would. I love that it's as committed to free speech as it is, and I wouldn't want to censor all those racists and Nazis just as I wouldn't want to censor Antifa and Communists. Keep America weird, I say!

If it's not Gab.ai, I do think some other network will rise. Two others I need to spend more time on are Steemit.com, a blockchain blogging website, similar to Medium and closely associated with EOS and Block.one, and Mastodon.social, which is sort of a cross between Twitter and Facebook. Steemit has become pretty popular (more so than Gab.ai), while Mastodon has unfortunately been struggling. I also want to spend more time on BitChute, a growing and reasonably popular YouTube competitor.


I am informed; you are misinformed; and the government should do something about this problem

Poynter, the famous journalism thinktank, has published "A guide to anti-misinformation actions around the world." This sort of thing is mostly interesting not just for the particular facts it gathers but also for the assumptions and categories it takes for granted. The word "misinformation" is thrown around, as are "hate speech" and "fake news." The European Commission, it seems, published a report on "misinformation" (the report itself says "disinformation") in order to "help the European Union figure out what to do about fake news." Not only does this trade on a ridiculously broad definition of "disinformation," it assumes that disinformation is somehow a newly pronounced or important problem, that it is the role of a supernational body (the E.U.) to figure out what to do about this problem, and that it is also the role of that body to "do something." Mind you, there might be some government "actions" that strike me as being possibly defensible; but the majority that I reviewed looked awful.

For example, look at what Italy has done:

A little more than a month before the general election, the Italian government announced Jan. 18 that it had set up an online portal where citizens could report fake news to the police.

The service, which prompts users for their email addresses, a link to the fake news and any social media networks they saw it on, ferries reports to the Polizia Postale, a unit of the state police that investigates cyber crime. The department will fact-check them and — if laws were broken — pursue legal action. At the very least, the service will draw upon official sources to deny false or misleading information.

That plan came amid a national frenzy over fake news leading into the March 4 election and suffered from the same vagueness as the ones in Brazil, Croatia and France: a lacking definition of what constitutes "fake news."

Poynter, which I think it's safe to say is an Establishment thinktank, mostly just dutifully reports on these developments. In their introduction, they do eventually (in the fourth paragraph) get around to pointing out some minor problems with these government efforts: the difficulty of defining "fake news" and, of course, that pesky free speech thing.

That different countries are suddenly engaging in press censorship is only part of the news. The other part is that Poynter, representing the journalistic Establishment, apparently does not find it greatly alarming about "governments" that are "taking action." Well, I do. Just consider the EU report's definition of "disinformation":

Disinformation as defined in this Report includes all forms of false, inaccurate, or misleading information designed, presented and promoted to intentionally cause public harm or for profit.

This implies that if in the opinion of some government authority, some claim is merely false and, like most professional publishing operations it is published for profit, then it counts as disinformation. This means that (with an exception made for non-profit publishers, apparently) the E.U. considers anything false to be an item of disinformation, and thus presumably ripe for some sort of regulation or sanction.

Well, of course this sounds ridiculous, but I am just reading. It's not my fault if that's what the report says. I mean, I'm sorry, but it certainly does look as if the E.U. wants to determine what's false and to then to ban it (or something). Of course, the definition does first say that disinformation is designed to intentionally cause public harm, but anybody who reads legalistic texts needs to bear in mind that, as far as the law is concerned, the parts that come after "or" and "and" are just as important as the parts that come before. The text does say "or for profit." Is that because in the E.U., seeking profit is as suspect as intentionally causing public harm?

The difficulty about texts like this, aside from the fact that they are insufferably dull, is that they are so completely chock-full of bad writing, bad reasoning, false assumptions, and so forth, that it would take several volumes to say everything that needs saying about the E.U. report and Poynter's run-down of government actions. What about all the important issues associated with what looks like a worldwide crackdown on free speech? They have been solved, apparently.

Poynter at least has the good sense to acknowledge difficulties, as they do at the end of the discussion of Italy's regulatory scheme. The government positions are appalling, as if they were saying: "We know what fake news and disinformation and misinformation are, more or less. Sure, there's a small intellectual matter of defining them, but no big deal there. It's just a matter of deciding what needs to be done. Free speech, well, that's just another factor to be weighed."

Just imagine reading this page just twenty years ago. It would have been regarded as an implausible horrorshow. I imagine how someone might have responded to a glimpse 20 years into the future:

What are you saying--in 2018, countries all around the world will decide that it's time to start seriously cracking down on "misinformation" because it's too easy to publish false stuff online, and free speech and freedom of the press? That's ridiculous. It's one thing to get upset about "political incorrectness," but it's another thing altogether for the freedom-loving West, and especially for journalists (for crying out loud!) to so bemoan "hate speech" and "fake news" (really?) that they'll give up free speech and start calling on their governments to exert control. That's just...ridiculous. Do you think we'll forget everything we know about free speech and press freedom in 20 years?

Well, it would have been ridiculous in 1998. Twenty years later, it still should be, but apparently it isn't for so many sophisticated, morally enlightened leaders who can identify what is true and what is misinformation.

It's time to push back.


Is it time to move from social media to blogs?

This began as a Twitter thread.

I've finally put my finger on a thing that annoys me—probably, all of us—about social media. When we check in on our friends and colleagues and what they're sharing, we are constantly bombarded with simplistic attacks on our core beliefs, especially political beliefs. "This cannot stand," we say. So we respond. But it's impossible to respond in the brief and fast-paced media of Twitter and Facebook without being simplistic or glib. So the cycle of simplistic glibness never stops.

There are propagandists (and social media people...but I repeat myself) who love and thrive on this simplicity. Their messages are more plausible and easier to get upset about when stated simply and briefly. They love that. That's a feature, not a bug (they think)!

I feel like telling Tweeps and FB friends "Be more reasonable!" and "Use your brain!" and "Chill!" But again—everything seems sooooo important, because our core beliefs are under attack. How can most people be expected to be calm and reasonable? People who take high standards of politeness and methodology seriously naturally feel like quitting. But social media has become important for socializing, PR, career advancement, and (let's face it) the joy of partisanship. "I can't quit you!" we moan. But, to quote a different movie, this aggression will not stand, man. Our betters at Twitter and Facebook agree, and so they have decided to force the worst actors to play nice. But they can't be trusted to identify "the worst actors" fairly. They're choosing the winners.

What's the solution for those of us who care about truth, nuance, and decency—and free speech? I don't know, but I have an idea. Rather than letting Facebook and Twitter (and their creeping censorship) control things, I'm going to try putting content updates on my blog. I'll still use Twitter and Facebook for Everipedia announcements and talk, and I'll link to blog updates from both places. But you'll have to visit my blog to actually read my more personal content. Anyway, I'm going to give that a try.


The Well-Ordered Life

The well-ordered life may be defined as that set of sound beliefs and good practices which are most conducive to productivity and therefore happiness, at least insofar as as happiness depends on productivity.

The well-ordered life has several types of component: goals; projects, which naturally flow from goals, and which are essentially long-term plans; habits, or actions aimed at the goals and which one aims to do regularly; plans for the day or week; assessments, or evaluating the whole, or stock-taking; and, different from all of these, a set of beliefs and states of attention that support the whole.

Let me explain the general theory behind the claim that the beliefs and practices I have in mind do, in fact, conduce to productivity and thus happiness. There is a way to live, which many of us have practiced at least from time to time and which some people practice quite a lot, which has been variously described as “peak performance,” “getting things done,” “self-discipline,” or as I will put it, a “well-ordered life.”

This generally involves really accepting, really believing in, certain of what might be called life goals. If you do not believe in these goals, the whole thing breaks down. Next, flowing from these goals, you must embrace certain projects; the projects must be broad and long-term, meaning they incorporate many different activities but have a definite end point. These must be tractable and perfectly realistic, and again, you must be fully “on board” with the wisdom of these projects. Projects can include things like writing a paper, working through a tutorial, writing a large computer program, and much more, depending on your career.

This background—your global goals and your significant, big projects—is the backdrop for your daily life. If this backdrop is not well-ordered, then your daily life will fall apart. If you lose faith in your goals, little everyday activities will be hard to do. Similarly, if you decide that a certain project does not serve your goals, you will not be able to motivate yourself to take actions. So you must guard your commitment to your goals and projects jealously, and if it starts to get shaky, you need to reassess as soon as possible.

Your daily life is structured by three main things: habits, plans, and assessments. Your habits are like the structure of your day. They can and probably should include a schedule and are regular activities that move you toward the completion of a project. Plans are like the content of your day. The habits and schedule might give you an outline, but you still need to think through how to flesh out the outline. Finally, there are assessments, which can be done at the same time as plans are done, but which involve evaluating your past performance, introspecting about how you feel about everything, and frankly squashing irrational thoughts that are getting in the way.

Such a life is well-ordered because projects flow from goals, while habits, plans, and assessments are all in service of the projects and, through them, the goals. It is a system with different parts; but the parts all take place in your life, meaning they at bottom take the form of beliefs and actions that you strongly identify with and that actually make up who you are.

This then leads to the last element of the well-ordered life: beliefs that support the whole. As we move through life, we are not in direct control of our beliefs or even most of our actions. We find ourselves believing things or with attitudes that we do not wish or that even surprise or dismay us. These beliefs can greatly support a well-ordered life, but they can also undermine it entirely. If you believe a goal is entirely unattainable, or a project undoable, you will probably lack the motivation needed to pursue it.

This is why assessment, or stock-taking, is so important if you are to maintain a well-ordered life, especially if you tend to be depressed or nervous or your self-confidence is low. You need to explore and, as it were, tidy up your mind.

You should expel any notion that self-discipline is a matter of luck, as if some people have it and others don’t. It is also an error to think self-discipline is a matter of remembering some brilliant insight you or someone else had, or staying in the right frame of mind. Indeed, self-discipline is not any one thing at all. It is, as I said, a system, with various working parts.

It is true that some people just rather naturally fall into the good habits and beliefs that constitute the well-ordered life. But the vast majority of us do not. The better you understand these parts, bear them in mind, and work on them until the whole thing is a finely-tuned machine, the more control you’ll have over your life. This is not easy and, like any complex system, a lot can go wrong. That’s why it’s so necessary to take stock and plan.


How to crowdsource videos via a shared video channel

I got to talking to one of my colleagues here at Everipedia, the encyclopedia of everything, where I am now CIO, about future plans. I had the following idea.

We could create an Everipedia channel--basically, just a YouTube account, but owned by Everipedia and devoted to regularly posting new videos.

We could invite people to submit videos to us; if they're approved, we put branding elements on them and post them. We share some significant amount of the monetization (most of it) with the creator.

We also feature the videos at the top of the Everipedia article about the topic.

Who knows what could happen, but what I  hope would happen is that we'd get a bunch of subscribers, because of all the connections of the video makers (and Everipedia--we collectively have a lot of followers and a lot of traffic). And the more people we got involved, the greater the competition and the better the videos would be.

There are still huge opportunities in the educational video space--so many topics out there simply have no good free videos available.

Others must have organized group channels like this before, but I can't think of who.

What do you think?


Could God have evolved?

1. How a common argument for the existence of God failed—or did it?

As a philosophy instructor, I often taught the topic of arguments for the existence of God. One of the most common arguments, called the argument from design or teleological argument, in one formulation compares God to a watchmaker.

If you were walking along a beach and found some complex machine that certainly appeared to be designed by someone, which did something amazing, then you'd conclude that it had a maker. But here we are in a universe that exhibits far more complexity and design than any machine we've ever devised. Therefore, the universe has a maker as well; we call it God.

This is sometimes called the Watchmaker Argument—since the mechanism our beachcomber finds is usually a watch—and is attributed to William Paley. Variations on this theme could be the single most commonly-advanced argument for God.

The reason the Watchmaker Argument doesn't persuade a lot of philosophers—and quite a few scientists and atheists generally—is that all the purported signs of design can be found in the biological world, and if biological complexity and appearance of design can be explained by natural selection, then God is no longer needed as an explanatory tool.

Some skeptics go a bit further and say that all the minds we have experience of are woefully inadequate for purposes of designing the complexity of life. Therefore, not only are natural mechanisms another explanation, they are a much better explanation, as far as our own experience of minds and designing is concerned.

But here I find myself skeptical of these particular skeptics.

2. Modern technology looks like magic

Recently, probably because I've been studying programming and am understanding the innards of technology better than ever, it has occurred to me very vividly that we may not be able to properly plumb the depths of what minds are capable of achieving. After all, imagine what a medieval peasant would make of modern technology. As lovers of technology often say, it would look like magic, and we would look like gods.

We've been working at this scientific innovation thing for only a few centuries, and we've been aggressively and intelligently innovating technology for maybe one century. Things we do now in 2017 are well into the realm of science fiction of 1917. We literally cannot imagine what scientific discovery and technological innovation will make available to us after 500 or 1000 years. Now let's suppose there are advanced civilizations in the galaxy that have been around for a million years.

Isn't it now hackneyed to observe that life on Earth could be a failed project of some super-advanced alien schoolchild? After all, we already are experimenting with genetic engineering, a field that is ridiculously young. As we unlock the secrets of life, who's to say we will not be able to engineer entirely different types of life, every bit as complex as the life we find on Earth, and to merge with our inventions?

Now, what havoc should these reflections wreak on our religious philosophy?

3. Could an evolved superbeing satisfy the requirements of our religions?

The scientific atheist holds the physical universe in great reverence, as something that exists in its full complexity far beyond the comprehension of human beings. The notion of a primitive "jealous God" of primitive religions is thought laughable, in the face of the immense complexity of the universe that this God is supposed to have created. Our brains are just so much meat, limited and fallible. The notion that anything like us might have created the universe is ridiculous.

Yet it is in observing the development of science and technology, thinking about how we ourselves might be enhanced by that science and technology, that we might come to an opposite conclusion. Perhaps the God of nomadic tent-dwellers couldn't design the universe. But what if there is some alien race that has evolved past where we are now for millions of years. Imagine that there is a billion-year-old superbeing. Is such a being possible? Consider the invention, computability, genetic engineering, and technological marvels we're witnessing today. Many sober heads think the advent of AI may usher in the Singularity within a few decades. What happens a millions years after that? Could the being or beings that evolve create moons? Planets? Suns? Galaxies? Universes?

And why couldn't such a superbeing turn out to be the God of the nomadic tent-dwellers?

Atheists are wrong to dismiss the divine if they do so on grounds that no gods are sufficiently complex to create everything we see around us. They believe in evolution and they see technology evolving all around us. Couldn't god-like beings have evolved elsewhere and gotten here? Could we, after sufficient time, evolve into god-like beings ourselves?

What if it turns out that the advent of the Singularity has the effect of joining us all to the Godhead that is as much technological as it is physical and spiritual? And suppose that's what, in reality, satisfies the ancient Hebrew notions of armageddon and heaven, and the Buddhist notion of nirvana. And suppose that, when that time comes, it is the humble, faithful, just, generous, self-denying, courageous, righteous, respectful, and kind people that are accepted into this union, while the others are not.

4. But I'm still an agnostic

These wild speculations aren't enough to make me any less of an agnostic. I still don't see evidence that God exists, or that the traditional (e.g., Thomistic) conception of God is even coherent or comprehensible. For all we know, the universe is self-existing and life on Earth evolved, and that's all the explanation we should ever expect for anything.

But these considerations do make me much more impressed by the fact that we do not understand how various minds in the universe might evolve, or might have evolved, and how they might have already interacted with the universe we know. There are facts about these matters about which we are ignorant, and the scientific approach is to withhold judgment about them until the data are in.


On intellectual honesty and accepting the humiliation of error

I. The virtue of intellectual honesty.
Honesty is a greatly underrated epistemic virtue.

There is a sound reason for thinking so. It turns out that probably the single greatest source of error is not ignorance but arrogance, not lack of facts but dogmatism. We leap to conclusions that fit with our preconceptions without testing them. Even when we are more circumspect, we frequently rule out views that turn out to be correct because of our biases. Often we take the easy way out and simply accept whatever our friends, religion, or party says is true.

These are natural habits, but there is a solution: intellectual honesty. At root, this means deep commitment to truth over our own current opinion, whatever it might be. That means accepting clear and incontrovertible evidence as a serious constraint on our reasoning. It means refusing to accept inconsistencies in one's thinking. It means rejecting complexity for its own sake, whereby we congratulate ourselves for our cleverness but rarely do justice to the full body of evidence. It means following the evidence where it leads.

The irony is that some other epistemic virtues actually militate against wisdom, or the difficult search for truth.

Intelligence or cleverness, while in themselves an obvious benefit, become a positive hindrance when we become unduly impressed with ourselves and the cleverness of our theories. This is perhaps the single biggest reason I became disappointed with philosophy and left academe; philosophers are far too impressed with complex and clever reasoning, paying no attention to fundamentals. As a result, anyone who works from fundamentals finds it to be child's play (I thought I did, as a grad student) to poke holes in fashionable theories. This is not because I was more clever than those theoreticians but because they simply did not care about certain constraints that I thought were obvious. And it's easy for them in turn to glibly defend their views; so it's a game, and to me it became a very tiresome one.

Another overrated virtue is, for lack of a better name, conventionality. In every society, every group, there is a shared set of beliefs, some of which are true and some of which are false. I find that in both political and academic discussions, following these conventions is held to be a sign of good sense and probity, while flouting them ranges from suspect to silly to evil. But there has never yet been any group of people with a monopoly on truth, and the inherent difficulty of everything we think about means that we are unlikely to find any such group anytime soon. I think most of my liberal friends are—perhaps ironically—quite conventional in how they think about political issues. Obviously conservatives and others can be as well.

Another virtue, vastly overrated today, is being "scientific." Of course, science is one of the greatest inventions of the modern mind, and it continues to produce amazing results. I am also myself deeply committed to the scientific method and empiricism in a broad sense. But it is an enormous mistake to think that the mere existence of a scientific consensus, especially in the soft sciences, means that one may simply accept what science instructs is true. The strength of a scientific theory is not determined by a poll but by the quality of evidence. Yet the history of science is the history of dogmatic groups of scientists having their confidently-held views corrected or entirely replaced. The problem is a social one; scientists want the respect of their peers and as a result are subject to groupthink. In an age of scientism this problem bleeds into the general nonscientific population, with dogmatists attempting to support their views by epistemically unquestionable (but often badly-constructed and inadequate) "studies"; rejecting anyone's argument, regardless how strong, if it is not presented with "scientific support"; and dismissing any non-scientist opining on a subject about which a scientist happens to have some opinion. As wonderful as science is, the fact is that we are far more ignorant than we are knowledgeable, even today, in 2017, and we would do well to remember that.

Here's another overrated virtue: incisiveness. Someone is incisive if he produces trenchant replies that allows his friends to laugh at the victims of his wit. Sometimes, balloons need to be punctured and there is nothing there when deflated—of course. But problems arise when glib wits attack some more complex theories and narratives. It is easy to tear down and hard to build. Fundamentally my issue is that we need to probe theories and narratives that are deeply rooted in facts and evidence, and simply throwing them on the scrap heap in ridicule means we do not fully learn what we can from the author's perspective. In philosophy, I'm often inclined to a kind of syncretistic approach which tips its hat to various competing theories that each seem to have their hands on different parts of the elephant. Even in politics, even if we have some very specific policy recommendation, much has been lost if we simply reject everything the other side says in the rough and tumble of debate.

I could go on, but I want to draw a conclusion here. When we debate and publish with a view to arriving at some well-established conclusions, we are as much performing for others as we are following anything remotely resembling an honest method for seeking the truth. We, with the enthusiastic support of our peers, are sometimes encouraged to think that we have the truth when we are still very far indeed from having demonstrated it. By contrast, sometimes we are shamed for considering certain things that we should feel entirely free to explore, because they do contain part of the truth. These social effects get in the way of the most efficient and genuine truth-seeking. The approach that can be contrasted with all of these problems is intellectual honesty. This entails, or requires, courageous individualism, humility, integrity, and faith or commitment to the cause of truth above ideology.

It's sad that it is so rare.

 

II. The dangers of avoiding humiliation.

The problem with most people laboring under error (I almost said "stupid people," but many of the people I have in mind are in fact very bright) is that, when they finally realize that they were in error, they can't handle the shame of knowing that they were in error, especially if they held their beliefs with any degree of conviction. Many people find error to be deeply humiliating. Remember the last time you insisted that a word meant one thing and it meant something else, when you cited some misremembered statistic, or when thought you knew someone who turned out to be a stranger. It's no fun!

Hence we are strongly motivated to deny that we are, in fact, in error, which creates the necessity of various defenses. We overvalue supporting evidence ("Well, these studies say...") and undervalue disconfirming evidence ("Those studies must be flawed"). Sometimes we just make up evidence, convincing ourselves that we just somehow know things ("I have a hunch..."). We seek to discredit people who present them with disconfirming evidence, to avoid having to consider or respond to it ("Racist!").

In short, emotional and automatic processes lead us to avoid concluding that we are in error. Since we take conscious interest in defending our views, complex explanatory methods are deployed in the same effort. ("Faith is a virtue.") But these processes and methods, by which we defend our belief systems, militate in favor of further error and against accepting truth. ("Sure, maybe it sounds weird, but so does a lot of stuff in this field.") This is because propositions, whether true or false, tend to come in large clusters or systems that are mutually supporting. Like lies, if you support one, you find yourself committed to many more.

In this way, our desire to avoid the humiliation of error leads us into complex systems of confusion—and, occasionally, into patterns of thinking that can be called simply evil. ("The ends justify the means.") They're evil because the pride involved in supporting systematically wrong systems of thought drives people into patterns of defense go beyond the merely psychological and into the abusive, psychologically damaging, and physical. ("We can't tolerate the intolerant!" "Enemy of the people." "Let him be anathema.")

What makes things worse is that we are not unique atoms each confronting a nonhuman universe, when we are coming to grips with our error. We are members of like-minded communities. We take comfort that others share our beliefs. This spreads out the responsibility for the error. ("So-and-so is so smart, and he believes this.") It is much easier to believe provably false things if many others do as well, and if they are engaged in the same processes and methods in defending themselves and, by extension, their school of thought.

This is how we systematically fail to understand each other. ("Bigot!" "Idiot!") This is why some people want to censor other people. ("Hate speech." "Bad influence.") This is how wars start.

Maybe, just maybe, bad epistemology is an essential cause of bad politics.

(I might be wrong about that.)

It's better to just allow yourself to be humiliated, and go where the truth leads. This is the nature of skepticism.

This, by the way, is why I became a philosopher and why I commend philosophy to you. The mission of philosophy is—for me, and I perhaps too dogmatically assert that it ought to be the mission for others—to systematically dismantle our systems of belief so that we may begin from a firmer foundation and accept only true beliefs.

This was what Socrates and Descartes knew and taught so brilliantly. Begin with what you know on a very firm foundation, things that you can see for yourself ("I know that here is a hand"), things that nobody denies ("Humans live on the surface of the earth"). And as you make inferences, as you inevitably will and must, learn the canons of logic and method so that you can correctly apportion your strength of belief to the strength of the evidence.

There is no way to do all this without frequently practicing philosophy and frequently saying, "This might or might not support my views; I don't know." If you avoid the deeper questions, you are ipso facto being dogmatic and, therefore, subject to the patterns of error described above.


Modern education and culture, or, what did you think would happen?

I. Modern education and culture

Look at where we are in education and culture today. Let's catalog the main issues, shall we?

School children are often not taught to read properly, and too many fall behind and grow up functionally illiterate. Yet students are kept in schools practically all day and are made to do endless amounts of busywork, and then they have to do even more busywork at home. The efficiency of the work they do is appalling, as their textbooks and assignments are all too often ill-conceived, being repetitious, deadly dull, and designed without any consideration for what individual children already know (or don't know). Generally, they aren't taught classics (but more on that below). So despite all that work, despite graduating at rates as high as ever, the average child emerges into adulthood shockingly ignorant. The educational process is regimented; little humans have essentially become cogs in a giant, humorless, bureaucratic machine. The whole process is soul-killing.

Growing up in these bureaucratized intellectual ghettos, it's no wonder that rebellion has become de rigeur, that everyone calls himself an individualist although few really are. Popular culture with each passing generation is more dumbed-down, delivering entertainment that can be effortlessly consumed by maleducated conformist rebels, increasingly avoiding any scintilla of intellectualism, any uncool and boring reference to any of the roots of Western culture. On TV, popular music, and the Internet—the ubiquitous refuges of the young from the horrors of the educational machine that dominates their young lives—one can navigate content of all sorts without any exposure to the classics of literature and the arts, or the root ideas of Western religion and philosophy. If a few lucky students are exposed to these things at their more academic high schools, most are not, and the taste for "the best which has been thought and said" is ruined by the presentation in a system that "critiques" and renders dull as much as it celebrates and usefully explains. It's a wonder if any students emerge with any taste for the classics of Western literature, art, and thought at all.

A problem about Western culture, for the modern world, is that it is intensely critical and challenging. The classics are beautiful, but hard—both difficult to appreciate and presenting lessons that require us to take a hard, critical look at ourselves. Although the classics can be profoundly inspiring and sublime in beauty, they require time, attention, intelligence, seriousness, and sincerity to appreciate. In the context of today's soul-killing schools, students are too exhausted and overworked to meet these challenges. Many students are also too narcissistic—having been told by their parents and teachers that they are already brilliant, having been idolized by popular culture for their cool, attractiveness, and cutting-edge thinking about everything—so the classics require a kind of self-criticism that is wholly foreign to many of them. It is no wonder the classics simply do not "speak to" the youth of today.

Moreover, almost all of the classics were created by white Western men. Spending much time on them is politically regressive, or that is what school teachers are trained to believe. Instead, the left at universities have been building a new kind of more critical culture, at once holding up the grievances of historically marginalized groups as a new gospel, while actually revering popular culture. Teachers and administrators marinade in this left-wing culture of criticism at universities for six or more years, before they make the choices of what pieces of culture are worth exposing to children. So, again, it's a wonder if any students emerge with any taste for the classics.

At the college level, matters have become dire in other ways. Everyone is expected to go to college, and at the same time universities have become corporatized, so that the students are now treated as "customers" whose evaluations determine how professors should teach. So, naturally, grades have inflated—which would have been necessary to coddle the "self-esteem" or narcissism of youth—and the courses themselves have been dumbed down, at least in the humanities. But who needs the humanities? Degrees in the liberal arts generally are held to be a waste of money, especially since college has become so expensive, and fewer people are pursuing such degrees. Even if one believed the knowledge gained through liberal arts degrees to be valuable enough to warrant spending $60,000/year, one spends much of the time, in most of the humanities, marinading in that same left-wing critical culture that produces our schoolteachers—so one wouldn't be exposed to the classics in the way that would incline a student to sign up for one of these degrees in the first place. So it's no wonder if students and their parents are finding it increasingly plausible to skip college altogether. This is a sad mistake, considering that young adults today, navigating a rapidly-changing world, are more in need of the wisdom and intellectual skills inculcated by a liberal arts education than ever before. And most recently, the consequences of our failure to pass on two of the ideals essential to Western thought—free speech and freedom of inquiry—has led to thoroughly illiberal efforts to "shut it down," i.e., prevent politically unpopular ideas from getting a hearing on campus at all. This is all in the name of intersectionality, empowering the disempowered, tearing down bad old ideas, and protecting the sensitive feelings of coddled students.

II. The once-radical ideas that got us here

Our education is degraded, and we are falling away from Western civilization. So how did it come to this? I put it down to a perfect storm of terrible ideas.

(1) To be effective in a fast-changing society, we need up-to-date know-how, not theory. American society developed out of a frontier mentality that placed a premium on a "can-do" attitude, an ability to get things done, with theorizing and book-reading being a waste of time. That might be understandable for the pioneers and peasants of a frontier or pre-industrial society, it is a terrible idea for the complexities of industrial and post-industrial societies, in which wisdom, trained intelligence, and sensitivity to nuance are essentials. Nevertheless, American parents and teachers alike generally seem to agree that practical knowledge and know-how are more important than book-larnin'. You would think that this might have changed with more people than ever going to college. But it has not.

(2) Books are old-fashioned in the Internet age. When, in the 2000s, the Internet came into its own as the locus of modern life, we began to ask, "Is Google making us stupid?" and to "complain" that we lacked the ability to read extended texts (long articles were "tl;dr" and books boring, old, and irrelevant). I think many of us took this to heart. Educated people still do want their children to read, but the habits of adults are slowly dying; you can't expect the children to do better.

(3) Western civilization is evil. "Hey, hey, ho, ho, Western civ has got to go," chanted those Stanford students in 1988, which became a watershed moment in the development of Western culture. At the time, it might have seemed a bit of left-wing excess, and just one side of the complex Culture War. But, in fact, it proved to be a taste of things to come. Many Western civilization requirements are long gone. What was once the province of the newly-created Women's Studies and Black Studies departments, and a few left-wing professors, gradually become the dominant viewpoint in all of the humanities. Why study the classics when the classics simply represent the point of view of the oppressor?

(4) Social justice is the new religion. Hand-in-hand with criticism of Western civilization came an increasing respect (which is good), then celebration (which is fine), and finally a veneration (which is undeserved) of everything that has traditionally been set in opposition to Western civilization, especially the usual identity groups: women, races other than white, ethnicities other than Western, religion other than Christianity, sexual orientation other than straight, etc. At universities, making these identity groups equal to straight, white, male, Christian Europeans has become nearly the only thing—apart from environmentalism and a few other such causes—that is taken really seriously. For many academics, intersectionality has replaced both religion and any apolitical ethics to become an all-encompassing worldview.

(5) Psychology is more scientific, accurate, and credible than philosophy and religion, and self-esteem must be cultivated at all costs. The gospel of self-esteem came into being in the 1970s, right around the time when the self-help publishing industry became fashionable. With the collapse of traditional (especially Christian) belief systems, people cast about for general advice on how to live their lives, and psychology delivered. As self-esteem was a key element of much self-help psychology, it was only natural that the parents of Generations X and Y would pull out the stops to protect the feelings and sense of self-worth of their precious darlings.

We have changed. Despite their education, too many of our children cannot read well, and fewer and fewer of us read books. Whatever we do teach or read, it is rarely classical literature. Classics have become an unexplored country, dull and reviled, to many of us. Recent generations are the first in centuries in which the upper echelons of society are quite shockingly ignorant of their own Western heritage. And here I don't just mean books, I mean also basic Western principles, ideas, and values. For many young people, social justice, psychology, and especially popular culture have replaced religion and wisdom literature. Popular culture may be a crass wasteland, yet it guides our youth more than ever, as being the only kind of culture that most of them have preparation and taste for.

We have declined. In past generations, this analysis would have sounded like scaremongering. Today, the analysis has come true; it is a postmortem.

But—and here I speak to the older generation, especially educated old liberals—what did you think would happen? This is precisely what some people did predict in decades past, because society's leaders were teaching a certain set of ideas to the leaders of the next generation:

European civilization colonized and exploited the world; it is irredeemably racist and the main source of the suffering in the world today.

Inequalities are deeply unfair, and white men have the best of everything; so we should celebrate everyone else and take white men down a peg or two.

We must be avoid saying anything that might even be thought to be offensive to disadvantaged identity groups.

Christianity is completely irrational and doesn't deserve a role in public life.

Science, and psychology in particular, studies all we need to know to live and be happy; philosophy and religion are based on muddle-headed superstition.

The self-esteem and sensitivities of young people are precious and must be protected from the buffets that life threatens to give them.

Even today, some of these ideas might sound ridiculous to some of us. But if you've been paying attention, you can't deny that these once-radical ideas have become increasingly mainstream.

III. The radical ideas that might guide our future

The desperate state of education today is predictable, given former trends and earnestly-expressed convictions. It was called scaremongering to say that these ideas were hacking away the roots of Western civilization—and yet they did. So one wonders: What can we predict about the future, based on ideas now growing in popularity, ideas that it is quite reasonable to believe will guide the education and enculturation of the next generation?

Here are some controversial ideas that are in vogue at universities today:

Free speech is a dangerous idea, and it certainly doesn't include hate speech and harmful speech.

What determines whether speech is harmful is whether it causes its listeners to react with emotional pain.

But we can disregard the pain of "privileged" people—"male tears," "white tears," and all that.

Those who are really plugged in know that books aren't really what's important. Know-how is what's important. You can just look up things online that you need to know.

Popular culture is worth careful academic study, at least as much as "the classics" or "high culture."

Higher education isn't important except as a credential to become a corporate drone and in some fields.

Grave inequalities persist, and our very civilization is racist. We ought to tear down and malign all the productions of white men.

White society, and white people (whether they know it or not), are all racist, and all men (whether they know it or not) perpetuate a sexist patriarchy.

Religion isn't just irrational and wrong, it's evil, and we should take steps to stamp it out and perhaps prohibit it.

Reproducing does great harm to the world. Life is an evil. Babies are not to be celebrated. We should stop having them.

All of these ideas have plenty of adherents on campus today. They might well shape the next generation. If so, what might our brave new world look like? Let's listen in to the monologue from a typical, center-left future student, shall we?

"It's 2047. The way some people talk, you'd think it was, I don't know, 2017 or something. Check this out. I heard someone, and I don't care if she was a black woman, actually citing the Bible in class? That triggered a lot of people, and she was kicked out, of course. I doubt they'll let her back in. It just goes to show you how many people still believe that superstitious bullshit, even though it's revolting hate speech. But you know what, I was kind of impressed about what she was reading, before I realized what she was reading. It sounded like Old English. Who reads crap like that these days? Well, I guess she can. But it's still bullshit. You don't have to be able to read it to know that.

"It's not just superstitious bullshit, it's totally irrelevant. Books are so lame! My favorite professors don't teach books, they teach modern media. When I started this major, I swear, I had no idea pop music and movies were so deep. Seriously! So why do we require students to read so many books at all? Last year I was required to read three books for required Communications courses. Everyone knows that books aren't really what's important; knowledge is free for the taking online. Everything's there, instantly! Besides, the most influential thoughts of the last forty years are all in the form of briefer texts online. I'm thinking I might want to drop out. Half of my friends didn't even go to college and are just being trained by their employers. But you know, I think those tend to be the more conservative people, you know? So...

"Anyway, at the very least, it's time to stop requiring that we read any books written before 1970, or maybe 2000, especially if they were written by white men. I mean, of course white people and men are still welcome at our universities, it is perfectly fair that they wait their turn in classroom discussions. I hate it when some white man just starts talking first. You can hear some people hissing when they do. After all, everyone knows that less privileged people have more valid and relevant perspectives, and hearing white people and men—and on some issues, let's face it, hearing ignorant, insensitive white men at all—causes the marginalized great pain. We can't forget that white Western civilization persists even today, despite our best efforts. We renamed the state of Washington, but not the capital of our country—it continues to be named after the very embodiment of a white, slave-owning, breeding patriarch! That pisses me off so much!

"And speaking of breeders...don't get me started on the breeders. We had to fight tooth and nail against the misogynist, patriarchal society just to make it possible to license parents. But now we're allowing almost everyone to be licensed. What's the point? Surely we've got to prevent so many people from breeding. We don't let just anyone drive, right? We need to start imposing some restrictions. I know it's a little simplistic, but sometimes, simple is the best way: we could just, for a while, restrict the number of children white people could have. I know it sounds shocking, but look—everybody knows they use the most resources, they're the most racist, they create the most inequality. And they're still a plurality in this country. So it's really a no-brainer. It's 2047!"

Maybe that sounds over-the-top. But that's the point. There are cutting-edge activist types who would find all of this commendable or at least very plausible. And just think: the cutting-edge ideas of 1987, which would have sounded totally bizarre and radical back then, are totally up-to-date today, in 2017. I'm similarly extrapolating, from the "cutting-edge" ideas of today on the same topics to how those ideas might be evolve in another 30 years.

Also, of course, it could get much worse. Illiberal societies have been much worse at different times and places in history.

Am I predicting that the monologue is what awaits us? No, my crystal ball isn't that accurate and history never unfolds smoothly or predictably. What I'm saying is that it's a natural extrapolation from ideas about education and culture today. Is that what we want? If not, then what kind of thought world are we trying to build?