So why, precisely, is consciousness mysterious? What is it, anyway? My view on this, in short, is that the weirdness, the mysteriousness, of consciousness lies primarily in the fact that it is an event, an activity, which is a kind of property of the brain. Much of the ontological weirdness of consciousness stems from the fact that properties (and events) have the same sort of weirdness we puzzle over when we think about the problem of universals. Just as properties aren’t things with spatial dimensions, so various mental events and properties aren’t things with dimensions.
When people open up heads and examine brains, they should no more expect to see thoughts bouncing around than when they hold a ball in hand and expect to see the abstract properties of roundness or redness. You can only see the object which is round and red. Like the abstract roundness and redness that the ball exemplifies, thoughts aren’t things. They are properties of a certain kind, i.e., they’re events, they are happenings or goings-on.
Ah, you say, that’s too quick. We can see instances of some properties and instances of some events, to be sure; we can see this ball’s redness and that it’s rolling. But, you cleverly add, there is no way anyone will ever perceive an instance of someone else’s consciousness in the way the conscious person is aware of it. So consciousness is an unusual sort of property (or event), to be sure. I readily admit that. An outside observer can’t observe consciousness going on in the way that the person who is conscious can. But that’s because, unlike every other kind of property, we are familiar with mental or conscious properties through introspection. Introspection is part of our equipment. A ball can’t (as far as we know) introspect and reflect on anything about itself. I can introspect and infer that you have similar thoughts, perceptions, and pleasures and pains to mine; but never will I, through introspecting, become aware of your thoughts, perceptions, and pleasures and pains. (That is, unless such a thing as “mind-reading” exists, which I doubt.)
Do we need to posit the existence of another ontological category (the irreducibly mental) in order to account for the “raw feels” or “qualia” of introspected consciousness? Well, no, we don’t actually. We know through research into the brain that certain thoughts, perceptions, and pleasures and pains—and here it’s hard to know what words to use—”are mapped onto” or “are caused by” or “have the underlying substrate of” certain sorts of brain events. If no perceptible brain events (of one sort), then no thoughts (of a kind); and if no thoughts (of that kind), then no perceptible brain events (of that sort). So when an MRI shows a certain area of the brain lighting up, you aren’t seeing a memory, because a memory is known and understood, irreducibly, by introspection. You can see evidence that a memory is taking place, though. Sufficiently advanced brain science might even indicate what the memory is of. But our perception or apprehension of the brain event will still be different from the introspective experience of the memory, it will not be the same as its raw feel or quale.
If you insist that this means I’m a dualist, because I’m saying something is irreducibly introspective (or mental) after all, then I’ll say that the irreducibility is similar to the irreducibility, again, of properties or events. It makes no more sense to say that a thought is some physical thing than that a property is a physical thing. It isn’t a thing at all. It’s a different ontological category, yes, but not because it’s mental, but because it’s an event (or a property).
Some part of the difficulty that some philosophers have with the mind-body problem, I’m convinced, is owing to a rather simple materialistic model of the universe: everything that exists is some physical object. But when you point out that there are, after all, physical properties, relations, events, sets or groups, etc., then they say, oh, well that’s a different problem. At least they’re all physical. Sure, but what makes them physical? That they are reducible to fundamental particles? Well, no. The color or weight or density of a rock is not reducible to fundamental particles, because properties can never be reducible to things. Properties are ontologically basic.
Once you start taking seriously the notion that there are a fair few (not an enormous number of) irreducibly basic concepts, concepts that cannot be semantically reduced, analyzed, or defined in terms of other things, then it becomes quite easy to say, “Well, mental properties are properties of bodies, because it’s bodies that have such properties, but we (the havers of those bodies) are acquainted with such properties only via introspection.”
If you have your wits about you, you will see another opening now. You will press me then to distinguish between the properties known by introspection versus those that aren’t, or to define “introspection” without reference to some irreducibly mental feature. Maybe we could, armed with such a definition, invent a self-aware AI, or decide whether some AI really were self-aware.
To that I answer: that’s a scientific, not a philosophical, question. It’s a question about the brain, or about systems that share whatever feature brains have that makes them (sometimes) exhibit consciousness. I suppose brain science is getting closer and closer to an answer all the time. All a person can tell you is when he is conscious and of what he is conscious (and notice, if he’s telling you that, then not only is he conscious of something, he is introspecting that he is conscious of it). Then a scientist, wielding these reports, can gather the MRI (or whatever) evidence that is needed to see what distinguishes the brain events that are accompanied by consciousness (and introspection) from those that aren’t.
So when someone like Daniel Dennett (a philosopher I read before he was famous and cool) declares that consciousness doesn’t exist, my reaction is to say that it’s an overreaction to a hard problem that is poorly understood.
Leave a Reply