Thursday 26 October 2017

On Definitions of 'Consciousness': Dennett and Others (2)




The first thing that many people may say about Daniel Dennett's definition/s of the word 'consciousness' (or what he takes consciousness to be) is that he doesn't define the word at all. He either tells us what it's not or talks about something else entirely: things like functions, brain processes, behavior, "verbal reports", evolution, engineering, and the like.

However, Dennett (as a “scientistic” Rylean) has much support for his position in neuroscience and science generally.

For example, here's Alexander Luria (in Consciousness and Self-Regulation) also arguably and paradoxically ignoring what it is he's talking about. Thus:

Modern views... regard human conscious activity as consisting of a number of components. These include the reception and processing (recoding) of information, with the selection of its most important elements and retention of the experience thus gained in the memory...”

Of course if someone argues that Luria is ignoring consciousness in the above, isn't that simply to beg the question? The assumption here is that consciousness is something over and above the “reception and processing (recoding) of information”, etc. Who says so? There are philosophers and even psychologists who say so. Here's the American psychologist Ulric Neisser (in his Cognition and Reality):

The treatment of consciousness as a processing stage is unsatisfactory in a still more fundamental way. It does justice neither to the usages of the word 'consciousness' in ordinary discourse nor to the subtleties of experience.”

Despite that, the assumption (if that's what it is) that functions, processing, behaviour, etc. aren't examples of consciousness (or even consciousness itself) needs to be defended. Indeed we can indulge in some sceptical psychologising here and quote Dennett himself (as found in John Horgan's The End of Science). According to Dennett, “mysterians” really “don't want consciousness to fall to science”. What's more,

They like the idea that this is off-limits to science. Nothing else could explain why they welcome such slipshod arguments.”

And here's another definition of 'consciousness' – by the neuroscientists Robert Thatcher and Erwin Roy John (in their Foundations of Cognitive Processes) - which Dennett would approve of. (Or, at the least, he'd approve of the spirit of the definition, if not every letter.) Thus:

Consciousness is a process in which information about multiple individual modalities of sensation and perception is combined into a unified multidimensional representation of the state of the system and its environment, and integrated with information about memories and the needs of the organism, generating emotional reactions and programs of behavior to adjust the organism to its environment. The content of consciousness is the momentary constellation of these different types of information.”

Indeed the thesis that “multiple individual modalities of sensation and perception” which are “integrated” into a “momentary constellation” is very much in tandem with Dennett. Consciousness, to Dennett, is what happens when it “all comes together”. Yes, there's no “place where it all comes together”.

Bernard Baars (in his A cognitive theory of consciousness) also argues that

consciousness as a set of messages posted on a large blackboard for all cognitive subsystems to read”.

It's true that there are problems with these kinds of definition. For example, Alvin Goldman (in his paper 'Consciousness, Folk Psychology, and Cognitive Science') writes:

We can easily conceive of a (nonhuman) system in which informative representations are distributed to all subsystems yet those representations are totally devoid of phenomenal awareness.”

We can also say that this passage doesn't tell us about consciousness itself either. It does tell us that we could have a “system” which is functionally very like a human subject, yet which doesn't have consciousness. It is, in fact, a take on the question: Why is there consciousness at all? Thus, in our context, it's beside the point. Indeed we can say that Baars, Dennett, etc. are simply changing the subject. What's more, they may be evading the issue entirely. Goldman seems to agree with this position. Or, at the least, he puts this position when he says that his own

discussion strongly suggests that only the phenomenal notion of consciousness is the one intended in common usage”.

To get back to Dennett's ostensibly negative definitions of the word 'consciousness'. We can say that defenders of consciousness (or qualia) also often tell us what consciousness isn't, not what it is. For example, Thomas Nagel (in his well-known 'What Is It Like to Be a Bat?') writes:

It is not analyzable in terms of any explanatory system of functional states, or intentional states, since they could be ascribed to robots or automata that behaved like people though they experienced nothing.”

A Dennettian could of course reply:

Who says that consciousness “is not analyzable in terms of any explanatory system of functional states, or intentional states”? Indeed I can say that even if I also believe that it's indeed the case that “they could be ascribed to robots or automata that behaved like people though they experienced nothing”.

So is Nagel also begging the question here? Simply because robots or zombies can be described functionally (also intentionally?), that doesn't automatically mean that human consciousness can't be so described. For one, perhaps biological (or human) functionality (as with John Searle) is simply different to non-biological functionality. And even if it is the same, the fact that robots or zombies can also be described functionally doesn't automatically mean that consciousness isn't a matter of functions, processes, behaviour/action, systems and whatnot.

George Rey, along with Nagel, also tells us (in his paper 'A Question about Consciousness') what consciousness can't be. Thus:

... whatever consciousness turns out to be, it will need to be distinguished from the thought processes we ascribe on the basis of rational regularities.”

This seems to be an even more extreme position than the functionalist one just discussed in that thought itself (i.e., not only “the content of thought”) is rejected as being a possible component of consciousness.

Here again we have a negative take on consciousness when David M. Rosenthal (in his paper 'A Theory of Consciousness') tells us that “[w]e cannot explain consciousness in terms of what is not mental”. Then he trumps this by concluding that “explaining consciousness in terms of conscious states will be trivial and uninformative”! This sounds like another hint at Kantian noumena; or, perhaps, a hint at “intrinsic [phenomenal] properties” - the inexplicable, unanalysable and primitive!

Alvin Goldman recognises this problem of defining consciousness in terms of what it's not. He writes (in the paper already quoted) about three such positions:

Each [position] tries to explain the consciousness of a state in terms of some relation it bears to other events or states of the system: (1) its expressibility in verbal behaviour, (2) the transmission of its content to other states or locations in the system, or (3) a higher-order state which reflects the target state.”

(1) to (3) are opposed to

[o]ur ordinary understanding of awareness or consciousness seems to reside in features that conscious states have in themselves, not in relations they bear to other states”.

Indeed Goldman goes one step further (though he's stating Ned Block's position here) and says that

at least one sense of 'consciousness' refers to an intrinsic (rather than a relational) property, called phenomenal consciousness”.

Again, wouldn't Dennett say that consciousness is (at least partly) “expressibility in verbal behaviour”, “the transmission of its content to other states or locations in the system” and so on?

In addition, what does it mean to talk about the “features” which “conscious states have in themselves”? This makes such features sound (again) like noumena. And if they are noumena, then perhaps a Dennettian may say: No wonder we can't say much - or anything - about them! What is it for a feature of a conscious state to be “intrinsic”, rather than “relational”? Can we so much as grasp that distinction or those terms?

To get back to Dennett himself.

Take his well-known and often-repeated claim that “there is no single central place (a Cartesian Theater) where conscious experience occurs”. Now, obviously, that's not a definition of consciousness. It doesn't tell us what consciousness is either. Instead it tells us where conscious experience doesn't occur. Thus isn't consciousness simply assumed in this part of Dennett's definition?

An addition to the “multiple drafts” explanation is more helpful. Dennett tells us that there are "various events of content-fixation occurring in various places at various times in the brain". This may mean that consciousness amounts to these “various events” whose “content-fixation occurring in various places at various times in the brain”. However, the question still remains:

What are these various events? Are they constitutive of consciousness?

If so, does that mean that these brain events are consciousness? Or do they simply underpin consciousness? If this is all about brain events, processes, functions, behaviour, etc., then many would argue that Dennett simply isn't talking about consciousness. He's talking about, well, brain events, functions... Nonetheless, A Dennettian may simply reply by saying that such people are – again! - simply begging the question against his position. That is:

Why are you simply assuming that brain-events, functions, processes, behaviour/action, etc. can't be – or aren't – constitutive (or examples) of consciousness?


No comments:

Post a Comment