This is a comment on J. K. O’Regan. and Alva Noë, "A Sensorimotor Account of
Vision and Visual Consciousness" Both their paper and my comment are
forthcoming in The Behavioral and Brain
Sciences
Behaviorism Revisited
Ned Block, Professor
Departments of Philosophy and Psychology
Main Bldg, Room 502A
100 Washington Square East
New York NY 10003
Office: 212-998-8322
Phil Dept: 212-998-8320
FAX: (212) 475-2338 or 995-4179
E-mail:"ned.block@nyu.edu"
http://www.nyu.edu/gsas/dept/philo/faculty/block/
Abstract: Behaviorism
is a dead doctrine that was abandoned for good reason. A major strand of O’Regan’s and Nöe’s view
turns out to be a type of behaviorism, though of a non-standard sort. However, the view succumbs to some of the
usual criticisms of behaviorism.
O'Regan and Noe declare that the qualitative character of experience is
constituted by the nature of the sensorimotor contingencies at play when we
perceive. Sensorimotor contingencies are a highly restricted set of
input-output relations. The restriction excludes contingencies that don’t
essentially involve perceptual systems. Of course if the
‘sensory’ in ‘sensorimotor’ were to be understood mentalistically, the
thesis would not be of much interest, so I assume that these contingencies are
to be understood non-mentalistically. Contrary to their view, experience is a
matter of what mediates between input and output, not input-output relations
all by themselves. However, instead of mounting a head-on collision with
their view, I think it will be more useful to consider a consequence of their view that admits of obvious counterexamples.
The consequence consists of two claims: (1) any two systems that share that
highly restricted set of input-output relations are therefor experientially the
same and (2) conversely, any two systems that share experience must share
these sensorimotor contingencies. Once stated, the view is so clearly
wrong that my ascription of it to them might be challenged. At least it
is a consequence of a major strand in their view. Perhaps this will be an
opportunity for them to disassociate themselves from it. I will
limit myself to (1).
There are some unfortunate people whose visual apparatus has been severely
damaged to the point where they can distinguish only a few shades of light and
dark. You can simulate this “legally blind” state at home, though imperfectly,
by cutting a ping-pong ball in half and placing one half over each eye.
In addition, many people are paralyzed to the point where they can control only
a very limited set of behaviors, e.g. eye-blinks. For someone who has
both problems, visual sensorimotor contingencies are drastically reduced.
In fact, it would seem that they are so reduced that they could be written down
and programmed into an ordinary laptop computer of the sort we find in many
briefcases. If this were done, O’Regan’s and Noe’s thesis would commit
them to the claim that the laptop has experiences like those of the legally
blind paralytic I mentioned. (This form of argument derives from my reply
to Dennett in Block, 1995a (p. 273) and has been subsequently used by Siewart,
1998.)
What would those experiences of the laptop have to be like? Well, if you
put the ping-pong balls on your eyes you can get some insight into the matter
yourself. My judgement is that such experiences are no less vivid
than ordinary experience although of course greatly reduced in informational
content. Perhaps O & N will say that the paralysis dims the experience to
the point where it is not so implausible that the laptop has it. However,
people who are temporarily completely paralyzed often report normal experience
during the paralyzed period. In any case, can there be any doubt
that such people do have some experience, even visual experience, and
the laptop has no experience at all? I would say the same for
people who are born with severe limits
to their visual apparatus but report visual experience with low informational
content. There is every reason to think that these people have some
visual experience and that the corresponding laptop has none.
Behaviorism in one form is the view that two systems are mentally the same just
in case they are the same in input-output capacities and dispositions.
There are standard refutations of behaviorism. (See for example, Block
1995b, 377-384 or Braddon-Mitchell and Jackson 1996, pp. 29-40 and 111-121.)
But what really killed behaviorism was the rise of the computer model of
cognition. If cognitive states are computational states of certain sorts,
behaviorism runs into the problem that quite different computational
states of the relevant sort can be input-output equivalent. For
example, consider two input-output equivalent computers that solve arithmetic
problems framed in decimal notation. One does the computation in
decimal whereas the other translates into binary, does the computation in binary
and then translates back into decimal. Delays are added to get the two
computations to have the same temporal properties. Behaviorism died
because it didn’t fit with the computational picture of cognition. Putting the point dramatically and over-simply,
behaviorism died because it isn’t true even of computers!
O & N’s view as I am interpreting it is a form of behaviorism. It
isn’t the general behaviorism that I just described because it is about sensory
experience, not cognition and not mentality in general. And so it might
be thought to escape the problem just mentioned. Since it is not about
cognition, O & N don’t have to worry about two different cognitive states
being input-output equivalent or two identical cognitive states implemented in
systems with different input-output relations. But their view is doomed
by a similar problem nonetheless: the same input-output relations can be
mediated either by genuine experience or by simple computations that involve no
experience. Genuine experience need not have a complex computational
role, and that less complex experience surely can be simulated in input-output
terms by a system that has no experience.
References
Block, Ned, 1995a. "How Many Concepts of Consciousness?" in The Behavioral and Brain Sciences 18, 2, pp. 272-284
Block, Ned, 1995b. “The Mind as the Software of the Brain”, An Invitation to Cognitive Science, edited by D. Osherson, L. Gleitman, S. Kosslyn, E. Smith and S. Sternberg, MIT Press.
Braddon-Mitchell, David and
Siewart, Charles, 1998. the Significance of Consciousness.