Friday, January 30, 2015

Against the mind/brain by Fred Cummins

Guest post from Fred Cummins in response to a tweet exchange.

This is a letter to Greg Hickok in response to some recent tweet exchanges that immediately seemed to raise issues that are not resolvable in 140 character snippets. I'm including Andrew Wilson and Sabrina Golonka from Leeds, and Marek McGann from Limerick in the distribution, as I believe we might all have interesting perspectives on the issues at stake.
Let's start with this heartfelt tweet from Greg, which I hope represents common ground among all of us:

I wonder how much the pace of science would quicken if we could just understand what the fuck each other are trying to say. (Origin: @GregoryHickok)
I might have some reservations about the use of "pace", which suggests a simple linear progressive course from ignorance into certainty, to be delivered by science, but the sentiment that we are talking past each other, and that this is not in any of our interests, motivates this reply.
Then, to start the discussion, here is a sequence of three further tweets (concatenated) by Greg which triggered this response:

Embody: 'To give a concrete form to what is abstract' (OED). In the brain, concrete form = neural codes (firing patterns, etc). An embodied mental concept then just means a concept defined by neural codes. Embodied cognition therefore targets the same question as standard cognitive neuroscience: how does the brain code information?
And one further attempt by Greg to nail down two opposing (or complementary?) positions:

Fair to lump embodiment in two broad camps? 'Psychological': cog grounded in sens-motor. 'Physical': cog+sens-mot grounded in body/envirnmnt
I bristled. I objected. I complained that Greg was misappropriating the word "embodied":

OED a terrible source from which to work here. Your nonce definition demands faith in a computational orthodoxy many reject. (Source: @fcummins)

Where should we start then? (@GregoryHickok)

By not misappropriating the term "embodied"? CogSci has not been helped by Camp A's dismissal of Camp B (for many As and Bs) (@fcummins)
To which Greg replies:

Ok, I'm listening. What is your version of embodied? (@GregoryHickock)
This is where the tweet-format breaks down, because that is rather a big question. It is also a question I have no intention of answering, because it presupposes further common ground that is not, in fact, present. This yawning abyss needs to be acknowledged before we could usefully approach an answer. Otherwise, we are indeed incapable of understanding what the fuck we are hearing from each other.
Important qualification: There are not two camps here, with Greg in one and Andrew-Sabrina-Marek-Fred in the other. I do not speak for Andrew-Sabrina-Marek, and my views here are my own. I am not confident they would garner assent from anyone else. That (and the good natured tone of twitter exchanges among us) will allow me to perhaps overstate my case in the interests of drawing attention to the missing common ground that must be acknowledged if things are to improve. And I will undoubtedly make unwarranted assumptions about Greg's position and commitments too, for which I will happily accept vilification and rebuke.  And if my tone degenerates into incoherent ranting towards the end, I ask for good-natured indulgence.  I love the challenge set by the first tweet, and I think we all need to learn how to argue, with due acknowledgement of the chasms that can separate us.

At issue here is not a disagreement over how we understand the mechanisms of the cognitive system that give rise to human behaviour. It is not a matter of negotiating the degree to which such explanations need to appeal to properties of the non-brain body in addition to the computational properties of the brain. The disagreement is much more serious and fundamental than that.
There is not, nor is there likely to be, agreement that the brain is best understood as a computational machine. Likewise there is no agreement that there *is* a cognitive system. There is no agreement that the physiological activity of the brain is properly understood in representational terms. There is no agreement that the brain is the locus, or origin, of phenomenal experience, or consciousness. This is a lot of disagreement. So much, that a premature jump to accounts of what "embodiment" might mean will get nowhere.

I don't want to convince Greg, or the whole of cognitive neuroscience that their enterprise is fundamentally unsound. I do believe that the orthodox interpretation of the brain, and the causal accounts of human behaviour that result, are tragically wrong, and that alternatives are necessary. But here is perhaps the biggest bone of contention of all: I do not believe that there exists, could exist, or should exist, a final fixed account of what a person is, how experience unfolds for them, or what it is like to be a person. These, I firmly believe, should be negotiated. Putative answers are not of the nature of "facts" in inorganic sciences, and will never admit of subsumption under laws comparable in rigour and predictive power to the laws of mechanics.

And so the representational, computational, information processing, cognitivist approach to brains and to people has its place. It is hopelessly wrong, but that is not the problem. The problem is the authority granted to models, findings, pronouncements arising from such views. Science does not give us certainty here. It gives us the means to negotiate and approach local consensus, for some issues, some of the time.

What I do find lamentable is the failure of folk in the representational camp (role played by Greg here) to even acknowledge that their position is open to question. To assume that it is OK to appeal to representations and information processing and perceptual inputs, and the whole grab bag of psychological concepts that, from where I and many others are standing, appear to be worse than untrustworthy.

That sentience might be a property of living forms, rather than brains, is a well articulated position of some venerability. Philosophical arguments extend back at least to Kant, and recently Thompson's Mind in Life, building as it does on people like Jonas, Husserl, Varela, Maturana, and many many others, provides a well fleshed out account. This is not a single voice. Relating mind to life is going on all over the place. The most significant developments in biological philosophy that address the question of how "life" is to be understood all feed into the discussion here. Terrence Deacon's "Incomplete Nature", and Stuart Kaufmann's "Investigations" are some relevant works. I could list dozens of others, but all I want to establish is that there is more than one game in town and the wilful neglect of the grounded, reasoned, position of others that the rationalist, cognitivist camp regularly engages in speaks only of their lack of knowledge of the territory. We in the mind and life camp are well aware of the arguments, successes, failures, weaknesses and strengths of the cogitivist position. Why do we repeatedly run into a failure of that camp to acknowledge other positions? Some of the founders are well known for their appalling arrogance here. Fodor, in 2001, says:

[The Computational Theory of Mind] is, in my view, by far the best theory of cognition that we’ve got; indeed, the only one we’ve got that’s worth the bother of a serious discussion.... its central idea---that intentional processes are syntactic operations defined on mental representations---is strikingly elegant.
That phrase "the only one we've got that's worth the bother of a serious discussion" rankles me. It is dismissive and ignorant. Chomsky argues regularly in similar vein, and has recently been called publicly to task for it by Christina Behme. This is rank, ignorant, fundamentalism.
The 1991 book The Embodied Mind (Varela, Thompson, Rosch) is a landmark work that introduced some of these concerns into cognitive science. It also represents the first principled introduction of the term Embodiment into such fundamental discussions. The term has been dragged through many ditches since, and used and misused in so many ways that any suggestion that there is an agreed interpretation of the term is ridiculous. To illustrate my point, I need only point out how Greg felt that embodiment could be subsumed within his view of how brains relate to experience and behaviour (I hate the word "mind").
Is it necessary to argue here the shortcomings of the contemporary cognitivist view? Is it necessary to decry it as a crypto-religious solipsistic Cartesian approach, bound inseparably to a cultural-historical view of the human that valorises individual agency and autonomy to the point of shutting the person off from their world? To my jaundiced eye, all computational accounts we currently have fall woefully short of providing plausible accounts of either experience or behaviour. They lean on a magical notion of representation that is a theological construct, yet they are not aware of their theological commitments.
The issues at stake are not small. They extend to more than a single word. But I hope this response can serve to lay down a marker, so that Greg, and anyone else belonging to the Church of Cognitivism, does not feel free to misappropriate terms from reasoned approaches to cognition they do not even acknowledge. I am happy to elaborate on any of the points above, but I feel this tweet has now reached its character limit.

Cognitive Science Programme
UCD School of Computer Science and Informatics


Unknown said...

Can anyone help me and summarize Cummins' arguments? I have learned that the cognitivist account is fundamentally unsound, tragically/hopelessly wrong, orthodox, suffers from a lack of knowledge, is dismissive, ignorant, fundamentalistic, crypto-religuous, and solipsistic, but I didn't quite get why. Where did I miss the facts?

A second issue I did not get is what exactly is the problem with "representation" (or computation, or information processing). I thought that in general there is nothing really mysterious about it. There are neurons in the brain that "correlate" with something happening out there in the world. A bee's neural system contains neurons that selectively fire when a particular kind of flower appears in the bee's visual field. Likewise, bar and grating cells "correlate with" or "represent" bars and gratings. But maybe I am overly naive in these things. So please help me, why is it problematic to use the term "representation" for correlative firing patterns?

Greg Hickok said...

No, there is no argument here. It's a sermon. The point I've been trying to make all along is that if you look under the hood of embodied accounts they end up being computational.

Fred said...

To be fair (1) this was written to Greg, not as a blog post, and (2) I didn't try to argue against representationalism. I tried to point out that representationalists are typically unaware that there are other approaches, and simply insist that all approaches must condone a computational interpretation of the brain. Greg's point that the other approaches reduce to representationalism illustrates this nicely.

Arguing agains representationalism requires even more space than we have here. One set of readings designed to introduce such arguments is here:

Computationalists and representationalists who start of with "What could possibly be wrong with computationalism?" have simply not begun to do their homework.

Greg Hickok said...

I've looked at some of the arguments from robotics, read a book or two, read Andrew and Sabrina's stuff and I still maintain that it's computational under the hood. Maybe the embodied folks who dismiss computational approaches haven't done their homework on what we count as computational. As I suggested in my book and in a previous blog post, let's pic an easy problem, say sound localization in the barn owl and see if you can do it without computation.

William Matchin said...

@ Moritz,

The term "representation" as used in the cognitive sciences is different than the usage in philosophy. Representations aren't defined by how they represent information "out there" in the world (this is the traditional usage in philosophy as I am aware of it), but rather, the term is typically used to mean "mental object". In this sense, a representation may have no direct external correlate to the outside world. The word "dog" would include various mental representations including a sequence of abstract phonemes that end up getting converted via a series of intermediary representations and computations into sound waves; these sound waves may have no direct correlation to the underlying mental representations, as Alvin Liberman's work in the 1950s showed (the same phoneme has dramatically different acoustic consequences under different articulatory configurations).

As far as I can tell, Fred argues that there is some coherent position that denies the existence of mental representations and computations, defined as mental objects and processes. I think such positions are hopelessly irrational, which is probably why cognitivists dismiss them so readily.

Norbert said...

I think that Greg has the right argumentative strategy. When issues are complex, let's start (or at least have as guides) a few concrete cases whose details we can point to to anchor our discussions. This is what Greg does so well in his book. Cummins shies away from this, and I think I know why. ONce we get down to details, we will see that the issue is not whether representations and computations matter, but which ones. This will, however, demystify the "debate," not something that all parties to the debate desire.

At least in those areas that I known anything about (language) there really is no alternative to the computational theory. So, if Cummins has something in mind, then it would be useful to present an purported explanation or two to fix ideas. Short of this, there can be no useful discussion.

Fred said...

The Barn Owl is a lovely example, and one where a "computational" account is obviously a great way of approaching our best description of the role of the nervous system in generating this behaviour. But the term "computation" here is used in the sense of "doing sums", not in the sense of something that gives rise to intentionality, as it is invoked in mainstream representational cognitive science. Doing sums is obviously going to be necessary here, and there is no point in arguing over whether computation is required here, any more than if we were discussing the modulation of blood pressure in the circulatory system. There, too, we should use sums, and we could even talk of representation in a philosophically benign manner as we relate numbers in one part of a network to those elsewhere.

But at the heart of representational cognitive science is the claim that computation and representation support our mental claims -- they ground the manner in which we come to know the world, by virtue of generating intentionality of a rather different sort.
One common case trotted out here to demonstrate unequivocal representations (neural patterning that is about something in an exterior world) is the retinotopic, tonotopic and somato sensory maps found in primary sensory cortices. These seem to be clearly "about" some part of the body. But the sense in which they are about say the play of light at a location on the retina (and by dodgy inference, in the visual field), for example, is a straightforward one or two step topographical mapping. In this way, they are about the world/cns border in the same way that a footprint in the sand is about a stepping event. That kind of "aboutness" is entirely causal and unproblematic. An imprint is about a prior event.

But like the innocuous sense of computational (we did some sums), this innocuous use of the term representation cannot serve to do the many tasks it is required to within cognitive neuropsychology. That kind of representation requires that some element within the system (symbol? firing pattern? population code? gamma-theta entrainment?) relates to some feature of the external (non-neural) world by virtue of playing a role within a symbolic system. The direct link that gave rise to the topological mapping is "black boxed". We can no longer follow the path from referent to referring representation through a series of causal transformations. This seems to be an article of faith at the heart of classic cognitive science. (Perhaps Greg and others will say "but we don't subscribe to that usage!". Fine, but the magic intentionality of symbols in systems in a Fodorian, or Newell & Simon, sense is always lurking, and if you don't distance yourself from it, you will be called to task on it.) (End of Part 1 of 2)

Fred said...

So let me counter with another example, this time from the world lf language (or languaging as we types are wont to call it). My own preferred object of study is joint speech, which is when many people say the same thing at the same time (think prayer & protest & football). Elsewhere I argue that this is a phenomenon of acute interest to human scientists, and it is a phenomenon in which the hallmark of the intentional cannot be stuffed into individual skulls, because it is collective. The fundamental disagreement I started this post with has at its heart the intrinsic shortcomings of a psychological approach that limits the mental to one mind at a time. This, I will insist (but we would need much more space to do this properly) *is* a theological intrusion into our science, and one that we would be better to be aware of. The psychological, on an embodied and enactive view, is not limited to single, atomic, minds of individuals. The world experienced, engaged with, and enjoyed, is intersubjective at its core.
There are over 9000 articles on the entirely marginal phenomenon of glossolalia. There is an order of magnitude less on joint speech, yet this is a ubiquitous phenomenon at the heart of human practices world wide, with direct neural correlates (if you think that is important) that we would do well to have some science of. A Cartesian science of the mind will blind us, if we insist it is the only account.

But I don't need that kind of consideration when talking about sound localization in the barn owl, and I am not moving into your turf there. I am asking that we recognize the plurality of approaches, without the kind of blanket dismissal that this discussion so often engenders. (Part 2 of 2)

David Poeppel said...

Geeesh ... I thought I was reasonably familiar with cognitive science, but the issues raised here are well beyond the scope of my understanding. Is the bottom-line message really that we are supposed to be open-minded and methodologically pluralistic? Come on ...

I would welcome an analysis of how joint speech, say, informs what we can/might/should know about the mind and brain (since the sound localization issue didn't take). I am, for example, interested in lexical access, and I'd like to know more about how the processes underpinning that well-defined phenomenon (I naively assume it is uncontroversial that we access some stored entries in our head) can be illuminated more thoughtfully in the approach that is being advocated here.

I am, for the record, open to embodied findings/interpretations, insofar as they can be shown to be coherent and have causal force in the question of interest. (See, say, Lewis & Poeppel, 2014, Brain & Language).

Also, a recommendation: a bit of Mahon & Caramazza and anything by Gallistel is salutory in these contexts. I recommend, say, issues surrounding navigation of the Tunisian desert ant cataglyphis. Very fascinating and informative.

Andrew said...

For what it's worth, I'm basically on board with what Fred is saying. I think that most cognitive scientists simply don't know that representations and computations are *optional* ways of explaining behaviours and that there are viable alternatives being developed. I note that a lot of the comments here are good evidence for this view :)

The trouble with getting out from computation is two-fold. First, it's hard to change if you don't have something good to change to and there aren't good non-computational analyses of all the things people want to study. Second, you can always add a representational/computational gloss to any mechanism. That's the power of computation, it really is that useful. But being able to describe a computational route doesn't mean that is what the system at hand is doing. (A simple example to illustrate: when I drop a ball it's motion is governed by the local dynamics but can also be described computationally. The latter is handy but it's not what the ball is doing.)

Sabrina and I wrote a book chapter with Eric Charles ( where we just point out that the most useful thing we embodied types can do is provide an alternative framework for people to do science in. I think this is important work to move forward on this issue.

Unknown said...

Fred, I completely agree that the issues of intentionality and phenomenal experience are notoriously difficult (if not outright impossible) to account for in a computational theory of the mind. The thing is, cognitivists (by and large) know this. The question is whether there is another game in town that gives a reasonable account for these sorts of things.

Reading what you wrote, it seems to me that you are advocating for two mutually incompatible views. On the one hand that you are claiming embody cognition theorists are able to give us a coherent account of all the stuff that cognitivists care about AND also solve the issues of intentionality, qualia, phenomenal experience and so on. On the other hand, you seem to advocate that the theoretical and ontological commitments of cognitivists and embody cognition theorists are so different that these two groups actually care about different objects of study altogether.

Did I get the first part right? If so, would you mind sharing a reference or an account of what you consider a successful story that would substantiate this "embody-cognition-gets-right-what-cognitivism-doesn't" attitude? This is not a fecetious request. If you don't reject representationalism outright (which you don't seem to), and you take seriously the success stories of cognitivism, then if you show a way in which we can have our cake and eat it too, I think everyone would rejoice.

However, did I get the second part right? If so, then it seems to me that you are not so much arguing for "methodological pluralism", but for two ultimately different fields of study: cognitivists would study "minds" as a sort of information processing symbolic structure that individuals have and embody cognition theorists, who eschew the very idea of individual "minds", and who try to give an account of intentional properties (etc?) of the intersubjective experience of sentient beings (or whatever it is that you consider to be the better framing of your ultimate object of study - you explicitly reject the word "mind", at some point, so I think you probably have something else in mind, no pun intended).

Fred said...

> Is the bottom-line message really that we are supposed to be open-minded and methodologically pluralistic? Come on ...

I don't see why this is a bad idea. If my understanding of cognitive science is so radically different from yours, why would it be a bad thing to be less than dogmatic? This exchange started with my contention that Greg was seemingly not aware of the magnitude of the disagreement, of the space of difference, between us. You say my concerns about collective intentionality are beyond the scope of your understanding. I doubt that is true, but if it is, they still represent serious challenges for any cognitive science that includes languaging in all its forms as part of its object of study.

The study of joint speech has caused me to challenge my own understanding of mind and brain. If you have never encountered it before, I can see being bemused, but I cannot see why it should be shoehorned into a (from my perspective) narrow view of what cognitive science is about.

Cognitive science is a much larger undertaking than is represented here. I think that is the root of our disagreement. And in a series of comments on a blog, we're not going to resolve them. But I stand resolutely by my initial claim that Greg's appropriation of the term "embodied" was not well informed about embodied approaches to minds, brains, behaviour, experience or cognition.

But I have appreciated this exchange. Our paths will undoubtedly cross again 9and again . . .).

Greg Hickok said...

Can someone give me a single example of a non-information processing (not using the loaded term, computation) account of a behavior? I've previously tried looking at the examples that Andrew & Sabrina have pointed to (fly ball catching, crickets, walking robots) and they all involve information processing. If radical embodiment can't explain something without appealing to information processing, then it's dead in the water. Unless this is just a semantic debate about what counts as information processing, in which case, we should all just get back to work.

William Matchin said...

@ Andrew: "we just point out that the most useful thing we embodied types can do is provide an alternative framework for people to do science in. I think this is important work to move forward on this issue."

In theory there are always potentially other viable frameworks.Developing an alternative is certainly useful, but only compelling if there are well worked out examples that persuade us it is viable. Chomsky critiqued the framework of Skinner (thoroughly, I might add, unlike the embodied or "postcognitive" complaints against cognitivism), but he simultaneously developed a framework for the study of language that did some real work.

Greg Hickok said...

Perhaps it's telling that some of the (prima facie) compelling examples of radical embodiment (walking or "tidying" robots, ear position, etc) are quite peripheral processes, close to body configurations that shape the input/output. But notice that to maintain this kind of account, you'd have to just wire up the eyes to the leg--no brain needed--in which case you get a creature that can't do much. Sure it might tidy some blocks, but ask it to put the red block on the blue block and it will just continue tidying. Perhaps, then, the biggest challenge to radical embodiment is the existence of 100 billion neurons in between the environment and the body.

Andrew said...

In theory there are always potentially other viable frameworks.Developing an alternative is certainly useful, but only compelling if there are well worked out examples that persuade us it is viable.
I'm pretty sure we know this. There are many examples of successful 'radical' (non-representational) studies in the literature; the extensive a-not-b error research programme by Linda Smith and Esther Thelen stands as an excellent example. The empirical work and the theoretical framework development is coming along hand in hand. Plenty of work to do, of course, but we've been going about 40 fewer years than standard cognitive psychology!

Chomsky critiqued the framework of Skinner (thoroughly, I might add, unlike the embodied or "postcognitive" complaints against cognitivism), but he simultaneously developed a framework for the study of language that did some real work.
Of course, Skinner famously never replied to Chomsky's so-called 'devastating critique' because he read it and realised Chomsky didn't understand Skinner's behaviourism. Let's just say Chomsky isn't my model for a revolutionary in science - I prefer James Gibson :)

William Matchin said...

@ Andrew: There are many examples of successful 'radical' (non-representational) studies in the literature
The a-not-b program you cite is not an alternative framework; it is the exact same framework: decisively cognitive, computational, representational, with some minor quibbles over the details of what representations and computations there are. Let me quote from an article by Linda Smith (2006) on the matter:

"Their approach starts with an analysis of performance, with the looking, reaching, and memory events essential to the infant’s real-time behavior in the task. The key behaviors are these: The infant watches a series of events, the toy being put into a hiding location and then covered with a lid. From this, the infant must formulate a motor plan to reach and must maintain this plan over the delay. The motor plan, necessary in any account of infants’ performance in this task in and of itself, is a 'belief' on the part of the system that objects persist in space and time. In this way, the object concept could be considered to be embedded in—not mediating between—processes of perceiving and acting." [emphasis by Smith]

This framework denies an "object concept", but look at all the terms used: watching, motor plans, memory, maintenance, events, formulate. I don't see how this is any different from the standard cognitivist framework, except that it denies object concepts (but it doesn't deny that the world is parsed into things called "events", the infant has "motor plans", "watches" events, etc. If this is the best a "postcognitivist" can come up with for an "alternative" approach, this is very disappointing. Greg is right that "under the hood" these approaches are identical to the computational approach.

Smith, L. B. (2006). Movement matters: the contributions of Esther Thelen. Biological Theory, 1(1), 87-89.

Andrew said...

But Linda (and especially Esther) never implemented those processes computationally in the brain. Their entire approach is dynamics. Any processes are implemented as the operation of calibrated dynamical systems, which is not a computational process (although it is one you can describe computationally, or at least mathematically).

You are essentially proving Fred's point again: reading everything as if it were computational, as if that were the only way to implement these processes. It's not.

Linda's not the most 'radical' psychologist I know. But Esther sure as hell was and between them they were way more embodied and dynamical than computational and representational. The a-not-b analysis is about showing how a behaviour previously explained with a putative representation is actually best explained by the embodied action of the a reaching infant. This 'replacement' strategy is the essence of the radical approach.