Wednesday, November 12, 2014

Computation at the neuron level -- where noncomputational embodied theories need to start

It seems that some embodied theorists see no need for computation or perhaps even information processing.  Rather than talking about, say, how interaural time difference (ITD) information can be used to compute spatial location, some embodied theorists want to say that spatial location is "perceived directly" given the physical signal as it passes through body-determined channels.  The brain is thought to bring little to the task in that the physical signal is not transformed but rather registers directly in neural systems.

These theorists have spent a fair amount of time talking about the body--the movement is called "embodiment" after all--but little time talking about what's going on at the neuronal level.  I say, point well taken with respect to the contribution of the body: you don't get ITDs in the first place without two ears and a head in between.  But I also say, it is time for embodied theorists to look at the next step in the "registration" of those physical signals: the function of individual neurons. (Actually this is the second step, the first being transducer organs such as the cochlea and photoreceptor cells).  Physical signals must be passed through neurons, which exhibit a complex relation between input and output.  Some would even go so far as to say neurons are transforming the signal, i.e., computing. Here's a quote that gives a sense of what's going on at the single neuron level:
Neurons take input signals at their synapses and give as output sequences of spikes. To characterize a neuron completely is to identify the mapping between neuronal input and the spike train the neuron produces in response. In the absence of any simplifying assumptions, this requires probing the system with every possible input. Most often, these inputs are spikes from other neurons; each neuron typically has of order N ~ 10^3 presynaptic connections. If the system operates at 1 msec resolution and the time window of relevant inputs is 40 msec, then we can think of a single neuron as having an input described by a ~ 4 x 10^4 bit word—the presence or absence of a spike in each 1 msec bin for each presynaptic cell—which is then mapped to a one (spike) or zero (no spike). More realistically, if average spike rates are ~10s^-1 the input words can be compressed by a factor of 10. In this picture, a neuron computes a Boolean function over roughly 4000 variables.  Aguera y Arcas et al. Neural Computation 15, 1715–1749 (2003)
If you want neuroscientists and good old fashioned cognitive scientists (GOFCS) to take you seriously, build some "embodied" models of whatever process you are interested in and let's see how far you get without transforming the information (and therefore morphing into a GOFCS).  For now, we don't see how you can get past even a single neuron without information processing rendering your more fundamental claims pretty much vacuous.  


Andrew Wilson said...

Direct perception is not an argument that the brain resonates directly with an information variable and does not do anything to it. Direct perception is the argument that detecting a variable that lawfully specifies a dynamical property of the world is all that's required to say you perceive that dynamical property. Clearly there is transduction, etc; visual information is in light and there isn't light bouncing around in my visual cortex. The issue remains, however; what is the best way to conceptualise that neural activity and what it's doing?

tim faber said...

Hi all! I was wondering in what way direct perception can relate to the idea of predictive coding. I guess this theory relies strongly on internal representations and discrepancies between visual input and existing models of the same input. With predictive coding there is not so much a problem with perception being seen as suboptimal but rather that topdown processes fill in or strongly interact with bottomup processes.

Greg Hickok said...

I think the radical embodieds will have to deny predictive coding (too much of a contribution by the brain for their liking), which flies in the face of much empirical evidence. Andrew?

Andrew Wilson said...

I actually don't know enough about predictive coding to answer that question, although it's on my list to learn about at some point. My one general thought is that just because it's called 'predictive coding' by people who work within an information processing, representational framework doesn't mean that's what it is :)

Max Baru said...

@Andrew: I don't understand the statement "Direct perception is the argument that detecting a variable that lawfully specifies a dynamical property of the world is all that's required to say you perceive that dynamical property."

If, as you say, transformation is required then I don't see how perception is entailed by and only by detection.