Jump to content
Science Forums

Mental processing is continuous, not like a computer


Tormod

Recommended Posts

The theory that the mind works like a computer, in a series of distinct stages, was an important steppingstone in cognitive science, but it has outlived its usefulness, concludes a new Cornell University study.

 

lefthttp://hypography.com/gallery/files/5/cover_art_thumb.jpg[/img]Instead, the mind should be thought of more as working the way biological organisms do: as a dynamic continuum, cascading through shades of grey.

 

In a new study published online this week in Proceedings of the National Academy of Sciences (June 27-July 1), Michael Spivey, a psycholinguist and associate professor of psychology at Cornell, tracked the mouse movements of undergraduate students while working at a computer. The findings provide compelling evidence that language comprehension is a continuous process.

 

"For decades, the cognitive and neural sciences have treated mental processes as though they involved passing discrete packets of information in a strictly feed-forward fashion from one cognitive module to the next or in a string of individuated binary symbols -- like a digital computer," said Spivey. "More recently, however, a growing number of studies, such as ours, support dynamical-systems approaches to the mind. In this model, perception and cognition are mathematically described as a continuous trajectory through a high-dimensional mental space; the neural activation patterns flow back and forth to produce nonlinear, self-organized, emergent properties -- like a biological organism."

 

In his study, 42 students listened to instructions to click on pictures of different objects on a computer screen. When the students heard a word, such as "candle," and were presented with two pictures whose names did not sound alike, such as a candle and a jacket, the trajectories of their mouse movements were quite straight and directly to the candle. But when the students heard "candle" and were presented with two pictures with similarly sounding names, such as candle and candy, they were slower to click on the correct object, and their mouse trajectories were much more curved. Spivey said that the listeners started processing what they heard even before the entire word was spoken.

 

"When there was ambiguity, the participants briefly didn't know which picture was correct and so for several dozen milliseconds, they were in multiple states at once. They didn't move all the way to one picture and then correct their movement if they realized they were wrong, but instead they traveled through an intermediate gray area," explained Spivey. "The degree of curvature of the trajectory shows how much the other object is competing for their interpretation; the curve shows continuous competition. They sort of partially heard the word both ways, and their resolution of the ambiguity was gradual rather than discrete; it's a dynamical system."

 

The computer metaphor describes cognition as being in a particular discrete state, for example, "on or off" or in values of either zero or one, and in a static state until moving on. If there was ambiguity, the model assumed that the mind jumps the gun to one state or the other, and if it realizes it is wrong, it then makes a correction.

 

"In thinking of cognition as working as a biological organism does, on the other hand, you do not have to be in one state or another like a computer, but can have values in between -- you can be partially in one state and another, and then eventually gravitate to a unique interpretation, as in finally recognizing a spoken word," Spivey said.

 

Whereas the older models of language processing theorized that neural systems process words in a series of discrete stages, the alternative model suggests that sensory input is processed continuously so that even partial linguistic input can start "the dynamic competition between simultaneously active representations."

 

Source: Cornell University

Link to comment
Share on other sites

Our brains really are analog systems, not digital. This outcome would be expected. This more interesting conundrum is the mechanism for learning and memory. We can store new informaiton very quickly, and retain it for extended periods of time. Further, new information is stored more effectively if it relates well to existing information. The neurobiology to support this learn/store/integrate function is extraordinarily complicated.

Link to comment
Share on other sites

Not sure that this study actually proves what they say it does.

 

Imagine a computer with two processors. One processor spends all it's time throwing out possilbe outcomes, regarding what is going to be happening in a few fractions of a second to a few minutes time.

 

This stream is directed to the other processor, which is routed securely in the "now". It doesn't try to guess what is happening in the future, it simply(!) takes the very latest info from the predictor, and the senses, and sorts them into what is actually happening.

 

With a set-up like this, the predictor is going to throw the right answer most of the time when the answer is non-confusable, and it is going to throw the wrong answer about 50% of the time when the answer is confusable. The later, secondary "now" processor, however, corrects the incorrect prediction by direct comparision with the latest info. It can do this fast, because it doesn't have to worry about might-be.

 

There would be a good amount of evolutionary pressure towards a system like this. We need to understand and predict the near future in most of life. Will that dog bite me? Will I get hit by that car? Will cracking this BBS from this IP get me caught?

 

Now, it is easy to see that some of those choices will be "obvious" to the predictor system, which lives in the next few seconds, and has been tuned by years of cause and effect, and some are not. The cracking one is not, since the delay is both too great, and the number of times is too small. The dog biting depends on several things, especially exposure to dogs. The car thing, to most people, is everyday. The ability to "see" what is about to happen is a great edge in a fight, too.

 

However, models are frequently wrong. The hypothesis has to be tested somewhere, and quickly enough that you avoid stepping out in front of the car just a bit too late.

 

Hence, we get stuck in patterns that get us killed, the minute we take our eyes (or mind) off the task at hand.

 

However, none of this is anything that a well-programmed, very, very fast (set of) computer(s) could not do.

Link to comment
Share on other sites

Our brains really are analog systems, not digital.
In a sense, this is very true – brain activity involves a lot of big, specialized molecules diffusing across aqueous synaptic humors, a distinctly analog process. In another sense, the brain is surprisingly digital – a fundamental brain event, the depolarization of a axon, is distinctly all-or-nothing, binary.

 

The typical digital computer is surprisingly analog – transistor switching events involve variable flows of many electrons to provoke other variable flows of electrons, digital only because such transistors are designed to avoid persisting in transitory states. The analogy of neurotransmitters to electrons, axons to transistors, is closer than one might think.

 

I believe the language processing model described in Spivey’s study is better considered a difference between typical computer software and architecture vs. the brain than an intrinsic difference in the 2 hardwares. Like the “older models of language processing” the article describes, digital computers have been predominantly engineered and programmed to use “series of discrete stages”, because it’s usually easier for engineers and programmers to meet goals using a hybrid state-value machine model than using more exotic “neural net” approaches. Exceptions exist, most in areas like complex optimization problem solving, image recognition, and voice recognitions.

 

In short, I believe the statement “Mental processing is continuous, not like a computer” reflects not just the stated change in models of brain language processing, but also a superficial appreciation of what current digital computers are actually like, and what they are likely to be like in the future.

Link to comment
Share on other sites

In a sense, this is very true – brain activity involves a lot of big, specialized molecules diffusing across aqueous synaptic humors, a distinctly analog process. In another sense, the brain is surprisingly digital – a fundamental brain event, the depolarization of a axon, is distinctly all-or-nothing, binary.

 

The typical digital computer is surprisingly analog – transistor switching events involve variable flows of many electrons to provoke other variable flows of electrons, digital only because such transistors are designed to avoid persisting in transitory states. The analogy of neurotransmitters to electrons, axons to transistors, is closer than one might think....

Your point about axons firing (or not) is true. But the logic in a biological system is analog, not digital. Each neuron has thousands (or orders of magnitude higher) of physical connections to it. Input depolarizations to a dendrite will act in additive fashion until the total depolarization exceeds a threshold, and the axon fires. This analog summation feature has no peer in digital systems. Further, some input stimuli are actually inhibitory. Ergo the analog summation includes both stimulatory and inhibitory influences that drive the axon firing. The axon depolarization then contributes to the next analog summation in the next post synaptic neuron.

 

The only way to see this kind of analog processing in a digital system is to look at hight level constructs at the programming level. The processing at the base level is purely boolean, and natively digital.

Link to comment
Share on other sites

Does continuous imply chaotic?

 

Rightly or not, an inference one might draw from the headline “Mental processing is continuous, not like a computer” is that this difference poses an insurmountable hurtle to the simulation of mental processing on a digital computer. My previous post was meant to refute that idea, without directly confronting it, as I am here.

 

Some background history: There are commonplace analog phenomena that cannot be successfully simulated on a computer. Most famous among these is the weather. For much of the first half of the 20th century, most technologists, notably the legendary John von Neumann, believed that successively more accurate simulations of the atmosphere would allow it to be precisely forecast many months into the future. In the 1960s, this belief was overturned when work on atmospheric modeling by such folk as Edward Lorenz revealed a category of problems that would come to be formalized under the moniker “Chaos Theory.”

 

The reason chaotic systems cannot be usefully simulated is that small inaccuracies in measuring the initial state of the system to be modeled yield large differences between the future states of the simulated and actual system. No improvement to the model can eliminate the difference – Unless the initial state measurement is virtually perfect, even a perfect model of a chaotic system becomes decreasingly accurate in its predictions with the passage of time.

 

The key question becomes, then “does ‘mental processing is continuous’ = ‘mental processing is chaotic’”. If so, it joins the weather in the category of things we can’t hope, even in principle, to simulate.

 

At first analysis, the answer seems to be yes. Like the atmosphere, the brain has a lot of moving molecules, electrons, etc. It’s hard to imagine, and likely provably impossible to invent, a scheme to measure its state to virtual perfection.

 

I’ve reached the opinion, however, that the correct answer is no. In the same way that a digital computer reduces a potentially chaotic system of moving electrons and changing magnetic fields into discrete binary bits to produce a system with a state that can be perfectly measured as a single large integer, I believe the brain’s scheme of inhibitory and excitory chemical inputs producing discrete nerve firings reduces makes it, in principle, measurable and simulatable.

 

I attempt further exploration of this question in the Science Forums>Computers and Technology>Upload your mind into a computer by 2050? thread.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...