Our Brains Don't Work Like Computers 737
Roland Piquepaille writes "We're using computers for so long now that I guess that many of you think that our brains are working like clusters of computers. Like them, we can do several things 'simultaneously' with our 'processors.' But each of these processors, in our brain or in a cluster of computers, is supposed to act sequentially. Not so fast! According to a new study from Cornell University, this is not true, and our mental processing is continuous. By tracking mouse movements of students working with their computers, the researchers found that our learning process was similar to other biological organisms: we're not learning through a series of 0's and 1's. Instead, our brain is cascading through shades of grey."
comparisons (Score:5, Insightful)
Fuzzy Networks (Score:3, Insightful)
-1, Roland Piquepaille (Score:3, Insightful)
Re:Hmm... (Score:3, Insightful)
Computers can process "shades of gray" (Score:3, Insightful)
Re:Fascinating (Score:3, Insightful)
Yep (Score:2, Insightful)
I guess some brains just have more contrast then others...
Re:really?!? (Score:3, Insightful)
Misleading (Score:5, Insightful)
The Slashdot headline says our minds don't work like computers, end of sentence.
Had TFSH (The Fine Slashdot Headline) been accurate, this would've been a mind-blowing result and in need of some extraordinarily strong evidence to support such an extraordinary claim. The question of whether the human mind--sentience, consciousness, and all that goes with it--is a computable process is one of the most wide-open questions in AI research right now. It's so wide-open that nobody wants to approach it directly; it's seen as too difficult a problem.
But no, that's not what these guys discovered at all. They just discovered the brain doesn't discretize data. Significant result. Impressive. I'd like to see significant evidence. But it's very, very wrong to summarize it as "our brains don't work like computers". That's not what they proved at all.
Just once, I'd like to see a Slashdot editor read an article critically, along with the submitter's blurb, before posting it.
Re:-1, Roland Piquepaille (Score:3, Insightful)
wonder if he's giving kickbacks to samzenpus for posting his stuff.
Re:comparisons (Score:5, Insightful)
Re:-1, Roland Piquepaille (Score:4, Insightful)
Both are computationally complete so WHO CARES? (Score:2, Insightful)
Re:really?!? (Score:5, Insightful)
Yes, that was sarcasam!
Re:really?!? (Score:2, Insightful)
Evolution (Score:5, Insightful)
That makes perfect sense, seeing as our brains evolved [talkorigins.org] from other biological organisms.
Check out evolutionary psychology [wikipedia.org] for some information. You'll view the world differently afterwards.
Evolutionary psychology (or EP) proposes that human and primate cognition and behavior could be better understood by examining them in light of human and primate evolutionary history... The idea that organisms are machines that are designed to function in particular environments was argued by William Paley (who, in turn, drew upon the work of many others).
Universality of computation (Score:5, Insightful)
Some people ascribe the seeming magic of consciousness to some ineffable property of the brain, e.g., quantum mechanical effect. While other insist that its just what happens when you connect enough simple elements in a self-adaptive network.
The question is, are there neural input-output functions that are fundamentally not computable? If not, then a digital computer will, someday, reach human brain power (assuming Moore's law continues).
Re:huh? (Score:3, Insightful)
Your brain is composed of billions of individual processing units. Each of those processing units may be sort of like a stream processor (like in Cell), in that they take inputs, perform a computation, and then fire out an output (although I don't know if anyone's even determined that conclusively). However, your brain is composed of billions of those linked together in very complex ways.
Suggesting that your brain only works on one item at a time is rather naive. It is most certainly doing many things at a time.
Fuzzy Networks-Side by side. (Score:1, Insightful)
Re:Misleading (Score:4, Insightful)
I don't see how that's at all possible given the underlying physical process. As voltage, or frequency, or whatever is the carrier for the "signal" traverses a synapse, at some level, nature itself quatifies it. There has to be a point where the level of the signal is distinguished as discrete from another level. One electron more or less, one Hz more or less. . . The question is, how consistent is the hardware at distinguishing the signal differences as discrete? I'm guessing that neurons probably aren't as sensitive as a purpose-designed piece of silicon could be. But maybe that inconsistency is a crucial part of the characteristics of data processing of biological nervous systems - those characteristics being what distinguishes them from technological systems. . . ?
Brain vs. Mind (Score:5, Insightful)
Natural numbers (1,2,3...), true/false, up/down...
It's not unnatural to divide everything in half, heck our bodys are mostly symmetrical; the distiction comes in where the dividing line is.
We can weight our decisions in endless ways, if someone makes a statement, our belief of that statement depends on how many times we have heard it, our trust in the stater, if it meshes with known facts in the current context.
What I wonder is how far can a human mind be pushed in terms of concepts it can grasp and control it has, can a human visualise a 5 dimensional virtual object? control emotional responses, without supressing them? hold multiple contridictary world models? accelerate long-term memory access?
Even if you think of an electronic computer, it's just hordes of electrons rushing down pathways, only reliable because the voltage levels are continually refreshed at each step, a few electrons might wander off the path, but they are replaced at the next junction. Quantum Mob Rule.
How does the mind emerge from the brain? (Score:4, Insightful)
We look around our world and notice that computers are superficially similar to brains (e.g. they can both do math), so we hypothesize that they work similarly.
However, there's very little hard evidence supporting this hypothesis in the first place, so there's no "news" in this story.
Bottom line: The brain is not just a super-powerful computer.
Pretty Please (Score:4, Insightful)
Could we pretty, pretty please have a Roland Piquepaille section, so we can opt-out? I've been good all year, and it's almost my birthday, and I won't ask for anything for Christmas.
-Peter
Re:The Network is the Computer (Score:3, Insightful)
Re:huh? (Score:3, Insightful)
"Thus you cannot say human brain does parallelistic operations at the same time"
Unless of course you want to be factually accurate.
So essentially... (Score:3, Insightful)
Hm, duh?
In all seriousness though, I wonder how the curvature of the mouse shows gravitation to one side versus the other, maybe they're just a quake2 player and enjoy cirlce-strafing.
Re:The brain is not a computer (Score:3, Insightful)
Consider that with all our signal processing techniques, a computer can't easily (despite what "CSI" says
Some people look on the destructive power of the elements or the vastness of space as humbling, but the intricate complexity of the brain is just as impressive, IMHO.
Simon.
Re:May I Be the First ... (Score:2, Insightful)
Ya, people have known this for quite some time. (Score:3, Insightful)
I hope no one was using this research to acquire a PHD or MS. The "brains are not computers" epiphany has been realized about billion times already. And this research could stand to be much deeper.
I'm a little bummed about the shallow linguistics analysis. It's interesting and all, but I wish they would have really jumped into something such as pattern recognition.
I'm and interactive designer, and I tend to believe that language and interaction is based upon pattern recognition. Our brains receive data, and compare them to flexible patterns in order to make decisions. This study certainly supports that theory.
In this case, if you show a candle and a dog to a user, and tell the user to click on the candle, the user will jump directly the candle since a dog does not fit the pattern of a candle at all...both visually and verbally. However, if you present someone with a picture of a candy stick and a candle, they will hesitate upon selecting the candle since they bare verbal and visual similarities. More processing time is needed to compare intricacies.
People probably slow down and curve their mouse movement since they are still comparing patterns while they are selecting. By curving the track path, users increase tracking distance and cognitive processing time. It also allots them a circular motion which can easily translate into a last minute decision change. When people are unsure of things, they usually prepare themselves for backing out.
damn I'm a geek
understanding the brain (Score:3, Insightful)
Re:comparisons (Score:3, Insightful)
On a different note I think from the article it's unclear whether they mean to say that the brain is not like a modern digital computer with ram and hard disks etc. (which is most definitely correct) or whether they're trying to say something as silly as a brain couldn't be modeled by an ideal Turing machine (I think it's a fact that any given physical could be modeled by a Turing machine, though I could be wrong).
Re:The brain is not a computer (Score:5, Insightful)
The point under discussion in this article is summed in this quote:
"More recently, however, a growing number of studies, such as ours, support dynamical-systems approaches to the mind. In this model, perception and cognition are mathematically described as a continuous trajectory through a high-dimensional mental space; the neural activation patterns flow back and forth to produce nonlinear, self-organized, emergent properties -- like a biological organism."
The goal is to forcefully point out (using an experiment) that the one way we think about mental processing, the digital computational model, is not very useful even at the trivial level of mental signal processing.
It's interesting how all the sarcastic comments about the "biological organism" reference completely miss the point. The point is that the signal is being processed in a way that could be modeled by the way a biological organism moves through space. It sniffs here, then there, then jumps to the solution. The signal processing itself exhibits emergent properties.
The reference to the dynamical system (http://en.wikipedia.org/wiki/Dynamical_system [wikipedia.org]) is key. (I think people frequently fail to gloss the additional "al" and think this refers to some sort of generic "dynamic system"). Dynamical systems, although deterministic, are a foundational tool for developing chaos theory.
For me the interesting idea is that the default state of thought is in-betweeness. We stay jittering back and forth in an unresolved state until, suddenly, we aren't.
Nothing new... (Score:2, Insightful)
Re:The brain is not a computer (Score:3, Insightful)
A brain, for the same input will have different outputs. Try asking your wife or gf if they are in the "mood". Will you get the same answer all the time? The connections in the brain constantly rewire themselves hence it CANNOT be a function.
Erm... what about rand()? fread()? time()?
When you consider that the question you proposed to your SO is fairly high-level, what about has_new_mail()? "SELECT count(*)"?
Computers only return the same value from a function if they're in the same state. The only difference there is that we can set the state in a computer. We can't load and save timestamped personalities/feelings/memories/etc. with people.
If we could, you'd likely find (IMHO) that the "function" of your SO is fixed also.
Re:comparisons (Score:5, Insightful)
That's wrong. Godel's Theorem shows that there exist true theorems that are unprovable -- by humans or computers. It doesn't say humans can "demonstrate" them better than a machine. At best, it shows you can "guess" a theorem (and wave your hands to make it seem plausible) and no one is able to DISprove it, but not that a human could "demonstrate" its truth when a machine couldn't. A mathematical proof is purely logical and computers can verify and generate these proofs, if not yet as elegantly as humans.
Re:comparisons (Score:5, Insightful)
Unlikely. First, what they are saying here is that there is no clock. The brain is fundamentally analog in both state and TIME. To "simulate" it using computer algorithms would likely require finely stepped integrators for every connection of every neuron and every chemical pathway. Even the modeling of the blood flow and its nutrients is likely critical to a successful simulation of the thought process in some way. Its not at all like a normal computing problem. Its more like computing physics. We'd need processors like the new PhysX chip though vastly more sophisticated. I'm thinking that a high fidelity of all of the connections of a single neuron in real time would likely take a full chip.
Furthermore, there is no evidence that we'll even be close to understanding how to teach the simulation if we created it. I'd put better odds on the creation of some sensing technology that could fully map the physical connections and the electrochemical state of every neuron and other component involved in thought (does anyone really think we know all of the components?). And I'd still place those odds very low.
And what if we could simulate it... should we? It is likely that we'd create many insane intelligences in the process, either because we didn't duplicate the processes close enough, didn't put in all of the instinct portions of the brain that actually have much more to do with true intelligence than the thinking portions, didn't provide the inputs that they were designed to have, or tried to improve on a analog machine with a complexity level far beyond modern math's ability to balance. And, whether or not its true, many would call them life. Turning them off would likely be considered the same as killing them. The ethical dilemmas that would come about are tremendous.
Re:comparisons (Score:3, Insightful)
Godel's theorems CANNOT be used to prove that the brain is smarter than the computer - in fact, human brains are ALSO governed by the theorem.
Please do a search on "Emperor's New Mind" and "Shadow of the Mind", and challenge yourself to find the known flaws in them.
Re:Misleading (Score:3, Insightful)
However, their experiment did not look close enough to pick out the jaggies.
Someone can write a computer program that behaves the same way as the experiment subjects. Now what can they conclude?
Looks like another example of Cargo Cult science.
Heh. You're funny (Score:3, Insightful)
No, sorry. The world doesn't revolve around you or your hobbies. There _are_ plenty of jobs for which the computer isn't the important part. It's not what makes them money.
E.g, for a lawyer it's a better investment of their time to study the laws and precendents, than to learn networking protocols. E.g., when you need surgery, better hope that that surgeon spent their time becoming a better surgeon, instead of becoming a networking expert. Etc.
For most jobs the computer isn't even as necessary as you'd think. It's at best "nice to have", but not justifying investing months into learning IT and networking protocols.
E.g., it's nice for a lawyer or doctor to have the client files on a computer instead of looking through a filing cabinet. But it's not as essential as you'd think. If you expect him/her to spend months becoming a computer expert, for something that saves him/her _maybe_ an hour per week, you need to put down the crack pipe. Then the computer would actually waste their time instead of saving them anything.
Here's another idea for you: You are there and are getting those calls not from "idiots" but from basically victims of a scam. All the "computers are easy", "wireless networking is easy" or "connecting through our ISP is so easy that grandma could do it" ads are actually marketting scams.
Computers are nowhere near that easy yet, or not without investing some signifficant time. But if your employer actually told those people "sorry, folks, it's only for IT gurus. Spend some time becoming an IT pro and growing a goatee, and then it'll be for you", then they'd lose business. Then, see above, you'd be surprised for how many people the computer isn't _that_ important.
So your employer, and a bunch of others, lied to those people to get their money. There's a name for that. It's called "fraud".
And now those people merely expect your employer to live up to those fake claims. They were explicitly told that they'll just plug it in and be online, so it's _not_ unreasonable for them to actually expect it to work like that.
Because thet's how any other industry works. If a car manufacturer told you "this model reaches 60mph in 8.9 seconds", you'd damn well expect it to live to those expectations. You'd expect that after 8.9 seconds, that car damn better be at 60mph.
Same here. If your employer told them "just pop in this CD and you'll be online in less than 1 minute", they expect that after 1 minute they damn better be online and surfing.
That's why you get those calls. Because those people expect your employer to live up to some very explicit claims.
And now for something nasty (Score:4, Insightful)
But what cracks me up is that the most arrogant assholes are the ones with the least skill or achievement. When you see someone harping the most about how he's uber-L33T because he knows what an IP address is, and how everyone else is an idiot... chances are it's someone who actually knows the _least_ about those. Chances are it's not a programmer who actually writes socket code, it's not a hardware engineer who's designed a network card, etc. No siree, it's a script-reader from the hell-desk that does the "I'm so l33t and everyone else is an idiot" fuss.
So you want to call people idiots if they don't know some computer trivia you know (off a list of canned answers)? Well, then being an EE and having some 20+ years of programming experience, I'll call _you_ an idiot, because you're below _my_ skill level.
Sure, you know what an IP or port number is or how to find it out in Windows. (Or can find it out on your list of canned answers.) But can you actually _use_ a socket on that port? Can you for example write a game server that listens on that port? If I gave you an old network card, can you find the right Linux kernel driver and change it to make it work with that card? Or what?
Or, ok, you do know what an IP address is. Congrats. Do you also know what a B-Tree is, how it works, and how to implement one in your code? Do you also know the difference between, say, MergeSort and QuickSort, and the influence of external (e.g., DB file on a disk) vs internal (in RAM) sorting on their performance? Can you implement either purely as, say, a state-machine driven by exceptions to signal state changes, just to prove that you actually understand the algorithm, as opposed to copying someone else's code off the net? Do you know the difference between bitmap indexes and b-tree indexes in Oracle, and can discuss when you might need one instead of the other?
Hey, it's computer stuff too. Very basic stuff too, nothing esoteric. We established already that computer stuff matters, and you're an idiot if there's something you don't know about them.
Re:comparisons (Score:5, Insightful)
1. You can't derive the arithmetic of the natural numbers from it.
2. There is at least one true proposition that isn't a theorem in the system (i.e. it's incomplete, hence the name of Goedel's theorem).
3. The system isn't consistent.
(3) renders a deductive system worthless, and (1) renders it pretty weak, so one can hope at best for (2).
Note that nothing is said about humans versus machines, and there's no reason that humans aren't as subject to it as programs.
Example, which I think I read about in GEB (but customized for the current discussion): "lawpoop cannot consistently assert this proposition." Clearly that is a true statement. (Yes, it's silly, but Goedel's theorem goes through a lot of work to generate an arithmetic encoding of "This statement is not provable in deductive system S," which is much the same sort of statement.) Sorry, but there's nothing magic about humans.
Re:OH MY GOD (Score:2, Insightful)
No, it's not. I can easily write a program that solves the halting problem for certain special cases, for instance for turing machines without "loops".
same old story (Score:3, Insightful)
Re:And now for something nasty (Score:3, Insightful)
It's like any complex problem where it seems easy until you look into it. The more you understand about it, the more you realise how little you understand.
Me? I know that I know nothing at all - so I must be the wisest guy alive *grin*.