Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Math Science

Traffic Jams In Your Brain 250

An anonymous reader writes "Carl Zimmer's latest foray into neuroscience examines why the brain can get jammed up by a simple math problem: 'Its trillions of connections let it carry out all sorts of sophisticated computations in very little time. You can scan a crowded lobby and pick out a familiar face in a fraction of a second, a task that pushes even today's best computers to their limit. Yet multiplying 357 by 289, a task that demands a puny amount of processing, leaves most of us struggling.' Some scientists think mental tasks can get stuck in bottlenecks because everything has to go through a certain neural network they call 'the router.'"
This discussion has been archived. No new comments can be posted.

Traffic Jams In Your Brain

Comments Filter:
  • That was easy! (Score:3, Insightful)

    by anss123 ( 985305 ) on Saturday November 20, 2010 @07:09AM (#34290350)
    103173

    I don't see what so hard about opening a calculator and typing some numbers.

    Kids these days!
  • by Anonymous Coward on Saturday November 20, 2010 @07:12AM (#34290360)

    doing complex multiplication isn't inherently necessary to staying alive. being able to discern who is who is.

  • by h4rm0ny ( 722443 ) on Saturday November 20, 2010 @07:16AM (#34290372) Journal
    I don't think it's processing power or inability at all. I thnk it's lack of working memory. We can all work out 357 multiplied by 289 easily with pencil and paper. Very easily. And we could do it in our heads just as well if we could casually remember all the intermediary stages: e.g. 9 times 7 is 63, 9 times 50 is 450, 9 times 300 is 2,700, sum all three numbers and remember the result, now begin with 80 times... etc. But it's not easy for most people to do that. The computation is easy. But we need more registers.
  • by ultranova ( 717540 ) on Saturday November 20, 2010 @07:23AM (#34290402)

    Couldn't it just be that we do not really have direct access to the raw computational capacity of the brain?

    Probably. You can scan a crowd because you have a hardware-level implementation for that; you can't multiply efficiently because that has to go through multiple levels of emulation, at least one of which has a severe lack of reliable memory.

    We shouldn't forget that abstract thought is actually a very new evolutionary hack; we've only had a real culture for a 10,000 years or so. Before that, it was cave paintings for a 100,000 years. You can't expect a very experimental feature to be thoroughly optimized, yet.

  • by satuon ( 1822492 ) on Saturday November 20, 2010 @07:28AM (#34290424)

    I think it is because the brain is at heart an analog instead of digital machine. Multiplying integer numbers however isn't a task well suited for analog machines.

  • by Rich0 ( 548339 ) on Saturday November 20, 2010 @08:18AM (#34290522) Homepage

    Uh, when I was taught math in elementary school, concepts similar to the Mayan depiction were often used. The only difference I see is that this was all done in base 10 and not in a hybrid of base-5 embedded in base-20.

    I'm not really sure what you're getting at. Sure, you can represent numbers as shapes and sizes, but I don't see how this really helps mental math except when it comes to order-of-magnitude calculations.

    If I want to multiply 357x289, I can already tell you that the answer is somewhere around 90000. The challenge comes if I want to know the answer to more than 1-2 significant figures. I don't see how using something like the Mayan system or any other system is going to accomplish this.

    In any case, I'm not even sure what the problem that you're trying to solve is. The average person can do math well enough to get by in the real world. Sure, it would be nice to be able to walk down the aisle at the grocery store and figure out the per-unit prices in my head to 3 sig figs, but I don't see anything you're offering as accomplishing this. If I'm going to do a model simulation run I'm going to use a computer, and that requires almost zero mental effort around performing calculations - just a TON of creativity and analysis creating the mode/etc.

  • Re:That was easy! (Score:3, Insightful)

    by roman_mir ( 125474 ) on Saturday November 20, 2010 @08:46AM (#34290600) Homepage Journal

    I don't know about that, it's does make some sense, it allows us to use visual memory to do calculations (like when we play chess without the board). I actually can imagine an abacus, it's nearly the same as imagining hands and fingers, it's easy to use that to do binary by the way.

    But in case of my grandmother, she remembered a LOT of numbers just like that, because you know, decades of experience all around numbers.

  • Re:Pseudoscience? (Score:3, Insightful)

    by goodmanj ( 234846 ) on Saturday November 20, 2010 @09:14AM (#34290712)

    Under the assumption that those recogntition tasks are inherently memory-intensive, the brain has to have similar amounts of memory at its disposal.

    I question the assumption you're making. The nervous system is not a computer in any useful sense: its elemental storage is not in bits, and its elemental operations are not bit logic. To compare its "specs" with a digital computer is to compare apples and oranges.

    Example: pitch recognition. How does a computer recognize the pitch of a sound? An incoming audio signal is converted by an analog-to-digital converter and stored as a long string of numbers in memory. A Fourier transformation algorithm is performed to transform this into pitch-vs-amplitude data. The human ear can do the same thing: can we draw conclusions about the ear's memory storage, CPU speed, and analog-to-digital converter specs by the comparison? No, because the human ear doesn't work that way. It does frequency detection "in analog hardware", as a consequence of resonant structures in the cochlea: the signals coming out of the cochlea encode pitch information, yet the cochlea has no memory or CPU at all.

    And that's just one tiny simple structure in the human nervous system. Multiply that category error by a million or so to see how false comparing brain processes to computing processes is.

    Back to my original point: while at a neurons-and-ganglia level you can't compare the brain to a computer, the *conscious mind* *can* emulate a computer, among other things. But the mind can only emulate a computer with a short-term memory of 7 items, regardless of what you think the "memory" of the underlying substructure is.

    And the fact that our conscious short-term memory holds 7 "items", not bits -- the items can be digits, words, names, faces, or objects -- continues to show just how un-like a computer the brain really is.

  • by icebraining ( 1313345 ) on Saturday November 20, 2010 @09:24AM (#34290762) Homepage

    and 'writing' the numbers in the air helps send them to longer term memory somehow

    Sure, it turns them into visual memories.

  • An analogy (Score:5, Insightful)

    by goodmanj ( 234846 ) on Saturday November 20, 2010 @09:29AM (#34290784)

    Here's an analogy to illustrate the category error people make when comparing the human brain to a computer:

    "A Sony Walkman can record and play music in realtime, fast-forward and rewind, and store an hour's worth of music. These tasks require a 75 Mhz processor and 100 megabytes of memory on an iPod Shuffle. Therefore, a Sony Walkman has a 75 Mhz processor and 100 megabytes of memory."

  • by goodmanj ( 234846 ) on Saturday November 20, 2010 @09:46AM (#34290848)

    2 corrections:

    1. "I think it is because the brain is at heart an analog instead of digital machine. Multiplying integer numbers however isn't a task well suited for analog machines."

    The humble slide rule is a beautiful analog computer whose primary job is doing multiplication. A skilled user can do multiplication with one faster than he can use a digital calculator.

    2. The brain isn't a digital computer, but it isn't really "analog" either. Individual synapses are either off (not firing) or on (firing), never something in between. But the *rate* at which they fire encodes information in a way that's not analogous to either analog calculating machines or digital computers.

    Comparing the human brain to *any* human technology, be it a digital computer or an analog calculator, is a massive category error.

  • by MDillenbeck ( 1739920 ) on Saturday November 20, 2010 @10:14AM (#34290954)

    Neurobiology is a fascinating topic. Of course a brain is not a digital. Neurons often have multiple connections (dendrites) and emit more than one type of neurochemical signal and often has more than one type of receptor. However, I can see the point that these neurochemicals are sent out in specific quanta and that a threshold needs to be exceeded to initiate a response. Thus instead of using a neuron as the basic unit but the receptor type as the unit, we can see neurology in a digital aspect. I would take it a step further that the brain would then be a series of parallel digital computers (based on receptors) that are networked to produce a series of responses, both when considering a network of neurons and within the neuron itself.

    Essentially, what we are looking at is emergent behavior. On the receptor level we see digital activity. However, once we get to the neuron or brain level, the emergent behavior of the system appears analog.

  • by mr_mischief ( 456295 ) on Saturday November 20, 2010 @10:26AM (#34291018) Journal

    We also haven't been worried so much about exact numbers of things for much of that time, and matching faces against memories isn't that exact of an example.

    You're likely to recognize someone who grew a mustache or cut their hair, or to ask someone familiar to you where they got a fresh scar rather than walking right past them.

    You are also not likely to care exactly how many bushels of barley you raised until you start selling the grain for currency or protecting it from known thieves. So long as your granary doesn't run out before the next harvest, you have enough grain. Even when bartering or selling for currency, unless you do a lot of it you can estimate your reserves of unsold stock. Once you move to a mercantile economy rather than being your own producer of sustenance, though, knowing how much of something you have and what you can get in exchange becomes more important.

    Building things takes a similar route to economics. If you're building small houses with a central hearth, the construction skills are much more important than anything numeric. Once you're building grand temples and fortifications, engineering kicks in.

    Now for the car analogy. I'll hit both engineering and economics. Once you have the materials and power sources to make automobiles and airplanes, engineering and trial-and-error still play a role. If you build custom buggies or roadsters on the weekends, you can utilize hard engineering but you probably don't need to. If you're meeting specific crash safety, fuel economy, and profit margin goals for the design of a car model and its highly automated production process for a big mass-market car manufacturer, your numbers had better be right.

  • by mr_mischief ( 456295 ) on Saturday November 20, 2010 @10:37AM (#34291050) Journal

    I don't recall a proper citation, but I seem to remember that even identifying quantities at a glance goes something like "none, one, two, three, four, five or six, some, a dozen, a score, a few score, oh my that's a lot". The specific levels at which those change over can vary, of course. Some people probably would say "about ten" before they'd say "about a dozen", too.

    One thing I've always liked about the Imperial measurement system, in fact, is that although the math is a little harder the units and their ratios really seem to be more relevant. An inch, a hand, a foot, and a yard seem to be more reasonably compared to one another than a millimeter, a centimeter, and a meter. There's the decimeter which seems it would be a very reasonable length for measuring everyday things, but the meter is too long for many things and the centimeter is too short. I'm not sure why the decimeter is almost never used. The cubed decimeter is even the definition of the (surprisingly non-SI) liter. The official SI unit of volume is the cubic meter. Who the hell drinks a cubic meter of anything at one go? I'd drink a liter or a quart, or maybe a cup or a pint. Maybe even a half gallon or two liters Maybe several pints if you'd kindly agree to drive me home. ;-)

  • by khallow ( 566160 ) on Saturday November 20, 2010 @10:45AM (#34291088)
    While math is a genuinely complex subject, I still think you should try to briefly elaborate on what you're talking about rather than just dumping links to books. I recently argued with someone about an economics subject. After making many unsubstantiated claims and accusing me of being "conditioned", his side eventually boiled to "watch my ridiculously long video for my argument" (I scrolled through the video, it had some psychedelic stuff, movie outtakes, etc, but not anything I'd consider related to the stuff he was talking about, except in some sort of weird brainwashing way). No offense, but I don't think it's fair to the reader (especially when you consider that your post may be read by hundreds of people) to acquire and read three books when they don't even know yet what you are talking about.

    After all, the internet is the living embodiment of the letdown. If I breathlessly tell everyone "there's this paper that will change your life", odds are much better that it's something crazy, like Hollow Earth or Electrical Universe, than something that would actually be beneficial to you.

    I'm not looking for a copy/paste of the book or anything like that. But it would be nice to describe briefly what goes on and how the method you describe addresses the grandparent's concern.
  • Re:Pseudoscience? (Score:3, Insightful)

    by hedwards ( 940851 ) on Saturday November 20, 2010 @11:10AM (#34291206)
    Actually, you're not entirely correct, the human brain is much more like old console hardware than a modern computer. Because a lot of that stuff was done on consoles via registers. The programmer didn't have to do anything in particular other than write to or read from the appropriate register to have whatever done.

    Such as on the GBA, if you wanted to write to the screen you would select the correct register and give it the correct value, the hardware would do the rest.
  • by drosboro ( 1046516 ) on Saturday November 20, 2010 @11:43AM (#34291348)

    I'm pretty sure you've hit the nail on the head. The only problem - this isn't new and publishable like a "router in your brain". Miller's Magical Number Seven (Plus or Minus Two) [wikipedia.org] was published way back in 1956. It's easy to see how it applies to a multiple-step calculation like this.

  • Re:Router eh? (Score:4, Insightful)

    by 0100010001010011 ( 652467 ) on Saturday November 20, 2010 @12:27PM (#34291582)

    Like Sleep? I can't count the number of times I've been stuck with programming logic, math word problems, etc. I'll stare at it until I can't make any more sense of it, go to bed. Wake up and within 30 seconds have the solution.

    Sounds pretty close to a reboot to me.

  • by drfireman ( 101623 ) <dan@kiMOSCOWmberg.com minus city> on Saturday November 20, 2010 @01:20PM (#34291882) Homepage

    Asking why we can't do three-digit multiplication quickly even though our brains is complex is sort of like asking why a toaster can't tell you ratios of voltages even though it has resistors in it. It's the difference between what a machine does and how it works. Brains are fabulously complex, but one thing they weren't built for is three-digit multiplication. Does the brain "know" how to do multiplication really really fast? Yes, of course, there are all kinds of things going on in the brain that involve multiplication. Does it know how to do it with numbers that come in through the ears, and spew the answer out through your mouth? No, brains weren't built to do that. They were, however, built (so to speak) to do much more complicated (but different) things, like recognizing threats and understanding spoken language.

    I don't know how good the router analogy will turn out to be, but it's not exactly breaking news that some things need attended, more-or-less serial processing, and that mental arithmetic is one of them. The things that don't need as much attention are things that are evolutionarily old and more or less built-in. Extremely overlearned tasks can fake it sometimes. Guys like Hal Pashler and Stan Dehaene are always making progress into understanding how and why these things work, but the idea of processing bottlenecks in cognitive function is very old. The router analogy is probably a bad one, because it's unlikely that the brain's router lives in any very specific place. It's more likely a property of how the brain adapts to tasks it wasn't designed for.

  • Re:Router eh? (Score:3, Insightful)

    by fyngyrz ( 762201 ) on Saturday November 20, 2010 @01:47PM (#34292008) Homepage Journal

    And I wonder if the supreme court judges' routers are missing the DNS information that is supposed to point to the constitution... because there's an awful lot of "lookup failed" in their decisions.

  • by Tablizer ( 95088 ) on Saturday November 20, 2010 @02:23PM (#34292226) Journal

    The imperial units are usually more divisible by 3 and 4, something metrics suck at. 12 is a better base than 10 for most uses. God fscked up when he made our hand.

  • by Anonymous Coward on Saturday November 20, 2010 @05:10PM (#34293212)

    The reason a human can quickly and easily scan a crowd to recognize a familiar face is not at all due to the processing or computing power of the brain. Rather, it's a byproduct of the way the brain works combined with how humans develop after they are born. From birth, the brain is constantly being exposed to stimuli, and trying to build associations between those stimuli. That's why a baby is not capable of scanning a crowd and recognizing a familiar face the day its born; but fast forward several months, and it can. It didn't get better at computation, it simply had more neurons available that were hooked up in the right way for that task. It learned how to do that through a combination of stimuli, feedback, and association. That's essentially what the brain is ... a machine that takes in stimuli, and evaluates feedback.

    It's entirely conceivable that you could take a baby, and create a specialized series of experiences which would effectively train its brain to solve very complex mathematical problems. Imagine a 3 year old who could only do the most basic human things (eat, sleep, excrete), but also take complex math problems as input, and produce the results nearly instantaneously. Once the brain knows "how" to solve a problem, it can usually do so very quickly... for example, scanning a crowd. The problem is teaching the brain how to solve the problem naturally.

On the eighth day, God created FORTRAN.

Working...