Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Science

Ask Slashdot: Storage Capacity of the Human Brain? 518

STUR submitted this interesting question: "Since humans are supposed to have such great minds, I would like to know how much storage capacity a human brain has compared to, let's say, a computer's hard drive. Hrm... If the capacity is high enough, do you think that computers of the future could possibly use the brain as a sort of hard drive or ram chip? Just a little something to think about." I've heard that the brain uses a form of holographic storage to archive its information and I don't know if there is a direct mapping to that and say terabytes of information (warning: I am not an expert!). What do all think?
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Storage Capacity of the Human Brain? have?

Comments Filter:
  • by Anonymous Coward
    I would be sceptical of any calculation of the
    capacity of the brain in terms of bits or bytes.

    And the capacity is somewhat less a matter of
    interest than the methodology used to process,
    retrieve, and synthesize input and associations.

    Any purely mechanisitic model of conciousness is
    bound to fail in an early stage of development.
    I think Richard Lewontin has dealt with the
    subject at great length if anyone is really
    interested.

    Even (to use a hackneyed example) trying to
    infer the internal workings of a mechanical watch
    from it's external appearance without any previous
    knowlege of it's workings is a formidable task.

    How much more difficult to do the same with the
    brain.

    Phil
  • by Anonymous Coward
    When people remember things, they do not rememeber anything like a "Bitmap". To teach a computer what you would remember of the chair you are sitting on could take megabytes of memory, you know if it has wheels, if it goes up and down, where it is ripped, and the patterns and feelings of the various parts. A computer would have to store bitmaps, feeling, sound and operational info.

    You remember, however, just what it is "Like". You have an idea of what a "Chair" is, and then you remember the exceptions. If it is ripped, you probably don't know everything about the rip, just that it is "Like" this other kind of tear I remeber. Over all, very little, uh, data storage is used, but the programming is being updated constantly.

    There is also massave duplication and interaction of data in an immensivly parallel system.

    I suppose if you were going to equate your brain to your comptuer, you would have to design one where each bit had its own microprocessor capible of interacting with many other (bits/microprocessors). The processor would have to be capible of reprogramming itself, and it would have to have a little random number processor built in (there is one in every neuron, it has been described as how the "soul" might be, hmm, "Attached" to the brain)

    I may be off on some of this, but the point is, you couldn't truly memorize a single megapixel photograph the way your camera does, it would probably take up half your memory. And a computer, no matter how long you give it and how well you program it, will never be as good at interpeting it as you are.

  • by Anonymous Coward

    I agree with you. I haven't seen anyone point out how easily and how often we simply generate "data". For example:

    Picture a warm, tropical beach

    To your computer, that statement is 30 bytes and it says nothing. In order to produce the image
    your computer would need to know the concept of heat, and what a beach is. Now a lot of people will say "sure, there could be /usr/brain/places/beach/tropical/warm1.jpg", and
    that image is 200k or so. But it doesn't work that way. Your brain can and does generate an image that you might not have been before. How much data
    would you have to feed into a 3D modeling program
    to be able to produce the same results?

    The biggest difference between our minds and computers (in my totally unqualified opinion) is that we have imagination. Computers don't.

    Computers can simulate this to an extent, because they can "learn" (take input and enter it into a database), and then use that input to help with generating any further output.. but the computer just won't do it as efficiently, and the computer just doesn't have the "life experiece" that we do. Regardless of the data storage is done, our brains really handle the data well. I don't remember the exact image that I saw 1 minute and 31-1:32 seconds ago. I don't remember the exact sounds I was hearing, or the feelings of touch, nor do I remember the smell. The reason for that is probably because my brain automatically discarded that whole second of useless information, which would to a computer represent a HUGE amount of storage. The reason is because I don't NEED to remember that. I can tell you that I was sitting exactly where I am now, typing out the top part of this very same message. At this point
    in time I can describe every last detail of sight, sound, smell, and touch. It's easy because the situation is almost the same as it is right now. Common sense (a bunch of logic) tells me this, not the ability to store 30TB of data.

  • by Anonymous Coward
    42
  • by Anonymous Coward
    It's a lovely story, but not a very good model. Take for example your assertion that "most people can't remember details very well." Then consider that any functional 5 year old retains a vocabulary of thousands of words, the rules for manipulating them in communication, and (in the case of most 5 year olds I've known) knows the TV schedule for any day of the week for multiple channels. I can still remember the phone number and address of the house I grew up in - while event memory may be fuzzy and biased, things like numbers (how many physical constants do you remember the value of for example), words, spellings, C language reserved words are all stored very crisply and accurately, with excellent recall speed. It doesn't seem to me any of these things are necessarily linked to some overarching gestalt. I suppose maybe I remember "printf" because it's "print" with an 'f' on the end, and "sprintf" because it's "printf" with a leading 's' and "snprintf" because ... But none of these do I associate with some event (I don't remember what I was wearing the first time I saw "printf" for example), and none of these recollections is in any way fuzzy or approximate.

    Another thing no one here seems to be considering is that there are quite possibly multiple memory systems operating in the brain (beyond just the long term/short term pair). Your PC for example has DRAM (core), cache, CMOS setup RAM, video ram,
    local ram on adapters, FIFOs, hard drive, floppy drive, CD/DVD, tape - all specialized types of memory to meet particular needs, and each with a multitude of storage formats, and each produced via an evolutionary process just like the development of the human brain.

    We have externally stimulated sensory memories, internally generated emotional memories, memories of gross detail (landscapes, clothing, people), detailed, crisp memories of all kinds of specific data, and probably lot of other categories too. We perhaps don't remember the details of specific events well all of the time, but compare that to the ability to recognize facial characteristics - the resolution in that subsystem is phenomenally good.

    Contrast all of this to the claim that most people can't remember details well - obviously that's over-generalized to the point of absurdity on the basis of quality or quantity of details the average person can remember. What kind of details specifically?

    I was gonna put a sig on here, but I forgot what it was.
  • a kind of distributed.net gone mad, heh.

    And they'd still not cracked rc5-128
    :)
  • This is very interesting. I'd not thought of it in this context before.

    When I was taking my grade 11 biology course, we studied genetics. Something that came up was the fact that your genes could, and in fact did, change. A real question that came up, which my teacher refused to answer (said to go to the religion teachers) was one involving different chromosomes. Say I, a male, was born with two 'X' chromosomes. I would be a different physical person - obviously, a female - but would I be the same person? After all, if the soul or consciousness or what-have-you is created or metred out or whatever when you're concieved, would the prescence of a couple of extra genes make that large of a difference?

    A very, very good book to read on what you wrote about is "The Terminal Experiment" by Robert J. Sawyer. It involves a scientist who has created the technology sufficient to make an exact copy of the human brain (and consciousness), and can also modify it to suit his requirements. I reccomend checking it out if you're at all interested in the theology of biology and consciousness - it is a near-future science-fiction book, if you're interested.

    E-mail me if you wish to talk further.

  • Posted by FascDot Killed My Previous Use:

    I doubt your grandfather's brain is full. More likely is that he sees fewer and fewer "memorable things" because he's seen so much and it all tends to blur together. If you did something way out of the ordinary like throwing a pie in his face or something I bet he'd remember it.
    --
    "Please remember that how you say something is often more important than what you say." - Rob Malda
  • Posted by FascDot Killed My Previous Use:

    Remember the part about people with photographic memory? They can look at your book page and store the entire thing in 3 seconds. Let's say 1K for the entire page (of text) div 3 is 300 bit/s. Already 3 times faster than your estimate.

    I don't know if photographic memory has ever been tested for storage speed, but I doubt that 1 page of text in 3 seconds is even close to the maximum.
    --
    "Please remember that how you say something is often more important than what you say." - Rob Malda
  • Posted by polar_bear:

    You can't really compare the brain's capacity in terms of storage capacity to a harddrive - The amount of storage that we have in the brain is amazing, but the access is somewhat faulty. (I liked the comment someone had about needing a defragger for the brain!)

    Think of it this way - the brain processes sound, sight, smell, touch and taste in real time - sometimes it saves the experience to easily accessed memory in one or all of the senses and sometimes it doesn't. If you digitized all the sight and sound information the brain processes in one day alone it would be more than enough to fill any commercially available harddrive. Then, what about free-flowing thought? How would you digitize that?

    The brain is also a real-time processor as well as storage facility - so imagine the MHz needed to process the things that the brain does. And the bandwidth - to process all five senses at once we'd have to imagine a processor that handles way more than 64 - 128 bits per cycle.

    What I'd really like is for someone to come up with a backup device for the brain...

  • Posted by Mary CW:

    In Kurzweil's (sp?) recent book, Age of Spiritual Machines, the author tries to set up layman's comparisons between organic brains and computer complexity. He focuses on connectivity/dendrites not storage capacity per se, using examples such as: a computer as complex as a bug's brain, a computer as complex as a rat's brain.

    Basic theme of the book is that as tech complexity increases, the machines can act more and more like organic systems, down to the point that we will have issues around the machines being conscious.

    Anyone else who's read the book care to comment?
  • Posted by FascDot Killed My Previous Use:

    1) You don't think so. Any proof?

    2) So what? The original poster was claiming to be able to calculate the total brain capacity based on the input bitrate. We've proven that bitrate invalid. It doesn't matter that it may be short term only.
    --
    "Please remember that how you say something is often more important than what you say." - Rob Malda
  • Posted by FascDot Killed My Previous Use:

    The question is, if brain can store 14 TB, why does avg person stores data at 100bps?

    And my answer is: the average person stores data much faster than that. My example of the photographic memory person is just an easy target. My general claim is that even a blind person (no visual input) stores data faster than 100bps.

    ...its logical to suggest that a person who's storing at 1000bps has the same storage ability as another guy but has to get rid of that extra data eventually.

    Logical, yes. Factual, no. Are their ANY recorded cases of persons with photographic memories suddenly having amnesia? Has anyone present researched photographic memory to find retension times?

    In short, why are all the "ideas" coming from you while all the "facts" are coming from me?
    --
    "Please remember that how you say something is often more important than what you say." - Rob Malda
  • Posted by spookysys:

    ..which of course is exactly as simple as it is wrong. It is not by changing the state of the neurons the data is stored. The information lies in how these neurons are connected. Information can be stored in many different ways, and similar memories can share large parts of the neural network. That is, if i've got this stuff right.. Most of this is collected from various AI docs and "Creatures 2" mag reviews =)
    I think ... =)
  • by gavinhall ( 33 )
    Posted by Assmodeus:

    we process 13 terabytes a second...not store it... i dont know how much we can store though... it has to be more than 13tb though.

    assmodeus
  • Posted by FascDot Killed My Previous Use:

    Surely that's a typo. I know *I* have more bandwidth than that. Heck, I've got more than that for audio alone.

    Think about it: Human speech doesn't sound natural on playback unless it's around 44kHz. That's 44 THOUSAND cycles per second. Of course, we're just talking processing so far, but people with "photographic" memory can store this information perfectly which means fast storage capacity.

    As for the 20 questions proof: This assumes that the item is randomly chosen. It also assumes that the person choosing picks a specific object that they have stored rather than a general class. ("I'm thinking of the third flowerpot from the left in the workshed behind my mother's house").
    --
    "Please remember that how you say something is often more important than what you say." - Rob Malda
  • This is, in fact, extremely accurate. However, our friends are only measuring conscious memory.

    Since the scope of my message would overflow my system's 32MB of RAM (At least using Netscape; This reply would take three pages!), I have placed the response on my homepage at http://www.geocities .com/SiliconValley/Network/5389/chaos.html [geocities.com]


    --
  • --Ever read "One flew over the cuckoo's nest"? Randall McMurphy had a lobotomy performed on him, and it fscked him up. People who have had brain tumors or strokes cannot recall some things (some of them major, like relatives / spouses!)
  • I don't know, I'll have to watch Johnny Mnemonic again... The story was good, if you understand about the Military Psychic Smack Dolphin...
  • by pb ( 1020 )
    I have read it, I know it wasn't Gibson after they got done with it, and I'm glad that "The Matrix" didn't try to steal the plot from Neuromancer (although I wouldn't mind a hot chick with some embedded razor blades... :).

    However, I think the short story didn't make reference to the actual storage capacity of the (admittedly digital) device embedded in Johnny's brain. But I don't mind the movie, for two reasons:

    1) Johnny Mnemonic wasn't a really great Gibson short story, in my opinion. I liked fragments of a hologram rose, or whatever it was called, a lot better, but you could never make a movie out of that.

    2) Keanu makes a good Johnny. He's good in action movies, and he'll never miss that chunk of brain anyhow. ;)
  • That's a good article, but I think memory is more involved than that is. First, for 'learning to ride a bicycle', that isn't even where the rest of memory is stored, procedural memory is in the cerebellum, I think, and it stores a lot of precise stuff, like my touch-typing. It's also the reason that we find it hard to remember a song in the middle, because we have to "sing" it from the beginning--it's a procedure.

    Second, if I pick up a book I've read a long time ago, it will be familiar. (or if I just see a passage) That recognition is memory, and I've read a lot of books. If I pick up one I haven't finished, I can seek through it and find my place. I don't think 122MB of data could account for even that much information, much less the music I remember (although that might be in my cerebellum, apparently they count it as memeory).

    Other than that, their methodology seems pretty sound, I'd just like to know how they got their estimates into bits.
  • Here's some cool stuff that explains that basically there is a "cap" on the end of our chromosomes that is reduced every time a cell divides. When the "cap" reaches a certain size the cell stops dividing (senescence). This is supposedly because you can only make so many copies before the quality is affected. Anyway, it's pretty interesting stuff. Search on Telomeres and Senescence and you'll find all kinds of info.
    Here's a Yahoo search on them [yahoo.com]
  • I've had plenty of LSD in my time and I can definitely say that my brain is an analog tomato.

  • wow, someone else that liked that movie. I still find it amusing that they made that movie from a 5 page short story.
  • nah a lot of the basic plot elements were there
    jones the smack addict
    molly millions (tho with a different name)
    the ninja and the fight at the end, etc

    they just embellished alot

    now if someone could do snow crash justice.....
  • This reminds me of that Sherlock Holmes story, where Watson is telling Sherlock (Sherlock? what a dorky name, anyone here remember his brother's name? Mycroft. What the fsck was WRONG with their parents?! - i digress).
    Anyway, Watson was telling Sherlock about how the Earth had been proven to orbit around the sun, and Sherlock scolded him for telling him "useless information", and taking up space in the attic of his brain that would otherwise be used for storing information about topics relative to crime-solving.
    Sherlock Holmes was definately a geek of the first order. I didn't read Sir Doyle's books for the mysteries, but for the fascination for this truly wierd character.

    "The number of suckers born each minute doubles every 18 months."
    -jafac's law
  • I guess the storage capacity would depend on what sort of compression our brain uses... Maybe it compresses audio to an MP3 format. Then we wouldn't even need to get a Rio. "Hear once, play anywhere".
  • by jd ( 1658 )
    Depends on what you mean as capacity. Each collection of neurons in the brain will have local storage and be self-adapting, which is intrinsic storage the structure itself posesses.

    Let's assume that 13 terabytes (well within the reach of human technology) is about the capacity the area dedicated to memory possesses internally. It probably has half that again in intrinsic storage, and I'm going to guesstimate the rest of the brain has around 10 terabytes local storage.

    This'd give a total of nearly 30 terabytes.

  • Read "Funes the Memorious" a short story by Jorge Luis Borges. In it he descirbes a person, Funes, who not only rembers everything he has seen, his perception is so acute that he sees each moment as separate and disjoint from all others. In short he had infinite perception and retention but no association.
    It is contained in Ficciones [amazon.com]. I highly recommend the entire book and all of his work. If you have ever pondered infinity or parallel existences, you need to read Borges.
    --
  • Unfortunately that article doesn't deal with the digital memory hogs of sound, smell and vision. Most people have the ability to identify (even if they cannot remember the names) thousands of people, thousands of smells, thousands of voices, and play "name that tune" for thousands of tunes. All the while, they have a working vocabulary of, say, 100,000 words, memories, thoughts and feelings. I realize that our memory is "lossy", but it isn't binary, it isn't digital. If we wanted to store all one person knew on one computer, a gigabyte would not be enough.
  • But if Clarke had got it right, the movie would have been called "Waiting to Exhale"... ;)

  • That must be the reason why where I work we're going to the pain of producing 16bpp images...

    If you can't tell the difference between a 24 bit image and a 4096 (was that Amiga HAM?) color image, you really ought to have your eyes checked.

  • Parse it again. It makes sense with the two extra "that", not without.

  • Remember that astronaut in 2001 that HAL kills with the pod? Frank Poole. Well they find him a thousand years later. He is just about to pass beyond Pluto and the solar system when they grab him. With 31st century technology... they revive him.

    Great Book
    3001 the Final Odyssey

    Arthur C Clarke as always sticks a lot of new science into it. He comes up with a paper published in the Quarterly Journal of the Royal Astronomical Society in june 1994 called: Machine Intelligence, The Cost of Interstellar Travel and Fermi's Paradox. They estimate that the total mental state of a one-hundred year old human with a perfect memory could be represented by ten to the fifteenth bits (one petabit).
  • I Haven't lost my mind, I have a backup on tape somewhere...
  • I Haven't lost my mind, I have a backup on tape somewhere...
  • I remember reading that the human brain has a storage capacity of approximately 122 MB. It's explained here [merkle.com].
  • The 2nd Law of Thermodynamics is a lot like Boyle's gas laws. They work wonderfully predicting the random motions of uncountable trillions of trillions of molecules free to bounce about as they like. It's all about statistics. The odds are, if the air pressure on one side of room is higher than the other, air will flow predominantly towards the area of low pressure. However, if you make the mistake of trying to understand these laws as absolute rather than statistical, you will find that they are absolutely false. Although unlikely, it's possible for the air in the room to congregate to one side. Remember, the motions of the molecules are *random*! They do not individually move in accord to Boyle's laws, and the laws are useless at prediction the motion of only a few trillion molecules rather than the trillions of trillions is supposed to be used for.

    What does this have to do with evolution? Everything. The 2nd Law of Thermodynamics does NOT say that entropy ALWAYS and MUST increase EVERYWHERE in a closed system, no matter how hard creationists want to make it say that. For entropy to decrease in part of a system is NOT in ANY WAY a violation of the 2nd Law of Thermodynamics. Overall, statistically, over time, entropy will increase. That's all the 2nd Law gives you, not a blanket forbidding of any decrease in entropy.

    Also, note, Boyle's laws are not violated by high and low pressure areas in the atmosphere. Why? Because Boyle's laws describe what happens IF NOTHING ELSE acts upon the gas. Likewise, the 2nd Law of Thermodynamics states what happens to collections of molecules NOT SUBJECT TO ANY OTHER FORCES. If some other process is occuring that increases complexity and decreases entropy, this is NO WAY VIOLATES the 2nd Law of Thermodynamics. Just as the fact that a baloon rises in no way violates the law of gravity. This is a good analogy: saying that evolution violates the 2nd Law of Thermodynamics is exactly like saying hot air baloons violate the law of gravity. But they don't. Because other things are going on than just gravity, hot air balloons act in a manner that, if you didn't know there were other things happening, might be construed as violating gravity. But in fact they don't, they act in perfect accord with the laws of nature, gravity includes. The point is, they don't act EXCLUSIVELY in accord with the law of gravity, other thing happen. Likewise, evolution in no way violates the 2nd Law of Thermodynamics, it acts in perfect accord with it, as well as with other forces that seek to decrease entropy at the same time the 2nd Law seeks to increase it (exactly like how the air pressure seeks to buoy a balloon while gravity seeks to pull it down -- both forces act, you see the net result). In fact, evolution works because of, not in spite of, the 2nd Law of Thermodynamics. You *need* entropy in the system or mutation would not occur.

    Anyways, enough said. The main point is simply this: anyone who says evolution violates the 2nd Law of Thermodynamics obviously doesn't understand the 2nd Law of Thermodynamics to begin with. Evolution doesn't violate, invalidate, or contradict it any more than airplanes violate, invalidate, or contradict the law of gravity. Get a grip -- the world is too complex for any one simple generalization like that to forbid anything...

    --

  • I've not done any research, but doesn't it seem as if there are many more defects, allergies, and weaknesses in the population today than there was in the past?

    Umm, the percentage of people born with them shouldn't increase. With natural selection active, it should decrease. Without it, it should just remain about where they are.

    On the other hand, since these people aren't dying off in childhood but remaining part of the population, even if the percentage of people born with these problems doesn't increase, the percentage of the living population that has them should increase.

    Also, since the population as a whole is increasing, although on a percentage basis people born with these problems should not be increasing, the number of people with them should be.

    This doesn't mean evolution will come to a dead halt. In fact, as we become more adept at and more comfortable with genetic manipulation, human evolution is likely to resume. Artificial selection of specific genes will replace natural selection of entire organisms.

    --

  • It can't how much is used at once. Simply processing the data from your eyes alone, turning the signals from the rods and cones into a coherent visual image, is occupying 30% of your brain (according to college psych textbook). Of course, this is not surprising, considering it's the single most complex task brains do. Contemplating philosophy is a relatively simple task by comparison...

    --

  • ACK! Molly Millions (actually, I prefer the name she used in Mona Lisa Overdrive, Sally Shears) is NOT in the movie. The woman in the movie is not Molly with a different name. That wouldn't be surprising, she's used different names in the books. However, the woman the the movie has a lot less in common with Molly than simply a different name. Point in fact, I'm hard pressed to think of much they had in common other than being female and in a story with the same title.

    It would have been much cooler if Molly *had* been in the movie. But she'd be the most interesting character in the movie, so they doubtless felt they needed to replace her with a far less competent, streetwise, and interesting character to keep her from overshadowing Keanu...

    --

  • Actually, that's pretty much true of any book. Most books, if not editted, would require about an entire TV season's worth of time to do. A 2 hour movie can't come from anything more than a short story without heavy editting...

    --

  • You'll find in lingustics as well that most people do not have a concurrently available vocabulary of thousands of words, in fact 5000 words is a lot even for an adult, you don't need nearly that to get along in society. But the availability of a word at a given instant is largely related to its association with other concepts and word streams. People with larger vocabularies are often more capable of utilizing a larger wordset because they typically are making an effort to use less common words (even if they won't fess up to it).

    I think you're confusing a couple of half-remember facts and combining them into one incorrect one. The average adult typically only uses about a thousand words. Even those of us who are extremely intelligent with vocabularies from specialized fields -- we just have a few "exotic" words in our standard lexicon of things we talk about.

    This does not alter the fact that there are dozens of words we know for every one we actually use. We only use a small fraction of the words we know -- most of the words we know will never pass our lips, it'll simply never come up in conversation. Nor will we type or pen them, we'll never have cause to.

    Thus, our "working vocabulary" is a rather tiny fraction of the words we know. 60,000 is not unusual for an educated adult. Add to that our knowledge of how to make words from other words, and there may be two or three hundred thousand words you would have no trouble comprehending if you came across while reading.

    So, your "available vocabulary" is a few orders of magnitude greater than your typical "working vocabulary".

    --

  • of course, if we were using DNA as our storage medium, would that mean that we would be reading off data in a four bit as opposed to a two bit system?

    Err, presumably you mean a four value (2 bits) as opposed to a two value (1 bit) system. Yes, that would be the case. But since we generally read memory in at least 8 bit chunks anyways, this doesn't make any difference to the computer. It knows not that the data was reconstructed from the state of 4 molecules rather than 8 transistors. You've done something horribly wrong if your computer cares how the data is stored -- this is the kind of information that interests electrical engineers, not computer scientists... :)

    --

  • Yeah, but the brain is a neural network. A mass of redundant parallel processing cells. We can survive brain surgery with minimal-zero damage. We can survive having ONE little connection severed. Try doing that to a computer. One little wire gets snicked, and the thing is a paperweight. Till we come up with a Neural Net simiulator, we can't even come close to the brain
  • Right... What I maent to say (and didn't make clear) was that the impulse is not electrical at any point, its a chemical single, which is why the time for the impulse to reach the brain is noticeably longer than the time for the "knee jerk" reaction to pull your hand away.

    If it was electrical, than even at a miniscule percentage of the speed of light the nerve impulses would arrive far faster than they do.

    An interesting side point is that there are a lot of factors that affect the speed of nerve transmission. Researchers have found that high-contrast visual ranges tend to lead to much faster nerve impulse rates and response times, and low contrast visual scenes tend to slow down the impulses. (Which from a practical standpoint means people are not as capable of judging visually the speeds they're travelling at in low-contrast situations like driving in fog...)
  • I admitted in my content that I was generally talking around my ass... but what I was saying was from a generalized understanding I put together over several years of participating in, and developing research on learning and memory, including a period of time academically studying the evolution of intelligence.

    Whereas your response is based on assumptions you make based on false interpretations of your experiences. You claim there is not an experience that causes you to remember what printf is. That's blatently incorrect. Your are able to associate the meaning of printf in that context because of prior experience you've had either with that concept or with concepts related to that. That's why its easy to pick up a third and fourth language when a brain has developed the proper pathways that allow it to associate with multiple languages. That's why its easy to pick up new programming languages. But that's also why you may know fifteen programming languages but be unable to learn a foreign language at all -- because you learn based on prior associations you've made and you don't have those between unrelated areas of knowledge.

    In that vein however, some researchers believe that an unusual ability to create those linkages between non-related contexts are one of the causes of extremely high intelligence, partly caused by genetics, but usually among those researchers its attributed to wide-ranging stimulation during the first nine to twenty-four months of live (the first nine being particularly important because unlike every other mammal species, the human brain continues to grow for 9 months after birth).

    There can't be differing ways of storing information in the brain because there is only a single construct within the brain -- the only differentiation between areas coming from the points at which there are larger interconnects within the brain, points where there are larger concentrations of neurons that are not necessarily in physical contact with each other (which is why some scientists think the folds in the brain are related to overall species intelligence), and the insertion points of external sensory nerves.

    You however, most likely, do not remember nearly the detail you think you do. Very few people naturally develop the ability to do that, although it can be learned. Take for example someone asks what your significant other looks like. The odds are you will pick out and describe certain elemental details, color of the eyes, color of the hair, shape of the nose, but if someone asked if there was a mark below their ear last time you saw them you might not be able to answer that -- because you are reconstructing an image of that person in your head from individual elements you remember -- elements that may or may not be correct.

    The more you pay attention to and use those snippets if information, the more other nerve pathways will utilize those elements and other memories will get locked to them. That's why you can remember the phone numbers of the houses you grew up in -- because of all the other memories associated with those specific memories. That's why you can completely forget a long-past romantic rendezvous, yet a fragrance or some sound can suddently bring that back -- because you triggered the "matrix" of nerve firings that held that experience in the context of another memory -- ane externally stimulated memories are FAR more capable of doing that than internally stimulated memories, because of the areas of the brain they tend to reside in and the relatively stronger impulses you tend to get from external sources.

    That's why relaxation and meditation help focus -- because they tend to quiet and control those externally triggered cognitive events and allow more attention to fall on internally triggered ones. (And is also why under hypnosis you are both capable of digging up memories more easily AND creating memories easily).

    You'll find in lingustics as well that most people do not have a concurrently available vocabulary of thousands of words, in fact 5000 words is a lot even for an adult, you don't need nearly that to get along in society. But the availability of a word at a given instant is largely related to its association with other concepts and word streams. People with larger vocabularies are often more capable of utilizing a larger wordset because they typically are making an effort to use less common words (even if they won't fess up to it).

    If you were going to pick out a point of my original posting that was over-generalized to a point of absurdity, there are points that are far more obsurd than the one you picked. The one you picked is in fact one of the most easily documented points I made, and one of the most widely understood scientifically. The methods that cause it to be true are not as well understood, but its validity is not widely doubted.
  • by tgd ( 2822 ) on Friday June 18, 1999 @09:08AM (#1843855)
    This is sort of a silly question for Slashdot, since most people are going to be talking out their asses.

    That said, (and talking more around my ass, than out it), there isn't any sort of storage figure. Researchers do not have much understanding about how we remember things, but it IS fairly certain that there is no relationship to the way computers store information (ie, the concept of terabytes, etc).

    Generally the brain remembers certain aspects of an experience -- wether an external experience, or an internal one. Its believed that the act of experiencing something, or recalling it later starts changing the relative levels at which nerves will fire and accept the chemical impulses from neighboring neurons. (Before anyone starts talking about electrical impuses, those are only conducted within the nerve cell not between nerve cells and its not an electrical impulse as much as a chemical shift within the nerve that changes the electrical potential of the local region while the signal travels down the length of the nerve -- thats why you can have your hand off a hot stove before you actually feel its hot)

    So a memory is generally a tangles mess of restimulations of fragments of what happened. Thats why with few exceptions, most people can't really remember details very well, and everyone is prone to manipulating memories. (ie, you read an interesting tale when you're young, later in life you're sure it happened to you or that someone TOLD you it happened to them, and not that you read it) Things like that happen a disturbingly large amount of the time, with everyone. Luckily such errors don't often affect anything serious.. I mean who cares where you heard a story?

    That's why things like memory and attention span and personality can be manipulated chemically -- because you can control the way those experiences link up with each other and how the brain reacts to those experiences.

    One of the most interesting things I think people find when they really start studying how the brain learns, and stores its experiences is how little actually comes from the senses or memory. (For example, how the brain can only distinguish general colors and shapes beyond a half-dozen degrees off center in your field of view, but you're constantly fooled into thinking you can see more than you really can)

    The question with the brain then is how discret these fragments of memories and experiences are, how many times they can crossconnect with others to produce memories without those crossconnects getting so blurred that you get confused about the truth of what you're remembering, and the number of different fragments that make up a given memory.

    Most likely no one will have any idea about the answers to those questions until there is a better understanding how a "neural network" arrangement can store and rerecognize patterns of nerve impulses when the "matrix" used is numbering in the millions of cells at a time...
  • "Anyway, Watson was telling Sherlock about how the Earth had been proven to orbit around the sun, and Sherlock scolded him for telling him "useless information", and taking up space in the attic of his brain that would otherwise be used for storing information about topics relative to crime-solving."

    As much as I love Sherlock Holmes, that passage has always disturbed me for two reasons:

    1) It seems to me that the brain is more like a muscle than a storage device: the more you use it, the higher its capacity becomes. Although there may be an upper capacity limit, I doubt more than 10 people alive at any given time ever get anywhere near it. Whereas I have observed many, many people who stopped using their brain capacity and, essentially, lost it.

    2) It also seems to me that super-capable people (and I am purposely avoiding the words 'smart' and 'intelligent') are often those who have the ability to draw together seemingly unrelated fragments of information into a new and critical insight. If they have never been exposed to the disparate information, they would not be able to make the leap (again I am avoiding the word 'intuitive' although it probably applies).

    My 0.02.

    sPh
  • I can't remember the number...
    Was it 18 or 24 Gigs????


  • Hey, I do that all the time. As a jobbing
    musician, I often have to learn new songs from
    tapes. At this point, if I listen to a song twice,
    I remember it in very detailed way, with chords,
    harmonies and backup vocal lines. The scary part -
    how would you like to wake up in the middle of the
    night from sound of some stupid headbanging song -
    and then realize that it's all in your head?

    Tell my voices to go away!
  • My god!!! Steven King is able to write _anything_ in less than 1300 pages???!!!
  • Evolution has been disproven on many levels. One very good example of this is a human eye. The infinite complexity of this organ is beyond a doubt a stumbling stone in an evolutionary biologist's work.

    Now please explain more. Are you telling me because the basic eye structure is present in so many varied beings that this presents a problem with evolution as a theory? I'm not attacking, just really want to know...

    it is obviously apparent that humans are very structured beings, and to hold the second law of thermodynamics sacred, we would then have to say the human race did not begin with some non-living matter which gradually, over milennia, turned into what it is today. It just does not make sense.

    What here are you arguing? That uncontrollable chaos in this system (presently) would preclude or include a more `structured' beginning? That would tend toward creationism, yet against it you argue?
    --

  • Well, I think most audio/visual memories are not stored perfectly (that is, high loss); those with photographic memories probably store them with less loss, but I'm sure there is some loss. I would think the brain would mostly "recreate" the images and sounds, in the same way when most people try to draw a person, their left side of the brain tries to draw the "generic" person consisting of rough ideas of generalized body parts which end up looking very unrealistic. (The right side of the brain seems to be better at recreating actual images.)
    --
    Aaron Gaudio
    "The fool finds ignorance all around him.
  • Forgetting is a bliss..

    There are also reports by people who claim to remember being born (under hypnosis or something) of extreme pain and suffering.

    So forgetting about your early experiences would seem to be an important sanity preserving feature.

    But also, wouldn't remembering everything pretty soon take away most of the hope you have and sort of turn your life into a self fulfilling prophecy..

    --
    Pirkka

  • Actually, when you think about it, that's a common occurance... Most people may not be encyclopedic (although my friends tease that i'm that way), but how many times have you been asked a question, had a major blank in the brian, and if you just waited a few seconds, and didn't tense up, *poof* the answer appeared. I don't know that much about how it works, but a bit more speculation can't hurt. What if the brain not only does an associative path, but does a few layers of abstraction beyond that? So that the recall i was talking about above is generated by a mind basically doing a reference check...Eg: "what's Sarah's pH#?" sets off images of phones and Sarah, then does a cross search for all memories that include both those, (*grin* regardless of capitalization) then tries to arrange by date, and importance....So while you are waiting for the phone number to occur to you, you remember that you have a Birthday gift you've never sent her, and that you have to pay the phone bill....(first relations) and those thoughts lead to a few other associations, while the original search is still being processed, of other bills, other people(all of which are then added to the calculation of where that memory is), and suddenly halfway through a thought on whether you did your laundry, you remember that there is a 64 in her number, and *poof* the rest of the memory is downloaded....because you've been, i don't know if this is the right phrase, but homing in on the co-ordinates of the memory....It's just a thought!
  • 80 Gigabytes, which can be doubled to 160 GB temporarily.

    -"Johnny Mnemonic", the movie. The short story (from Burning Chrome) doesn't actually say how much info is stored.
  • I thought about this with two of my friends the other day.. My explanation was that we eventually run out of memmory/swap space and one of our internal kernel processes trys to malloc() but does not check for the return value and so we wind up trying to write to NULL - We have an internal segfault causing a KERNEL-PANIC!!! What a shameless way to go, a little error-checking could have made us stable... Oh well, I guess "they" didn't use an open-source model. (hehe Microsoft Human 1.0 -- BEWARE!!!)

    --
    Marques Johansson
    displague@linuxfan.com
  • That was written in 1988 - COME ON!!!!

    In 1988 120MB was an UNGODLY wealth of Data.. Back in 1988 I don't think the universe had even acquired that much data...
    It wasn't until the late ninties that the universe evolved an internet from which data expanded infinitly in all directions... Read your history!! Oh Sorry, We didn't have history back in the 80's, not enough memmory pointers available.... wasn't until that evolved memmory upgrade in the 90's when we were finally able to get a successful malloc()...
    Duh!

    --
    Marques Johansson
    displague@linuxfan.com
  • Bet you really love my .signature, then. :)
    ---
    "'Is not a quine' is not a quine" is a quine.
  • This doesn't follow. As soon as you introduce measurement accuracy on an analog signal, you can talk about bits of information.

    ...and the brain seems to be mixed analog/digital. Long distance pathways (e.g. to your feet) look roughly like asynchronous digital (for the obvious reason than it's hard to reliably move an analog signal that far).

    In any case, the storage question is not currently answerable. We do a lot of compression, and we can adaptively learn to compress data that we see a lot of.
  • >What, so we just sprang into being because of
    > some divine influence? And that makes *more*
    >sense? Give me a break.

    hmm, well... yes. It does make more sense. Much
    more sense than saying that we are a freak of nature.

    Then again, looking at some people around here...
  • >and the fact of evolution is being observed
    > right now -- it's happening in front of our
    > eyes, for those who are willing to look.

    Hahahah yeah right. Show me where one species has had its DNA changed or altered and remained viable, and even reproduced. We're not talking natural selection... that's not really evolution.

    > There are species living with the eye in what we
    >could term mid-step of evolution.

    two problems there, one is *could* term mid-step, meaning there is still some doubt. Two, why isn't there a progression of steps, instead of concrete, finite distinctions? For that matter, how can we classify species under evolution? There should be sooo many variations and grades between species that one couldn't say for sure. According to Evolution, our classification should be more like:
    "This species is 40% this and and 30% that and 1% this..."

    If you look at *all* the facts, evolution takes just as much faith as creation. maybe more. There is also significant scientific proof for creation. As you say:

    "it's happening in front of our eyes, for those who are willing to look."

  • I find the article you cire hard to swallow, especially the idea that we process data at the rate of two bits per second. I memorized the sentence "Visual, verbal, musical, or whatever--two bits per second" in about a second, and that's WAY more than two bits in any reasonable encoding scheme...

  • I find the article you cite hard to swallow, especially the idea that we process data at the rate of two bits per second. I memorized the sentence "Visual, verbal, musical, or whatever--two bits per second" in about a second, and that's WAY more than two bits in any reasonable encoding scheme...
  • Heh, I wonder what the RIAA would think of that?
    "I didn't steal the music! It's just such a catchy song that I can't forget it ..."
  • Ted Kaehler has some interesting comments [webpage.com] on the excelent book "The Cerebral Code" (which is available online [washington.edu]). Ted calculated the capacity of the typical human cortex at 500 terabytes. While I don't agree with the way he did it or with his result, it is worth lookin into this.
  • Well I personally believe that the brain may be some sort of ultimate storage device with almost limitless range of space. You take various case studies and you can see that pretty much every little fact a person encounters in his life is stored, it's just a matter of retrieving it. Using various trivial techniques people have reported instances dating to the womb, there early child-hood years, and things that happened yesterday. Down to the most precise details. Being that our mind is basically a vast collection of electromagnetic junctions each eminating, and working through a mass know as our brain, which has the type of shape and size, and being that is has a lot of liquid also, as too hold vast information, probably beyond our comprehention. It seems only logical to me that everything we see, hear, touch, etc. are stored via electromagnetic impulses to various places in our brain forming to date the most powerful storage medium ever seen, and maybe ever will be. After all it sure seems capable of storing the information of a persons entire life though that person may have a hard time recollecting those facts, they are there.
  • The brain might not be the best model to use as a storage system. If you think about it, the brain must be using some sort of lossy storage system.

    I mean, I can't remember everything I've experienced in 32-bit color with sound, although some things I can remember clearly.

    In addition, a lot of times you can't remember something you want to, when you want to (much later, it seems, the answer comes floating back).

    I feel that the brain is clearly optimized for something other than bit-perfect storage and quick recall of arbitrary data, and so I don't think it would work real well as a storage medium.

    However, with some sort of lossy storage, it can probably store a truckload of data, much as jpeg can compress images down a lot.


    -Richard.
  • > 3) Does it have a RAID driver yet?

    Shouldn't that be: RAIB ? Redundant Array of
    Inexpensive Brains?

    -- REW
  • What, so we just sprang into being because of some divine influence? And that makes *more* sense? Give me a break.
  • There's a book all about the 'hologramic' nature of the human brain at: http://www.inst inct.org/texts/shufflebrain/shufflebrain-book00.ht ml [instinct.org].
  • The Second Law no longer requires a "closed system" to be valid. The November 1987 issue of Science (if I remember right) discussed the proof of the validity of the Second Law in a gas confined in two dimensions, and the authors indicated they would be soon ready to establish that the Second Law is valid for an unconfinded gas in three dimensions, also. I never followed the proof so I don't know if they published the proof for three dimensions, but they gave 1991 as an approximate deadline.
  • "Heat to a rolling boil" ????
    Is that the boiling point of Carbon or the boiling point of Hydogen? It wouldn't matter anyway, because in an aqueous environment using CO2 and H2O, the heat of formation of CH4 is +20.34 Kg-cal per gmw, which means it's not spontaneous.
    What you're really refering to is that "experiment" in which methane, ammonia, water and electrical sparks (simulating lightening) reflux for several days and a tarry substance containing amino acids appears. From this we are supposed to jump to self-replicating molecules and spontaneous life. Quite a leap of faith....
  • So you think Mother Nature makes sure that the flow of heat among atoms and molecules only occurs when humans are doing "thermodynamics" experiments? What you probably mean to say is that Thermodynamics doesn't depend upon any specific laws of chemistry or physics but stands alone as a body of knowledge and experience. So, if your pet theory in chemistry disagrees with the one of the laws of thermodynamics then so much for your pet theory. Thermodynamics rules on the validity of any other area of science.
    1st Law: dE = q + w Translation: you can't get out of a process more than what you put in. English translation: you can't win.
    2nd Law: dS = dQ/T Translation: At a given temperature the change in disorder of a system is proportional to the change in heat content. English translation: You can't breakeven
    3rd Law: To avoid discussions about adiabatics, simply say that 0K is impossible to reach. English translation: you can't get out of the game.
    These laws apply to any circumstance which involves matter interacting in any way at any temperature. Period. To say otherwise is to not understand how Carnot dervived the concept and how it has been enlarged upon since.
  • Actually, that makes perfect sense. Take a look at space/matter/energy and assume that the universe was once made up of pure energy(pre-big-bang) with no space and no matter. If there was only a 0.00003 percent chance of a bang occurring at any given time(I know -- time didn't exist either, right? You know what I mean here...) and time is assumed to be infinite, we can then postulate that a bang must/will occur at some point. The universe's decision over life/non life is very similar, IMHO and we are one small part of the result.

    This might seem deterministic to all of you who ardently believe in free will, but then again, the more we learn about DNA, the more we realize we were programmed by the biggest computer of all right from the start - the universe/God/Nature(call it what you want).

    --

  • The average human brain has about 10^10 neurons, each of which is connected to an average of 10^4 other neurons. This means that at a minimum the brain has 10^14 bits. Dividing this last number by (8 * 1024^2) translates it into 32.95 MBytes.

    This is, of course, a ridiculously low lower bound.

    Synaptic connections are far more likley to come in many shades of gray. Let's say that each synapse has about one-thousand possible strength values. Then you can multiply this lower bound by a factor of 10.

    Next, if you include the specifics of the wiring, you can factor in the combinatorial nature of how the wiring could take place. This buys you another factor of 30 (10^10/log(2) for binary encoding of destination neurons).

    But with spiking neurons, who knows? So all bets are off.
  • Hmm. We consume lots of resources and leak all over the place, so just maybe!
  • The three necessary comments have yet to be made, so I just thought I'd have to say them.

    1) Can it run Linux?
    2) How many MP3s can we get on it?
    And 3) Does it have a RAID driver yet?
  • Forgive the flame-fest, but I can't resist this troll:

    Take a look at the second law of thermodynamics. " Entorpy in a system increases over time." it is obviously apparent that humans are very structured beings, and to hold the second law of thermodynamics sacred

    Weak, weak, weak.

    Along your line of logic:

    Ice is a higher state of order than water.

    Water has more entropy than ice.

    You can not decrease entropy, as it is always increasing

    Therefore, water can not become ice because entropy is always increasing.

    This of course stupid.

    The second law of thermodynamics argument is so wrongly used it makes me sick to my stomach each time I hear a creationist use it. It only proves how they don't try to understand the physics.

    Water of course, CAN turn into ice, the extra heat energy is lost -- radiated -- the energy is elsewhere. When you look at the TOTAL system, i.e. all of the energy, every erg and calorie, that the entropy is increased.

    The earth is not in a closed system. It' getting energy from the sun every day. The law of entropy doesn't apply here.

    Another very basic scientific law states that living tissue can not spawn from non living matter.

    Sounds positively victorian-era to me.

    There's a REASON biologists across the globe don't debate creationism versus evolutionism anymore. The debate is settled. Everyone has gone home. The scientists may debatbe about the rate and the methods of evolution, but not the concepts brought forth by Darwin.

    All right creationist. Answer me this: (p.s. you lose if you say some stupid thing like "Cause God made it that way") Without evolution, how can you explain:

    Why do snakes have hip bones?

    Why do whales have hip bones?

    Why do chimpanzees have 99% shared genetic material with humans?

    Why does a mosquito have significantly less in common with humans?

    Why do dolphins and whales have more genetic material in common with humans than do fish?

    Explain how fossils of primitive creatures seem to always be in stratum layers below more advanced creatures.

    Explain all those dinosaurs, and why they're gone now.

    Explain biotechnology, and tell me how it could have come this far without a deep understanding about biology -- when you say they are all wrong because the Bible says they are.

    Explain myxomatosis outbreaks in Australia, and why there are still rabbits, despite the deadliness of the disease.

    Explain Archeopterix

    Explain Homo Habilius

    Explain Homo Erectus

    Explain Homo Neanderthalus

    Explain why penicillin isn't so great for everything anymore.

    Explain how some scientists predicted the demise of penicillin efficacy before the symptoms of bacterial resistance were observed.

    Overall, I'm unmoved by your arguments. The theory of evolution has completely revolutionized the entire field. You saying that it's not true is analagous of you telling a geographer that the world is still flat, or a physicist that relativity is a crock.

    Biotechnology will move on without you. You can continue to sit with your fingers in your ears shouting 'NOT TRUE! NOT TRUE!' while you choose your path ignorance and unenlightenment

  • AFAIK, each neuron in the brain makes connections with up to 10,000 other neurons. you go figure the maths out.
  • go read the stuff on this website
  • The only reason humans "fill up" their brains (if this is even possible) is because we absorb way too much useless information. For example, in the course of reading through the comments here, I've obsorbed the information that somebody thinks the human brain holds hundreds of exobits, 13TB, or 122mb. This has no relevance to anything, I just happen to have remembered it. Most humans have selectively photographic memories, but we can never remember an entire page of text because we remember only the concept of the picture or object. Very few people can memorize an entire page of text on one reading (I am so jealous of those people!)
  • You flippantly mention "remembering the number of
    cracks in the wall". I learned that each of our
    eyes is very very inaccurate and that we have a
    2 degree section of focus, in which we see with
    the accuracy that we're accustomed to. I learned
    that we constantly skip about with our eyes and
    that our view of the world around us is mostly an
    internal model, with small parts of it updated
    frequently with high precision. If what I learned
    is true, you'd not remember the cracks in the wall., because they wouldn't have your attention.
  • I read somewhere that the brains capacity was on the order of 13 TB. No links to back it up yet...

    If you want to back up 13TB, you might start with EMC [emc.com]. We deal with datasets measured in terabytes every day, and are quite adept at backing them up without even taking them offline.

    :)

    [Yes, I work for EMC--it's a wonderful job.]
  • I still find it amusing that they made that movie from a 5 page short story.

    Hey, "2001: A Space Odyssey" was made from a short story ("The Sentinel") that was only about five pages, maybe ten. They threw in a scene from another Clarke story, "Take A Deep Breath". (Guess which scene :-)

    (Mind, Clarke is (or was) a SCUBA diver, he should know better. (I.e. exhale, don't hold your breath, or you'll risk an embolism.)

    Come to think of it, a lot of SF movies are based on short stories or novelettes, rather than full length novels. Too hard to do justice to the latter in the screen time available. "Dune" should have been a mini-series. (And not done by DeLaurentis).
  • by Fish Man ( 20098 ) on Friday June 18, 1999 @10:43AM (#1843951) Homepage
    Actually, human vision is the aspect of the brain that I find most fascinating, partially because my professional specialty is artificial vision. I find it fascinating how much more advanced our biological vision processing is that anything we can achieve artificially.

    Several fascinating aspects of human vision processing:

    The "raw" "pixel" resolution of the human eye is actually flaberghastingly low, on the order of 200 x 200 pixels (the effective resolving ability of the rods and cones).

    However, the human eye "snaps" about 10 - 12 "frames" per second (maximum, in good light) and the brain integrates subsequent frames, each with very subtle positional differences, and compares adjacent "pixels" from frame to frame to assemble an image of dramatically higher resolution. Thousands by thousands of "pixels" when required by the task being performed (e. g. threading a needle or intricate soldering). This is why staring at a small object for some length of time is necessary before we perceive all the most subtle details.

    An additional "weakness" of the eye for which the brain performs some amazing processing to compensate for is this: The rods and cones of the eye are "recharged" by flushing a fluid containing rhodopsin across the retina. Rhodopsin is a protein that breaks down and emits a tiny electro-chemical current when struck by photons of light. The speed of the rhodopsin decay is proportional to the intensity of the light hitting it. The retina "recharges" when the previous charge of rhodopsin is nearly depleted. This works out to 10 - 12 times per
    second in bright light, much less (down to a minimum of perhaps once per second) in very low light.

    Anyway, the flaw in the above scheme is that the "dose" of rhodopsin that each rod or cone receives in any given "recharge" is very poorly controlled. It varies all over the place. This means that the electrical current emitted by any given rod or cone for any given intensity of light from frame to frame is not consistent! So, the brain has to analyze the average current emitted by each rod and cone, over the surface of the retina and over time, and integrate this information to produce and accurate and detailed internal picture inside the brain!

    The analogy is this (for all you artificial vision programmers):

    Imagine that your boss gave you this task:

    We are going to give you a CCD camera with an array of 200 x 200 pixels. We will rapidly vibrate the camera so that by integrating the subtle changes between adjacent pixels you will, after storing 30 frames, interpolate a picture with a resolution of 5000 x 5000 pixels. Furthermore, the brightness value digitized by each camera pixel is going to randomly vary by 200% for any given actual light intensity. Your system has to output a real time image flow at the above resolution and a brightness accuracy of +- 0.01%.

    Yeah right!

    But this is analogous to what the brain does!

    I've always been more impressed by the brain's processing power than by its storage capacity.
  • by RebornData ( 25811 ) on Friday June 18, 1999 @09:18AM (#1843969)
    Disclaimer: I'm not a neuroscientist, but I've been reading a lot of books on this recently, and there are huge differences between computer and brain storage that make this kind of measure meaningless.

    First, the "write" operation is highly dependent on how you experience an event. You can't be fully, simultaneously aware of every input- the brain is an excellent signal filter, and only processes those aspects of the environment you are focused on. But there's also an "interest" component- even if you are really paying attention an input, the aspects of it that you find important will be what you remember. Example- there was a study where a researcher asked workers in a museum about a particular painting they all saw on a regular basis. No two people described it the same way- some described the colors, others the emotions they felt as a result of the content, still others the execution of the painting and the specific stylistic elements. And what they remembered correlated closely to what it was about the painting they were "interested" in as part of their job- the curator's recollection (style, context, etc..) was very different from the guy who cleaned it (complicated, hard-to-clean frame).

    Secondly, a very important aspect of remembering is uniqueness- something distinctive about a memory that allows you to get a "handle" on it later. It's also thought that multiple, similar experiences tend to blur each other and reinforce the common elements between experiences. For example, I can tell you exactly how I get to work and what lanes I prefer to use, but I can't tell you the exact sequence of lane changes I made on any specific trip.

    Third, the brain has a very powerful reconstruction mechanism. It's kinda like dinosaur skeleton reconstruction. Just as a paleontologist can fairly accurately reconstruct an entire skeleton from a relatively small number of bones (or bone fragments) your brain pulls together and reconstructs the few bits of a specific experience that were stored and synthesizes a more detailed rememberance from the fuzzier "generalized" remembrances to give you the impression of remembering much more detail than you actually stored.

    This all contributes to explaining why it is so difficult for humans to remember "digital" data. For most of us, there's very little that's interesting, unique, or distinctive about the numbers in a sequence. Mnemonists with apparently infinite abilities to recall details generally have a learned or innate mechanism by which they create unique, distinctive symbols for number sequences which make it possible for them to remember them. In the most highly-developed cases, these symbols encompass every sense- sight, sound, taste, texture, smell.

    But such people are often cognitively lost in details... they can't deal with concepts easily, and can't abstract over information they have taken in, since they are so overwhelmed by the distinctiveness and richness of the details. As are computers, which know nothing except detail. So the "lossiness" of the human memory actually serves a useful purpose, and is a large part of what makes us "intelligent" relative to a piece of silicon.

    Of course, I'd like to have it both ways... :-)
  • There is a consciousness register set, where you remember things like the phone number you're about to dial. Capacity of 7 items plus or minus 2.

    There is a short-term memory of your entire current awareness. The registers often are used to point at these things, whether they be the red car you're avoiding, the smell of that pie you just noticed, or the part of the network topology which you're designing. These are augmented by various short-term memories for particular senses (the guy on the walkie-talkie just said your name, and you can recall several seconds of sound before that even though you weren't paying attention due to the auditory system memory).

    There is a long-term memory which is updated after going through various filters. Emotions tend to increase the chances of a memory being stored permanently. Severe trauma blocks storage of memories (severe accident victims can recall details on the scene, but not after rest).

    Some memory processing seems to be done during sleep, but the major reason for sleep is to recharge the energy-storing glial cells because vertebrate brains use more energy than the bloodstream can supply (otherwise there would be mammals which never sleep due to the evolutionary advantage that would provide).

  • I thought this sounded familiar. One source is the /. article on May 15: Task Processor Found In Human Brain [slashdot.org]
    Slashdot needs more filters on submitted articles, to point out to editors more about past articles.

    As a follow-up to my AC post above, I quote from Russell's book, "Artificial Intelligence, a Modern Approach."

    A crude comparison of the raw computational resources available to computers (circa 1994) and brains:

    • COMPUTATIONAL UNITS: Computer: 1 CPU, 10^5 gates ; Brain: 10^11
    • STORAGE UNITS: Computer: 10^9 bits RAM, 10^10 bits disk; Brain: 10^11 neurons, 10^14 synapses
    • CYCLE TIMES: Computer: 10^-8 sec; Brain: 10 ms
    • BANDWIDTH: Computer: 10^9 bits/sec ; Brain: 10^14 bps
    • NEURON UPDTES/SEC: Computer: 10^5 ; Brain: 10^14

    The number of hard disk storage bits may be approaching the number of neurons and connections in the human brain, but one bit on a disk has less information than one neuron or synapse. The disk would need at least one link per item, and in many cases multiple links per item. Obviously many bits will be needed to store links per item.

  • by jbf ( 30261 )
    I don't buy the 2bps argument. People with perfect pitch, for example, can tell you in less than 1s what note you're playing on a 88-key keyboard (6.45 bps). Even an average amature musician can tell you the interval between two notes in less than 1 second (3.58 bps). Perhaps the true/false questions that were asked were insufficient to provide the full range of data that is procesed by the brain, partially because of vocabulary limits (how do you describe the "amount" of light, or level of sound, or pitch modulation, without having a tool to measure it? Yet we still can distinguish between really small levels).

    Another problem with the 2bps analogy is that you can't capture the entropy of a concept in bits, and that's a major factor in human memory. (Ever crammed for a test in a class you don't understand?)

    I also think that some of these other numbers are high. MPEG, JPEG, wavelet, and MP3 compression show that not all the information we store to reproduce something electronically is actually significant to the human mind.

    Just my 2c.
  • First off, although I don't do this stuff day to day anymore, I did do my degree in cognitive science, so this is the kind of stuff we were expected to think about.

    The question as asked is ambiguous, since a 'hard disk' is an object whose contents have no semantics, just syntax, that is they have no meaning except when interpreted by another entity (you, me, sendmail, etc). OTOH the information that is stored in the brain is a mixture of semantics and syntax that we do not yet understand.

    To clarify this I would like to give a very /. analogy. Consider a minimal working Linux installation (basic hardware, kernel, shell). Let us say that the system has a 10GB hard disk. It is obvious that most of the HD is almost completely devoid of information, since it has no context, but is just blank. However the portion of the HD that contains the base software is very information rich, but only in the conext of the surrounding hardware (bios etc), and perhaps more importantly, in the context of the wider environment of /.ers and others who know what Linux is and what it can do, and can interact with it.

    Now here we have drawn three levels fairly clearly - the HD, the bios and other hard/firmware, and the rest of the world. But in the case of a human brain it is not at all clear where (or if) one can draw these distinctions, so only a complete description can suffice (i.e. we are not able to summarise the state by means of external references)

    In this event we can rephrase the question as follows:

    Given the required processing capacity, what amount of storage would be necessary to provide the same information processing capacity as a human brain?

    Now here we hit astronomical numbers. The question is equivalent (check Turing, Church, etc) to asking how many bits it would require to store a complete description of a human brain at a given instant. This is certainly a smaller number than a precise description of the state of every subatomic particle in a brain (i.e. less than the memory required for a Star Trek transporter), but is still pretty big.

    Back of an envelope? A very conservative envelope? If a brain state could be described by the states of each neuron and each connection in the brain, and each of those took 16 bits (which is almost certainly a gross underestimate), and there are ~10^12 neurons and ~10^3 connections each on average (Churchland and Sejnovki, 1992), then that is 16*10^15.

    or 16 peta bits

    HTH

    Matt
  • 10-12 Frames per second seems _unbelievably_ slow to me. I (and most anyone else) can easily see a difference between an image moving at 12FPS or one at 24FPS. Even the difference between 30 and 60FPS is pretty obvious.

    I fooled around for six months writing a juggling animator. At the end, I had it so that it could animate at the refresh rate of my monitor (75fps). If I animated the pattern "in real time" (i.e. a cycle takes as long as it would take if it was juggled by a six foot human on earth), the balls moved quickly enough on the screen that without motion blur there was a stroboscopic effect. So, I added motion blur, and was able to retain the 75fps. With other refinements such as subpixel positioning, the animation looked perfectly smooth. Out of curiousity, I reduced the frame rate to 37.5 fps, leaving all of the refinements in place. The difference was subtle, but I and several of my friends that I showed it to could all see the difference.

    I don't think this proves that the eye is sampling at a rate higher than 37.5 fps, since the eye/brain could be doing a lot of amazing image processing to achieve an effective frame rate higher than the raw frame rate.
  • Disclaimer: I know only a modest amount about neuropsychology/theory of computation...

    OK, I've thought about this for a while and here goes:

    As far as I can tell, there is very little we actually *know* about the brain. It cannot be likened in any definate way to a (theoretically) lossless accurate finite state machine/automaton, so there is no hard and fast way to answer this question. Get rid of that 122 meg estimate - he's obviously talking about something else ;).

    Basically, from what I can tell, the brain works _as a whole_ to store any one given 'item of information'. This is unlike a conventional computer that has a specific location to store a specific bit of data.

    Any storage device on a conventional computer (as far as I know) uses some sort of definate addressing mechanism to access a particular peice of data.

    On the other hand the brain "stores" pretty much everything you experienced, whether you remember it or not. The problem is how it is /addressed/. In the brain's case, there may be loads of 'data' stored, its just you cant remember the 'links' needed to actually get to it - ever been caught out by that word you know, but you just cant quite remember? (I know I have ;).

    The way the brain 'records' experiences is by changing (m)any neorones' receptibility to neighbouring cells as well as those neorones' internal chemical (in)balance. This would give a "random" (hey - never use the 'r' word without the quotes) yet analogue data storage that clearly cannot be enumerated in any definative way to bytes.

    I believe it's entirely possible that the brain could even 'record' data in more outlandish methods such as small inductive/capacitive fields and chaotic electrical interference. What is definately known is that the way the brain stores data is probably more than the "just count the number of cells and multiply by ten" way shown in one of the replies. Such a way would be inmature, inaccurate and an insult to the more thoughtful amongst us.

    Now, since I've said all that about just how impossible it is to measure the maximum capacity of the brain, what is probably a more practical answer is how much the brain will store on average. This I dont know, though I guess that that would be a question for the psychologists out there - and maybe thats where the 122 meg guess comes into it.
  • by NullGrey ( 46215 ) on Friday June 18, 1999 @09:19AM (#1844052)
    I nead a defragger. ;)


    +--
    Given infinite time, 100 monkeys could type out the complete works of Shakespeare.
  • You'll find in lingustics as well that most people do not have a concurrently available vocabulary of thousands of words, in fact 5000 words is a lot even for an adult, you don't need nearly that to get along in society.
    You and I have obviously been reading different linguistics texts. A non-native speaker visiting a foreign country could get by reasonably well with a vocabulary of 5000 words, but that is not a lot of words for a native adult speaker. In fact, it is not that much for a child.

    The question of how many words a person knows cannot be answered very precisely -- in part because the question is ill-defined. Do you include derived words, or only root words? How well does the person need to know the meaning of the words? What about words that have multiple meanings? Do they count as one word or several?

    However, once you settle on a definition of "knowing" a word, you can estimate the number of words a person knows by randomly selecting words from a dictionary of known size (preferably a very large dictionary) and conducting a little vocabulary test. The "known size" requirement of the dictionary isn't trivial, since you are presumably only interested in root words, whereas the publisher's word count will include compound words, whose meaning could be inferred from the root words.

    Using the above approach, Nagy and Anderson at U. Illinois, estimated that the average high school grad knows 45,000 words. Throw in all the words that aren't listed in an English language dictionary, such as proper nouns, acronyms, recent slang, etc., and the count will be closer to 60,000. Averaged over the student's lifetime, this works out to learning an average of 10 or more words per day! I've read other, higher, estimates of the size of an adult vocabulary. I found this particular analysis in George A. Miller's The Science of Words.

    IANAL (I am not a linguist), and I would be happy to be corrected by someone who is. But as someone who has struggled to learn foreign languages as an adult, I am well aware of how far 5000 words is from a passable adult vocabulary. If anyone else, like me, is interested in learning languages, you might find this web hack [wordbot.com] I wrote of interest.

  • We die because it is advantageous to our species to die.
    This is a commonly held belief, but natural selection doesn't work that way. Selection acts on individuals, not species. A trait that is benefitial to the individual (i.e., helps that individual reproduce) will always win out, even if it is harmful to the species as a whole. If longer-lived people passed on more of their genes than shorter-lived people, then soon the population would be made up only of the longer-lived people. Even if humanity as a whole were worse off because of it, it would still happen.

    So if it's not for the benefit of humanity, why do we all self-destruct within a century or so? Well, evolution is all about compromises. Few of our ancestors lived long enough to die of heart attacks or cancer, since they were more likely to become some predator's lunch. So rather than optimizing for conditions that rarely happen, such old age, nature, like any good hacker, optimized for the common case: youth. Any mutation that increases our survival when we're young at the expense of killing us when were ancient is likely to have been selected for.

  • In Psychology we were taught that short-term memory usually lasts less than 30 seconds and is limited. Long-term memory is unlimited and is permanant.

    A little food for thought. We discussed one patient who had a literal photographic memory. It was explained to us that he committed every memory to long term memory and thus was unable to forget anything. He eventually went insane.

    Sorry to bring an unprovable story in but can you imagine the ramifications of that? It would seem that having a selectively photographic memory would be nice. Just remember EVERYTHING that you want--Good times, your wedding or divorce even... I dunno, or an entire textbook before a test. But imagine of you couldn't control what you remembered. Think of all the useless information you recieve every second. What people are wearing, the color of the wall, how many cracks in the wall...

    I most certainly would go insane.
    -Clump
  • For anyone really interested in this subject, there is a GREAT book called The Evolution of Consciousness. One of the main premesises of this book is that the processing in our brain is specifically designed to limit raw information storage. What we remember is the impression of the overall environment, and then reconstruct the memory each time we need it, inventing details which are consistent with the impression. This explains why everyone at one time or another distinctly remembers you saying something which you didn't actually say. This means that only a fraction of the information is actually stored, while the rest is reconstructed. Kind of like the way a 3D game is reconstructed out of various primitaves.
  • This is very interesting. Perhaps the variable timing of rod recharge can be interpreted by the brain as improved frame rate. Studies have been done that show the eye jots around what it sees and takes in detail of a small portion of the image. So that initial 200x200 enhanced to 5000x5000 image is then enhanced section by section as the eye focusses on parts in a non-ordered fashion. So if you stand in front of a painting and look at it, your eye is resolving the image more and more accurately the more you look at it. The same thing is true of staring at the head of a pin. First it's blurry, then you focus. Some of that is adjusting focus, some is the gradual resolving of the image.
    Back on the topic of human memory, think about all the little details you remember about last weekend. What you did, who you saw. That is the 2-bit per second memory. Now recall everything you know about your favorite song. You know the melody, the lyrics, the beat, effects, maybe the music video, you recall images, times you listened to the song, who you were with. What you recall about a song or a movie or a TV show, or a news article or a web page can fill a lot of data if it is forced into binary. Take the tune of the song. Now first we must describe each note in binary, so assign a number of bits that can describe how well you know each note. 8? 16? Assume the song has a refrain, so part of the song is repeated and hence not new memory. Fur Elise has 17 core notes, making for 204 bits if we differentiate to 12 bit music. Now we know the instruments, how they sound for the song. Describing that in binary can eat up data pretty fast, but we only remember a few samples from the song. My point is that in just a well remembered song we can eat up a meg or more. Know a hundred songs? Well there's a hundred meg. Know your profession? How about movies? A well recalled clip, image, quote, all times the number of movies or shows remembered. We can recall the tone of voice used, the plot summary, and tons more. There is no way this data could be stored in binary form and not take up gigs. Over a lifetime, the 13TB figure seems resonable.

HELP!!!! I'm being held prisoner in /usr/games/lib!

Working...