Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Science

Task Processor Found in Human Brain 187

BillG writes "So people can do more that one job at the same time. And if this report is to be believed, we can do it much more efficiently. After all human brain is THE most powerful computer. " So that explains how I can do e-mail and post at the same time.
This discussion has been archived. No new comments can be posted.

Task Processor Found in Human Brain

Comments Filter:
  • by Anonymous Coward
    Is there something known as the opposite of ADD? I've always been a very single-tasked person.

    One of my friends says that I suffer from
    'reflexive concentration'.

    I can do things that envolve concentration: hypnosis,programming, etc. very well, but I suffer from information overload if I have to do more than one thing at a time well--like driving.

  • by Anonymous Coward
    The poster above suggested that attention is the limiting factor in how many tasks you can perform. Not a bad start, but not quite correct.

    Not all tasks require attention. For example, I bet you have no problem listening to NPR in the morning while driving to work. The route is so overlearned that attention is not neccessary. You can listen intently and still get to work safetly.

    Of course, it depends on what you mean by "task". For example, my visual system alone is extracting, in parallel, millions of features from the visual field--and all without the need for attention (some researchers call this 'pre-attentive', but IMHO that is lame). So, I can perform millions of tasks at once.

    Having said that, there are definite instances in which we can seemingly perform only one task at a time. So whatever you mean by task, has to be well defined. Why are some tasks easy and some hard (and require attention)? Problem signal-to-noise ratio. The more overlearned or easier the task, the higher the signal. The lower the SNR, the more likely attention will be needed.

    -Anonymous Cognitive Scientist (but rarely a coward)
  • by Roland ( 61 )
    http://www.webcom.com/bmainc/welcome.html
    check this out, they do som e cool, non-drug treatment that teaches out how to control brainwave patterens. I went there twice a week when I was in Highschool.
  • I wonder if I can find the /usr/src/linux/include/asm/param.h file in my brain. That way I could change time scheduler to 1024 hz and run single processes faster when I'm only thinking about one thing. That would kick ass.

    I think proof that we don't crash every day and see a big blue screen tells us that our brains sure as hell aren't running windows.
  • by Jordy ( 440 )
    You should really see a shrink and get checked out but, what you described is a classic example of ADD.

    ADD allows a person to hyperfocus on most things are interested in. The problem is that if that person is put in a situration where there is more than one thing that requires concentration or they are not interested in a particular task, they tend to get "lost".

    I personally have an appointment with a shrink to get diagnosed properly for ADD, however my mother has ADHD.. which is similar to ADD, but with hyperactivity as well as do various other people in my family.

    No one quite understands how ADD works. Last I read it seemed to be more of the way you grow up rather than a genetic trait.

    --
  • Not true... you must remember that the human brain is massively parallel and doesn't work on a binary system, but rather analog. An electrical pulse may have variable intensity and the reaction may introduce variable amounts of chemical reactions...

    This speeds things along considerably. Now... if we just had a more efficient way of exporting and importing data into our brains... we'd be set. I mean, if you could just plug yourself in and learn everything there is to know about a particular subject in a fraction of a second... you'd be set :)

    --
  • However, a thought- I'm pretty good on a _bicycle_ because it requires more attention to keep it balanced and in the right place. Does anyone have opinions or experience regarding autistic/aspergers people on _motorcycles_? It only just occurred to me- it's possible that would be safer for all concerned, because a motorcycle would absolutely demand all one's attention, stick you out there in the wind with nothing between you and the road, and consequently travelling on a bike might easily result in total attention to the bike and the road and _no_ priority for interesting computer problems or whatever (my nemesis driving a comfortable automobile)
    From my personal experience, I would say you're safer on a bike, but not as safe as you'ld like to think. I often find my mind latching on to other things while riding (could possibly be helped along by the route being so familiar (not many options)).

    However, my mind does wander less on my bike that in a car (I also let my licence expire, with no real desire to renew it). Anyway, biking got my blood pressure down from 130/85 (not bad) to 110/75 (pretty darn good:). Probably helped me get over my recent bronchitis as quickly as I did (still a little pleghmy, but no more aches, shakes and fevers). 'Course, the anti-biotics helped too:)

  • Ummm... religion?

    More seriously: what about all those cases of insanity (I mean the real (?eh?) ones, not the political dissident ones) throughout history? I'ld think those could be classed as brain explosions.

    Pertaining to my first comment, wouldn't many of you agree that (at least fundamentalistic) religion is a form of insanity and thus a case of runaway brain processes?

  • After reading several posts and a novel (The Star Fraction, Ken McCloud) about thoughts being virii and, in Ken's book, computers using humans to bring them to life (not well translated by me), and also remembering that astronomer's (?) comment about life expanding to fill all available niches (NOTE: Unlike said astronomer, I don't beleive that Earth is the sole source of life), I got hit by an interesting thought.(helped along by other posts about reverse engineering the brain): could whatever life-form that makes up the content's of our brains (our soul, I guess), be trying to expand it's niche from the organic world to the inorganic (eg silicon) world?

    What I'm thinking is that our [I'll stick with] soul's are using our bodies to both understand themselves and produce more, different (hopefully longer lasting and more efficient) `bodies' (eg a computer or a robot).

    I don't know, just a crazy idea that hit me.

  • By "brain explosion" i mean drooling, twitching, spastic lack of cohesion, followed by overheating and some leakage.

    Which can be mistaken for religion, except it's involuntary ;)

    Umm, since when is most religion voluntary? Usually, your born into it, your parents taking you to church etc as they were. Not a lot of choice there and it takes effort to break the chain. The only voluntary religion I know of right now is free software, and judging by some slashdotters (and others), even that's debatable.

    Anyway, I like your description. hmm, out the ears? or noze? (leakage) Is the lack of cohesion physical or mental (or both:). Messy.

    Not to many naturally occuring instances of that version of brain explosion, though there's been plenty of induced cases. I beleive heavy club applied to the head induces the above symptoms.

  • by Skyshadow ( 508 )
    My mom tried to get me diagnosed with ADD when I was 17 and this whole travesty was just beginning. She made a neurologist appointment even though I told her I didn't want to go, and pretty much indicated that if I ever wanted to use the car again I should play along.

    I read pretty darn fast, test extremely well and have a tremendous eye for detail, but they say there's something wrong with me because I'd rather be outside or doing something fun than working on repetative math homework.

    Now, I ask you, are people with ADD the ones with the problem here? To me, it seems like this is almost genocide on a way of looking at things -- medicating you until you think like all of the rest of the good little worker ants that our society seems so intent on turning out.

    Anyhow, I cheated on the computer-based test (you sit in front of the screen and click a button every time a certain shape flashes up) and so I was only diagnosed with "mild" ADD.

    To all the parents out there who are thinking of medicating their ADD child: Don't. If they ever realize what you've taken from them in exchange for a few good grades in school, they'll hate you for it (and rightfully so).

    ----

  • It looks good, but how efficent is the task switcher? Is it truly paring out the jobs efficently, or being held back due to the equipment's (ie body's) own slowness?

    If it's the latter, maybe adding extra capacities (say, extra arms, eyes, or even fingers) could optimize the system. Multitasking at it's greatest. It's been proven to work, according to Anormalities and Curiosities of Medicine dating back to the 1800's. You'll need to use Project Gutenberg to find the text.



    ---
    Spammed? Click here [sputum.com] for free slack on how to fight it!

  • Book is 'The Mind of a Mnemonist'.
  • I'm trying to put a little energy into this article's comments, because people are mentioning autism, but not everyone understands what it is or what they're talking about.
    Asperger's is _very_ much like being single-tasked- but that's not to write it off as a deficit! It's another 'mode' of being, just like ADD (lotsa ADD advocates sounding off, I think that's pretty cool).
    Someone earlier was talking about the mind's 'kernel' versus 'userland' and I thought that was a truly wonderful way of explaining what goes on with autism. In most people, the 'userland' is all over the place and can cover a lot of bases. In autistic people, the 'userland' might cover many or all of those bases, but it will do them one at a time and get stuck on them easily. It's like cooperative multitasking... _with_ the attendant advantages as well (yes, there are some if you constrain your requirements enough).
    Temple Grandin, a well known and successful autistic adult, has explained it as having many deficits and liabilities but also having a 'Sun workstation' in one's head, and given the opportunity this can be a serious advantage. It's not simply being incapacitated for complex interactions like hanging out with 5 people all of whom have subtly different reactions to all the others (gah! How do you normals do this?), it's also having this weird 'black box' processor that normals don't have, and learning what it can do or helping it grow.
    My own 'weird black box' has a lot to do with geometry and design (I was tested on a vocational test called the 'GATB' test and in one key part, a bit where you had to visualize what shape a flat figure could be folded into, I nailed a ten year high score on the test _and_ enjoyed the problems so much I wanted to keep doing them after the 5 minutes were up). I also have some of it devoted to English language and grammar, probably due to extensive reading :) I want it to also cover programming, but it's been very weird, as I simply cannot _comprehend_ the normal programming educational materials (it seems to jump all over the place with no center) and so I have to immerse myself in programming-related stuff and be _around_ people talking (or posting) about programming, and soak it up until a point comes when suddenly I start doing stuff that non-programmers would find completely incomprehensible (I dunno yet if it's stuff programmers would find brilliant. I'll have to wait and see...)
    Driving? I haven't renewed my license, and it's been maybe five years now? I didn't get it _taken_ away from me, it just expired. Looking back, I got in too many accidents anyway so I'm not sure I _want_ to drive. Part of it certainly was drugs (I'm clean and sober at all times now and prefer that), but that's not the whole story- even when not high, I'd suffer from information overload or my brain would want to think about something else, and I hit other cars in fenderbenders. I'm happy not to drive now- it was yet another thing that I 'should' have been able to do, that _normal_ people are able to do, but reality kept tapping me on the shoulder and saying "yo! *crunch* that was a pickup truck's rear bumper, your beloved old Rabbit GTI is now toast while the truck wasn't even scratched! ...aren't you glad it wasn't a little kid in the road?" (shudder)
    However, a thought- I'm pretty good on a _bicycle_ because it requires more attention to keep it balanced and in the right place. Does anyone have opinions or experience regarding autistic/aspergers people on _motorcycles_? It only just occurred to me- it's possible that would be safer for all concerned, because a motorcycle would absolutely demand all one's attention, stick you out there in the wind with nothing between you and the road, and consequently travelling on a bike might easily result in total attention to the bike and the road and _no_ priority for interesting computer problems or whatever (my nemesis driving a comfortable automobile)
  • You're thinking the more extreme forms of Kanner autism.
    I'm a fellow traveler- I have Asperger's syndrome (I'm one of the many _adult_ Asperger people out there who fell through the cracks in childhood and just had to wing it)
    When you read accounts from someone who's experienced this 'can't-cope' or if you've had a taste of it yourself, then it's easy to get a little more specific about what is going on. It's not really so much that the center of personality is so _weak_- what's happening is basically that the sensory input is being set nice -256 or so! o_O
    (For those who haven't gotten that far, 'nice' sets priority levels, and the high negative numbers set a process to very high priority, nearly locking out everything else- another telling similarity is that userland doesn't typically get to nice processes to such levels, only root gets to set negative numbers and users are normally restrained to setting processes to _lower_ values- i.e. 'forgetting' the process and having the computer pay less attention to it!)
    With that understood, it should be easier to imagine what it's like having a sensory input set to unbearably high priority. I'm not Kanner-type autistic but I _have_ had experience with this- when I was a kid we had cats (I still identify with cats powerfully) and at times they had fleas. I developed a horror of having fleas in my bed biting me when I had to sleep- and I'd end up unable to sleep whether or not there were any fleas, because _touch_ would become so intolerably acute for me that I would be feeling individual thread-ends or grains of sand, all over, with the intensity of flea bites or crawling insects (normally a much more intrusive input!)
    I've seen many reports that sight and sound etc. can be this intrusive as well, so I wanted to try and clarify it a little :) 'open content', it's time for me to share some of my reality for the betterment of the community. After all, a _lot_ of Asperger people (or other stuff like ADD, see below) are hackers...
  • Posted by LuneKat:

    I can respond to several subsatially different IM conversations and write fiction at the same time... does that count?

    ps I'm working on a SF story right now that involves that multitasking ability. In the story, it is an increased memory storage capacity that allows multithreaded activities.
  • Posted by Open Matrix:

    I have to agree with you on this one. I don't have ADD but my sister and father both have it. I've seen my sister sit for hours doing her homework and you go over there to see what kind of progress she is making and you see she hasn't done one thing because her mind has been wandering to God knows where and she doesn't realize hardly any time has gone by. I've also seen my mom yelling at the top of her voice at my sister and my sister is basically paying no attention to her because she is so caught up in what she is doing and when you finally get action out of her by disciplining her it's just enough action to keep mom from disciplining her any more but even all through this yelling and whatever she still is focused on whatever she's fantasizing about. I do see how this could have it's advantages but it sure makes the person seem VERY selfish and uncaring. I think this is why this is considered a disorder. I also think this could be very dangerous in situations (i.e. driving) were you really need to focus on something that is boring.
  • Posted by Lord Kano-The Gangster Of Love:

    The US government has known how to program people since at lease the 1960's. The perfect example is in the military. During WWII, some reports say as many as 80-90% of our service men did not fire their weapons in a combat situation. In order to overcome this the military did studies to find out why. They came to the conclusion that there was a resistance to kill other people hard wired into the human mind. What they did was train men to act automatically instead of thinking about it. The act of firing a rifle at an enemy soldier's head can't be delayed by thought. To this day the military forces of countries around the world are trained using the information collected by that research.

    What can be used to overcome the resistance to kill is a trained response to a stimulus. You see the outline of an enemy soldier, you fire at it. Men are trained for weeks that when you see a foreign shape, you shoot it. No thought, no contemplation of right or wrong, just instant action. This has been done for over 30 years.

    Granted this isn't low level programming, but if the point of programming is to determine behavior this is certainly a valid concern.

    Do you know what MK Ultra was/is?

    World governments have been interested in programming the human brain for along time.

    LK
  • by Paul Crowley ( 837 ) on Saturday May 15, 1999 @09:38AM (#1890783) Homepage Journal
    The philosopher Daniel C Dennett argues that many of the peculiar features of human conciousness arise because we are parallel machines trying to emulate serial ones when processing language. I guess the reason other animals don't need this part of the brain is because they only use the directly parallel solutions?
    --
  • Can you provide some references for this?
    --
  • And how much cache?

    As someone else already said: about 7 objects. (Remember, the brain is *not* a simple computer!)

    And I suppose there were always times I wished my own brain had more RAM. I think I've got about 2 meg myself.

    I think you've a lot more than that. Just imagine going the way from your home to you school. To *any* school you ever went to. Sure, there will be parts of the way that you can't remember, but most of it is there, isn't it?

    Remember the first car crash that you ever saw? Right, that is a couple of seconds of high res video!

    Just add all those things up and you'll find there is quite a lot of memory, but in some parts, no easy way to access them.

    On the other hand, do you remember what were the colors of the cars in the crash? You probably don't. So the event wasn't saved as straight video, but more as tokens and events. When you then think of the "super-event" (the crash) your brain renders the corresponding pictures in real time. So, probably not so much memory, but also a very fast processing unit...

    Disclaimer: I'm just some hacker that thought a little about his brain... ;-)

  • "BTW, I have ADD"

    Same here.

    "(Why does "I've ADD" seem wrong?"

    Because "I've" is short for "I have(verb)", not "I possess".

    "Does anyone really understand why we talk the way we do?),"

    To one extent or another. Grammar is reasonablely well explored; we have formal models of it, none of which are perfect but most of which are useful.

    "and have what is supposed to be an exception...I read really well and fast and a lot, which is the opposite of the normal of ADD. Does anyone else who 'has ADD' do this?"

    1000 WPM. Yes. I LOVE reading. ADD is (to say the least) not well understood.

    "I'm actually pretty glad, overall, I have this 'disease'. Deep hack mode is easy"

    My problem isn't deep hack mode -- it's homework. I can _easily_ spend all weekend not doing a stitch of homework, without ever leaving my book. Time passes randomly, and before I know it -- the weekend's gone. I wasn't doing anything fun, I wasn't doing anything useful; I was just staring into space. I just spent today doing that.

    I like deep hack mode, but I so rarely get a chance to drop into it. I hope getting out of school removes most of this problem... I enjoy my job SO much more than my school, even though it's still a problem.

    -Billy
  • When a computer can beat me at chess, go, pente, poker, and battleship simultaneously while discussing politics... I might get worried. The human brain is not as fast, but remains the king of multipurpose processors.
  • 20/20 memory? That might not be as good as you think.

    I'd ask you to think of something you'd forgotten and were glad to have forgotten, but that'd be kind of pointless. However, we all have stuff like that. Things which we're better off not really remembering.

    While we're at it, I'd add that 20/20 memory would mean memorizing much more sensory input than our brains are equipped to handle normally. The result: probalby something on the order of madness, paralysis, or even death the instant you tried to remember any event.

    The way I'd like to see something like this run would be a "memory unit." Kind of like a Zip drive for the brain. Mmmm... downloading class notes just before the final...

    This does scare me a bit, however. After all, if the brain is a computer, that means it can be programmed. And if it can be programmed, you know a virus can be written. Or worse, Billy will decide to port Windows to the brain.
  • I don't know if this is the same as ADD but it seems pretty close.

    From someone who was also diagnosed as "hyperactive", then answer is that you didn't have what they call ADD (why in all of heaven and earth, they seem to feel the need to label it a "disorder" is beyond me...) you had ADHD. And I really hate the "drug 'em" mentality that the medical community seems to have with this- it's more of an emergent personality trait than it is a "disorder".
  • The standard retort is "there's no such thing as Attention Deficit 'Disorder' -- the real problem is that most people have BTS: Boredom Tolerance Syndrome".

    ;-)


  • I was also diagnosed with Attention Deficit Disorder, when I was screwing up in High School. I think the diagnosis was correct since my problems mostly came from being distracted by the other students games and general craziness going on around me.

    Since then I've followed it a bit and I learned at least one positive and interesting thing. ADD kids often grow up to be excellent, sucessful business owners. That dosn't sound like a curse to me. Maybe it's not a disease but natural selection, a type of specialization.

    It not what your delt but what you do with it.

    I would definatly be curious to see if that part of the brain reacts differently in the bulk of ADD people.


    Matthew Newhall

    Yes! I'm in heaven!
    This is nice.

  • >Sorry, but you've completely mis-remembered George
    >Miller's research. He was researching memory span,
    >not simultaneous tasking -- the two are quite
    >different. Miller found that after a single
    >presentation people could remember, on average,
    >seven separate items, be they numbers, words, etc.

    That wouldn`t be something new, I learned that ten years ago in school :-)

    The brain is using tokens for it`s short- and midterm-memories. Most humans feature between four and eight, most have six tokens.
    Therefore human concisness basicly comes down to the number of his tokens and their individual level of abstraction - sometimes people need two tokens for one thing where others only need one.
  • Yes, you can remember everything from your long term memory. It seems to use some sort of dynamic compression, the information being compressed when you sleep (that's why sleeping 'clears' our mind). When using some kinds of hypnosis, you can 'recall' all that info.

    You can reconstruct it, but the details are unlikely be correct. It's lossy compression.

  • Whoa.. Scanners moment...
  • I don't think that's a very good example of human multitasking. My feet know how to walk, my mouth knows how to chew gum; er, whatever portion of the brain (medula?) handles motor functions knows how. Still, there is little conscious thought involved in either action. I can even do a reasonable job of avoiding obstacles while walking and chewing gum, meanwhile being focused completely on some other task.

    Multitasking, btw, doesn't necessarily mean doing two things simultaneously, at least in the strict sense. A single processor computer doesn't run tasks simultaneously, it just swaps them in and out to make sure each gets some cpu time.

    Just think of talking on the phone and coding at the same time. Its not hard to do. You are liable to miss a sentence or two of the conversation, but that was the time during which coding was occupying your attention (missed interrupt?). Switch back to the conversation, and you can pick it up almost immediately (but during that time, no coding is getting done). To actually write code and carry a conversation at simultaneously without either suffering is not something I can do.
  • I know Kung Fu!
  • That sounds reasonable. I drew a connection between first-level memory storage and simultaneous tasking (just like one could draw a connection between load averages using a processor with more cache). Muscle memory would tend to be a "hard-wired" register.

    This having been said, is there any study showing how many tasks one can simultaneously perform? I posit that you can perform all you like, but what will happen if you forget about them? (let's not get into long-term storage like writing stuff down) -- I'd say that as mental efficiency approaches, the limiting factor will still be 7 + or - 2.
  • I'm not saying the *brain* is a NAND gate. The neurons are electrochemical NAND gates inasmuch as they are all alike -- and yet they perform different purposes (at different times in different combinations) -- all neurons are the same composition (different shapes, sometimes, and glial cells don't count), so how can the same gate be a different gate (fundamentally, in structure?) Conway proved that any logic gate can be emulated by a combination of NAND gates (don't ask me how; my math is nowhere near the level of Conway) and hitherto, most neural net programming has been using NAND gates to improve consistencey and reliability. If one NAND gate goes down (and let's face it, age, beer, and head trauma takes down much more than one) -- another NAND gate can fulfill its role, whereas if we were using OR, XOR, AND, NOT gates then we'd have spares only from that pool of specific gates.


    This having been said, this is how the *neurons* work, and because different *combinations* of NAND gates can yield other gates, then it's quite possible that you might have a lot of OR gates (or whatever), whereas I might have a lot of AND gates, and somehow that might make you think visually and myself aurally...

    As for using 100% of our resources; that's really funny. In parallel tasking, the name of the game is redundancy, and the human brain is the perfect example of asymettric processing. Maxing out a beowulf cluster (bearing in mind this is *symmetric* mind you) is great for throughput but terrible for reliability -- what happens when one processor is gone? you start thrashing to disk like mad. This problem is slightly eased in an asymmetric processing cluster like the Brain -- some nodes are only there for backup purposes, or to create the ambient internal interference neccesary for NAND gates to become another type of gate.
  • Rudy Rucker once calculated that even IF the neuron was a purely digital phenomenon (on/off) then we'd have 2^(3 billion) states of mind which comes out to somewhere around a gigaplex states of mind. Of course, some of these are physiological and undesirable, e.g, racing heart and no breathing, and there are bound to be more because the brain works on interference patterns as well.

    Memory uses a combination of compression and hashing. We remember generalities (I saw a tree there), and hash it into our description of the scene with a pointer to a description of what a tree is. When we retreive the memory (I saw a tree there), we also uncompress the description of the details (Trees are green and brown) and uncompress any subdescription (Pine trees are pine green, and triangular), leading (after many cycles) to the conclusion memory (I saw a pine green and brown, triangular, tree there).

  • Basically the gist of your post was "the human brain has more concurrency than control, but there has to be control somewhere."

    the point being: we *know* there's a lot of concurrency in this massively parallel computer called the brain, but where's the control? A computer of *any* kind without control is runaway, and that makes no sense considering we have survived without too many brain explosions over the years.

  • At the risk of touching an inflammatory topic, the answer is NO because the key here is a voluntary belief mechanism. You choose to believe something because the argument given turns the "credible" flag on. Certain "Software" -- ie, different styles of logical/illogical methodology -- will change the sensitivity of that flag.

    By "brain explosion" i mean drooling, twitching, spastic lack of cohesion, followed by overheating and some leakage.

    Which can be mistaken for religion, except it's involuntary ;)
  • This was already estimated by biological psychologists as between 5 and 9, with the average falling at (you guessed it) seven. Hence the famous psychological paper entitled "The Magic Number Seven, Plus Or Minus Two."

    Of course this relates to conscious tasks. Presumably the brain is conducting hundreds of tasks relatin to hormonal and electrochemical feedback from the endocrine and nervous systems respectively. Think of it like this:

    Kernel-land human brain can take a load average of hundreds. User-land human brain can handle a load average of 5-9 without page faulting.
  • AFAIK, and according to a fairly recent Discover Magazine article, the human brain DOES have math coprocessors -- multiple ones in fact. (Again, according to the article ...) the human brain has a set of cells which function roughly as a number line, with a span of integer values from -5 to 5. This is inborn. An area of the brain handles basic concepts as "more" and "less." Also inborn. An area of the brain handles sophisticated mathematical tasks which are worked through (by which I mean subtraction, addition, etc -- this is sophisticated using the NAND gates which are neurons). Another area handles mathematical concepts which are memorized (e.g., multiplication tables). An example operation of multiplying 144 by 144 would call upon the memorized-portion coprocessor to realize that 144 is 12*12, and then the work-it-out-portion processor would calculate 12^4.

    Then there's the whole part of the brain that thinks in geometric terms. Does it do a graphical representation of algebraic equations? I'm not qualfiied to answer. For more, read Rudy Rucker, "Mind Tools" and judge for yourself. The fact that I passed applied linear algebra by representing concepts such as orthogonal projections in a geometric view convinces at least me that geometric thinking is math coprocessing at its most fine-grained.

    Damage to one portion of the brain, but not others, can lead to quirks in solving mathematical problems... very odd ones.

  • this is what hack mode is :-)

    I find it quite useful sometimes -- tradeoff being that I have a very difficult time concentrating on boring things, like this history final I have to take.

    Slashdot is far too facinating!

    Lea
  • kill -9 is a command on Unix systems to kill a process, regardless of the current state of the program.

    -- Give him Head? Be a Beacon?

  • heh heheh heh heh, Hey Beavis, man pages RULE! heh heheh

    :P (Good Job, bro. Use the man page for your reply. Slick!)

    -- Give him Head? Be a Beacon?

  • > switching speeds are measured in milliseconds and no caches;

    It's not THAT bad. Switching speed is a good deal faster than that in some nerves like motor nerves, but latency is of course a factor -- muscles can only pull so fast. Your short term memory can be considered a cache.

    Explain the difference between data and concept? I can change the concept of polymorphism and inheritance with a sufficiently advanced OOP language that implements the meta-object protocol.
  • I'll second this post. I can pull random memories out of thin air, or remember a person I met once years ago, and tell you what they were wearing, what I was thinking and how the weather was. Granted, it's not perfect, or even as perfect as PsychoSpunk above claims. I do tend to float from one thought to another fairly randomly, too, so it is easy to get distracted. And now and then, my memory is absolutely horrible. I can walk between rooms, and end up doing something else, then forget what I was doing in the first place.
  • If we were to have a memory like so many of you are describing, it would almost be impossible to function normally. You would remember many many useless things.... Your brain would be filled with unimportant information like this, and it would be very cluttered. I believe there have been some cases where people did have this type of memory, and they were unable to function normally, because stupid little details like these would interfere with their normal train of thought.

    To recall these memories - especially old ones, you need some sort of cue, which brings it out, and puts it into conscious thinking.

    Don't the ideas expressed in these statements conflict? The first statement that `if you remembered everything, then everything would be in conscious memory, all of the time' implies strongly that every memory is in the foreground, while the second statement is the opposite: memories lie dormant until something either moves or duplicates them into the foreground.

  • ADD people also have 'hyper-focus' where they only pay attention to one thing.

    So, actually, yes, it's probably something 'wrong' in this part of the brain. Or at least different.

    BTW, I have ADD (Why does "I've ADD" seem wrong? Does anyone really understand why we talk the way we do?), and have what is supposed to be an exception...I read really well and fast and a lot, which is the opposite of the normal of ADD. Does anyone else who 'has ADD' do this?

    I'm actually pretty glad, overall, I have this 'disease'. Deep hack mode is easy, and, luckly, I don't have ADHD, or at least not anymore, so I'm fairly calm physically. :)

  • That agrees with what I remember.

    I recall some canadian guy at a seminar explaining that a UI should have no more than 5 options at one time as this was the lowest common denominator (7 +- 2) for recall. He had some really interesting comments ( read abuse ) about the Win interface.
  • That explains all those bad sectors I have to remap every Monday morning after a long weekend of alcohol consumption!

    :-)
  • Well, I'm apparently an exception. When I took my intro to psych class a semester back, I was bothered by the fact of many anomalies that were present in what my prof said vs what I experience.

    But, nonetheless we must remember that psych is a young "science." It is based on impressions and guesses which are hard to prove. (As you may see, I'm a firm believer that what the psychologists learned a century ago is distorted by their world views from a century ago, and therefore their findings should be taken with as much regard) I don't recall every small detail about every day. But my friends will vouch that it is disturbing when they ask me to recall facts about a certain date (I can do this for up to 5-6 years back) and provide them with details. The problem is, that someone usually has a corroboration with my version and it's unsettling that it's never the same person who does. In other words, I'm holding a capacity of 5 years of near perfect detail in my head whereas they hold maybe a 5th of that.

    Like I said, that "science" has yet to discover all the intricacies of the mind. My gf is bothered that I do this date recall thing, but I never forget anniversaries. There is the argument that I hold too much useless info, but then again, I argue that I have a larger storage capacity. We can hold these things and function normally, it just requires to expand the amount of brain that you use.

  • Thank you for providing a second post. I'm certain that there are many people out there that don't follow the psychologists' model for brain behavior. It's probably on the order of unique fingerprints, considering the complexity of human development and evolution. For some, it is just simply necessary for this style of recall, although being afflicted with the "photographic memory" as it is commonly called, is not entirely a pleasant life. I'm not quite sure why those of us who do possess this mental capability have it as part of our genetic makeup. It is not a skill I find particularly useful or unuseful, it is just the way things are.

    Plus, it is probably something that many people possess yet some have failed to learn to utilize it. Like I said, I was bothered by the "hard facts" of this "science" of psychology when I took my intro course. I do not claim to be an expert in this field, just one who has been disaffected by it. I do my work in CS and find my abilities useful.

    Anyhow, there is one other link to all of this. I'm curious about those who possess this ability and the breakdown of genetics. I have inferred that it is a recessive ability and I am curious if those who possess it find these attributes in their parents. My own parents are impressive in their own mental ability but my mother stated last Christmas while cleaning the mess of opening presents that she liked it when she was smarter than I was. This leads to my major confusion, because if this is a recessive trait, then carriers would not necessarily be able to use it, however if it is dominant, it would figure that there would be a lot less stupid people in the world. And everyone who had the half/half combination would be able to utilize this trait.

    Our genetics culling process has tended to make useful things dominant. Yet, certain things which are not useful or unuseful seem to be marked with randomness. (Why are brown eyes "better" (dominant) over blue eyes? and furthermore, does the iris color affect the way I see things vs the way a person with different eye color sees? These seem to be keys to psychology rather than dream interpretations (simply just random images with a subconscious doing its best to make sense with a storyline))

    Oh well, I've gone on and on about this and yet seem not to really have said anything overly important. Just wanted to state that this "science" still has a far way to go before it accurately catalogs humanity.


  • Consider trying to write an 8-part jazz piece with all parts going in all directions.

    Notwithstanding the fact that I'm a lousy f*cking singer :), I think that's relevant. You can do two things at once, and sort of focus more on one than the other, but still be keeping an eye on both. Likewise singing along with the radio while driving in traffic. The driving part itself involves watching other cars, mirrors, shifting, steering, turning up the radio, etc . . . None of this is entirely on "automatic", and you're not really "focusing" on any one to the exclusion of others. Actually, I guess practice is crucial to all of this, and "practice" means, essentially, not having to think.

    Hmmm . . .

    Never mind.


    "Once a solution is found, a compatibility problem becomes indescribably boring because it has only... practical importance"
  • Well, my brain must not go too far w/o a stack overflow since sometimes I go to the grocery store and in the process of driving there I forget what I came for. I sometimes forget what food I want to order at a restaurant because I concentrate too much on talking with people at my table before the waiter comes to take my order.
  • Or (let me play devil's advocate here) Linux advocacy? I tried Linux and now I can't stop using it, and I'm getting all my friends to try?

    It *is* a virus!
  • by joshv ( 13017 ) on Saturday May 15, 1999 @10:07PM (#1890818)
    Beaverton, OR
    May 15, 1999

    Hot on the heels of a recent announcement that that human brain has rudimentary task switching capabilities, a local Linux advocacy group has created an initiative to port Linux to the Human brain.

    Says the founder of the iniative, Niels Bohrman: "We feel this discover is a real breakthrough for Linux. The human brain has an installed base of over 5 billion. This could blow Windows out of the water."

    The group feels that there are some technical challenges ahead of them but are positive and upbeat, steadfastly attempting to recruit from the most talented Kernel hackers the net has to offer. Says Borhman, "We feel that if we can get the right people working on this it will not be a problem. The basic science shows that the task switching abilties are there in the human brain, it should be an easy port."

    When questioned about actual applications for the port the group was vague and evasive. "Don't ask a geek why" said one member defensively. Another member, who asked not have has his name revealed, expressed an unfullfilled wish to download porn directly to his brain.

    The group has no timeframe for the Linux port to the Human brain but hopes to have a beta up on the net by the end of the year.

    Attempts to reach Linus, the legendary creator of Linux, for comment on the iniativive were unsuccessful.

    Alan Cox, a prominent Linux kernel hacker responded to news of the iniative with a short email saying that: "There might be some problems with virtual memory and a port to the Human brain, I'll have a patch out by the end of the week".

    Further information can be obtained at the Human brain Linux port web site www.humanlinux.com
  • A billion years ago, when I was in high school, :) most of the people in my social circle did exactly this. I assume that's part of why we were attracted to one another (apart from a love of computers long before such things were prevalent.)

    I also remember that it was a source of entertainment for others, who did not believe we could do such things. I can't help but notice the parallels with the term "geek" as used in "circus geek," someone whose uncommon attributes the shills are amazed by. We occasionally would "show off" those skills for their amusement.

  • "Branching is a particularly human activity. Other species cannot keep
    a goal in mind and at the same time switch to another task and then
    return to what they were originally doing without skipping a beat."
    I believe the above to be utter BS. I have never heard that claim before, nor seen anything to that effect in animals. And humans are not particularly good at multitasking either, the only things we can do at the same time are things that we have automated (and that means hard-wiring the neuron connections in the brain). Try concentrating on two or more things at the same time.. it's almost impossible. The very few individuals who can do that ends up as air traffic controllers or in the stock markets (those guys with all those telephones :-)
    TA
  • What you're describing is almost word-by-word the "self-test" you should try before applying to air traffic control school, according to an interview with some air traffic official in a newspaper the other day.
    If you can't do it then just forget about applying. Very few people can do it :-)
    TA
  • I also seems to have a kind of ADD. Read the Jargon File; most hackers seem to have it. I think it's good, too. Sometimes I get in a kind of deep hack mode, though I guess I need more training ;-) . I also read very fast, in that mode. It's interesting: you start reading slowly, then you get 'into' what you are reading and finish it pretty fast.

    And I also have that problem of 'getting lost' in nowhere (I lose the ability to concentrate, it's sometimes a problem but truly useful when you're in a waiting room for 1 hour and you have already read all the magazines there.) I'm 'training' how to change to those states (normal, 'lost', hyperfocus).

    BTW, a good slashdot pool seems to have surfaced here.
  • by Raven667 ( 14867 )
    I wonder if this is related to ADD (Attention Deficit Disorder). People who can't keep focused on one task and multitask habitually might be having seziures in this area. Just a shot in the dark.

  • Not to go all "Neal Stephenson" on ya but couldn't ideas be called a "brain virus"? Any powerful idea, say religion for example 8^) can be called a self replicating organism.
  • I was diagnosed as "Hyperactive" when I was younger and took Ritalin. I don't know if this is the same as ADD but it seems pretty close. The Ritalin helped but from what my mother tells me it left me in a wierd state. She says that one time I was sitting on the bed, staring into space, and just fell over onto my face--I apparently didn't notice.

    I am a habitually multitasking person, more efficient that way I think, and get into deep hack mode when I am interested in something. If I am not interested I tend not to deal with it--I barely did any homework in High School.

    I did do alot of reading though, even during class. The most that I have done is when I was deployed with the military in Bosnia. I was at a base camp where there was nothing to do and was able to finish a paperback novel every day for about 2 weeks.

    Rambling right along, when I focus I tend not to pay attention to anything else. I forget to eat, sleep and don't hear when people are shouting my name. I have had to have people shake me to get my attention before. Time passes randomly.

    Did I mention that I have trouble finishing proje
  • I was thinking of something a little more exact and fine grained.

    What is the answer to 143*4535/192+44=x
    .
    .
    .
    Too late.
  • Linking the psychology to the functional area is a big part of brain research. The most important advance that they could make, IMO, is to augment recall. If everyone had 20/20 memory how fast would the civilization of humanity evolve. That is the "killer app" that I hope for. That and maybe a math coprocessor.
  • Hehe, it makes you wonder just how much like a computer a brain really is. Are all our new ideas about parallel processing, memory management, data abstraction, multithreading, and multitasking rooted in the processes of our own brain?

    The way you worded that post, it almost sounds like "processes" in the brain "talk" to each other using some kind of standard "protocol"! Coolness. A reverse engineering project like that would almost make the efforts to decode DNA seem boring!
  • While we're at it, I'd add that 20/20 memory would mean memorizing much more sensory input than our brains are equipped to handle normally. The result: probalby something on the order of madness, paralysis, or even death the instant you tried to remember any event.

    You can see this in autistic (sp?) people. The sensory data isn't properly processed and it's just like a giant data-flow, and the brain can't cope with that.

  • The long term memory is open to debate but is pretty considerable. Consider how much bytes storing something like a picture takes and multiply that with all the memorable images we have in our minds.

    I don't think that's a good way to look at it when you don't even know how the 'pictures' are stored.

    Wagenaar did a study on long-term memory: his conclusion was that there are a lot of things you forget: even the events that are (were?) important to you.

    I think it's safe to say that (long-term) memory is pretty unreliable. Sure, it works OK, but for example in court eye-witnesses are considered to be good witnesses, but in most cases they are not.

    Memories get distorted, amended, forgotten, adjusted. Even the ones you're so sure about.
  • Sorta like the difference between a DSP and general purpose processor; at some point a processor is complex enough and powerful enough that loading and unloading programs into cache and memories is efficient enough, over dedicated processors for every function, that an OS scheduler is devised and implemented. As the process sees more use and becomes more of a bottleneck, dedicated hardware(like 3d cards, sound cards, SCSI cards, etc) develop to offload the tasks from the general purpose processor, though in a pinch the general purpose processor can do 3d geometry processing, or sound processing, handling device interrupts and stuff.

    I'd imagine that people getting side tracked may also be a function of discrimination; the brain fails to value the original task as of higher importance than the secondary task, and thus 'forgets' to switch back.


    -AS
  • by Anonymous Shepherd ( 17338 ) on Saturday May 15, 1999 @06:16AM (#1890832) Homepage
    Regardless if we decide that the human brain is a computer ala turing's machine, the human brain and being can and is already programmable and constantly being programmed...

    Look at commercials, advertisements, and marketing as functions of programming a person. Look at movies as ways in which to program a person to act, behave, and believe. Saturday morning cartoons program kids in what is 'right', 'wrong', and 'okay', and our politicians, our news agencies, our TV, they all program us towards what is socially acceptable and unacceptable.

    Ills such as racism, sexism, ageism, etc, can all be traced to unwanted side effects of social programming, unless of course you believe those are fundamentally built into the human psyche.


    -AS
  • Not quite as good as yuo had in mind in some respects, but allows you to ``remember'' things you never new. Check out the Remembrance Agent [mit.edu].

  • We alread y can and do, it's called brain-washing, torture, etc.... The only way to program a computer is through it's inputs, and that's exactly how brain-washing is accomplished on humans, by fiddling around with our input senses.

  • We do. Different people use it for different things, that's all.

  • More accurately, the brain is a mess of differentiated chips. You can walk, chew, and talk at the same time. It's only when you want to focus you conscious attention on more than one thing at a time that it becomes difficult. It's possible to do this, we do it all the time, but it's can be very tiring when taken to extremes, and the amount our conscious mind is able to handle is small. Most people have only one os, the rest of the brain is add-on cards (what about multiple personalities? can they consciously completely focus on more than one thing at a time, or are they merely swapping os's on the fly?).

  • I understand what you're saying, but these images, and the other images and ideas they cross reference to, would also be a language that's been programmed into you.

    My dreams tend to be alot in images, thhough words also play a part. Maybe it's just my obsessive compulsive behavior that causes me to repeat everything I write down or say in my head. I definitely think in non-word abstracts though, too, because sometimes I cannot figure out how to express what I mean.

  • We have a hardware basic neural network, this can be augmented while we're growing up, and afterwords, by what we sense. Movies, commercials and the like are such small parts of our everyday input that it's not suprising they don't affect us a tremendous amount (usually, some movies do, when those movies call on different programmingg (flashbacks)).

    Look at how well the mind can be programmed. We don't start out knowing our native language, but by the time we've learned it, we find it almost impossible to think without using it (unless we've learned another language very thoroughly). Look at concepts such as honor. It's not programmed into us, or if it is, not what kind of honor to follow.

  • Okay, at work at 8 am on a Sunday, I don't think I clearly wrote what I was trying to say.

    Most of your memories need cues so that you can remember them, these are older memories, that you think about less often, so they are further down in your subconscious (which is an abstract thing, not a concrete part of your brain, which I could point to and say, hey, there's your subconcious)
    Memories that are "floating closer to the surface" are more easily remembered, with less cues.

    I don't know about others, but sometimes when I am thinking, I drift off and think about other things, which in turn, makes me remember something I read, or saw, or something else someone did, anything. But if, I remembered every minute detail of everything, my thoughts would become jumbled, and I would never really return to my original train of thought, because just about anything would become a cue, since I remembered everything.

    hmm. That's probably still no clearer... I'm a comp sci major, not psych! heh.
  • by Anonymous Female ( 17974 ) on Saturday May 15, 1999 @12:35PM (#1890840)
    Some interesting things I have learned in Psychology 101:

    If we were to have a memory like so many of you are describing, it would almost be impossible to function normally. You would remember many many useless things. For example: What time you got up yesterday to go to the bathroom, what color socks you wore 3 weeks ago, what you ate for dinner months ago, etc. Your brain would be filled with unimportant information like this, and it would be very cluttered. I believe there have been some cases where people did have this type of memory, and they were unable to function normally, because stupid little details like these would interfere with their normal train of thought. This is why you only remember the pretty important things. All of your sensory input is taken in, and then sent to your hippocampus, which it will stay for a short period of time. Your brain then decides wether or not to store it in memory. Anything which involves emotion has a much greater chance of being stored permanently, or at least as long as it's important. To recall these memories - especially old ones, you need some sort of cue, which brings it out, and puts it into conscious thinking. Audition (sense of smell) is one of the most powerful cues. (Like smelling cookies baking, and then remembering grandma's house)
    And you don't usually forget things you'd like to, as someone pointed out - horrible war images. But other things, like car accidents, you may never remember, because if you were in the car accident, and you had any sort of blow to the head, at that point in time, information isn't being sent to the hippocampus, or if it's gotten there, it can't be encoded into memory, because at that point in time, a hard hit to the head would disrupt chemical activity, or neural pathways which allow this to happen.

    And the other thing I'd like to point out, the fact that we only use 2% (or whatever %) of our brain is really a myth. We do use all of our brains. There is implicit and explicit memory. Say you got amnesia, and didn't remember anything - yet you still remembered how to walk and talk, and perform other simple tasks. This is part of your memory - which I believe is stored in your cortex (I don't quite remember) Most animals have very thin cortices (plural of cortex?) The larger your cortex, the more capacity you have. (The size of your cortex is determined by what kind of enviornment you were raised in. An experiment done with rats showed that a rat left in a cage by itself while it was young did not develop as thick a cortex as rats put in a cage with other rats, and rat toys for it to play with)

    Woah long post - sorry about that. I just found Psychology 101 to be an amazing class (after being forced to take it as a requirement for an AI course, Human-computer interaction), and I would suggest anyone interested in the brain to take a few Psychology classes.
  • I agree here. Memories are NOT stored as a computer stores them. (except maybe in autistics, but I'm not going to get into that...) Rather, it seems to me that everything is stored in terms of concepts.

    In other words, when I look at a forest scene, I don't remember "green green green brown green green". Rather, you remember "tree tree tree rock". Then each of these concepts has subconcepts like branches, etc... and things relating to the specific type of tree/rock.

    Becuase of this, all memories are idealized to some extent to match the person's concept. It's rather akin to Platonic Ideas, for anyone who's interested. (I expect that sort of thought is what led him to his ideas, not to mention a bunch of other later philosophers).
  • This is a funny thing, when you try to represent the human brain with NAND gates. Because no two people remember things in the same way. For example, when I want to remember a math equasion, I picture the page in the book that the equasion is on. That doesn't sound computer like. Its some wierd stuff. I still can't imagine what would happen if we actually used that 80% of the brain that we supposedly don't.
    -davek
  • What the heck is the kill-9 department? Is it some kind of inside joke? Sounds very scary to me. Is it like "Plan 9 From Outer Space"?

    -----BEGIN ANNOYING SIG BLOCK-----
    Evan

  • Ahhh, now I see. Thank you very much. I have much to learn.

    -----BEGIN ANNOYING SIG BLOCK-----
    Evan

  • Only if those bunny-suit guys start making appearances in your dreams...

    -----BEGIN ANNOYING SIG BLOCK-----
    Evan

  • I don't think that's a good way to look at it when you don't even know how the 'pictures' are stored.
    Wagenaar did a study on long-term memory: his conclusion was that there are a lot of things you forget: even the events that are (were?) important to you.


    I know that models of how long term memory work are currently lacking. I believe the current models(aplasia and mammalian hippocampus model) predict long term memories of about a few hours based on the phosophorylation of receptor gated channels.

    I agree that the mind probably manufactures a lot of the detail we seem to remember(I'm not sure whether this applies to those who have "photographic memory" since I'm not sure what having such a memory entails). I remember reading about psychological studies where people recalled an event from their childhood with a lot of detail despite the fact that it never happened. As I pointed out, using a computer model to explain the brain's function is not necessarily a good thing. I would definitely hesistate to say that the brain has X +/- Y bytes of storage. I was just trying to show the poster that the brain has quite a lot of storage(certainly more than the equivalent of 2 MB).

  • Look at commercials, advertisements, and marketing as functions of programming a person. Look at movies as ways in which to program a person to act, behave, and believe. Saturday morning cartoons program kids in what is 'right', 'wrong', and 'okay', and our politicians, our news agencies, our TV, they all program us towards what is socially acceptable and unacceptable.

    I'm not too sure how much these things really affect us. Sure they have some influence but not the influence that you seem to attribute to them. These techniques seem more like nudging the mind in certain directions, they don't work all the time and they can't change the mind significantly.

    However you have to agree that we don't have the ability to compel people to carry out complex tasks in a way we tell them to. I believe this is the sense the original poster used it in.

    Actually alot of this seems to be a semantic problem. We use the word program in at least two different senses(cs, general usuage) without distinguishing between the two. Ills such as racism, sexism, ageism, etc, can all be traced to unwanted side effects of social programming, unless of course you believe those are fundamentally built into the human psyche.

    I'm not sure that these problems are the result of social programming. People seem to have a inborn need to classify objects in the world around them in order to predict and understand the world around them. We put all objects that resemble each other in a certain category so that we can use our experience with one object in the category to explain other objects in the category will work. Although the way people classify each other seem to be social(race, class, etc.) it seems to happen universally. So, I think that this is probably a result of our need to classify.

  • nteresting. If the brain does indeed have its own CPU, what speed does it run at? And how much cache?

    The short term memory of most people is about 7 things(numbers, names, etc.). The long term memory is open to debate but is pretty considerable. Consider how much bytes storing something like a picture takes and multiply that with all the memorable images we have in our minds.

    Speed wise, signals travel along neurons at about 30 m/s. People have estimated that the depth of a typical path has to be a 100 neurons or less to get a reaction time of about half a second to a second that is necessary to carry out a conversation. However a typical neuron is connected to several tens or hundreds of thousands of other neurons so the brain is massively parallel.

    All this assumes that the brain functions in a similar fashion to computers which is not necessarily a good assumption to make.

  • Admit it, the human brain is not all that good when thinking purely mathamathically. Computers are.(for now anyways)

    I disagree entirely. Numerically computers may be bettter but mathematically no. Computers aren't very good at proving theorems and coming up with new ones(the theorem provers out their really aren't that great despite a few interesting successes). Mathematics consists mostly of proving theorems and coming up with new ones. Thats what math people have been doing for the last 200+ years. Frankly from my experience with my math profs, I wouldn't trust them to multiply two reasonably large numbers without making a mistake.
  • by scheme ( 19778 ) on Saturday May 15, 1999 @06:08AM (#1890852)
    This does scare me a bit, however. After all, if the brain is a computer, that means it can be programmed. And if it can be programmed, you know a virus can be written. Or worse, Billy will decide to port Windows to the brain.

    Whether the brain is a computer is still a subject of considerable debate. The researchers seem to have located the region of the brain that allows people to jump back and forth between tasks. However, this does not mean that the brain handles this in the same way that computers do.

    Even if the brain were to operate like neural networks (the cs/math versions not the bio ones) do, the possibility of programming it would be in doubt. Currently no one really understands how a trained neural network operates or how to actually program it directly aside from randomly assigning the connections a weight and then training it. And this is for relatively simple neural networks with maybe 5 layers, the brain is considerable more complex. It'll be a while before we fully understand it, let alone be able to consider programming it. We don't even know how long term memory is kept/created and the only models around only explain memories that last for a few hours.

  • kill-9 in linux means "shut down this program, regardless of whatever state it is in. Do not wait for it go down gracefully or normally. Crash it like NT running on a Pentium 90."

    Joelinux
  • see the subject line. the whole "drug him till he's normal" mentality just pisses me off...

    different people are wired differently. it's A Good Thing. why people feel compelled change it?

  • It would be interesting if they could find out what the average number of simultanous tasks a human can effectivly do is. It would be cool to have a more efficient multi-tasking brain :)
  • I have, and I have no idea what you're talking about.

    What I believe he's talking about, in reference to LSD in connection with multithreading, is the fact that the human brain, on a non-trivial dose of LSD (or to a lesser extent, the tryptamines i.e. psilocybin, DMT) is an example of multithreading gone out of control.

    Each external stimuli starts a new thread, but for the LSD victim, there is no kill command available. The result is much the same as it would be in a computer that continued to spawn processes with no ability to terminate them. As each process demands its clock cycles, each process runs slower and slower in real time. This is experienced by the LSD victim as 'time slowing down, dude'.

    As the number of processes continues to increase, the victim is able to give consideration to more and more external stimuli, leading to the discovery of concepts never before entertained, such as 'wow, compact discs are so small they fit in your hand', and 'there is air on the moon, dude'.

    Eventually, the brain is dragged to a halt, each of one trillion processes demanding it's share of the processing unit's time. No meaningful computation can be performed at this point. The LSD vicitm's mind interprets this phenomenon as 'white light dude. I'm seein' God'.

    In the final stage, the ability to kill processes begins to reappear. One by one the threads are shut down, so that by 11 hours after the time the drug was administered, but two processes are left running.

    1) I've definitely gotta trip again next week
    and 2) I've been thinkin' about movin' to Austin

    "You know I've been here before, and I don't like it anymore........." -Spiritualized

  • Interesting. If the brain does indeed have its own CPU, what speed does it run at?
    And how much cache?
    I've always wondered things like this.
    And I suppose there were always times I wished my own brain had more RAM. I think I've got about 2 meg myself.

    And that's on good days.
  • I find it interesting that no one has mentioned people with eidetic memory (often called photographic). J.R.R. Tolkein is the first person to come to mind. He was (in addition to a fantasy writer) an Old English scholar. What was remarkable is that you could ask questions such as "What was the text in paragraph 4 on page 51" about any book he'd ever read and he could answer it. And these were some obscure texts in a language that isn't exactly English. I'd love to have recall like that. Now if they could figure out a way to turn THAT on, I'd be a happy camper.

    Skippy
  • Knew I should have run the spellchecker on that one... :-)
  • There's a case of a famous Russion mneumonist who had seemingly perfect recall of arbitrary data for unlimited periods of time. The psychologist who "discovered" him as a young man and followed him throughout his life documented incredible feats of memory, including remembering long strings of random numbers from tests that he'd given the subject 20 years earlier. Interestingly, having such an incredible memory impaired this man's ability to process things in the abstract- his mind was so overwhelmed by the detail of unforgotten memories that he couldn't effectively summarize or process information in the way that most of us can. If you want to read more about his case (it's fascinating), there's an english translation of the book written by the psychologist published by MIT Press called "The Mind of a Mneumonist".

    Some psychologists believe that the process of selectively forgetting is a very important part of being able to see past the minute details of an experience and capture the larger meaning. The process of forgetting is not as simple as losing bits in RAM- how much you remember is based on how the memory was encoded in the first place, what cues caused you to retreive it, what about a particular event was important to you, and what related things have happened since. But even "unimportant" details that aren't stored very well in the first place deteriorate into vague impressions that still allow you to have some sense for what was there.

    I can recommend a another book I'm reading right now called "Searching for Memory" by Daniel Schacter which gives a very readable description of the latest theories on memory function.
  • KILL(1) UNIX Reference Manual KILL(1)

    NAME
    kill - terminate or signal a process

    SYNOPSIS
    kill [-signal_name] pid ...
    kill [-signal_number] pid ...
    kill [-l]

    DESCRIPTION
    The kill utility sends the TERM signal to the processes specified by the
    pid operand(s).

    Only the super-user may send signals to other users' processes.

    The options are as follows:

    -l List the signal names.

    -signal_name
    A symbolic signal name specifying the signal to be sent instead
    of the default TERM. The -l option displays the signal names.

    -signal_number
    A non-negative decimal integer, specifying the signal to be sent
    instead of the default TERM.

    Some of the more commonly used signals:

    -1 -1 (super-user broadcast to all processes, or user
    broadcast to user's processes)
    0 0 (sh(1) only, signals all members of process group)
    2 INT (interrupt)
    3 QUIT (quit)
    6 ABRT (abort)
    9 KILL (non-catchable, non-ignorable kill)
    14 ALRM (alarm clock)
    15 TERM (software termination signal)

    --

  • Sorry, but you've completely mis-remembered George Miller's research. He was researching memory span, not simultaneous tasking -- the two are quite different. Miller found that after a single presentation people could remember, on average, seven separate items, be they numbers, words, etc.

    People can perform several different cognitive tasks "simultaneously." Until the skill is learned to the point where it is primarily performed via motor memory, however, it's really a serial process of attending to each task -- just like the kernel gives a few slices to each task requesting CPU time.

    (Damn, I *knew* that master's in cognitive psych would pay off some day! :-)
  • First, I didn't say get a life. I DID say get a girlfriend. I said it after reading the responses people gave.

    Anyways...

    All I see is talk of speeds and chewing gum. Deep hack mode is about indirectly distancing yourself from your ego void. So is asking a woman out on a date. Cracking does not count as deep hack mode and multitasking will not get you there.

    LSD is interesting to this story because as a friend of mine once told me he was better able to multitask on it. All his college work was positioned in windows of information in front of him as he sat on a beach. So yes a drug will help you multitask but being able to manage all that information is about skill not speed. BSD was written on marijuana not crack. It's the differennce between seeing a bunch of windows in front of your face or seeing the ceiling melt into floor. One is a useful enhancement. The other depending on how you look at it is entertainment.

    Consider trying to write an 8-part jazz piece with all parts going in all directions. It's funny. The whole concept of paying attention excludes the ability to multitask. Paying attention by definition means singling out parts which makes mutitasking impossible because you're constantly trying to listen and block each part out at the same time.

    The same goes for deep hack mode. You've finally managed to convince the rest of your body not to sneeze, fart, or in any way disturb you while you work.

    Ps.

    Ok maybe I snapped. All week I have been around the shallowest people you ever met. They constantly wanted to discuss solutions to Littleton. People who would revere Ben Stein because he knows what tree extract turpentine comes from. Sort of how Bill Gates keeps up with the times by reading every newspaper. Wonder if he's ever heard of Kafka or if he's ever seen Terry Gilliam's "Brazil". So when I saw silly stuff about speeds I lost it. Sorry.
  • by James Lanfear ( 34124 ) on Saturday May 15, 1999 @09:11AM (#1890876)
    From the sound of it, this has next to nothing to do with 'walking and chewing gum', which are parallel tasks, using different regions of the cortex at the same time--no task switching involved.

    What this sounds like, OTOH, is the ability to use the *same* part of the brain for multiple tasks. This would be an extraordinary system, having to store a *lot* of information about the current state of the brain, and then be able to retrieve it for switching. (There seems to be some overlap with memory here, too...) In an extreme case, it may actually be holding copies, albiet probably simplified, of whatever networks were being switched. In computing terms, this thing could not only be a task switcher, but a swapper as well, actually changing tasks by swapping the activation patterns in and out of networks.

    Speculating a bit, this system would also be a convenient way around nature's standard approach to multiple tasks, which is to evolve parallel systems. When behavior became sufficiently complex, it didn't make any sense to keep parallelizing for everything, so the switcher evolved as a 'quick-fix' that allowed the organisms to multitask new behavior before the better performing dedicated systems could be developed.

    Still speculating, it no doubt takes short cuts, which could explain why people are so poor at switching. (For one thing, it may not attempt to 'force' the switch. The current task, or a monitor, would have to request that the older one be swapped in--this would not only be necessary for the current task to be completed, but would explain why people get side-tracked; they 'forget' to swap the primary task back in.)

    As for a possible relationship to attention, I doubt it. Attention is a fairly old function, much older than cortex this region is part of, and isn't really switched, though the switcher may be *used* by the attention-function. (IIRC, the thalamus is believed to be the primary 'seat' of attention, which makes it *much* older and considerably more universal.)
  • Of course the human brain is a very powerful computer: just consider its awesome ability to process vision and speech data -amongst others- wich it's due to the power of its 100 billions of neurons and 100 trillions of synaptic connections (cf. Paul Churchland's "The Engine of Reason, the Seat of Soul", MIT Press). On the other hand, the human brain can compute functions not computable by Turing-complete machines (such as our PC's).

    But to claim that the human brain is "THE most powerful computer" based on the news that, finally, the coordinating center of brain activities has been found is too much daring, in my opinion. Please do mind that the human brain isn't as powerful a computer as Von Neumann machines: just consider chess!
  • You're right. The human brain's serial computing ability is very restricted, but its parallel computing capabilities are tremendously powerful. A good example of this, contrasting with artificial computing devices, its their limitation in processing speech and vision data as well as learning!

    I agree with you. But my point is the following: from the localization of the brain's coordinating area it cannot be inferred that -as it was afirmed in the original post- the human brain is the most powerful computing device in the Universe. Well, for some functions it is and for others it isn't; and that's for computing those functions for wich it doesn't have much computing power to effectively solve in a useful period of time, that computers were developed. and that is precisely the problem that Pascal, Leibniz, Babbage, Church, Turing, Von Neumann, Zuse, Atanasoff and many others atacked with their machines (logical or physical), or their blueprints.

    Now, this leads us to other question: will the development of Artificial Neural Networks lead to the production of such machines that exceed those of the human in what regards paralell processing, learning, vision and speech recognition/synthesis? I don't know. But 50 years ago who would know that a Von Neumann machine would play chess ABOVE grandmaster status?

"When the going gets tough, the tough get empirical." -- Jon Carroll

Working...