Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Science

How Much Give Can the Brain Take? 124

Your Mama writes "Just how malleable is the brain? How easily can a person overcome the forces -- genetic and environmental -- that shape a creature from birth? Over the last few weeks, evidence has emerged that throws these questions into a new light. The NY Times has the article" (The usual "free NYT registration required" notice would be here if we weren't so bored with it.)
This discussion has been archived. No new comments can be posted.

How Much Give Can the Brain Take?

Comments Filter:
  • Ok, I've reread the article. They do talk about the new research indicating that new nuerons can form in all parts of the brain. However, this really doesn't seem to be the point of the article. The new research is actually written of for ~20% of the article. The first half of the first section is a long introduction, and the second section is a review of the current thinking on the importance of the "brain-formative" years of early life. The article reminds me of many a high school paper that was stretched to meet a word count requirement.
  • "Why Michael Couldn't Hit (And Other Tales of the Neurology of Sports)" by Harold L. Klawans, M.D.

    The title is a reference to Michael Jordan, who despite being one of the greatest basketball athletes of all time, utterly failed to be a world-class baseball player. The book provides fascinating evidence that to be a world class athlete, there are windows of opportunity for learning particular skills during early childhood. If the window is missed, that person will never be capable of performing at a world class level.

    Even if you're not interested in sports, I think the material in the book is applicable to almost any human activity. For example, he talks about violinists, and differences in CAT scans of the brains of people who started before the age of 13, and after 13. Fascinating stuff.

    Learn why Babe Ruth was more superhuman than you might think... I love this quote (a footnote): "*None of this applies to Babe Ruth. He used the heaviest bat of any major league slugger, one that was clearly so heavy that its moment of inertia was so great that no one could possibly have used it successfully to hit major league fastballs. The Babe was blissfully ignorant of physics."

  • I guess that makes sense... Loss of empathy or
    the ability to predict the reactions of others
    could lead to what they probably mean when they
    say that.
  • by nido ( 102070 ) <nido56@noSPAm.yahoo.com> on Sunday October 24, 1999 @07:58AM (#1592135) Homepage
    An interesting field I've run across is that of accelerated learning. I've found Dr. Win Wenger's book, The Einstein Factor, an interesting read, covering many of the threads discussed on this news item. For example, in the book Wenger mentions a study of a group of Nuns (not sure where they are, I've lent my copy out to someone) who on live to age 90+. This group of people have had virtually no incidents of debilitating mental deterioration such as Alzheimer's disease. The premise of the book is that intelligence is not what you have, but what you make of it. The nuns don't get Alzheimer's because they use their brains constantly, keeping diaries and staying active well into their later years. Many interesting comments on how geniuses (Einstein & Tesla, among others) looked/thought about the world. Check out Win Wenger's Project Renaisance home page at http://www.winwenger.com/ [winwenger.com], or read about what Project Renainssance is [winwenger.com].
  • It's important to remember with information like this that while it has an effect in how we understand the brain, it doesn't change a hundred years of empirical science.

    The truth is that as we age, there are marked decreases in our ability to solve problems requiring fluid intelligence, quick responses to new situations, and incorporation of new information into a problem space.

    That doesn't mean there isn't plenty of anecdotal evidence of those 90 year old men Heinlein was so fond of, that could knock any 4 20 year olds on their cognitive asses, it just says that by and large, the trend is towards some slippage, especially in the areas of thought requiring flexibility and adaptability.

    If these neurons have been here being produced all along, then they obviously aren't sufficient to prevent the mind's downfall - maybe they slow it down, but the mind still falters.

    Now let me reverse myself a little... :) I think research like this shows promise, if it can be applied to the idea of increasing the neuronal reproduction, to reverse degenerative mental conditions, or even just age related wear and tear. Let's just not get carried away and say that because of this research, suddenly our old ideas about age and mentality are wrong.
  • The truth is that as we age, there are marked decreases in our ability to solve problems requiring fluid intelligence, quick responses to new situations, and incorporation of new information into a problem space.

    But why? There are two possible explanations. Either it's because (a) our brains are simply incapable of keeping up as they age, or (b) the user of that brain gives up as he/she gets older.

    The evidence seems to weigh in strongly on (b). That is, the older someone gets, the less they expect of themselves, and so the less they do. And because they do less, their brains decay, and like a flabby muscle becomes completely unusable.

    Like muscles in your body, I believe your brain can either be as sharp as a tack well into your 90's, or can become dull and flabby before your 40th birthday. It's just a matter of exercising your brain. Most people just choose not to exercise their brain, just as they choose not to exercise their body--and the whole thing becomes a flabby mess.

    I knew a man who was totally out of shape who enrolled in a flow Yoga class in his mid 60's--and 10 years later does better Yoga than people a third his age. You *can* reverse the effects of aging, either physically or mentally, by getting off your ass and doing something about it.

    Too bad most people just allow themselves to decay.
  • If one looks at the article - it mentions THOUSANDS of new cells migrating in the monkey brains. If I recall basic pathology correctly, the human brain has on the order of TRILLIONS of cells. No mention is made of whether any functional improvement or difference was noted and the article even mention that no one knows what the new neurons were doing.
    Chimps differ from we humans by approx a 1000 genes out of 100,000 or so and their brains are maybe a 1/3 of our size in the higher areas. It is reasonable to infer that if this magnitude of repair or growth occured in humans then not much of an appreciable difference would occur. Much of the (re)learning that occurs in traumatic brain injured patients(people and monkeys) involves new neural CONNECTIONS between cells - not the new growth of cells.
    Other physiologic processess are also occuring in the body which may play a role in the hinderence of developing new neurons. A child can fluently learn multiple languages easily, yet an adult will with difficultly learn one or two with an accent. An infants or childs brain is still growing and has not had its developmental proceses shut down by the effects of maturation hormones.
    Future research will probably concentrate on whether the adult brain cells are at the end of their lifetime or if they can be induced to be young again and divide and produce new neurons.
  • right and wrong are not subjective, rather whatever society agrees upon. these are known as laws. by living in this country you have implicitaly agreed to obey these laws, or to accept punishment for violating them. if you do not like these laws then leave. otherwise do what you will within them
  • Actually what the nun study showed was that they suffered less of the typical effects of senility . They would still be suseptible to Alzheimers which is a disease process by which an abnormal protease (a protein that digests or cuts another protein) leaves waste snipets of protein outside the cell producing the characteristic Alzheimer plaques. This disease is unaffected by how much you "exercise your brain". Amgen and Harvard researchers published the discovery of the Alzheimer protease this week in Science - follow this link for synopsis
  • >> Humans are born with instincts, but anything can be learned whether one has his underlying machinery in order or not.

    Oh, really? Try ingesting a fifth of whiskey daily for a few months and let me know how your hacking goes...
    - freehand
  • How is this Nazi-like? I would love it if my great-grandkids could escape my bad eyes, allergies, and Asperger's Syndrome. If they were smarter that would be terrific. Read Heinlein's "Beyond This Horizon".
    Would you accuse your dentist of being a Nazi? And no, I don't give a damn what color or gender my descendants are.
  • It is very easy to misinterpret empirical evidence. What the evidence doesn't show is why the decrease occurs. Our society is far less respectiful of our elders than many others in the world. The job market (especially the computer industry) pushes them out. TV doesn't seems not to have shows with anyone over 25 that isn't the parent of a leading character. This kind of attitude doesn't help someone older feel smart or flexible.
    What is really strange to think is that our country is run by people much older than this hip teens to twenties age group. If people get older and lose mental faculties than how can you explain some of the senior cititzens in the senate that still have all their marbles.
    What about the FED? Allen Greenspan is a genius. He is able to understand a large interconnected world macro economy better than anyone. He has helped creat the longest time or prosperity that this nation has seen in quite awhile. He is losing fluid intelligence? Lord, I hope not! I know a couple of people that work in the New York office and they say that Greenspan is as sharp as ever.
    Then comes industry. Forget the new "billionaire before thrity" thing that Bill Gates started. Most CEOs of big successful companies are older people. How can you explain this?
    If you are right then we have a lot to worry about when it comes to voting. As you go up in age group the percentage of registered voters that actually go vote increases. Older people are a powerful voting block that always gets out. Meanwhile the youth of this coutry wine about needing internet voting to make it easier. If grandpa can walk there on his walker, then we should be out there.
    It comes down to what individuals do. My mother is a 50 something accountant. She has done accounting for years and spent most of the last 15 with the same company using the same software. She has gotten a bit set in her ways. It is no surprise that when her boss left and she assumed some of his duties including handling investments and some system administration, she was a bit worried. She panicked a few times but now she is getting the hang of it. She learned it faster than a lot of the kids in my CS and Finance classes in college.
    Speaking of college, how come adults who go back to school do routinly better on average than their younger "smarter" counterparts?
    It is not that people lose the ability to learn, it's just that they get out of the habit and get into the habit of doing the same thing every day. All changes create stress, even good things like learning something new. As people get older they learn to avoid stress. This normaly healthy reaction can have very unhealthy results as older people start to fear learning new things. People that have not been burned by stress as much do not have as much fear of new things. This is why some older people will not touch a computer, while others run their own irc chat group.
  • Actually what the nun study showed was that they suffered less of the typical effects of senility . They would still be suseptible to Alzheimers which is a disease process by which an abnormal protease (a protein that digests or cuts another protein) leaves waste snipets of protein outside the cell producing the characteristic Alzheimer plaques. This disease is unaffected by how much you "exercise your brain". Amgen and Harvard researchers published the discovery of the Alzheimer protease this week in Science - follow this link for synopsis
  • How is it that some *younger* people can't use a computer, while others can easily network their home?
  • Check out an old book by Glenn Doman "Teach Your Baby to Read". It's about teaching babies to read. The ideal age for learning to read is 24 months according to this guy...

    Now the really neet thing is that I'm doing it with my 15 month old son. I print out in big bold caps each new word he learns to say (eg. CRACKER, MAMA, DOGGY, BOOT etc.) and show it to him, say it to him and put it in context a few times a day. In two weeks of doing this, he can easily identify 8 words presented in random order mixed with words he doesn't know. I only spend 10 or 15 minutes a day doing this. And the ease with which he learns new words is increasing.

    So where am I going with this (certainly my purpose is not just to brag about my son :)? We are constantly robbing our children of their true potential by underestimating their abilities. The same is probably true for adults if this article is correct. For all you who are parents out there, take 15 or 20 minutes a day to teach your children, and you will be astounded!

    Most of us geeks know that humans can learn much more than school presents us with. Its a matter of challenge, interest and love, not ability.

  • by Zoyd ( 13778 ) on Sunday October 24, 1999 @06:48PM (#1592150)
    Like most scientific folk wisdom ("You use only 10 percent of your brain power," "Right-brained people are more creative") the Mozart effect, as it is sometimes called, is extrapolated from research whose meaning is open to debate.

    There is an interesting Skeptical Inquirer article [csicop.org] debunking the 10-percent myth.
  • 1 S4W 4 M0V13 L4$T W33K WH3R3 $C13NT1$T$ F0UND 0UT 4 W4Y 2 R3GE3N3R4T3 BR41N
    C3LLZ. TH31R 3XP3R1M3NT W0RK3D, 4ND TH3Y H4D 4 CUR3 F0R 0LDT1M3RZ D1S3AS3!!1
    JU$T TRY 2 1M3G1NE: S0M3 D4Y, R0N4LD R34G4N M1GHT B3 LUC1D!!!!!!!11

    TH1$ T3CHN0L0GY M1GHT B3 4V41L4BL3 2 3V3RY0N3 PR3TTY S00N, T00. TH3 0NLY C4TCH
    1Z TH4T M0$T 0F TH3 SC13NT1STS G0T 34T3N BY G14NT $H4RK$, S0 1T M1GHT T4K3 4
    WH1L3 2 R3C0N$TRUCT TH31R W0RK. BUMM3R. 3NYW4Y, TH3 N4M3 0F TH3 D0CUM3NT4RY
    1Z ``D33P BLU3 S34'' S0 D0N"T M1$$ 1T 1F U R 1NT3R3$T3D 1N TH1$ $0RT 0F TH1NG.

    l8er d00dz!!!!!!!$%*$%*$!!!!!!!!!!11
    :WQ
    :wq
    ------ ------ ------
    ALL HA1L B1FF, TH3 M05T 31337 D00D!!!!!1
    ------ ------ ------
    ALL HA1L B1FF, TH3 M05T 31337 D00D!!!!!1
  • There's no sense in highlighting the fact that there are a lot of things that we don't know. This makes for a very useless description of the world, and science is supposed to be useful. ("How fast do things fall?" "I don't know." "But I heard about G*m_1*m_2/r^2..." "I don't know that for *sure*, that's just our best guess right now.")

    There is a good (and simple) reason why the dogma became that no neurons were formed in the adult brain. People looked and didn't find any. Of course, all they could say at that point was that it seemed unlikely that more than N neurons were being formed per time T. The simplest hypothesis is that zero are formed. In the absence of other evidence, that wins.

    Were some scientists a bit too confident in the hypothesis? Yes, probably. Were they all shocked? Nope. Not everyone dropped the ball...it just reads that way in the popular press. ("Humans Grow No New Neurons!!!!" "Humans Do Grow New Neurons!!!!!")

    Also, for what it's worth, 1k neurons/day isn't very many. You have between 10^10 and 10^11 neurons in your brain. (They're hard to count; an order of magnitude is about as honest as we can realistically get.) 10^10/10^3 = 10^7 days or about 30,000 years to replace all your brain's neurons. Unfortunately, you tend to die after about 80. Okay, but this at least shows that neural regeneration is *possible*, so we could maybe find a way to increase the rate...we couldn't have done that if adults didn't make any neurons, right? Actually we could have: humans make neurons when they are younger; we'd just have had to do a bit more tweaking of developmental regulation pathways to go from "zero" to "enough" instead of "a few" to "enough".

    The bottom line is that it's cool for neurobiologists, but you still have to take care of the neurons you've got (and hopefully make them connect in computationally useful ways); yeah, you'll get a few more, but not enough to save you from stupidity.

    • --Ichoran
  • right and wrong are not subjective, rather whatever society agrees upon. these are known as laws.

    Oh, it's that simple. Silly me. But, I happen to dislike some of the laws in my country, I my point of view they wrong, yet I obey them.

    Apparently, what you call right and wrong, seems to be what I call legal and illegal.
  • Sorry, a typo. Here is what I meant to say:
    right and wrong are not subjective, rather whatever society agrees upon. these are known as laws.

    Oh, it's that simple. Silly me. But, I happen to dislike some of the laws in my country, In my point of view they are wrong, yet I obey them.

    Apparently, what you call right and wrong, seems to be what I call legal and illegal.
  • yep.

    thanks for the correction..."o" and "p" are a litte closer together than the limit of my optical resolution at the moment.

    Somebody did a recent follow-up. It was a comparative study of the behavior of two sets of twins raised in 'similar' (whatever that means) social environments; one member of each pair had suffered damage very similar to that which our friend Phineas experienced (they preserved his skull; later, A team of Hrvard Surgeons reconstructed the probable damage--seen in a SciAM article-in order to identify the brain structures responsible for constructing and preserving our moral attitudes. Said skull still sits on diplay at harvard, above the entrance to their neurobio museum. Creepy.).

    The conclusion reached by the study was that the role of the brain ares in quesiotn in regulating moral behavior was unquestionable. The mechanism is still completetly unknown and knowledge of the structures involved remains imprecise (the specific structures damaged in each of the recent cases were understandibly different, with some srtucutres destroyed in both cases, and others in one or the other, but not both)

    Of course, both the twins and Gage likely lost a bit more of themselves then the ability to determine and/or care about right from wrong, so we arent likel yot be able to isolate only those structures responsible for the creation of moral behavior for quite some time, if ever, but it seems interesting to me regardless.

    Maybe someone else more cpable of coherent thought and writing (read:anyone who sees fewer than three keyboards in front of them) who has read the articles could respond?

    please?

  • How is it that some older people can't comprehend an ATM, while others can easily use a
    computer? I don't think it is related to how smart you are, but maybe it is a function of how many
    brain cells are being created or how their bodies manage to incorporate them into their existing
    brain.

    The geezers I've met that are bright, have always been smart people. Not in the way that you're born smart, but in the way that they've never stopped learning. They read decent books, not absorb gamma rays all day. They have always had an interest in technology, not suprised by these computer gizmos CNN keeps talking about. They developed critical thinking skills and use them, not your typical confused the-way-things-were geezer.

    I've read a little research on alzheimers about the same thing, if you aren't using it, you're losing it.

  • ...then why do older people seem to learn much slower and often forget things? Is it even possible to really ever forget something?

    1)Rate of neuronal death: increases with time

    Rate of Neurogenesis: Decreases with time, initally very small when compared to (1)

    as soon as you lose the neural structure that encodes the memories you have recorded-howver THAT works (*probably* strength of synaptic connections), you lose the memories

  • by Wooly-Mammoth ( 105587 ) on Saturday October 23, 1999 @11:17PM (#1592158)
    NPR Science Friday had a "Brain Update" show which examines various issues in detail, with Charles Gross on the program (the one who conducted the monkey expt.) Worth listening to, since so many are complaining about lack of info. Here's the link [npr.org]

    http://search.npr.org/cf/cmn/cmnpd01fm.cfm?PrgDa te=10/22/1999&PrgID=5

    Needs Real Audio, btw.

    Wooly Mammoth.
  • Given that people thought for a long time that the adult brain was pretty much unmalleable, going from there and saying that people now aren't so sure seems to be pretty informative and interesting.

  • The article mentions some people who have
    "lost the ability to tell right from wrong"
    through neurological damange. As right and
    wrong are subjective classifications, I wonder
    what exactly they mean by that


    Perhaps a more objective way of stating this would be that these people are unable to grasp societal norms, or even if they did, do not see the need to comply with them.
  • Yes, it seems that as we get older, we lose our sense of play and curiosity. While I think simple aging does play a very important role, I think a large part of this due is societal pressure: as we become adults, we're more and more made to conform, to act 'responsibly', to behave in a certain manner. Adults are not supposed stop and pick up a strange looking object on the street just because it looks fascinating. There's a label for people who continue to do so: geek or nerd ... Totally 'uncool'. Hormones kick in: finding and impressing a mate is more important than learning something new. Learning takes the back seat to the pressures of everyday life.

  • Thanks for the clarification... I wish the moderators weren't so trigger-happy. :/

    --

  • Arrgh! Yes, to some extent right and wrong are subjective - is it wrong to wear yellow? Is it right to like the spice girls? Is intellectual property theft? These small, human-invented things that differ from culture to culture are indeed subjective, and to a small extent you could call them right-vs-wrong.

    But... there are a lot of things that are universal- across all cultures. No society would think it's a good thing to kill your folks. That society would disappear after just a few generations. Many cross-cultural moral values are reflected in language, for example; every language has a word for "murder" - as defined as wrongful killing. Murder could be considered universally bad. Now, a death could be taken as murder by one culture, and justifyable killing by another, so the standards for murder don't hold, but the idea of murder does.

    Subjectivism irks me. It makes me want to shake the subjectivist vigorously and yell "Grow up!" Thank God my compatriots and I define this as acceptable... ;)

  • The article mentions some people who have "lost the ability to tell right from wrong" through neurological damange. As right and wrong are subjective classifications, I wonder what exactly they mean by that. Could they mean losing the abililty to empathize with others?

    There's a longer and very interesting article on Dr Damasio's research in this week's edition of the Economist [economist.com].

    But unfortunately it's not one of the ones they're giving away free on the website, so here are a few paragraphs cut down from the full article. If any of the Economist science writers are out there (I know you sometimes read Slashdot), thanks for a fascinating piece!

    "The two patients, now in their early 20s, had survived similar injuries to their pre-frontal cortexes during infancy... Both had lived lonely maladjusted lives with no plans for the future and unfortunate personal habits such as compulsive lying, petty theft, poor hygiene, irregular sex lives, and indifference to their resulting children

    "In the set of tests psychologists have dreamt up to quantify moral health, their performance clearly bespoke an ignorance of the conceptual foundations of morality... The patients' responses to these tests were about the same as those that would have been expected from a ten-year-old child: that is, they appeared to be motivated exclusively by a desire to avoid personal punishment [ignoring any wider moral implications]. This degree of pathology is significantly more serious than those found in those who suffer brain damage as adults.

    Also, the two "seemed never to have learnt any of the basic moral rules that cover social interaction, apparently because their early traumas prevented them from ever acquiring this sort of information... to all appearances, they suffer from little or no sense of remorse at their behaviour.

    "Neurobiologists have long known that the brain can compensate for some sorts of injury that are sustained during the course of its development. It does this by recruiting new sets of nerve cells to substitute for those that have been damaged or destroyed. The pre-frontal cortex, however, seems to be unable to repair itself adequately in this way, leaving infant victims with no means of learning right from wrong during the relevant period of their growth.

    "The solution, Dr Damasio postulates, may lie in helping that part of the brain to annex more nerve cells, by carefully adjusting levels of the relevant neurotransmitters..."

    Thus, even if Dr Gould and Dr Gross are right that there are always new neurons migrating to the pre-frontal cortex, it seems they can't always be integrated.

  • by mouseman ( 54425 ) on Sunday October 24, 1999 @01:25AM (#1592166) Homepage
    I don't think it's quite that simple. It's an unfortunate but inescapable fact that our bodies deteriorate with age (as I'm starting to discover... :-). It would be nice if our brains were immune, but they're not. The fact that we make new neurons is encouraging, but we make new cells throughout our bodies, and that doesn't stop the aging process.

    Although it's possible that a simple tweak will completely prevent the slow slide into senility, I doubt it. The brain is a complex beast, and there are just too damn many things that can go wrong with it. Granted, I've met people lucky enough to remain very sharp late in their lives, and the wisdom that comes from experience often more than makes up for reduced mental agility, but the fact remains that the brain does suffer from old age.

    The body is like a lot like a microsoft os: when you first start it up, it's in a fairly simple, clean, state, but the longer it runs, the more random cruft and unanticipated mutations it accumulates, until it becomes incredibly unstable and finally crashes. After all, both are a collection of hacks generated by the same design process: tweak the code and ship if, say, the mean time between failures is twice the expected period of operation. for windows, the expected period of operation is 8 hours (by then, the average user has shut it down for the night), so it should run, on average, for 16 hours before crashing. for the human body, it's about 25-35 years (by then, like any good hunter-gatherer, you've had your kids, raised them to maturity and been eaten by a saber-toothed tiger), so MTBF should be 50-70 years. with medical technology, that's been pushed back a bit, but we're operating way out of spec, and our bodies come with no warrantee, express or implied, regarding fitness of use for any purpose, including, but not limited to, survival, reproduction, or the ability to operate a computer in the wee ours of the morning while retaining sufficient mental capacity to post a reply to /. that doesn't degenerate into a rambling jumble of twisted metaphors, all different.

    any resemblance between the above post and the writings of a sane person are purely coincidental.

  • The nuns don't get Alzheimer's because they use their brains constantly, keeping diaries and staying active well into their later years.

    My grandmother regularly read newspapers, and led an active social life. She even practised memorising long sequences of digits to keep her mind crisp. She now has Alzheimer's...

    There could be a lot of other reasons for the phenomenon you describe. For instance, nuns have a lot of habits that differ from those most other people have. Maybe some special kind of diet, and plenty of physical exercise did the trick?!

    On the other hand, maybe mental exercise does prevent Alzheimer's, but only in the statistical sense...
  • But... there are a lot of things that are universal- across all cultures. No society would think it's a good thing to kill your folks. That society would disappear after just a few generations.

    Ever heard of mercy killing?

    IMHO universal right and wrong is just an illusion we get from the abstract concepts the use of language impose on us. Ofcourse "Murder" is wrong, in the abstract sense, but as soon as you deal with an actual experienced situation, the choice of whether it is right or wrong will depend on the actual situation, and previous experiences of the subject deciding between right and wrong.
  • Thanks!

    IMHO this one is much more well written than the NYT one. I'll check out the mentioned article in Science (Oct 15 issue) tomorrow...
  • Age has nothing to do with new tricks. Old dogs _can_be taught new tricks, and old people _can_ learn new things.
    It's a stupid cliche really. The real deal is "You can't teach an old dog STUPID tricks."


    I agree completely. But as you grow older you'll tend to find more things stupid, because you know better--or at least think you do ;)
  • The way I see the human mind, and the way I've often observed my own mind to be, is like that of a very large, very complex computer program.

    You are in good company, some AI folks have believed so for decades. IMHO the computer analogy is a much too simplified model for our minds. It leads to all sorts of weird conclusions, such as the idea that there might be a sequence of words (a program in your analogy) that would cause the mind to hang. Also the mind performs many things in parallel, for instance all sensory input (vision, audio etc) is preprocessed in massive parallelism before it reaches conscious experience.
  • cypherpunks/cypherpunks used to work, and the NYT has decided their ability to control hacker readers is more important than their demographics of people actually reading their e-rag. So they appear to be reading Slashdot, and _killing_ accounts mentioned here.
    I read the article, and I didn't sign up for anything. All you have to do is figure out password variations quicker than they can make them invalid. I hit one that has not been made invalid. I'm not going to repeat it here. It can just stay valid, thank you- representing the shared, passed-around newspaper of some clever hacker who could hit on a combination that another hacker could guess, but which would not be tracked and invalidated by whatever lunatic at the Times is responsible for at all costs making sure people _don't_ read the Times without logging in. :P
    Here's a question. If RMS reads the times, with his traditional login/PW of rms/rms, do they refuse to allow him the password he uses for himself, just because he lets other people know it? For a basically trivial information site which one would think they'd _want_ readership for?
  • I thought one of the major problems with kids learning to read was making them memorize words. Phonics, though not perfect, appears to be better in the long run. It makes unfamiliar words more readable, as it is less guessing and more thinking. Spelling is supposed to be easier too.

    I learned to read at 3 yrs old using a phonics based approach. It really helps to read early, since you can learn on your own from then on.
  • Back in the 50s everyone thought nurture was the case, then in the late 70s things switched to nature, and now we are back to the correct answer: nurture.

    The world is not binary, at least not the one outside my window ;)
    The debate is not about if it is nature OR nurture. It's about to which degree they both matter.
  • It is definately true that phonics is better for young children. However, for babies, memorizing words is the only way possible because usually their ability to differentiate phonemes is very poor and teaching them the abstractions of letter symbols is very difficult.

    There is a sweet spot somewhere in the range of 2 years to 5 years old when phonics is really good. (See the book "Teach Your Child to Read in 60 Days" - I can't remember the author's name - which discusses a purely phonics based method of early reading instruction.)

    However, in the long run, the use of phonics tends to limit the speed at which one can read because of its connection to a linear, vocalized "meaning" of the words and symbols. Spead reading is only possible by returning to a system of visual recognition and processing of text. I haven't done too much research on this, but I expect that after doing a couple of years of phonics, it would be beneficial to start deliberately moving back to a purely visual system.

    Come to think of it, the ability of adults to learn to speed read (to transition from a vocalized from of reading to a visual form) is probably a supporting example of the flexibility of the mind in "later" years. I've tried to learn to speed read myself, with some very limited success (when I'm in practice, I can read at about 500 wpm - my normal speed is about 300 wpm).

    One last comment is to reiterate that I believe the important issue is not so much the specific technique of instruction but that the first five years of a child's life should include much more deliberate, challenging, interesting and loving education - and unfortunately, for most kids that means getting some at all! Maria Montessori wrote way back in the late 19th century that those early years were extremely important and I don't think that these new discoveries in any way invalidate this, rather they just point to the possibility that remedial efforts might have more success than we would otherwise think possible.

  • I just love the MS analogy ;)

    However I think you could take it a bit further. If the mind already has a model that explains a phenomenon sufficiently well, there is a resistance to learn a new model that might be better, instead you just make small modifications on your old one.
    Just like the MS Word app BTW ;)
  • Nice comment, should have higher score IMHO...

    So reluctance to learning might be a cultural trait? Probably, but I think this is a secondary effect that merely reinforces the trend. The main cause I believe is that (as you mentioned) the need to learn decreases, and the need to learn decreases because we know a lot of useful stuff already...
  • What's even more interesting is asking someone to say the alphabet backwards as fast as they can forwards :)
    ----------
  • Because one chose the path of learning how to do such things? I may have led a particularly technology minded life -- but that doesn't mean that someone who focussed their energies on something completely different doesn't have the aptitude or ability to do such things.

    I struggle daily with the 100,000 things I could or should be learning. You could say I like to learn. However, I then realize that if my situation were different then I wouldn't worry so much about such things because I wouldn't be driven by the need to succeed and I probably wouldn't care to learn anything at all besides what i consider "fun".

    You can psycho-analyze a young and older person and you will find that both are particularly opinionated. I know I am. The fact still remains, though, that younger people are still exploring their possibilities so they may seem more open to learning. However, as I'm sure some people would agree, once you get locked into a particular profession, you tend to not really care about anything else except that which will improve your job. There are just too many things to focus on in your own area of expertise to give a s*** about "new-fangled" technology. I know if I was 60, I would rather be manipulating my retirement plan to accomodate my need to travel while brushing up on my golf stroke.

    Yes, knowledge is sometimes motivating and even fun -- but most people don't care about such things that don't directly affect them (you can argue that techno geeks go out of their way to learn such things, but I'd just say that it's in their area of expertise). Do most people care -- as given in an example further up -- how Asynchronous Transfer Mode protocol works and how it's used on every non-legacy telephone network now? Nope.
    ----------
  • Yeah really. I usually just go to one story and find somewhat interesting comments to add points to -- as well as moderating down flame comments that have no point at all.

    Nevermind reading the second or third pages of comments at all :). The story would have to be incredibly interesting to spend 3 hours reading comments :).
    ----------
  • It's funny you refer to me as, ahem, a "yankie prick", given
    that I am a fairly recent immigrant from Europe.
    I wouldn't call Haider's election marginal or freak, he has
    held high posts before, and came back after being forced out,
    so this shows clear support for his views. Indeed my understanding
    is that some other prominent Austrian politicians have had a
    direct relationship with Hitler's SS and Hitleryugend. So specifically in
    the case of Austria, Haider seems to be the rule not the exception.

    Except for strange criticism of freedom of speech (for KKK or whoever),
    I concur with your opinion of America. You seem to be just like our
    politicians here - slinging mud at others hoping it'll make yourself
    look better.
    But remember, my point was that Buchannan is similar to Haider, whom
    you just called a "marginal ultra conservative" - a very fitting description
    of Buchannan as well. And republicans vary widely, from very conservative
    to very liberal. Most even desire to save social security - the ultimate
    big government program around these places.
  • Oh, and that quote "man of character" was how Haider referred to
    SS veterans.
  • It seems to me that this Times article is saying "Noone knows for sure, but some people think this, and others think the opposite." Write an article when someone can actually influence the arguement one way or the other.
  • Quite interesting reading.
    So now we know that the brain improves itself.
    So with the proper technology applied, in some distant future, it should be possible to increase intelligence (a function of neural connections) without limit.
    HEADLINE: The entire human race turns geek.

    Evolution anybody?
  • So if new cells are continually being added to the brain, then there is no reason why we can't stop learning.

    And if new cells are perpetually being created and the incorporation of these new cells in our brain allows us to change our thinking, then there is no reason why we can't accept new ideas.

    So does this mean that it is possible to teach an old dog new tricks? While people are obviously going to be different, I think it is a fair statement to say that older people tend to be more fixed in their opinions and less willing to change them. If new cells are being created all the time and this enables our brains to be malleable, then is simply stubboness or our attitudes that prevent us from being open to other people's ideas or new technologies?

    How is it that some older people can't comprehend an ATM, while others can easily use a computer? I don't think it is related to how smart you are, but maybe it is a function of how many brain cells are being created or how their bodies manage to incorporate them into their existing brain.

    I think the research they are carrying out has some very interesting ramifications and opens up a lot of questions about ourselves and our learning ability over time.

  • I think this is admirable in a newspaper. I'm sick of papers with slanted arguments - this one seems to actually be presenting _BOTH_ sides of the picture!

    Go NY Times!
  • by Improv ( 2467 )
    The article mentions some people who have
    "lost the ability to tell right from wrong"
    through neurological damange. As right and
    wrong are subjective classifications, I wonder
    what exactly they mean by that. Could they mean
    losing the abililty to empathize with others?
  • It seems like every week I read about debate on enhancing this, fixing that. Someone needs to get the balls, lawyers, and volunteers to actually test on a human.
  • For all the non-American's: Buchanan
    is one of the smartest analysts around,
    yet he is also one of the most bigoted
    people I am aware of (almost like that
    Austrian furer, Heider or something).
  • The result is a world of individuals, a
    place where the very idea of a brain transplant is absurd.

    Absurd? Maybe in the way you can get a new brain, but not a new body. Once we make tools small and smart enough to re-connect even the tinniest nerves you'll be trading that old meat for the newest model. "Honey, Look! This one comes with a 10-year snore free guarantee!"

  • Not really. This article didn't have a lot of conviction behind it, and I'd be hard pressed to say that it convinced me that that sort of thing would be possible. And without limit? There's a limit to everything, the trick is just finding it.

    -----------

    "You can't shake the Devil's hand and say you're only kidding."

  • Thus is the story in just about EVERY article on the brain. And that's how the scientists are, too. I work in a large neuroscience department (as a sysadmin, certainly not a researcher... although I wouldn't be surprised if I've been a subject unknowingly... :> ) and there isn't a single pair of doctors or grad students in the place that have the same "theory" about any aspect of how the brain works, is structured, or anything else.

    -Chris
  • Don't missunderstand me. I agree that unbiased reporting is a Good Thing. What I was trying to express is that I feel that an article isn't worthwhile unless there is new research that supports one of the two scientific camps' positions.
  • We (our brains) are shaped in two ways:

    No, your mind is shaped by Nature and Nurture. Your brain is physically created by physical processes (is that a duh?). These processes have nothing to do with the time Billy beat you up and took your lunch money. They have everything to do with how cells are instructed in the creation of other cells. These instructions are in the DNA

    Daniel
  • I completely agree. When the article started out, it started saying that new evidence has come forth supporting that neurons can be regenerated, but never really gave much evidence.

    In short, the article just didn't present anything new. That would be okay, normally, as review articles are fine. However, the article is rather poorly written imho in that it doesn't clearly state what it's going to be about at the beginning, leaving the reader guessing.

    And after I finished reading it, I still felt like the article had no real point.

    It's certainly not worthy of being posted to /.
  • by Anonymous Coward
    now a utilty like purge -f /dev/brain for all that useless info we learned in school.. would be useful .. yah?
  • Don't give me this crap about how right is
    objective.. Isn't that a clever argument? Feh.
  • Maybe someone else more cpable of coherent thought and writing (read:anyone who sees fewer than three keyboards in front of them) who has read the articles could respond?

    I haven't read those articles, but I've read

    • An Anthropologist on Mars
    and
    • The Man Who Mistook His Wife For a Hat
    , both by Oliver Sacks. Good reading.

    Anywho, though I can't remember which book it's in, Sacks relates a couple case histories of people who took damage to their fronal lobes, similar to your classical lobotomy. One is the great old story about the man who was tamping dynomite and had the rod shoved through his head. He then examines the details and extrapolates some interesting posibilities. It's a bit old, but still relevant, I think.

    What I seem to remember is that this man who had the tamping rod blown through his frontal lobe became immoral, like these other subjects you talk about. But the immorality is relative, right. The trick is that he would have been concidered immoral in any culture, because what he really lost was his ability to forsee the consequences of his actions. As a another side effect, he also became a much worse chess player...

    I also seem to remember that the other case history involved a juvinile delinquent who became "moral" after suffering damage to his frontal lobe. Fun, no?

    That's why labotomies have been practiced. They actually get some results. [Shudder]

    Anyway, it'd be neat if some neurologist, like Sacks himself, read this thread and added to it.

    If you know any neurologists, poke them and prod them into participating, ok? In the mean time, I'll go hunt one down...

  • On a side note, if it is true, that we do continually form new neurons, then why do older people seem to learn much slower and often forget things? Is it even possible to really ever forget something?

    New information can replace old by means of synaptic changes--happens all the time in ANNs, where no new nodes are added once training has started. I doubt that every time you learn something, you get a new neuron.
    And the case that old people seem to learn much slower: It takes less time to learn a completely new concept for a fresh mind, than to try to incorporate a new concept into an existing body of knowledge.
  • Something I've noticed;

    When people have a lot of self-esteem they really don't have much trouble learning because their learning process is just that learning; they know what to ask, they know where to look.

    If they don't have that self-esteem, they have to artificially build their confidence by over practicing their field of study past the point of ridiculousness.
  • Okay.
    Yes, evidence may show this, as a statistic.

    I think the reason, though, lies in what we choose to do with our brains.

    If you don't think about abstract problems very much, your ability to do so declines.
    If you never think about math, your ability to deal with it decilnes.

    If you start studying math again, and getting all these patterns wrapped up in your head, you start getting better at it again. The brain is good at what it does the most, as those are generally the most important things to you.

    If you get older, and stop thinking so much, then of course, when some wierd problem comes by, you'll think 'I could have solved that in my youth'. But really, you could have done it now, too, if you hadn't given up.

  • Dude, I agree with you 100% and it's refreshing to see you critique your post so honestly. It's a case of a moderator wanting to use up all his points. Sometimes it's better to just eat the points fellas.
  • It would be a lot easier for yuo to just make your own account, just like you did for slashdot. They don't ask for your credit card number or anything like that.
  • I don't really follow the assumption in the article (and many of the comments) that the brain can't learn new things unless it can create new neurons. Even if this research doesn't apply to humans, and actually we don't generate a substantial amount of new neurons as humans, that doesn't rule out the potential to learn 'new tricks' at all.

    You see, the brain has so many neurons that the number of potential configurations through adjusting the 'weights' between neurons is almost limitless. Through adjusting these weights, the brain changes its behaviour in some way--it 'learns'. It is not necessary to create additional neurons to learn more things.

    Of course, if a module of a brain is not properly connected due to some affliction (as occured with the people mentioned in the article who can't tell right from wrong) additional neurons would be required to fix the problem.

  • These things you claim are universal just happen
    to be things that are enforced in all cultures
    you are familiar with. And even then, there are
    several cultures that are (or were) very different
    (e.g. Aztecs and several other primitive cultures)
    Murder as you state it isn't specific enough to
    be considered much proof for the existance of
    a universal morality -- you're unable to make it
    anything more than a loose category, one tied to
    a biological imperative natural to the species.
  • No, the FULL headline would read:

    ENTIRE HUMAN RACE TURNS GEEK
    Slashdot traffic increases by 15,000%.
  • Well, you may like this "man of character"
    (you may even adore Hitler's employment
    policies) just do not compare him to
    moderate republicans. Buchannan is a far
    closer comparison.
    BTW, the last Austrian politician that got the
    world to pay attention actually was Hitler.
    Makes you wonder, doesn't it?
  • His main objection to the 10% myth seems to be that it's used by people whose ideas he disagrees with. The evidience to support his own claim that we use 100% of our brains boils down to two points

    > 1. Pet scans show that...over the course of a whole day,
    > just about all of the brain is used at one time or another.

    If I have 200Meg of ram in my machine and I use pretty much all of it over the course of a day, but never more than 20% at a time, does that prove it is all necessary ?

    > 2. The myth presupposes an extreme localization of functions
    > in the brain. If the "used" or "necessary" parts of the brain were scattered all around the organ, that would imply that much of the brain is in fact necessary.

    No, it doesn't. You might as well claim that since apples are scattered around the apple tree, the whole tree is edible.

    I am aware that neither of my objectictions demonstrates that we don't use/need all our brain. I am just irritated by sloppy thinking and flawed arguments from such an intellectually self-righteous writer.

    Personally I suspect that most people use most of their brains. Nature is efficient: brains require a lot of energy to keep going so we would be unlikely to evolve larger brains than we need.
  • The recent experiment on rats regenerating "supposedly permanently dormant" neurons thru intracranial Nerve Growth Factor, led me to research alternative NGF stimulants. Acetyl-l-carnitine does just this, with significant effects in human patients, and (shhh, quiet) it can be bought on health food store shelves. Time to reawaken brain cells!
  • Grow some of those brain cells back and learn to type like a somewhat intelligent human.
  • yes exactly that, check out thomas hobbes's work, leviathan, and i beleave although i dont have it here the crito by plato, i may be wrong on which one it is so loot for the "speach of the laws of athens" but reading plato is good for the soul anyway. basically what the laws of athens say is that since you choose to live in athens (its speaking to socraties) you agree to obey the laws of athens, if you dont like the laws then leave



  • Before we ask the question:

    "How much can the Brain takes?"

    We must first have the answer to this question:

    "How much is _much_?"

    That is, in other word - how do we QUANTIFY our question?

    Information (facts/data/stories/experience/perception/etc) are that we have in real-life cannot be quantify as neatly as the data we have in the digital world - in bits and bytes.

    Therefore, it is almost impossible to quantify how much can our Brains hold because how do we quantify an "image" that is in our brain - especially those "images" that comes not only "image" per se, but also "feelings", including the "feel" of "smell" and the "background noise" that may be associated with that "image".

    Until we can find a way to quantify our question, the question of "How Much Our Brains Can Hold", IMHO, is a no-brainer.



  • Maybe, like all other cells in our bodies, the brain cells are manufactured at a much slower rate when you are 60 than when you are 16...

    --

  • Lawnmower Man, anyone?

    --

  • I remember reading the details...they lost a part of their prefrontal cotex and were 'mean', emotionally unstable, and unable to accept long-term responsibilities. To the extent that empathy is the basis of morality, they might be said to have 'lost the ability ot empathize.'

    The same damage was originally found in a man named ohineas gage...check it out. Quite interesting if you're into the whole debate on the biological nature of morality.

  • I wasn't able to read the article, and I'm very tired right now, but I seem to remember something from my college neurobio classes about people who lose the ability to consider the consequences on their actions.
    Then there are the ones who lose some or all of their emotions, sometimes making them not care what's right or wrong (for instance).
    I think there are some that lose what moral code they had, but I don't remember much about them... my book's around here somewhere, but I'm too damned tired to look it up.

    Hope some of this helps

    KdL

  • I've always believed that older people probably have just as much capability to learn new stuff as the younger folks, but they just have less motivation to do so. I mean, when you're young and just starting out, you -need- to learn as much as possible to stay alive, secure a good paying job, catch a good mate, start a family, etc.. But once you've married, have a mess a kids who can more or less take care of themselves, and have enough beans saved up to live comfortably without working anymore, why bother putting the effort into learning new stuff? Learning hard stuff takes a lot of effort, effort which a young person might be willing to expend, but not necessarily someone who already has all the essentials in life achieved.

    There are, of course, some people who love the challege and continue learning all their lives, even after fulfilling life essentials.

  • You're right of course, but there is also empirical evidence that the usage pattern of the brain has a strong influence on how much the mental facilities weaken over time. For example, one experiment showed that by using video games, the weakening of eye-hand coordination and reaction speed of old people could be slowed down. (So, Quake can actually be useful ;-)

    The mentioned research could help finding explanations for these phenomena and improve our knowledge of how to counter the weakening of mental facilities. Mental exercise is probably always good.

    Chilli

  • phineas gage.

    He was a construction foreman who got a bar stuck through his head. Neat story. He became, um, unreliable afterwards. He's one of the cases I was trying to think of in my post below.

    KdL

  • ...and eventually they lose the capacity to divide at all. Hence, no more brain cells, no more regrowth of dead storage space,a continually decreasing amount of snyaptic connectivity, less neural regulation (hormone production, motor control, perception filtering, etc), dependence on diapers, ec

    The golden years look more like pyrite to me...

    (do a search for telomeres and telomerase for an explantation...am very tired)

  • Age has nothing to do with new tricks. Old dogs
    _can_be taught new tricks, and old people _can_
    learn new things.

    It's a stupid cliche really. The real deal is
    "You can't teach an old dog STUPID tricks."

    Dogs and people get old in part by not being
    stupid or doing stupid tricks.

    ciao mtngrown

    A semiold guy who will not learn stupid tricks.
  • Given the discussion that people who keep a habit of learning all through their lifes are still mentally agile when old, an interesting connection is the constantly increasing demand for life long learning. Especially in high tech jobs, nobody can expect that the knowledge that is sufficient today will still be sufficient in 20 or 30 years; this is probably most extreme in the IT industry.

    Does this imply that more and more old people will be mentally fit? Given the problem of the dramatically increasing average age in most industry nations, we should probably hope for this effect.

    Chilli

    PS: Maybe most smart old people will in the end be computer geeks, because they were forced to keep learning ;-)

  • Wired also had a report on this a few days ago,

    http://www.wired.com/new s/news/technology/story/22223.html [wired.com]

  • "Just how malleable is the brain? How easily can a person overcome the forces -- genetic and environmental -- that shape a creature from birth?"
    That makes it sound as if all the forces are external, as if the person and his brain can't apply any growth forces. So everyone is a victim of their environment, as they can't think for themselves and do anything different than they are told. Just go flip your burgers, pay half your wages in taxes, and vote for us because we'll tell you what is best for you. The new discovery is that the brain installs new neurons in the approximate location of old neurons, so the brain is always rebuilding itself. So the nature is to allow continued learning. The new neurons are similar to the old, but do not have exactly the same training as the old neurons..they learn what to do based on what you are experiencing and trying to do right now.
  • It's not really nature VSnurture, rather, both of them play a part in everything. IE, our ABILITY to learn, or to grow new neurons as might be the case here, is part of our heritage and thus affects the ability to form new connections.
  • Actually, your brain IS shaped by both Nature and Nurture. There simply isn't enough information in the DNA to form all of the complexity in the brain -- environmental factors have important influences on the normal development of the brain.
  • The article mentions the 'folk-wisdom' that one "cannot teach an old dog new tricks". Until about 1995 (wild guess) I'd always read the neurologists/neuroscientists believed this to be true in the sense of no new neurons being formed after X years. Now we know that new neurons can form in adult brains.

    This is not as simple a change in knowledge as, for example, recognizing an Rh factor in the blood. To say one year there are 4 blood types and the next year say there are eight is not a contradiction (unless you say _only_ 4). But it is a contradiction to say one year that new neurons don't form, and the next year new neurons do form.

    So what does this matter? People make mistakes, right? Well sure, but any neurologist who mouthed off saying 'no new neu' really should have said 'I have no clue', 'I don't know', or any myriad of truthful statements; if you're going to be a scientist, you mustn't just make $#!+ up!

    And another rant! Even if we could grow neurons, that wouldn't preclude learning in later years. Barring X-Files pedagogy, there isn't anyone who knows for certain how the brain and the mind come together. Last I'd heard, neurons were no longer the brain's wires (probably a bad metaphor anyway), but the microtubules within the neurons were. And the (programmer's) neural net idea seems to be out too, since there are non-neuronal broadcasts also; chemical for sure, and electrical as well, IIRC.

    So I'll through out three hypotheses for fun and chunk in an aphorism.
    Fun Hypothesis #1: The heart is the seat of the soul/mind/whatchamacallit. Why? Electrical isolation. It sets its own rhythm.
    Fun Hypothesis #2: The stomach is the seat of the soul/mind/whatchamacallit. Why? Just a gut feeling.
    Fun Hypothesis #3: The entire body is the seat of the soul/mind/whosamadosit. Huh? Ever seen them seperated? Phantom limbs/pain. All cells have microtubules. Easy check: start correlating physiological and psychological dysfunction; such as (say) systemic failure (a la lupus) with dissociative personality disorder. If there is a correlation then see which came first. If there is such an order, see if the connection is causal. (Of course, I'll bet the disease is just so bad that dissociation is a coping action. But even that idea supports a much stronger tie between the whole body and the mind.)

    Aphorism: The second most important statement in science is: "I don't know." The first should follow the second.
  • Ok, I understand exactly what you mean...but I think you're missing that they're presenting a totally NEW idea in this article. The research they're talking about is suggesting things never thought of before and attacking some of the most fundamental beliefs of neuroscience.

    Dont understand it as just presenting some new inconclusive research regarding one "side" or the other...the author of this article was informing the reader of very important new developments in how we see the brain. This is a very important function of a NEWSpaper.......

    If you still thought that there was no way for the brain to produce more neurons, then you'd be seriously ignorant of modern neuroscience...so good job, NY Times.

    Mind you, I'm not arguing that it should be on slashdot :)

    [its 3:15 AM here in New York....forgive me for any incoherence]

  • How is it that some older people can't comprehend an ATM, while others can easily use a computer? I don't think it is related to how smart you are, but maybe it is a function of how many brain cells are being created or how their bodies manage to incorporate them into their existing brain.

    I remember reading a study awhile back which showed that people who are in occupations where they exercise their mind heavily tend not to lose brain mass. In contrast, people who have relatively mindless jobs (such as bus drivers) tend to actually lose brain mass.

    I think this phenomenon is related your question about learning: If you rarely exercise your ability to learn, then your body doesn't bother to maintain that part of the brain, and it literally atrophies. It doesn't matter how much you already know -- if you're not equipped to learn, you can't learn (at least, not easily).

    Neurogenesis implies that you might be able to restore some lost functions, depending on where those new neurons are placed. I wouldn't bank alot of money on it.

    A different issue that affects people's ability to learn certain topics is fear and/or strong misconceptions: some people are just too afraid (either consciously or subconsciously) of a particular topic to learn anything on it, or too certain that a topic is too difficult to be learned. Basically, both states lock out the person's ability to accept that they can and should learn about the topic at hand. I see both happen computer newbies all the time.

    --Joe
    --
  • useless comment, but i, for one, really hope neuron regeneration does exist. otherwise i might end up paying for all those years of excessive drug use in my youth ... oh my ... :)

    - j

    ---
    "The only guys who might buy [the Apple iBook] are the kind who wear those ludicrous baggy pants with the built-in rope that's used for a belt." - John C. Dvorak, PC Magazine
  • "Your Mama" was the Slashdot nick of the person who submitted the article. (See the Slashdot article text at the top. "Your Mama writes...")

    --Joe
    --
  • What is this guy talking about?

    No matter how malleable the brain turns out to be, it is still governed by its genetic heritage. For new connections to form with ease, the underlying machinery must be in order. And the design of these biochemical cogs is encoded in the genes.

    We (our brains) are shaped in two ways: by nature (heritage) and by nurture (our environment). Forming "new connections ... with ease" has nothing to do with "underlying machinery" having to be in order (whatever that means). Humans are born with instincts, but anything can be learned whether one has is underlying machinery in order or not.

    On a side note, if it is true, that we do continually form new neurons, then why do older people seem to learn much slower and often forget things? Is it even possible to really ever forget something?
  • My grandfather had a hospital induced stroke which left him for months crippled mentally. He was so down because he was forgetting everything he knew, and I hope to have this mans range of knowledge when I reach his age. Anyhow, during this I was reading a book called 'MegaBrain', one of those books you pick up to see how the ticker is working and to shed some light into our own maps.

    Its whole philosophy, which makes sense to me is if you debilitate a person by putting them in a non-demanding enviornment, one that does not offer challenge or creativity, the brain shrinks (basically), however, if you put a person in an enviornment that is rich in material and challenge the person, no matter what age, will begin to grow new neurons and build cells to keep pace.

    To me these are duh's (you don't use it you lose it). However, I gave this book to my grandfather and told him to cheer up, that he was going to be getting better soon, he just needed to keep challenging himself and he'd be fine. By all means don't shut off. And now, while some parts of the stroke are irreversable (dammed idiot doctors), he is as lively as he was. If we can only get the docs to let him have some vino!(twice dammed)

    -Malachi

"Look! There! Evil!.. pure and simple, total evil from the Eighth Dimension!" -- Buckaroo Banzai

Working...