Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Science

Scientists map schematic of brain's fibers 98

jake_the_blue_spruce writes "A simplified press release here and an abstract of the actual paper here details a Washington University study where they used MRI to track nerve fiber bundles from different identified areas of the brain. They made a 3D map of the resulting schematic. It's a lot like the bus-level view of a computer, with the various known brain areas as black boxes connected by fiber bundles. Cool. "Downside is that you have to request an image of it from the article. But I still think my brain looks like my Trash-80.
This discussion has been archived. No new comments can be posted.

Scientists map schematic of brain's fibers

Comments Filter:
  • The problem with this idea is that people's brains differ. To get a vision mod, you would need to map all the myriad connections for your own brain and tailor a device specially to it. It's so much easier to mass-produce "prosthetics" like night-vision goggles that there would be no point in messing around with your brain (even if we had the faintest clue about how to do it, which we don't).

    I don't like to rain on people's parades, but as a student of cognitive science I consider it my job in this case. Don't believe the hype.


    Well, I hate to rain on your parade, but as a medical student, I believe that your opinion has almost no basis in fact.

    There are large areas of the brain that are identical in shape, position, and function in the human. In fact, all pathways in the brain are the same from person to person, disregarding pathological changes [1] For example, the optic nerve/chiasm/ and tract have a very predictable pathway, and lesions can be determined based upon the visual deficits. This would not be possible if we were all wired differently. But, that's not the case. If you have bilateral temporal hemianopsia, I know right where the lesion is. In fact, give me 100 patients with the same presenting visual disturbance, and the lesion will be in the same spot.

    We know where the visual cortex is. We know how it receives data. We even know quite a bit on how the brain processes images, and what stages and levels of neurons do what processing. And it's the same in you, me, and Linus.

    And, in fact, there are working retinal prototypes for people who have certain types of blindness. They're extremely crude at this stage, but they exist, and they work.

    We know quite a bit about pathways - and these would the easiest areas to implement data supplementation. [2]

    What we know little about is why you hate tuna fish, and I love it. Or how people develop thought processes.

    Basically, once information gets to the level of the cortex, we don't know what happens. But, we do know how the data gets there, and we have a pretty good idea what data is stored where. I don't know what all is involved with the term "cognitive science", but it appears from your lack of anatomical knowledge that you don't study neuroanatomy.

    I will agree with the concept that interfacing directly with the cortex is probably technologically impossible for the reasons you state. However, getting data to the visual cortex, or any other part of the brain, for that matter, isn't that difficult. How the brain processes that information, on the other hand, is still a major mystery.

    [1] Which, BTW, is why the head transplant stuff isn't good for prolonging life in and of itself.

    [2] There is a simple experiment that you can do to place data into your visual cortex. Push on your eyeball (not hard...) and you'll see a circular light in your field of vision on the opposite side of your point of pressure. Congratulations, you've just mechanically put data into your visual cortex.
  • from a bookstore.com page

    "At the heart of this book is the revolutionary idea that human consciousness did not begin far back in animal evolution but is a learned process brought into being out of an earlier hallucinatory mentality by cataclysm and catastophe only 3,000 years ago and still developing. "


    So the pyramids were built by aliens after all!

    Thanks for the pointer to the book.
    I'll get my son to read it. He can read circles
    around me.
  • Well, I hate to rain on your parade, but as a medical student, I believe that your opinion has almost no basis in fact.

    My condolences. There's still time to drop out and join an internet startup ;-^)

    If you have bilateral temporal hemianopsia, I know right where the lesion is.

    Almost always a lesion involving the chiasm, such as a pituitary adenoma or craniopharyngioma.

    In fact, give me 100 patients with the same presenting visual disturbance, and the lesion will be in the same spot.

    Unfortunately, this isn't quite accurate. Loss of vision in one eye might be caused by a central retinal artery embolus, a fracture involving the optic canal, an optic nerve glioma (intrinsic to the nerve), clinoidal segment aneurysm or meningioma (extrinsic compression), or perhaps a degenerative disease of the retina.

    We know where the visual cortex is. We know how it receives data. We even know quite a bit on how he brain processes images, and what stages and levels of neurons do what processing. And it's the same in you, me, and Linus.

    To a first approximation. Primary cortical areas are the most conserved, but higher order associative areas are poorly understood and difficult to map. I'm sure you are familiar with Penfield's experiments involving cortical stimulation during awake craniotomies for epilepsy. We still do not really understand these experiments. Moreover, there can be considerable variability between people of different sexes (more bilaterality of language representation in females), age (relative weakening of uncrossed pathways after childhood), and even among individuals. This is why we must do amytal tests and intraoperative cortical mapping in some cases. This is why it is probably a good idea to stimulate and record before making thalamotomy lesions, rather than simply depending upon a generic atlas

    I will agree with the concept that interfacing directly with the cortex is probably technologically impossible for the reasons you state. However, getting data to the visual cortex, or any other part of the brain, for that matter, isn't that difficult.

    A direct cortical interface is not that unrealistic. Although it would probably be impossible to implement an electrode grid with the same resolution of native visual cortex, it is reasonable to expect that we can achieve light, shapes, and shadows.

    I'm sure you are also familiar with cochlear implants - electrodes essentially stimulating the cochlear nerve. At first, these patients hear a lot of distortion, but over time, their brain seems to tune itself to the input and they have serviceable hearing.

    Also read Merznik's (sp?) work on cortical plasticity. Even in primary sensory or auditory cortex, the homotopic maps can be altered somewhat by changes in sensory inputs. Hence, representations of fingers change when they are sutured together, representations of tones change with auditory conditioning, and the relative sizes of barrel fields representing whiskers change with differential manipulation of the rodent's whiskers. I'm sure that this type of plasticity will be exploited in neurorestorative strategies.

  • Could they wire me to play like Hendrix?

    Or did that come from somewhere else, not
    his brain?
  • Mods such as that would be simple implants of biograde lenses and appropriate emitters for such things as IR, UV, etc

    Ah, the Star Trek Visor!

  • I don't think the two are connected, but the tweaked mice raised my eyebrow too. I'm surprised it didn't get posted on /.

    The CNN.COM version of the article ends with an ominous 'ethically questionable' blurb. B.S. says I! I've got my sleeve rolled up, and I'm waiting for the shot to come.

    Mice are only good (in I.Q. research at least) for running through mazes and pushing on colored buttons. With this therapy/re-engineering, they got significantly better at their forte.

    Imagine what an army of penguins could accomplish with the same genetic hack... And with gilded nerves... well...
  • Roger Penrose: The Emperor's New Mind and Shadows of the Mind; excellent books :)
  • I do know anyhitng about this, but I have alwas assumed that it was pretty much a given that we would not be working with individual nerve connections an time soon and that we didn't really need. The real question is how do you get the brain to learn to use the equipment you plug into aproximatly the right place, so the real advances are not necissarily in knowing what connects to what, but in knowing the connectivity establishing rules for the brain (which might involve neural growth hormones).

    My question is: How much is known about how to do this? I know about the leach-computer that was mentioned on here a while back so we have at least some idea of how to electrically train neurons to do serton things, but the leach to computer may be really waistfull in that project (in terms of using the same nervers to do diffrent things).

    What would be really cool and maybe realistic is a monkey version of the leach computer, i.e. a living monkey with a calculator attached to it's head who shows no side effects from using the calculator. It would show that we are maybe connecting to the nerves without fucking things up by talking to too many of them at once or something.

    Jeff
  • The fact that neuroscientists are still using black boxes and power lines to represent the supposedly functional areas of the brain demonstrates a crude model of the indubitably sublime nature of mind. Both the methodology and theory seem primitive today; digital logic models are not really comparable to the subtle tides of thought and perception that really occur, and can hardly begin to address creativity or spirituality. I'm not considering implants while the scientists can't see the forest for the trees. Waiting for paradigm upgrade.

  • Yeah, didn't you know that Earth was created to compute the ultimate question to the ultimate answer of Life, the Universe, and Everything?

    Beer recipe: free! #Source
    Cold pints: $2 #Product

  • It was the one that came with the origional post. I just followed a link on the page for the html formated version of the journal article. My university has an electronic subscription to that journal - any computer on campus or coming through the uni's dial-up can access it. I guess they recognize us by the sub-net or something.
  • I don't know about not enjoying it if you're not sympathetic to the viewpoint; heck, I don't hold that viewpoint, but I still enjoyed his books. Basically, anything that lets you know more about something, or gives you a different perspective can't be bad; at the worst, all you do is read and discard. At best, you take away more info than you started with and your perspective is a little broader; you know the arguments you'll have to beat. Or, alternatively, it just might change your mind.

    Knowledge is a good thing; suspend judgement until the end and then make your own decision. It might even change your mind; you never know.

    Simon
  • Presumable the ultimate goal of this research is to figure out where to build in the NSA backdoor into peoples' brains when they're born.
  • >Not too long from now, you won't have to practice that perfect golf swing for months. You'll just go
    >to a walk-in clinic and have the responsible nerves anodized during lunch time, and be out on the
    >green, kicking butt and taking names by early afternoon.

    Umn, no.

    It might be possible to acquire abstract information this way, but most physical activities require MUCH more than the neural knowledge. You could get 'programmed' to play piano in an afternoon, sure, but you wouldn't suddenly get the muscular strength and flexibility in your fingers that would be necessary to implement that knowledge.

    The same applies for any physical activity that currently requires practice -- there's much more than the "book learning" going on there, so your hypothetical golf swing in a simgle download is simply not going to happen.


    --
  • Regarding fMRI's: I've read a bit about this in relation to studying the 'limbic system' (Here [vt.edu] somewhere, I believe; interesting papers, even if it isn't).

    All functional mapping tends to paint a biased picture of the brain. In particular, the cortex tends to be over-represented, compared to the limbic system. Unfortunately, I don't think this can be avoided at present.

    Carrying that further, I'm not terribly confident about how usefull non-invasive techniques can be: in particlar, it is currently rather difficult to study neurochemistry without taking apart brains, which tends to result in death, and even then you cannot extract much detailed information. I'm not sure I'd consider 'functional mapping' to be accurately mapping any functions, esp. sub-cortical functions, while it's based entirely on neural firing patterns.
  • [2] Push on your eyeball (not hard...) and you'll see a circular light in your field of vision on the opposite side of your point of pressure.

    OOPS, Man, how do I put my eye back in???
  • You know that sweatband Hendrix seemed to be perpetually wearing? Soaked in LSD....listen to the recording of his 'Star Spangled Banner' from Woodstock. You can tell when it starts to hit his system. ;)


    -Andy Martin
  • Wow, Dungeons of Daggorath, now that name brings back memories. I LOVED that game!!!

    Oh well, back to work and reality for a while...
    ---
  • This modality - MRI - gives excellent spatial resolution. Unfortunately, it is not so good with temporal information, which can be at best on the order of a second, which is much slower than the brain processes information. In my lab we are using magnetoencepalography and related methods, in concert with MRI, to look at the temporal dynamics of the brain.

    In other words, MRI gives a schematic of the brain, and fMRI tells you which parts get warm when doing certain tasks. We're trying to use MEG like a logic probe, to look at timing.

    Our code base, both for signal processing and for visualization, is all developed on Debian GNU/Linux machines (both i386 and Alpha) and will all to be released under the GLP. It is also all being ported to SGIs and to large Linux clusters.

    If you're interested in figuring out how the brain works and want to get a PhD or MS in CS at a really funky department while hacking Linux and playing with gonzo brain imaging data, don't be shy - get in touch.
  • The PDF version of the paper, with lots of detailed images, is here [pnas.org].

    I'd like to commend the PNAS for putting the PDF file online for free and with no registration hassle. In addition, the server seems to be holding up quite well. It looks like a successful defense against the /. effect does exist!

  • As is mentioned in the absctract, Francis Crick has called in the past for more work on human neuroanatomy. If we are ever to fully understand the brain, we need to know how things are connected. Much work has been done in the macaque monkey, where we now know a large amount of about the neurocircuitry of the visual system. Such work has been done in the macaque because many of the techniques used could not be used in human subjects. Research such as this paves the way for the future of neuroscience, allowing us to understand the way the brain is wired. Without knowledge of the wiring, we cannot begin to understand brain function.

    With that said, however, there is one caveat: neuroscientists are not at a consensus as to the usefulness of fMRI. The main question is, does increased blood flow correlate with increased firing rate of neurons? Such research (to my knowledge) has not yet been carried out. Whatever the outcome of the fMRI debate, however, current studies such as this neuromapping research are helping us to further understand the wiring of the human brain.

    Nick Knouf
    nknouf@cedric.caltech.edu
  • Anyone have a username/password for this URL?
    cypherpunk/cypherpunk
    cypherpunks/cypherpunks
    didn't work.

    Are they perhaps actually charging for access?
  • God no, whatever you do *DON'T START WITH DENNETT*. He is probably the single most destructive influence on philosophy students of the last decade, and I shudder to think what he's done to the 'interested amateurs'. Hofstadter isn't so bad, but he coming out of AI so you shouldn't expect a completely unbiased view. (No offense to AI workers, but the field tends to be a quite a ways behind neuroscience and philosophy.)

    Back to Dennett, his primary interest seems to be the philosophy of mind, and his primary tactic is to ignore the difficult parts, or just as often, rewrite the problem into a form he can solve in under one paragraph. He's done an excellent job at maing it *look* like he's getting things done, but in reality, he isn't. (Williams Seager, in his latest book (can't remember the name) devotes two chapters to analyzing Dennett. John Searle inevitably opposes him and I believe David Chalmers and Owen Flanagan have been critical of Dennett in recent work, as well as many others I can't remember.)

    Unfortunately, there aren't many good introductions to the philosophy of mind available today, and fewer that don't follow the Dennett-Churchland line of reasoning. Nagel had one, but it's a bit dated now, and basically finishes up with his neutral-monism/panpsychism view.[0] Flanagan "Consciousness Revisited"[1] was fairly good, from what little I remember. Chalmers would be OK, except that he tends to use rather questionable grounds for his arguments.[2] *sigh* Oh well, there goes philosophy...

    [0]: Which I'm fond of, but which isn't very popular in most circles. I think a more balanced presentation would be preferable.

    [1]: That was the title, I hope. There have been several books on philosophy published in the last few years titled 'Consciousness something or other'.

    [2]: I agree with Dennett on several fronts, including his belief that zombies are absurd philosophical tools. Unfortunately, his argument against them is equally absurd (along the lines of 'we're all zombies (or zimboes)', IIRC).
  • Probably the best place to start reading about this sort of thing is Douglas R Hofstadter and Daniel C Dennett, "The Mind's I", but everything I've read by either author has been excellent.

    I think the answer is this: *you* know you're thinking about your brain, at least if you stop to ask yourself, so clearly the information is available to brain processes should it be relevant. But it seems damn unlikely that it would look greatly different than other kinds of deep thought to any probe that only measured low-level activity like electrical patterns or chemical changes.

    Put it this way: do you think your computer knows when you're recompiling a kernel?
    --
  • Its nice to see a civilised disagreement on slashdot... ... ESPECIALLY between two well educated people... I mean, where else in the world can you see two neuroligists disagree about the intregration of computers into the mind.

    Kinda change from the BillGates Sux Linus Sux stuff we usually get on here :)
  • by xQx ( 5744 )
    The ultimate in the intregration of computer and mind.
  • I must point out that this is not the first work showing brain connectivity. In fact, people have been doing this for a decade with MRI, and before that with more invasive means. For example, Douek et al. (Journal of Computer Assisted Tomography, 16(6),923-929:1991) colour mapped myelin fiber orientation in the brain using diffusion weighted MRI.

    Brain connectivity is important because specific regions of the brain must communicate to achieve higher functions (see Broca's area [wustl.edu], for example). Coupling this information with functional information (regional metabolic activity in the brain also measurable with MRI or PET or SPECT) can provide valuable insight into brain function and dysfunction.

  • OOPS, Man, how do I put my eye back in???

    I actually had a dream about that when I was younger... I popped my eye out of its socket at the dinner table and I was having fun hiding under the table with my eye being held up above the table, looking around.

    My mom just got pissed and told me to act my age and put my eye back. The dream turned a little horrorful when I realized I had no way to retract all that optic nerve back inside my head.
  • Our mapped out brains look like the inside of a computer?.... maybe we're someone's PC and don't even realize it...
    I wonder if the mapped out Internet looks like out mapped out brain.
  • As soon as someone says any word/name, five zillion images jump into your awareness, all at once. Your sub-visual and sub-vocal spectra is flushed with all images and voices, perhaps even pushing out external light/sound. This is what never forgetting anything means.

    Zowie....this means people would actually have to stop and think before they spoke or acted. Most of the time, at least, this would be a good thing.

  • Any chance we can plop down some high tech gizmos into these pathways and get sensory enhancement? I'd like built-in night vision, please!

  • Please don't make analogies between computers and the human brain. I don't want my children to grow up running the beta of Mindows 2012.

    "There is no surer way to ruin a good discussion than to contaminate it with the facts."

  • yeah, real time speech translating would be really cool to... or even something simpler like a address book (for those of us that are bad with names!!!
  • This is good work, great for medical applications and interesting and all, but just because you can map the neural pathways onto a schematic, doesn't mean that neural connectivity is the same thing as simple circuitry.

    Brains use things like neurotransmitters that have different actions, pulses that are somehow responsible for binding different activations, local effects in which "neighborhood" activations affect nearby thresholds, and other phenomena that are quite different from simple circuitry.

    Still, mapping out neural connectivity is the cog-sci version of the human genome project, and ultimately could be relevant to neural net engineering.
  • I don't understand why they're not simply putting the image on the Web. They want to test the /. effect on mailboxes or something?

    Could anyone kindly mirror it and post the URL here?

    "There is no surer way to ruin a good discussion than to contaminate it with the facts."

  • interesting, but unfortunately the paper fails to answer the most important questions for Slashdot readers :

    Can this "human brain" thing run linux?
    Can it be networked into a Beowulf?
  • The problem with this idea is that people's brains differ. To get a vision mod, you would need to map all the myriad connections for your own brain and tailor a device specially to it. It's so much easier to mass-produce "prosthetics" like night-vision goggles that there would be no point in messing around with your brain (even if we had the faintest clue about how to do it, which we don't).

    I don't like to rain on people's parades, but as a student of cognitive science I consider it my job in this case. Don't believe the hype.

    Beer recipe: free! #Source
    Cold pints: $2 #Product

  • Can you post it somewhere when it is received? I'm sure that the slashdotters will want to get a good look at it. I remember watching the techs page through the MRI images of my brain at Yale-New Haven hospital, and seeing my medulla emerge on the screen... Wild stuff.
  • by Otto ( 17870 )
    > The problem with this idea is that people's brains differ. To get a vision mod, you would need to map all the myriad connections for your own brain and tailor a device specially to it.

    No. You create a mod that fairly generic, then let the brain figure out how to make the connection. The brain is adaptive, right? It may take a while (years maybe) for the brain to learn how to use this new data stream, and you might have some serious issues in the mean time, but it's possible.

    ---

  • It would be really cool if someone requested the photo from the media affairs office and posted it somewhere . . . I would hate to think of what the /. effect is doing to the poor people in that office :)

    bror
  • i want night vision, mind over matter, the ability to increase adrenaline in my system, and i want to put a cell fone in my head so i can just "think" calls. heh.
    awesome.
  • "To get a vision mod, you would need to map all the myriad connections for your own brain and tailor a device specially to it."

    Also, as you age, you gain new connections and change old ones, right? So the mod might not even last very long...

    Heh. Too bad, tho.
  • what's wrong with making analogies between computers and the human brain? Neural networks, protocols, interface designer, etc are all designed based on comparrisons to the human brain. I agree that I don't want my kids' brains running the latest version of windoze (maybe they'll be better behaved and when they crash I can get some rest)...

    if you study AI, you'll realize that all of that isn't based on hacking code, but on studying the brain and trying to mimick the way it processes info. so if we're to make advances in this field, get ready for much scarier stuff than just allusions to brains and computers.
  • Cool as it would be to see all the figures on the web, PNAS is primarily a scientific journal - and generally scientific figures aren't shown in full in the web versions of articles. (I doubt PNAS has seen this much activity in a long time - not too many of these articles get mentioned on /.!)

    Heretic old fashioned suggestion: You could always go to your local University library and take a peek in the actual journal ... ;-)

    YS
  • I think the project wants to find out how the rewiring works. If you can control that, you don't need to wait...
  • Let's start a research foundation dedicated to this. We'll try to get Bill Gates to fund us, just because he has billions that he likes to throw around. Then, when we have brains networked together to work more efficiently, we'll convince Gates that all M$ employees should be networked together, then while they're all running WinBRAIN 2015 w/ Service Pack 4, we'll install Back Orifice in their heads, start sucking info out of it, and send images of flying windows into their minds 24-7. After we're done with that, we'll introduce a new virus we'll name after some dancer or some such, which will mostly randomly corrupt data, then on a random date in 2018, it will mutate into the "Brain Eating Mutants From Outer Space" virus, which they will think is some video game, which will eventually erase their brain. Then we hire them for $1.50/hr to clean our garages.

    Oh, and we'll also be able to make Vision Accelerators which will not only speed up the way we see things, but enhance it.
  • Good point! I guess the lesson is that, as in programming, you should usually use the most amenable interfaces--the ones that are exposed to you--instead of trying to hack the internals.

    Beer recipe: free! #Source
    Cold pints: $2 #Product

  • Oops. Forgot the smiley, did I? I love brain/computer analogies. I just hate having to reboot every night! :)

    I see nothing wrong with trying to mimick the brain's functions. It's even more interesting that some of the brain's functionalities were not mimicked in AI attempts, but independently developped. Cybernetics rule.

    "There is no surer way to ruin a good discussion than to contaminate it with the facts."

  • Um, yes, they are charging for access :-) This is a professional scientific journal, where yearly charges can run into the thousands of dollars. Many times, one can access sites like this from university computers where they have a site-wide license, but for other users, yes, you have to pay for the access. This is the only way to defray the astronomical costs involved in publishing a research journal.

    Nick Knouf
    nknouf@cedric.caltech.edu
  • That URL worked for me without any trouble at all. Maybe the site is licensed to certain IP's?
  • Or- horrors of horrors- your university, inorder to save money has an electronic subscription to the Journal, and uou can read the entire article on the web, such as I just did.

    Cute pictures.
  • This is pretty interesting stuff, but lets not get too excited about what it can and can't tell us. Sometimes, the media tends to get a bit over excited about basic science steps, claiming they are leaps.
    -- Moondog
  • I don't know about you, but I don't find skeletal structures to be particularly sexy...
    ---
    "'Is not a quine' is not a quine" is a quine.
  • URL? Share the wealth, man.
  • Hmm, that's entirely possible. I accessed it from my machine at UC Berkeley. I know that we have a number of "digital library" agreements.

    I definitely wasn't trying to be sarcastic. If indeed they are restricting access, then I withdraw my commendation.
  • Thanks for the clarification...I was sort of stepping into an area where I don't have much expertise :-)

    Nick Knouf
    nknouf@cedric.caltech.edu
  • This is pretty cool, but a image isn't so great to me. Schematic capture isn't as popular as it once was. How about modeling the brain in something more common in the Electronics Engineering field.

    When can I get a VHDL or SPICE model of the brain?

    If I can get that, maybe I'll program it into my FPGA? =^)
  • Well, I know enough about neuroscience to recognize and understand everything you wrote in your post, and I remember enough about the details of vision to know where in the occipital cortex different activities like edge detection, movement detection, and color perception go on, and where different "modes" of information like color and shape are integrated. I know that vision is the best-understood cognitive activity. In fact, I should have known better than to use vision as an example. ;)

    From your post, I guess that you are most familiar with the gross physiology of the brain. Back when I studied neuroscience actively, I was interested in computational accounts of cognition, which means that I was looking on a lower level, the level of individual connections among neurons and even of individual neurotransmitters.

    I suspect that our disagreement arises from differing views of what level of detail is important. While it is well-established what areas of the brain are responsible for what gross functions--language, apetite, emotion, attention, short-term memory, vision, audition--it is not understood how individual actions of those types are carrired out. That's the reason the old "information processing" theorists' diagrams are so full of modular boxes with with lines connecting them. To me, all the important detail is either inside of the boxes or in the lines themselves, both of which areas most people glibly gloss over. The common attitude that those things don't matter is one thing that eventually soured me on cognitive science as a career. However, this wasn't the real problem. The real problem was that I wanted a rigorous computational account of the thing that really mattered to me--language--and I eventually became convinced that the problem was intractable. The categories at that level are too abstract, and the distance from stimuli too great, to admit of a rigorous study. You clearly concede that point in your post, writing that "interfacing directly with the cortex is probably technologically impossible for the reasons you state".

    Anyway, to get back to the subject of vision mods, I believe that you do need to work at the level of individual axons to obtain useful effects. I realize that "useful effects" is in this case a hazy category, but I will leave it that way so it can be hammered out in further discussion. In looking back over my post, I realize I may have given you the impression that I thought the whole brain was an undifferentiated tangle, and that by "map all the myriad connections for your own brain" I really meant the whole brain. In fact, I meant the myriad connections of the occipital cortex--let's say the V4. I maintain that belief.

    This is the real issue: can you manipulate vision at a gross level, or do you have to descend to the cell level? For the so-called higher functions, I can see that we agree in considering the interface problem intractable. In a relatively well-understood area like vision, there is more room for contention. Nonetheless, I believe you will find that no useful effects--again, I leave that category open--can be obtained at the gross level. Remember, arbitrarily functions is much harder than ablating them by lesion.

    Beer recipe: free! #Source
    Cold pints: $2 #Product

  • This modality - MRI - gives excellent spatial resolution. Unfortunately, it is not so good with temporal information, which can be at best on the order of a second, which is much slower than the brain processes information.

    Hi, Barak. Being basically a cognitive electrophysiologist, I, of course, have to agree with this, but I do want to point out that the Raichle paper is phenomenally important in that this is the beginning of doing real dynamic processing studies.

    The logic for how to use this data is straight-forward enough; you mention MEG, but another technique that should work really well with this the event-related optical signal (EROS) which was pioneered by Dr. Gabriele Gratton [missouri.edu] at the University of Missouri, and now spreading rapidly elsewhere.

    Our code base, both for signal processing and for visualization, is all developed on Debian GNU/Linux machines (both i386 and Alpha) and will all to be released under the GLP. It is also all being ported to SGIs and to large Linux clusters.

    That's great! I think you should issue that as a challenge to many other labs out there.

    If you're interested in figuring out how the brain works and want to get a PhD or MS in CS at a really funky department while hacking Linux and playing with gonzo brain imaging data, don't be shy - get in touch.

    What he said, except ours is a Psychology department (so the department is psycho rather than funky).

    All kidding aside, the geek masses yearning to go to grad school despite the difficulties of being a grad student would be well-advised to take a look at the advances being made in human brain imaging and cognitive neuroscience when mapping out their careers.

    Jonathan King,
    Department of Psychology, U. of Missouri

  • I must point out that this is not the first work showing brain connectivity. In fact, people have been doing this for a decade with MRI, and before that with more invasive means. For example, Douek et al. (Journal of Computer Assisted Tomography, 16(6),923-929:1991) colour mapped myelin fiber orientation in the brain using diffusion weighted MRI.

    Well, people in the field know that there has been lots of work going on in this topic (computing the diffusion tensor). It is, after all, a fairly obvious thing to do. What seems to be novel to me about this is that they used the DT to trace out some very long fiber pathways, and they did it with just a wimpy 1.5T Siemens Vision scanner. And the big surprise to me was that the WashU folks beat the MGH group (among others) into print on this topic.

    Coupling this information with functional information (regional metabolic activity in the brain also measurable with MRI or PET or SPECT) can provide valuable insight into brain function and dysfunction.

    As Barak Pearlmutter mentioned earlier in this thread, this information is an even better fit for techniques that provide much better temporal resolution than fMRI, PET, or SPECT; he mentioned MEG (magnetoencephalogram, although EEG-based methods can also contribute, and I can't help mentioning the amazing new "shoot lasers through the skull and get optical imaging data" technique known as EROS [missouri.edu], currently being developed at the University of Missouri and elsewhere.

    Jonathan King,
    Dept. of Psychology, University of Missouri

  • Hi, Jimhotep,

    Great nick! You should check out Douglas Hofstadter's Goedel, Escher, Bach: An Eternal Golden Braid. It deals with this question extensively, relating it to the Goedel theorem and bringing in lots of other interesting subject matter relating to recursion. It's a fun read, because the non-fictional chapters are interspersed with weird Lewis Carrol-esque allegories starring a tortoise . . .

    Five stars. Quinn Bob says, check it out.

    Beer recipe: free! #Source
    Cold pints: $2 #Product

  • Searle's charge that Dennett ignores the difficult parts depends on Searle ignoring the bits where Dennett tackles the difficult parts. It's just that he tackles them in a tractable form: instead of addressing the wishy-washy and question-begging "why do people experience conciousness?" question, he addresses the concrete question "why do people *report* conciousness?". This key move makes it possible to get started on the problem.

    I'll also note here that Dennett has a footnote hanging off the sentence "We're all zombies" stating roughly "Of course, it would be an act of utter intellectual dishonesty to quote this out of context."
    --
  • Be warned, Penrose believes that strong AI is not valid. In other words, he doesn't think that machines can become sentient. The Emperor's New Mind is mostly devoted to disputing and deconstructing strong AI. If you're not sympathetic to this viewpoint, you probably won't enjoy the book at all. It won't change your mind, either, if you've thought at all about the subject before. The debate is essentially a religious one, and the arguments on both sides are few and fixed.

    Beer recipe: free! #Source
    Cold pints: $2 #Product

  • It happens in neural networks, too. You can only get so much recursion out of a connectionist system; things become noisier and noisier and then break down.

    I, for one, am damn glad my brain is not a Scheme interpreter. ;)

    Beer recipe: free! #Source
    Cold pints: $2 #Product

  • You know, he actually became fascinated with the idea that his music was being transmitted into his brain by an alien civilization. I'm not kidding. I read an article about Hendrix's science-fiction proclivities in this trendy British men's magazine a couple of years ago. I wish I could remember the name of the mag, but I can't. :(

    Beer recipe: free! #Source
    Cold pints: $2 #Product

  • Personally, I'm just waiting for someone to hack together a jack that will let me use all my Atari 2600, 7800, Coleco, Sega Master System and TRS-80 CoCo cartridges without having to have a game system OR a TV.

    Dungeons of Daggorath would be COOL if I just had to THINK the commands.
  • The coolest things happen in my neighborhood (walking distance from my house to One Brookings Drive over there...)

    My dad died of Alzheimers -- or, I guess the symptoms thereof, since the disease itself doesn't really kill. (You know, when you stop eating 'cause you don't wanna...well...) I suppose something like this could help track the degrading of the "wiring" as it were for people with certain diseases -- they mention specifically schizophrenia, but I'm thinking of progressive diseases like Alzheimers. Maybe we could have "re-wired" dad! (Don't call me sick and morbid, humor is the best way to deal with it.)

    Go Wash-U.
  • Heretic old fashioned suggestion: You could always go to your local University library and take a peek in the actual journal ... ;-)

    What?? you're suggesting I get out of the house to keep abreast of scientific discoveries? :)

    "There is no surer way to ruin a good discussion than to contaminate it with the facts."


  • Ever think your brain knows when
    it is thinking about itself?

    I wonder about crap like this all the time.
  • I sent email to the address in the article and here was the response:

    A high-resolution, color image of a fiber tract positioned on a 3-D model of the human head is available from the Office of Medical Public Affairs upon request. Their phone number is (314) 286-0100.

    Also, the paper is available on-line at . It is under the section "Current Issue".

    PNAS News Office

    Maybe the Office of Medical Public Affairs wants their phones /.ed =)

  • Interesting.. But why would thinking about itself have a different effect on the brain than thinking about anything else? For that matter, is thinking about thinking different than thinking about anything else? Is it 'meta-thinking'?

    I think that I think, therefore I think that I am.. Uhhh... Maybe.
  • That's because you're drunk, QP! Remember me? yoshi!!!! hee hee ;)

  • I know just the book for you, Jimhotep: The Origin of Consciousness in the the Breakdown of the Bicameral Mind by Julian Jaynes. Not exactly a summer-beach read in spots, but engrossing and thought-provoking.

  • damn "matrixian" loops...

    you need to stop watching your pirated mpg of the Matrix, I think it's affecting your neural i/o
  • > Could they wire me to play like Hendrix?

    I don't know about playing like Hendrix, but soon they might be able to "wire" [cnn.com] you to perform better in mazes.

    KV

  • check out Mankind in Amnesia by Immanuel Velikovsky. He posited that cataclysm not only produced changes that spurred on evolution, but that these events also helped evolve our consciousness. This includes some other strange effects, like how we can coexist on the planet with super-destructive weapons that could wipe out our civilization with little or no cognitive dissonance.
  • Things aren't quite as bad as you imply with fMRI Nick; there's been a lot of work about this in the last ten years!

    Optical imaging [sciencemag.org] experiments show how you can use direct observation of the brain to interrelate the changes seen with fMRI and neural activity. There are many more similar studies, all of which suggest a close correlation between neural activity and fMRI measurements. More recently people have begun comparing similar paradigms [jneurosci.org] directly in macaque and human using fMRI and electrophysiology

    Geraint
    geraint@klab.caltech.edu
  • Prolly have to prep the brain first, like this [wired.com] with genetic therapies first.
    Are the brain people talking to the gene people yet?
  • actually, I do wonder about the value of this applied to degenerative diseases and neurological diseases like Multiple Sclerosis . if you find a circuit, fuse, or random wire to be on the fritz, could you just pop in and rewire it?

    could there be a use, along with tracking the problem, to using artificial means to fix the problems?
  • Anyone ever read that?

    It's a Walter Jon Williams cyberpunk book, akin to Count Zero, where people have their neural pathways 'treated' to improve their response/reflexes.. We might be seeing the first steps in that direction here.

    Not too long from now, you won't have to practice that perfect golf swing for months. You'll just go to a walk-in clinic and have the responsible nerves anodized during lunch time, and be out on the green, kicking butt and taking names by early afternoon.

    Talk about golden memories, too. Just have the neurons where the experience you want to remember are stored - gilded. It's like having gold-plated A/V contacts. You'd never forget anything again.

    Too bad that doesn't work just from drinking Goldschlagger.
  • Argh! Nooooo!

    I've read it. Quite bizzare, even if interesting.
  • I don't believe this. That's what I'm just finishing up here at Children's in Boston! IT's called diffusion tensor mr. I should be done in a few days. Blast them
  • > Can this "human brain" thing run linux?
    > Can it be networked into a Beowulf?

    I dunno about it running linux, but I believe the "networking a bunch of human brains into a Beowulf" has already been done. I think they called it "open source" when the resulting cluster is used to develop software :)

A committee takes root and grows, it flowers, wilts and dies, scattering the seed from which other committees will bloom. -- Parkinson

Working...