Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Biotech Science

Nerve Cells Successfully Grown on Silicon 284

crabpeople writes "Researchers at the University of Calgary have found that nerve cells grown on a microchip can learn and memorize information which can be communicated to the brain. 'We discovered that when we used the chip to stimulate the neurons, their synaptic strength was enhanced,' said Naweed Syed, a neurobiologist at the University of Calgary's faculty of medicine."
This discussion has been archived. No new comments can be posted.

Nerve Cells Successfully Grown on Silicon

Comments Filter:
  • by neurosis101 ( 692250 ) on Friday February 20, 2004 @05:31AM (#8337763)
    This is the future of computing right here.

    Not making faster Pentiums or Athlons. Sorry. Most of that magic has already been woven. Who out there is qualified to make systems level designs and decisions about bio computer systems? Think about the type of knowledge it must take about physics, electrical and computer engineering, as well as biological knowledge.

    What type of magnetic and power restrictions will there be? Reliability? What type of optimizations will exist? Interfaces? Flexibility?

    We're still quite far away from having things like this be applicable to modern day but think about when you too can say, "I know Kung Fu"!

  • by penguinland ( 632330 ) on Friday February 20, 2004 @05:35AM (#8337785)
    Researchers at the University of Calgary have found that nerve cells grown on a microchip can learn and memorize information which can be communicated to the brain.

    While the article mentions this in the introduction, it doesn't mention this happening at all in the research. It talks about neurons communicating with each other. This is a long way from connecting this chip into a living brain in an animal that can still function.

    While I agree that this is a fascinating article, we should make sure not to sensationalize it too much. Making chips that interface with actual brains in actual animals, even if they are snails, is still a long way off.
  • by imnuts2 ( 754767 ) on Friday February 20, 2004 @05:51AM (#8337839)
    IQ instead of GHz?
  • Re:I'm no Bill Joy (Score:5, Insightful)

    by kinnell ( 607819 ) on Friday February 20, 2004 @06:03AM (#8337884)
    The idea that we could grow neurons on silicon is one of those big steps that looks to lead us into the Johnny Mnemonic world

    No it's not. This involves interfacing with the neurons that are already there.

    As these true neural webs become more complicated, it would be interesting to see if any kind of emergent behavior was evident

    Given that large collections of neurons are well known to exhibit emergent behaviour, I think it would be more interesting if they didn't.

    this could be the first step to replicating a nervous system without having to rely on fetuses for stem cells. It requires no human cloning and holds immense promise

    Nerve cells harvested from an animal brain can be grown in the lab. There is no need for embryonic stem cells or cloning at all. Growing them on silicon does not make this easier - in fact they will probably grown better in a petri dish.

    It would definitely be cool to have a couple of these chips implanted to enhance the base memory that we are kitted with at birth

    Memory in the brain is not simple storage of information. It is unlikely that pluggin a DRAM into your brain would be able to enhance your memory.

  • by q.kontinuum ( 676242 ) on Friday February 20, 2004 @06:10AM (#8337908)
    Evolution != bruteforcing. With bruteforcing (e.g. trying to guess a password with a dictionary) there is no "being on the right path" or whatever. It's just wrong or right. Evolution is survive of the fittest, do minor changes in different direction on an existing system and let see which one will lead closer to success.(Just like sex ;-)) Take many of the fittest and do the same again. The some time take some of the not so fit and try as well the same.

    On the other hand you are right: This trial and error seems to lead to better results in the long run compared to deterministic creation. But this scheme is already adopted by science. IIRC there was a distributed computing project simulating a robot with a defined task and changing the parameters of the robot. The different clients exchanged the information about the results. I don't remember anymore the name or the homepage of the project, I think it was already 4 or 5 years ago...

  • Nerve cells (Score:1, Insightful)

    by Avada Kedavra ( 712991 ) on Friday February 20, 2004 @06:13AM (#8337924)
    Maybe something can be done about spinal cord
    injuries in the future with this technology?
  • Half-bit bandwidth (Score:5, Insightful)

    by korpiq ( 8532 ) <-.@korpiq...iki...fi> on Friday February 20, 2004 @06:20AM (#8337938) Homepage
    Quoth the article:

    scientists stimulated one nerve cell to communicate with a second cell which transmitted that signal to multiple cells within the network.

    Singal up (probably down too, though that is not said). That's a start. Now let me jump.

    Imagine how this would feel in your own brain. Even strengthened to noticeable level by a lump of neurons, the signal would still read "beep". Now imagine being fed information through that channel. "Beep, bip beep bip bip beep". Better start training that morse.

    Now let's enhance the input by adding more bits into it and running data through a digital-to-analog converter. This is where you would slowly be able to "see colors", one at a time. Low signal, cold feeling; high signal, hot feeling. That is brainable information. You can associate different patterns of these "colors" to different ideas.
    But still it's not like you could see any shapes, is it?

    Now add more bytes, feed them in side-by-side. That's a feed. At this point, feel nausea. Something is feeding noise into your thoughts, something you cannot possibly comprehend.

    Would take a processing system not unlike vision inside the brain to translate that feed into experiences like colors, tastes, touches, then further associate these to make shapes out of the noise.

    A long way.

    Worth taking, of course, as research goes, but I wouldn't toss away those external displays as of yet. Have a hunch computers won't be the same, either, when we get there.

    Future research will focus on interfacing silicon chips with the human brain to control artificial limbs and develop "thinking" computers.

    Mostly fun!
  • by nhaze ( 684461 ) on Friday February 20, 2004 @06:25AM (#8337945)
    Potter has done a lot of work on the project since then and electrodes were defintely incorporated. He has linked the cultured network up to a variety of output devices, including a stylus device to 'draw', onto a robot to manuever, and a DOOM-like virtual environment. http://www.gatech.edu/news-room/release.php?id=160 http://www.wireheading.com/roborats/hybrots.html
  • by El Torico ( 732160 ) on Friday February 20, 2004 @06:32AM (#8337970)
    How would that be any different than what we've had for the last 20 years?
  • by erwin ( 8773 ) on Friday February 20, 2004 @06:37AM (#8337980)
    Alan Cooper, author of "The Inmates are Running the Asylum" and other texts put it this way:

    Q: What do you get when you cross a camera and a computer?
    A: A computer.

    His point is that from an interface and place-in-the-world point of view, most products that have been digitally enhanced tend to remain closer to their technology roots than their analog counterparts (with all of the usability, and I would say ethical, challenges inherient in a technologist-driven system).

    That said, this is pretty frickin' cool, but the double-edged sword presented by this innovation seems both particularly sharp and far reaching. I really hope we get this one right.

    "Why can't you use your powers for Good?"
  • by dave420-2 ( 748377 ) on Friday February 20, 2004 @06:53AM (#8338016)
    That's the beauty of the brain, though. It can make sense of the strangest of inputs. The very nature of neurons and connections in the brain means that if you were to introduce an "input" into the brain using a technique like this, given time, there's a very good chance that the brain will eventually make sense of it. After all, it's a very good learning computer, and this is really no different to the information sent via the optic nerve.

    Imagine trying to describe vision to someone who's been blind from birth. It's nigh-on impossible to explain, as it's unlike anything else they can experience. This is what we're seeing here - a new sense we just can't comprehend, yet could offer us such incredible benefits we can't hope to fully understand at such an early stage as this.

  • by scambaiter ( 703904 ) on Friday February 20, 2004 @07:07AM (#8338046)
    Actually i think its a big mistake to think of the brain in terms of cpu computing power. The brain does not simply use brute-force computing power to solve problems and handle special tasks and situations. We got a lot of built-in or learnt features to do so efficiently. For examples we use a lot of shortcuts to (not always correctly) solve a task. Just think of optical illusions: the brain uses some cues to judge a situation instead of doing a correct calculation. Or think of reflexes: a lot is happening just before the brain is involved, e.g. when you put your hand on some hot thing (yep, that athlon that has been running for 2 weeks straight;)) the signal to take the hand away gets sent straight from the spine before it reaches the brain.

    Unless we use equivalent mechanisms for cpu based computing comparing the speed of the brain to silcon based units imho doesnt make much sense.

  • by Vellmont ( 569020 ) on Friday February 20, 2004 @07:27AM (#8338092) Homepage

    Q: What do you get when you cross a camera and a computer?
    A: A computer.


    Perhaps I'm missing the point (I've never read the afformentioned book), but when I cross a camera and a computer, I usually get a camera. Digital cameras are exactly this, no? The question seems a silly one. When we started making bridges out of steel did they somehow become something other than bridges?

    A camera is a thing that can capture pictures and later reproduce them. You can use film, or silicon to do that, but it's a camera because of what it does, not how it does it.
  • Software version (Score:5, Insightful)

    by Gendhil ( 686251 ) on Friday February 20, 2004 @07:47AM (#8338172)
    Or, for a more software interpretation, it's a function that takes a bunch of boolean parameters and returns a boolean. Anyone who's ever done any programmation or computer architecture should see why you can easily process anything with this.
  • by Welsh Dwarf ( 743630 ) <d.mills-slashdot@NOSpAM.guesny.net> on Friday February 20, 2004 @07:59AM (#8338207) Homepage

    Neural computing will remain the domain of highly specialized research into AI and neural computing forever. We may develop neural analogs using nanotech or some other gee-whiz tech, but they will not be true neurons.

    I disagree, I think neural computing will have practical applications, but more in the lines of neural interfaces than actual computers. Imagine a prosthetic(sp?) arm that works just like the old one did...

  • big deal (Score:4, Insightful)

    by lukesl ( 555535 ) on Friday February 20, 2004 @09:58AM (#8338794)
    IAAN, and this is not a big breakthrough in any sense. Basically, this is something that was first done using manually-positioned electrodes probably twenty years ago, and now they can grow neurons on a dish that has electrodes built into it and do it that way. WoO-hAH!

    The computational power of neurons comes from the way they work in groups, not the way they work alone. Therefore, it's strongly dependent upon the detailed organization of their connectivity. Grinding up a piece of brain and regrowing it on a dish will obviously not retain native connectivity. Additionally, the time it would take to manually rewire an interesting circuit by giving little localized electrical pulses (or do anything else interesting) is longer than neurons are viable in culture, and that's not a problem that's been solved yet.

    I'm not saying this technology won't have important uses as a research tool, just that it won't be useful for what people here seem to think it will be useful for (high-density pornography storage). BTW, one of the more interesting characters in this field is Steve Potter [gatech.edu], a somewhat strange guy who does some technically impressive work [uwa.edu.au]
  • Re:Kinda cool (Score:3, Insightful)

    by KReilly ( 660988 ) on Friday February 20, 2004 @12:15PM (#8340011)
    What you guys are failing to take into consideration is what is the difference in heat given off between a resistor and neuorons? Even if neorons are slower and larger, the fact that they can be packed together without need for cooling makes them much more powerful/useful..

    Well, I find mine useful anyways, I am sure some people have mized results

  • by 330Pilot ( 688005 ) on Friday February 20, 2004 @12:34PM (#8340199)
    "Future research will focus on interfacing silicon chips with the human brain to control artificial limbs and develop "thinking" computers." Thats one heck of a leap forward from connecting x number of snail nerves together.
  • by Kaki Nix Sain ( 124686 ) on Friday February 20, 2004 @03:12PM (#8341814)
    XOR has the truth table:
    a b aXORb
    1 1 0
    1 0 1
    0 1 1
    0 0 0

    What you have described is:
    e i e?i
    1 1 0
    1 0 1
    0 1 0
    0 0 0

    Where ? is either &~, i.e. "e and not i", or "not if e then i". A "partially working" logical function is really just a fully working different logical function.
  • Re:Kinda cool (Score:3, Insightful)

    by Anonymous Coward on Friday February 20, 2004 @03:51PM (#8342396)
    i don't think people would be equal because the good chips would probably still be owned by bad (read: greedy) people. i mean, in theory it sounds like a nice utopian marxist wet-dream, but there is too much inertia keeping the system the way it stands. and personally, any sort of wild-und-crazy hive-mind is not something i'd ever want to participate in. the distractions would be omnipresent (you think video games are addictive now?) and any sort of rational, thoughful, political or philosophical discourse would be run over by the lovechild of classical liberalism and western capitalism / consumption.

    und besides: The ads. Think of the ads!!!!

This file will self-destruct in five minutes.

Working...