Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Biotech Science

Nerve Cells Successfully Grown on Silicon 284

crabpeople writes "Researchers at the University of Calgary have found that nerve cells grown on a microchip can learn and memorize information which can be communicated to the brain. 'We discovered that when we used the chip to stimulate the neurons, their synaptic strength was enhanced,' said Naweed Syed, a neurobiologist at the University of Calgary's faculty of medicine."
This discussion has been archived. No new comments can be posted.

Nerve Cells Successfully Grown on Silicon

Comments Filter:
  • Kinda cool (Score:5, Interesting)

    by hyc ( 241590 ) on Friday February 20, 2004 @05:26AM (#8337741) Homepage Journal
    But what's the size of a neuron vs the size of a transistor in a 65nm process CPU?
  • Just like sci-fi. (Score:3, Interesting)

    by murat ( 262137 ) on Friday February 20, 2004 @05:27AM (#8337746)
    "We discovered that when we used the chip to stimulate the neurons, their synaptic strength was enhanced, ... " There was something like this in one of Asimov's books. The guys synapses are enhanced by a machine, then the guy starts to "feel" and "manipulate" things.
  • I'm no Bill Joy (Score:5, Interesting)

    by ObviousGuy ( 578567 ) <> on Friday February 20, 2004 @05:28AM (#8337750) Homepage Journal
    But this is very exciting. The idea that we could grow neurons on silicon is one of those big steps that looks to lead us into the Johnny Mnemonic world that Gibson was talking about just a couple stories prior to this one.

    There is a song that says, "It only takes a spark to get a fire going". So too is it true that it only takes a couple neurons to start synapsing. As these true neural webs become more complicated, it would be interesting to see if any kind of emergent behavior was evident.

    Also, with the current political and scientific climate as it is, this could be the first step to replicating a nervous system without having to rely on fetuses for stem cells. It requires no human cloning and holds immense promise.

    It would definitely be cool to have a couple of these chips implanted to enhance the base memory that we are kitted with at birth, that's for sure!
  • by Gopal.V ( 532678 ) on Friday February 20, 2004 @05:29AM (#8337761) Homepage Journal
    Will this make computers more human or otherwise ?.

    Maybe it's time to admit that nature does a better job bruteforcing (OK , what else do you call SEX and EVOLUTION) the secrets of this world than all our mathematical precision.. (E=MC2 ... Forty Two ... naah... doesn't work) ... Of course, nature did a better job making us humans than we would have achieved ... :)
  • Other uses? (Score:5, Interesting)

    by tanveer1979 ( 530624 ) on Friday February 20, 2004 @05:30AM (#8337762) Homepage Journal
    We discovered that when we used the chip to stimulate the neurons, their synaptic strength was enhanced

    If only they could find out how did the strength increase and wether we can do the same to the human body we can find a cure for most of the nervous system degradation diseases. Anybody have link to a more verbose article?

  • by Anonymous Coward on Friday February 20, 2004 @05:44AM (#8337819)
    its ok, its just the end of the beginning.
  • Re:Kinda cool (Score:5, Interesting)

    by Sivar ( 316343 ) <`moc.liamg]' `ta' `[snrubnselrahc'> on Friday February 20, 2004 @05:47AM (#8337827)
    Perhaps a key use is not to use neurons to improve silicon chips, but to do the opposite.

    Who knows, in a few decades we might have people deleting their childhood to store and smuggle hundreds of GB of information about the cure for a major epidemic that an evil pharmaceutical company is exploiting for profit.
  • by Elanor ( 130622 ) on Friday February 20, 2004 @06:02AM (#8337876)
    a.k.a Neal Stephenson and his uncle.

    Chip embedded in politician's brain after a stroke - he goes on to be president.... v. spooky.

    I would love to see alzheimer's patients helped with this. If it's a genetic disease, I'm up the creek and dropped me paddle a while back.

    - Lnr
  • by kinnell ( 607819 ) on Friday February 20, 2004 @06:13AM (#8337920)
    Making chips that interface with actual brains in actual animals, even if they are snails, is still a long way off.

    No it's not []

  • by ktanmay ( 710168 ) on Friday February 20, 2004 @06:23AM (#8337943)
    You know I had read somewhere that our brains (individual processes) run at around 200MHz (as it is all electro-chemically done), now if you say that we have hundreds of billions of neurons, so do we have billions of transistors on chips.
    The difference here is that our brains use the 3rd dimension effectively (and also work in parallel, I think). Now I'm not sure if the latest breakthrough uses electro-chemical processes to communicate, but if it's faster than 200MHz, it definitely has huge potential.
  • Re:Other uses? (Score:1, Interesting)

    by Cred ( 754775 ) on Friday February 20, 2004 @06:30AM (#8337964) Journal
    Now what about McDonalds using nanobots to inject "information" to our brains while enjoying that big fat El Maco? Great marketing isn't it? McDonalds could teach us to hate Burger King and vs. I wonder what would happen in this situation, McDonalds saying hate Burger King and BG doing the same for McD.
  • Reminds me of... (Score:4, Interesting)

    by jonney02 ( 591116 ) on Friday February 20, 2004 @06:45AM (#8337998)
    A quote i read somewhere
    "The danger from computers is not that they will eventually get as smart as men, but we will meanwhile agree to meet them halfway." -Bernard Avishai
  • by dave420-2 ( 748377 ) on Friday February 20, 2004 @06:45AM (#8337999)
    Imagine? Watched the news recently? :-P
  • by hkmwbz ( 531650 ) on Friday February 20, 2004 @06:51AM (#8338010) Journal
    With research like this going on, will we eventually see a medical solution to tinnitus?

    Tinnitus is a serious problem to a lot of people today, and it can have many causes, from various diseases/illnesses, to noise damage. It apparently has to do with the nerves in one's ear, so would this kind of research, might we finally see a way to actually treat tinnitus?

    Until you get T, you don't realize how lucky people who can actually be in a quiet room without going mad are...

  • implants (Score:2, Interesting)

    by VanillaCoke420 ( 662576 ) <vanillacoke420@h ... Eom minus distro> on Friday February 20, 2004 @06:56AM (#8338026)
    Perhaps it will be possible to make brain implants that enables you to connect to the internet and let others connect their brains with the net as well. Imagine sharing brainpower, or even share thoughts, ideas and memories over a filesharing network.
  • by bthomson0 ( 747986 ) on Friday February 20, 2004 @07:42AM (#8338146)
    Yes, Neural Networks by there very nature are really nothing more than adaptive filters, aka classifiers. The eye does a great deal of preprocessing such as edge detection, motion detection, etc ?aka classifying? to reduce the work load on the brain. Neural Networks could perform similary preprocessing to reduce the work load for cpu based image reconition systems.

    Actually the idea of "reflexes" is the same as electro-robots which can since objects by electrical load. That hot plate is nothing more than an over threshold input that cause electro-motor response. An electrical circuit could also be easily designed to include a little bit of fussy logic via a simple anolog circuit to achive the same thing.

    So, equivalent mechanisms are not readily available for cpu based computing, but there are for ANN based computing. If we ever hope to match the basic capabilities of animals we can not just rely on cpu based computing, we also need ANN based computing for sensor preprocessing and feed back controlled motor function
  • by hyc ( 241590 ) on Friday February 20, 2004 @07:42AM (#8338149) Homepage Journal
    Thanks for the reply, very enlightening.

    But it clearly would be folly to try to emulate a neuron using purely digital computing techniques. You're dealing with an analog mechanism that is pretty much a wire-or of many inputs feeding into a capacitor. This is very much an analog computing circuit; now the question is how efficiently you can do A/D-D/A conversion on this scale.

    (And as I recall, the sciatic nerve running down your leg is a single cell with an axon over 1 foot long. Definitely some impressive stuff Mother Nature has concocted...)
  • by AllUsernamesAreGone ( 688381 ) on Friday February 20, 2004 @07:49AM (#8338179)
    Well, it could be the beginning of the Singularity..
  • by Gendhil ( 686251 ) on Friday February 20, 2004 @07:51AM (#8338189)
    Since you are using a ridiculously little part of all the storage space you have available at anytime in your life. The cool thing, would be to have an electronic device that could strenghten a give synaptic path, allowing you to "refresh" your memory at will, and not forget important things. (like read the whole C++ W/ libraries reference once and then refresh this everynight)
  • Predicted 1945(!) (Score:3, Interesting)

    by infolib ( 618234 ) on Friday February 20, 2004 @08:19AM (#8338287)
    At the end of WWII research director Vannevar Bush predicted the IT revolution. [] He was eerily right in many ways, but some things are still to come. For some time I had the following quote hanging on my wall:

    In the outside world, all forms of intelligence whether of sound or sight, have been reduced to the form of varying currents in an electric circuit in order that they may be transmitted. Inside the human frame exactly the same sort of process occurs. Must we always transform to mechanical movements in order to proceed from one electrical phenomenon to another?
  • by techiemac ( 118313 ) <techiemac AT yahoo DOT com> on Friday February 20, 2004 @08:39AM (#8338364)
    It's kinda funny, a few years ago (back in the 80s) my dad actually did this. Believe it or not, he was the first one to grow a neuron on silicon (a Motorola chip for those interested). The poster with the electon micrograph of it was absolutly everywhere (we had 1000s of the posters in the basement). I even rememeber going to highschool science and, sure enough, there was my dad's poster.
    The hype surrounding this was insane mostly due to fact that everyone thought this was the true start to cybernetics. In the end, the hype died down, My dad's lab got a ton of grants and he got back to doing more research. Ironically enough, the most publicisied research that he did (the neuron on a chip) probably had the least impact.
    Such is the world of science at times :)
    So, yes, it's nothing new. Just repackaged.
  • by q.kontinuum ( 676242 ) on Friday February 20, 2004 @09:42AM (#8338690)
    Interesting. I'm no professional about biology, so I might be wrong.

    But from a logic point of view: If a generation with severel individuals, each of them with minor changes compared to it's ancestors, is born, for some individuals their changes will be an advantage, for some the changes are a disadvantage.

    The weaker individuals will not spontanously die, but they might have fewer chidren or maybe only few of their children will survive. The stronger individuals will have more children, or if they have the same amount of children they are in a better position to feed their children / let them survive.

    Thus, the next generation will still have some (few) individuals with the inherited weaknes plus some new minor changes and some (more) with the strength inherited plus some minor changes.

    Now it is still possible, that one of the weaker individual is affected so positively by its changes that it is now the strongest among all of his generation.

    Where is the flaw in my logic? Or does evolution theory realy exclude this scenario?

    In a book (of course I don't have the title again.. grrrr...) describing the technical version of evolution I saw that the weaker results are still considered for further development, you just have to put more weight on the stronger results.

  • by visualight ( 468005 ) on Friday February 20, 2004 @09:47AM (#8338718) Homepage
    Ok, now thats a good explanation of why humans can so (mentally) easily manipulate objects in 3d space without doing any math.

    I've always figured that the best design for a computer would be one that's able to "imagine". Since it would take too many transistors to emulate a neuron, maybe there's some other way to do it? Is binary the only way to compute?
  • by Anonymous Coward on Friday February 20, 2004 @09:51AM (#8338744)
    I disagree. Nature had a lot of time to "bruteforce" things. Give us the same amount of time and we will see what we'll be able to do in terms of "reengineering the world".
    Modern science is a 400 - 500 years old thing. Nature had billion of year to reach the levels we see.
    I think that the progresses we are achieving in the last 50 years are *really* impressive, and probably what we'll see in the next 50 years will be even more impressing. Sometimes humans deserves more credits IMHO.
  • by sbma44 ( 694130 ) on Friday February 20, 2004 @10:01AM (#8338817)
    it's something called long-term potentiation, and neuroscientists have known about it for a long time. if you get a neuron to fire enough, its synapses will strengthen. It's been a while, but I believe the mediating mechanism involves calcium-triggered protein synthesis.

    FYI, LTP is one of the most promising mechanisms proposed for explaining how long term memory works.

  • Input interface? (Score:3, Interesting)

    by delcielo ( 217760 ) on Friday February 20, 2004 @10:15AM (#8338931) Journal
    I don't see any practical value in being able to add memory; but it would be cool to have an interface that would let me learn things faster.

    Kind of like how people in "that movie" can learn how to fly a UH-1 in 3 seconds.

    Now THAT ability would be cool.

  • by MacJedi ( 173 ) on Friday February 20, 2004 @11:43AM (#8339684) Homepage
    Excellent point. You are right about the computational flexibility of neurons. They can represent a wide range of logical functions, although I believe that the single neuron is incapable of doing an XOR.

    Actually, I think it can be done (or at least a partially working XOR.) Imagine a neuron with two inputs and an output. But these inputs are not both excitatory: one is excitatory and the other is inhibitory. So, input only from the excitatory branch produces an action potential, and input from both branches yields no output. Unfortunately, input from just the inhibitory branch would produce no output either.
  • Re:Kinda cool (Score:4, Interesting)

    by Talinom ( 243100 ) on Friday February 20, 2004 @11:58AM (#8339842) Homepage Journal
    Or, in an evil universe not too far from our own...

    People get divorced and lose their families and free time due to the high demands of the current marketplace.

    People needing to do more work each day take pills to reduce the need for sleep.

    Employers needing to cut training costs develop the "Plug N Work" chip. When you get hired you are assigned a read only chip that has all of the companies policies, procedures, employee names, and specific work duties for each task.

    Employers add wireless to the PNW chip to rapidly update corporate policies as they are implemented.

    The tasks and skills for your job (doctor, lawyer, tech support, etc) are duplicated by a firm that sells the chips to your company. Your wage just became minimum because now ANYONE can walk off the street and perform the function.

    Wireless communication reaches the brain level and we go from being worker drones to Borg drones. This eliminates the internal need for teleconferencing, e-mail, telephones, or bulletin boards. Your pr0n and Slashdot time at work become obsolete in the new order as everyone would know what you were doing.

    Underground hackers develop technology to override The Companies' chip and deliver slashdot, goatse.cs, and pr0n unbidden to all recievers in the area.

    George Orwells dream of the though police and ultimate revisionism become a reality.

    But perhaps I'm just being paranoid.
  • by Anonymous Coward on Friday February 20, 2004 @12:34PM (#8340206)
    Sounds analogous to a wire, but that's so obvious that I must be missing something.
  • questions, questions (Score:3, Interesting)

    by whittrash ( 693570 ) on Friday February 20, 2004 @01:56PM (#8340966) Journal
    I am not so much interested in the Hollywood vision of this, although Ice-T deserved an Oscar for his performance. What I think is interesting is to think about the limits of our brains and how this could be used to expand consciousness.

    I think it would be interesting to understand how a neural interface would 'feel'. What would a process based in ones and zeros feel like? How would the brain adapt to take advantage of the new processing capability? Would we be able to project our consciousness outside of our body in some kind of digital plenum (that may not be a visual experience at all, it could be an entirely abstract experience like blind person contemplating numbers or language). Would we be enabled to 'see' new phenomena if we integrate a chip into the visual cortex (we could hook our brain up to a radio telescope and see the entire electormagnetic spectrum)? What color would ultra-violet be if it became a part of the visual spectrum or is the brain incapable of seeing a color that is as yet unimagined? What would it be like to 'smell' or 'touch' light or gravity or computer processes. Would it be possible to add entirely new senses or reasoning structures to the mind. Could we augment our perception to allow us to be cognisent of additional dimensional properties in addition to the 3 dimenstion we can see now. Would our bodies ultimately be relevant to our consciousness or could this technology allow us to be unhinged from our physical being, what would that mean for religion and philosophy? Could a person be in more than one place at once? Would it be possible to integrate two people into one or transfer one person into another, what would that do to 'individuality' and 'memory'.

    Just a few questions.
  • by jerald_hams ( 725369 ) on Friday February 20, 2004 @02:15PM (#8341181) Journal
    I think parent (along with some other posts) are confusing the biological neuron and the perceptron, which is a simplified mathematical model. While the perceptron can't cope with linearly inseperable problems (like XOR), there is no consensus on the computational limits of the neuron. In fact, very little is known for certain about the learning algorithm used by the nervous system. The neuron may learn not only through the weights of its inputs, but also through chemical interactions with glial cells. Really, the neuron is still too much of a mystery for us to know its limitations.

The intelligence of any discussion diminishes with the square of the number of participants. -- Adam Walinsky