Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Biotech Science

Nerve Cells Successfully Grown on Silicon 284

crabpeople writes "Researchers at the University of Calgary have found that nerve cells grown on a microchip can learn and memorize information which can be communicated to the brain. 'We discovered that when we used the chip to stimulate the neurons, their synaptic strength was enhanced,' said Naweed Syed, a neurobiologist at the University of Calgary's faculty of medicine."
This discussion has been archived. No new comments can be posted.

Nerve Cells Successfully Grown on Silicon

Comments Filter:
  • Plant human cells in an elevator-controlling unit and you'll have the dumbest movie ever.... [imdb.com]
  • Kinda cool (Score:5, Interesting)

    by hyc ( 241590 ) on Friday February 20, 2004 @05:26AM (#8337741) Homepage Journal
    But what's the size of a neuron vs the size of a transistor in a 65nm process CPU?
    • Re:Kinda cool (Score:5, Interesting)

      by Sivar ( 316343 ) <charlesnburns[@]gmail...com> on Friday February 20, 2004 @05:47AM (#8337827)
      Perhaps a key use is not to use neurons to improve silicon chips, but to do the opposite.

      Who knows, in a few decades we might have people deleting their childhood to store and smuggle hundreds of GB of information about the cure for a major epidemic that an evil pharmaceutical company is exploiting for profit.
      • Re: (Score:3, Funny)

        Comment removed based on user account deletion
      • Interesting? Have these people never *SEEN* the movie?

        I'm not even gonna give the title, that would just be blowing it out of the water.

        Kinda funny though, I saw Henry Rollins in concert last night, and spoke to him before the show about his role in that movie. "You *liked* that role?" .. "Um, yeah Henry, I did"

        Cool...

        Although, and I hate to say it, but is he starting to look more like Richard Dean Anderson as time goes by? Pic from last night [shadowsrealm.com] and yes, I look like a goof, you don't need to remind

      • Re:Kinda cool (Score:4, Interesting)

        by Talinom ( 243100 ) on Friday February 20, 2004 @11:58AM (#8339842) Homepage Journal
        Or, in an evil universe not too far from our own...

        People get divorced and lose their families and free time due to the high demands of the current marketplace.

        People needing to do more work each day take pills to reduce the need for sleep.

        Employers needing to cut training costs develop the "Plug N Work" chip. When you get hired you are assigned a read only chip that has all of the companies policies, procedures, employee names, and specific work duties for each task.

        Employers add wireless to the PNW chip to rapidly update corporate policies as they are implemented.

        The tasks and skills for your job (doctor, lawyer, tech support, etc) are duplicated by a firm that sells the chips to your company. Your wage just became minimum because now ANYONE can walk off the street and perform the function.

        Wireless communication reaches the brain level and we go from being worker drones to Borg drones. This eliminates the internal need for teleconferencing, e-mail, telephones, or bulletin boards. Your pr0n and Slashdot time at work become obsolete in the new order as everyone would know what you were doing.

        Underground hackers develop technology to override The Companies' chip and deliver slashdot, goatse.cs, and pr0n unbidden to all recievers in the area.

        George Orwells dream of the though police and ultimate revisionism become a reality.


        But perhaps I'm just being paranoid.
        • Paranoid? That's a beautiful vision. We no longer have to waste time on training or put up with incompetence: everyone will be equally competent, everyone will be able to perform their task perfectly.

          Minimum wage? No, you're not thinking this through. This is a true commodification of labour. The entire economy will have to change to accomodate this idea... and it will be fantastic! This is something that Yevgeny Zamyatin would've loved to include in his utopic novel We.

          P.S. I'm serious: We is so
          • Re:Kinda cool (Score:3, Insightful)

            by Anonymous Coward
            i don't think people would be equal because the good chips would probably still be owned by bad (read: greedy) people. i mean, in theory it sounds like a nice utopian marxist wet-dream, but there is too much inertia keeping the system the way it stands. and personally, any sort of wild-und-crazy hive-mind is not something i'd ever want to participate in. the distractions would be omnipresent (you think video games are addictive now?) and any sort of rational, thoughful, political or philosophical discour
      • Re:Kinda cool (Score:3, Insightful)

        by KReilly ( 660988 )
        What you guys are failing to take into consideration is what is the difference in heat given off between a resistor and neuorons? Even if neorons are slower and larger, the fact that they can be packed together without need for cooling makes them much more powerful/useful..

        Well, I find mine useful anyways, I am sure some people have mized results

    • by G4from128k ( 686170 ) on Friday February 20, 2004 @07:32AM (#8338105)
      Neurons are much larger than transistors, but the two aren't really comparable. The main body of a neuron is usually around 25 microns (25000 nm) in diameter and runs at a clockspeed only in the kilohertz max.

      A neuron is much more than a transistor-like switch. On the one side of the neuron's central body is a set of dendrites that connect to and gather input from other neurons. The average neuron might have a thousand of these dendrites.

      The synapse at the end of each dendrite acts like part of a multiply-accumulate term -- taking the signal from an other neuron, multiplying it by a numerical coefficient and summing it into the total excitation level of the neuron's body. I suspect that the precision of this multiply -accumulate process is fairly low -- perhaps 8 to 16 bits.

      Next, the body of the neuron has a long axon extending from it that sends the output of the neuron to other neurons (connecting to the dendrites of other neurons). This axon can be quite long, millimeters, even inches, in length. Thus, the axon is like an off-chip line driver with the potential to have a very high fanout (of a 1000 or more). (On a modern microchip, these off-chip connections are driven by much larger transistors than the small 65 nm ones used in computation).

      Third, a neuron is not a static multiply-accumulate system. The coefficients on each synapse change in response to long-term adaptive processes. This process is computationally complex and includes cross-correlation of inputs between synapses and processing of other chemical signals in the brain. Cross-correlation alone could require the equivalent of several kilobytes to several megabyts of RAM. (We won't even get into the adaptive processes that include physical growth and removal of dendrites as this has no easy analog in hardware)

      In summary, a neuron is more than a transistor-like switch. Its a free-running 1000 register multiply-accumulator with an off-chip line driver and a statistical processing engine that updates the coefficients on each of the multiply-accumulate terms. Thus, emulating a single neuron would require hundreds of thouands to millions of transistors.
      • by hyc ( 241590 ) on Friday February 20, 2004 @07:42AM (#8338149) Homepage Journal
        Thanks for the reply, very enlightening.

        But it clearly would be folly to try to emulate a neuron using purely digital computing techniques. You're dealing with an analog mechanism that is pretty much a wire-or of many inputs feeding into a capacitor. This is very much an analog computing circuit; now the question is how efficiently you can do A/D-D/A conversion on this scale.

        (And as I recall, the sciatic nerve running down your leg is a single cell with an axon over 1 foot long. Definitely some impressive stuff Mother Nature has concocted...)
      • Software version (Score:5, Insightful)

        by Gendhil ( 686251 ) on Friday February 20, 2004 @07:47AM (#8338172)
        Or, for a more software interpretation, it's a function that takes a bunch of boolean parameters and returns a boolean. Anyone who's ever done any programmation or computer architecture should see why you can easily process anything with this.
        • by G4from128k ( 686170 ) on Friday February 20, 2004 @08:14AM (#8338266)
          Or, for a more software interpretation, it's a function that takes a bunch of boolean parameters and returns a boolean. Anyone who's ever done any programmation or computer architecture should see why you can easily process anything with this.

          Excellent point. You are right about the computational flexibility of neurons. They can represent a wide range of logical functions, although I believe that the single neuron is incapable of doing an XOR.

          But a neuron is more that a Boolean circuit. Although a neuron seems like a two-state device (its either quiesent or its firing), it is more of an N-state analog device in which the pulse-rate encodes a numerical quantity (probably the equivalent of an 8 to 16 bit floating point number). That is why the dendrite field is like a giant numerical multiply-accumulate.
          • Excellent point. You are right about the computational flexibility of neurons. They can represent a wide range of logical functions, although I believe that the single neuron is incapable of doing an XOR.

            Actually, I think it can be done (or at least a partially working XOR.) Imagine a neuron with two inputs and an output. But these inputs are not both excitatory: one is excitatory and the other is inhibitory. So, input only from the excitatory branch produces an action potential, and input from both bra

          • Temporal Synchrony (Score:3, Informative)

            by percepto ( 652270 )
            But a neuron is more that a Boolean circuit. Although a neuron seems like a two-state device (its either quiesent or its firing), it is more of an N-state analog device in which the pulse-rate encodes a numerical quantity (probably the equivalent of an 8 to 16 bit floating point number). That is why the dendrite field is like a giant numerical multiply-accumulate.

            You're right on-- the change in firing rate relative to the baseline firing rate is very important. Also, there is some reason to think (logica [ucla.edu]

          • by jerald_hams ( 725369 ) on Friday February 20, 2004 @02:15PM (#8341181) Journal
            I think parent (along with some other posts) are confusing the biological neuron and the perceptron, which is a simplified mathematical model. While the perceptron can't cope with linearly inseperable problems (like XOR), there is no consensus on the computational limits of the neuron. In fact, very little is known for certain about the learning algorithm used by the nervous system. The neuron may learn not only through the weights of its inputs, but also through chemical interactions with glial cells. Really, the neuron is still too much of a mystery for us to know its limitations.
      • by Welsh Dwarf ( 743630 ) <d,mills-slashdot&guesny,net> on Friday February 20, 2004 @07:49AM (#8338181) Homepage

        This axon can be quite long, millimeters, even inches, in length.

        Acutally, it can be over a meter in length (spinal cord to calf is one axone). Try that with a transistor

      • Ok, now thats a good explanation of why humans can so (mentally) easily manipulate objects in 3d space without doing any math.

        I've always figured that the best design for a computer would be one that's able to "imagine". Since it would take too many transistors to emulate a neuron, maybe there's some other way to do it? Is binary the only way to compute?
  • by Anonymous Coward
    Haven't you learned anything on the matrix?
    You'll be the reason of extinction!!!
  • Just like sci-fi. (Score:3, Interesting)

    by murat ( 262137 ) on Friday February 20, 2004 @05:27AM (#8337746)
    "We discovered that when we used the chip to stimulate the neurons, their synaptic strength was enhanced, ... " There was something like this in one of Asimov's books. The guys synapses are enhanced by a machine, then the guy starts to "feel" and "manipulate" things.
  • I'm no Bill Joy (Score:5, Interesting)

    by ObviousGuy ( 578567 ) <ObviousGuy@hotmail.com> on Friday February 20, 2004 @05:28AM (#8337750) Homepage Journal
    But this is very exciting. The idea that we could grow neurons on silicon is one of those big steps that looks to lead us into the Johnny Mnemonic world that Gibson was talking about just a couple stories prior to this one.

    There is a song that says, "It only takes a spark to get a fire going". So too is it true that it only takes a couple neurons to start synapsing. As these true neural webs become more complicated, it would be interesting to see if any kind of emergent behavior was evident.

    Also, with the current political and scientific climate as it is, this could be the first step to replicating a nervous system without having to rely on fetuses for stem cells. It requires no human cloning and holds immense promise.

    It would definitely be cool to have a couple of these chips implanted to enhance the base memory that we are kitted with at birth, that's for sure!
    • Re:I'm no Bill Joy (Score:5, Insightful)

      by kinnell ( 607819 ) on Friday February 20, 2004 @06:03AM (#8337884)
      The idea that we could grow neurons on silicon is one of those big steps that looks to lead us into the Johnny Mnemonic world

      No it's not. This involves interfacing with the neurons that are already there.

      As these true neural webs become more complicated, it would be interesting to see if any kind of emergent behavior was evident

      Given that large collections of neurons are well known to exhibit emergent behaviour, I think it would be more interesting if they didn't.

      this could be the first step to replicating a nervous system without having to rely on fetuses for stem cells. It requires no human cloning and holds immense promise

      Nerve cells harvested from an animal brain can be grown in the lab. There is no need for embryonic stem cells or cloning at all. Growing them on silicon does not make this easier - in fact they will probably grown better in a petri dish.

      It would definitely be cool to have a couple of these chips implanted to enhance the base memory that we are kitted with at birth

      Memory in the brain is not simple storage of information. It is unlikely that pluggin a DRAM into your brain would be able to enhance your memory.

      • Since you are using a ridiculously little part of all the storage space you have available at anytime in your life. The cool thing, would be to have an electronic device that could strenghten a give synaptic path, allowing you to "refresh" your memory at will, and not forget important things. (like read the whole C++ W/ libraries reference once and then refresh this everynight)
        • Input interface? (Score:3, Interesting)

          by delcielo ( 217760 )
          I don't see any practical value in being able to add memory; but it would be cool to have an interface that would let me learn things faster.

          Kind of like how people in "that movie" can learn how to fly a UH-1 in 3 seconds.

          Now THAT ability would be cool.

    • The idea that we could grow neurons on silicon is one of those big steps that looks to lead us into the Johnny Mnemonic world that Gibson was talking about just a couple stories prior to this one.

      I am waiting for the Alastair Reynolds [powells.com]-style Conjoiner conversion myself, but Johnny Mnemonic will do in the meantime.

      As long as it's the short story and not the film, that is.
    • Re:I'm no Bill Joy (Score:3, Informative)

      by eric76 ( 679787 )
      Check out the book:

      Bothe, Samii, Eckmiller, Neurobionics - An Interdisciplinary Approach to Substitute Impaired Functions of the Human Nervous System, Amsterdam : Elsevier, 1993.
    • questions, questions (Score:3, Interesting)

      by whittrash ( 693570 )
      I am not so much interested in the Hollywood vision of this, although Ice-T deserved an Oscar for his performance. What I think is interesting is to think about the limits of our brains and how this could be used to expand consciousness.

      I think it would be interesting to understand how a neural interface would 'feel'. What would a process based in ones and zeros feel like? How would the brain adapt to take advantage of the new processing capability? Would we be able to project our consciousness outsi
  • by nhaze ( 684461 ) on Friday February 20, 2004 @05:28AM (#8337751)
    I thought the Pine Lab at Caltech had done this several years ago. Neurochip Project [gatech.edu]
    • by kinnell ( 607819 ) on Friday February 20, 2004 @05:40AM (#8337801)
      No. You're right, growing neurons on silicon is nothing new, but the breakthrough here is that they have been able to stimulate the neurons into forming new connections, rather than just measuring the response of existing networks.
      • by techiemac ( 118313 ) <techiemac@NOspAm.yahoo.com> on Friday February 20, 2004 @08:39AM (#8338364)
        It's kinda funny, a few years ago (back in the 80s) my dad actually did this. Believe it or not, he was the first one to grow a neuron on silicon (a Motorola chip for those interested). The poster with the electon micrograph of it was absolutly everywhere (we had 1000s of the posters in the basement). I even rememeber going to highschool science and, sure enough, there was my dad's poster.
        The hype surrounding this was insane mostly due to fact that everyone thought this was the true start to cybernetics. In the end, the hype died down, My dad's lab got a ton of grants and he got back to doing more research. Ironically enough, the most publicisied research that he did (the neuron on a chip) probably had the least impact.
        Such is the world of science at times :)
        So, yes, it's nothing new. Just repackaged.
    • by NeuroKoan ( 12458 ) on Friday February 20, 2004 @05:55AM (#8337854) Homepage Journal
      Quote from the above link
      This particular chip has no electrodes. The grillwork design allows the neurons to grow, and contains them indefinitely. We are currently building full chips with this design, and with electrodes.

      Keep an eye out for this page. Once we get fully functional chips, it shouldn't be long before I can show some real experiments and data.


      I think the big news is that electrodes were on the silicon chip, and were actually able to "learn and memorize information which can be communicated to the brain" (as per the original article).

      Also, the page looks like it hasn't been updated since 1995. I wonder what happened to this project. From the page Maher and Thorne seemed so close to what has just been acheived in Canada.
      • by nhaze ( 684461 ) on Friday February 20, 2004 @06:25AM (#8337945)
        Potter has done a lot of work on the project since then and electrodes were defintely incorporated. He has linked the cultured network up to a variety of output devices, including a stylus device to 'draw', onto a robot to manuever, and a DOOM-like virtual environment. http://www.gatech.edu/news-room/release.php?id=160 http://www.wireheading.com/roborats/hybrots.html
    • Weird -- I remember reading an announcement on this subject on Usenet back when I was in university. What's more, I was able to google for the original article [google.ca] from January, 1991:

      Hello. I just wanted to inform the netland that a direct nerve to transistor
      interface is finally operational. The invention was privately announced 1
      month ago, but is now out in the public. It is possible now to grow a nerve
      over a silicon substrate in a way that the nerve has a capacitive connection
      to a FE-Transistor built into

  • by Gopal.V ( 532678 ) on Friday February 20, 2004 @05:29AM (#8337761) Homepage Journal
    Will this make computers more human or otherwise ?.

    Maybe it's time to admit that nature does a better job bruteforcing (OK , what else do you call SEX and EVOLUTION) the secrets of this world than all our mathematical precision.. (E=MC2 ... Forty Two ... naah... doesn't work) ... Of course, nature did a better job making us humans than we would have achieved ... :)
    • Well, nature has had a tiny bit of more time to do her stuff than we have...
    • by q.kontinuum ( 676242 ) on Friday February 20, 2004 @06:10AM (#8337908)
      Evolution != bruteforcing. With bruteforcing (e.g. trying to guess a password with a dictionary) there is no "being on the right path" or whatever. It's just wrong or right. Evolution is survive of the fittest, do minor changes in different direction on an existing system and let see which one will lead closer to success.(Just like sex ;-)) Take many of the fittest and do the same again. The some time take some of the not so fit and try as well the same.

      On the other hand you are right: This trial and error seems to lead to better results in the long run compared to deterministic creation. But this scheme is already adopted by science. IIRC there was a distributed computing project simulating a robot with a defined task and changing the parameters of the robot. The different clients exchanged the information about the results. I don't remember anymore the name or the homepage of the project, I think it was already 4 or 5 years ago...

      • Evolution is survive of the fittest, do minor changes in different direction on an existing system and let see which one will lead closer to success.

        Umm, no.

        In evolutionary terms, 'fittest' are those who survive. There is no objective definition of 'fittest' independent of survival.
        • Interesting. I'm no professional about biology, so I might be wrong.

          But from a logic point of view: If a generation with severel individuals, each of them with minor changes compared to it's ancestors, is born, for some individuals their changes will be an advantage, for some the changes are a disadvantage.

          The weaker individuals will not spontanously die, but they might have fewer chidren or maybe only few of their children will survive. The stronger individuals will have more children, or if they have th
        • Evoulution is survival of those who survive.
    • by Anonymous Coward
      I disagree. Nature had a lot of time to "bruteforce" things. Give us the same amount of time and we will see what we'll be able to do in terms of "reengineering the world".
      Modern science is a 400 - 500 years old thing. Nature had billion of year to reach the levels we see.
      I think that the progresses we are achieving in the last 50 years are *really* impressive, and probably what we'll see in the next 50 years will be even more impressing. Sometimes humans deserves more credits IMHO.
  • Other uses? (Score:5, Interesting)

    by tanveer1979 ( 530624 ) on Friday February 20, 2004 @05:30AM (#8337762) Homepage Journal
    We discovered that when we used the chip to stimulate the neurons, their synaptic strength was enhanced

    If only they could find out how did the strength increase and wether we can do the same to the human body we can find a cure for most of the nervous system degradation diseases. Anybody have link to a more verbose article?

    • Re:Other uses? (Score:2, Informative)

      by Anonymous Coward
      Anybody have link to a more verbose article?


      [aip.org]
      link to article published in Physical Review Letters
    • Pubmed link [nih.gov] to the abstract for their research. Publisher's site sometimes holds a free copy of the full paper (depends on the journal).
    • it's something called long-term potentiation, and neuroscientists have known about it for a long time. if you get a neuron to fire enough, its synapses will strengthen. It's been a while, but I believe the mediating mechanism involves calcium-triggered protein synthesis.

      FYI, LTP is one of the most promising mechanisms proposed for explaining how long term memory works.

  • by neurosis101 ( 692250 ) on Friday February 20, 2004 @05:31AM (#8337763)
    This is the future of computing right here.

    Not making faster Pentiums or Athlons. Sorry. Most of that magic has already been woven. Who out there is qualified to make systems level designs and decisions about bio computer systems? Think about the type of knowledge it must take about physics, electrical and computer engineering, as well as biological knowledge.

    What type of magnetic and power restrictions will there be? Reliability? What type of optimizations will exist? Interfaces? Flexibility?

    We're still quite far away from having things like this be applicable to modern day but think about when you too can say, "I know Kung Fu"!

    • by ktanmay ( 710168 ) on Friday February 20, 2004 @06:23AM (#8337943)
      You know I had read somewhere that our brains (individual processes) run at around 200MHz (as it is all electro-chemically done), now if you say that we have hundreds of billions of neurons, so do we have billions of transistors on chips.
      The difference here is that our brains use the 3rd dimension effectively (and also work in parallel, I think). Now I'm not sure if the latest breakthrough uses electro-chemical processes to communicate, but if it's faster than 200MHz, it definitely has huge potential.
      • 200Hz, not 200Mhz.
    • by Illserve ( 56215 ) on Friday February 20, 2004 @06:45AM (#8338000)
      This is certainly not the future of computing!

      The precise properties of individual Neurons are unpredictable and highly variable. Worse, they require constant life support just to stay alive. A 5 minute power interruption to your neural CPU and it's time to go shopping for a new one. You would certainly not want to build a practical computing tool out of them.

      Neural computing will remain the domain of highly specialized research into AI and neural computing forever. We may develop neural analogs using nanotech or some other gee-whiz tech, but they will not be true neurons.
      • by Welsh Dwarf ( 743630 ) <d,mills-slashdot&guesny,net> on Friday February 20, 2004 @07:59AM (#8338207) Homepage

        Neural computing will remain the domain of highly specialized research into AI and neural computing forever. We may develop neural analogs using nanotech or some other gee-whiz tech, but they will not be true neurons.

        I disagree, I think neural computing will have practical applications, but more in the lines of neural interfaces than actual computers. Imagine a prosthetic(sp?) arm that works just like the old one did...

        • Yes, I meant to add that as well (it came out of my fingers as "neural computing" instead of "neural interfaces")

          But that's a far cry from being the future of computing, which implies that we'll use biological neural tissue instead of fabricated CPU's.
      • It seems like both of these difficulties (unpredictibility and constant power) could be overcome. The manufacturing process would have involve a training stage, where the neurons would be put through a series of routines until the connections between the neurons were at the correct strength. As for the power interruption, enough backup power supplies and advanced warnings would make this an unlikely event. As long as the chance of power-loss was less than the chance of hard-drive failure, it will be a sella
  • by Homology ( 639438 ) on Friday February 20, 2004 @05:31AM (#8337764)
    nerve cells grown on a microchip can learn and memorize information which can be communicated to the brain.

    The researches have read some Slashdot posts, and believe that there must be a huge market for this chip. There is clearly a need for it ;-)

  • "A transistor located on the chip then recorded that conversation between cells."

    I'd like to see this transistor...

    fud
  • by Kalroth ( 696782 ) on Friday February 20, 2004 @05:33AM (#8337774)
    .. memory upgrade implant, specially in the mornings.
    It would also be cool with an encyclopedia or even a few o'reilly books implanted.

    Too bad it seems to be a one-way communication only, otherwise a spellchecker implant would be cool too :-)
  • by penguinland ( 632330 ) on Friday February 20, 2004 @05:35AM (#8337785)
    Researchers at the University of Calgary have found that nerve cells grown on a microchip can learn and memorize information which can be communicated to the brain.

    While the article mentions this in the introduction, it doesn't mention this happening at all in the research. It talks about neurons communicating with each other. This is a long way from connecting this chip into a living brain in an animal that can still function.

    While I agree that this is a fascinating article, we should make sure not to sensationalize it too much. Making chips that interface with actual brains in actual animals, even if they are snails, is still a long way off.
  • by ClaudiusMinimus ( 743231 ) on Friday February 20, 2004 @05:43AM (#8337813)
    would have a whole new meaning...
  • by zeruch ( 547271 ) <zeruch.deviantart@com> on Friday February 20, 2004 @05:52AM (#8337844) Homepage
    ...but I still think Natural Stupidity will outpace Artificial (or artificially enhanced) Intelligence.
  • Skynet (Score:2, Funny)

    by Karem Lore ( 649920 )
    da da dum de dum.
    da da dum de dum.
    da da dum de dum.

    Termihuman III, coming to a cinema near you.

    In the year 2250, a small pocket resistance of humans find the means to develop an organic gooker. Using the power of jelly to disable our circuit boards, they start a highly accurate military campaign to overrun the machines...

    Tron and Tran, are a simple couple thrown together in this all-action, pistol pumping, explosion-full chase between man and machine. Will their love be enough to conquer the invading h

  • 'We discovered that when we used the chip to stimulate the neurons, their synaptic strength was enhanced'

    Makes sense, doesn't it?

    The (possibly) frightening spect of this is that it may pave the way for artificial lifeforms/cyborgs/skynet...

  • by NorwBlue ( 711956 ) on Friday February 20, 2004 @06:01AM (#8337875)
    Ely Lilly release the new Prozac add-on for nervous cpu's
  • by Elanor ( 130622 ) on Friday February 20, 2004 @06:02AM (#8337876)
    a.k.a Neal Stephenson and his uncle.

    Chip embedded in politician's brain after a stroke - he goes on to be president.... v. spooky.

    I would love to see alzheimer's patients helped with this. If it's a genetic disease, I'm up the creek and dropped me paddle a while back.

    - Lnr
  • by hazman ( 642790 ) on Friday February 20, 2004 @06:13AM (#8337923)
    Imagine a U.S. President that is simply a marionette made of organic plasma being controlled and manipulated by puppeteers and handlers behind the curtain - stringlessly AND wirelessly.
  • by Xenobane ( 746489 ) on Friday February 20, 2004 @06:17AM (#8337929)
    towards a virtual girlfriend.
  • Half-bit bandwidth (Score:5, Insightful)

    by korpiq ( 8532 ) <-,&korpiq,iki,fi> on Friday February 20, 2004 @06:20AM (#8337938) Homepage
    Quoth the article:

    scientists stimulated one nerve cell to communicate with a second cell which transmitted that signal to multiple cells within the network.

    Singal up (probably down too, though that is not said). That's a start. Now let me jump.

    Imagine how this would feel in your own brain. Even strengthened to noticeable level by a lump of neurons, the signal would still read "beep". Now imagine being fed information through that channel. "Beep, bip beep bip bip beep". Better start training that morse.

    Now let's enhance the input by adding more bits into it and running data through a digital-to-analog converter. This is where you would slowly be able to "see colors", one at a time. Low signal, cold feeling; high signal, hot feeling. That is brainable information. You can associate different patterns of these "colors" to different ideas.
    But still it's not like you could see any shapes, is it?

    Now add more bytes, feed them in side-by-side. That's a feed. At this point, feel nausea. Something is feeding noise into your thoughts, something you cannot possibly comprehend.

    Would take a processing system not unlike vision inside the brain to translate that feed into experiences like colors, tastes, touches, then further associate these to make shapes out of the noise.

    A long way.

    Worth taking, of course, as research goes, but I wouldn't toss away those external displays as of yet. Have a hunch computers won't be the same, either, when we get there.

    Future research will focus on interfacing silicon chips with the human brain to control artificial limbs and develop "thinking" computers.

    Mostly fun!
    • by dave420-2 ( 748377 ) on Friday February 20, 2004 @06:53AM (#8338016)
      That's the beauty of the brain, though. It can make sense of the strangest of inputs. The very nature of neurons and connections in the brain means that if you were to introduce an "input" into the brain using a technique like this, given time, there's a very good chance that the brain will eventually make sense of it. After all, it's a very good learning computer, and this is really no different to the information sent via the optic nerve.

      Imagine trying to describe vision to someone who's been blind from birth. It's nigh-on impossible to explain, as it's unlike anything else they can experience. This is what we're seeing here - a new sense we just can't comprehend, yet could offer us such incredible benefits we can't hope to fully understand at such an early stage as this.

    • Brain implants to give vision to the blind already exist [wired.com].

      This new tech is basically a way of doing it more efficiently.
    • given that, in laymans terms, the brain adapts to changes (people who get a lump of brain chopped out can adapt slowly over time to accomodate this sometimes) it may be possible to implant the very young with mini interfaces which supply a feed.

      Now - feed simple messages such as 'food' or 'your mum' or 'barney' into this interface to train it to associate the feed with whats going on around it.

      You never know - your brain may well start treating this as a new sense - and you would potentially have some mor
  • In a press release read before assembled journalists, Intel Corp. announced that growing neurons on Pentium class chips would contravene the DMCA, by allowing competing engineers to directly download chip information into their brains.

    When pressed further, the spokesman stated that he couldn't be sure, but believed that growing neurons on AMD chips would however not contravene any laws.

    RIAA executives were unavailable for comment, but an anonymous source indicated that at least one executive has been ad

  • by erwin ( 8773 ) on Friday February 20, 2004 @06:37AM (#8337980)
    Alan Cooper, author of "The Inmates are Running the Asylum" and other texts put it this way:

    Q: What do you get when you cross a camera and a computer?
    A: A computer.

    His point is that from an interface and place-in-the-world point of view, most products that have been digitally enhanced tend to remain closer to their technology roots than their analog counterparts (with all of the usability, and I would say ethical, challenges inherient in a technologist-driven system).

    That said, this is pretty frickin' cool, but the double-edged sword presented by this innovation seems both particularly sharp and far reaching. I really hope we get this one right.

    "Why can't you use your powers for Good?"

    • Q: What do you get when you cross a camera and a computer?
      A: A computer.


      Perhaps I'm missing the point (I've never read the afformentioned book), but when I cross a camera and a computer, I usually get a camera. Digital cameras are exactly this, no? The question seems a silly one. When we started making bridges out of steel did they somehow become something other than bridges?

      A camera is a thing that can capture pictures and later reproduce them. You can use film, or silicon to do that, but it's a c
  • Reminds me of... (Score:4, Interesting)

    by jonney02 ( 591116 ) on Friday February 20, 2004 @06:45AM (#8337998)
    A quote i read somewhere
    "The danger from computers is not that they will eventually get as smart as men, but we will meanwhile agree to meet them halfway." -Bernard Avishai
  • by hkmwbz ( 531650 ) on Friday February 20, 2004 @06:51AM (#8338010) Journal
    With research like this going on, will we eventually see a medical solution to tinnitus?

    Tinnitus is a serious problem to a lot of people today, and it can have many causes, from various diseases/illnesses, to noise damage. It apparently has to do with the nerves in one's ear, so would this kind of research, might we finally see a way to actually treat tinnitus?

    Until you get T, you don't realize how lucky people who can actually be in a quiet room without going mad are...

  • implants (Score:2, Interesting)

    Perhaps it will be possible to make brain implants that enables you to connect to the internet and let others connect their brains with the net as well. Imagine sharing brainpower, or even share thoughts, ideas and memories over a filesharing network.
    • I am for one waiting to get it in my head. I am sick and tired from using clamsy keyboards and mice with eye-hurting displays. I don't want to read from display - I want to feel it. I don't want to type on keyboard - I want to think it.

      Of course it opens a new filed for hacker attacks. But it won't stop us to have anyway, eventually. Certainly they will work in area of brain firewalling.

      At first time as a simple solution I could use my personal laptop as a gateway connecting me to to the rest of the wor

  • by Serious Simon ( 701084 ) on Friday February 20, 2004 @07:10AM (#8338053)
    The last thing I would want is to make it nervous.
  • I wonder (Score:5, Funny)

    by Gendhil ( 686251 ) on Friday February 20, 2004 @07:55AM (#8338199)
    Who'll be the first to upload a linux distro into the brain of an actual pinguin.
  • Predicted 1945(!) (Score:3, Interesting)

    by infolib ( 618234 ) on Friday February 20, 2004 @08:19AM (#8338287)
    At the end of WWII research director Vannevar Bush predicted the IT revolution. [theatlantic.com] He was eerily right in many ways, but some things are still to come. For some time I had the following quote hanging on my wall:

    In the outside world, all forms of intelligence whether of sound or sight, have been reduced to the form of varying currents in an electric circuit in order that they may be transmitted. Inside the human frame exactly the same sort of process occurs. Must we always transform to mechanical movements in order to proceed from one electrical phenomenon to another?
  • but I welcome my new silicon based masters.
  • by ThePretender ( 180143 ) on Friday February 20, 2004 @09:01AM (#8338454) Homepage
    'We discovered that when we used the chip to stimulate the neurons, their synaptic strength was enhanced,'

    ...and when we added the RFID, the test subjects had great futures working for Wal-Mart as they could communicate directly with the pallets of merchandise. In 2.0, the store employees will automagically know when the Gillette razors need to be restocked. :-)

  • Here on /. years ago there was an article about Japanese researchers controlling cockroach movement with implants stimulating its brain. Perhaps now they could control bigger life forms ....

  • big deal (Score:4, Insightful)

    by lukesl ( 555535 ) on Friday February 20, 2004 @09:58AM (#8338794)
    IAAN, and this is not a big breakthrough in any sense. Basically, this is something that was first done using manually-positioned electrodes probably twenty years ago, and now they can grow neurons on a dish that has electrodes built into it and do it that way. WoO-hAH!

    The computational power of neurons comes from the way they work in groups, not the way they work alone. Therefore, it's strongly dependent upon the detailed organization of their connectivity. Grinding up a piece of brain and regrowing it on a dish will obviously not retain native connectivity. Additionally, the time it would take to manually rewire an interesting circuit by giving little localized electrical pulses (or do anything else interesting) is longer than neurons are viable in culture, and that's not a problem that's been solved yet.

    I'm not saying this technology won't have important uses as a research tool, just that it won't be useful for what people here seem to think it will be useful for (high-density pornography storage). BTW, one of the more interesting characters in this field is Steve Potter [gatech.edu], a somewhat strange guy who does some technically impressive work [uwa.edu.au]
  • "Future research will focus on interfacing silicon chips with the human brain to control artificial limbs and develop "thinking" computers." Thats one heck of a leap forward from connecting x number of snail nerves together.

A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...