Nerve Cells Successfully Grown on Silicon 284
crabpeople writes "Researchers at the University of Calgary have found that nerve cells grown on a microchip can learn and memorize information which can be communicated to the brain. 'We discovered that when we used the chip to stimulate the neurons, their synaptic strength was enhanced,' said Naweed Syed, a neurobiologist at the University of Calgary's faculty of medicine."
The Future of Computing (Score:5, Insightful)
Not making faster Pentiums or Athlons. Sorry. Most of that magic has already been woven. Who out there is qualified to make systems level designs and decisions about bio computer systems? Think about the type of knowledge it must take about physics, electrical and computer engineering, as well as biological knowledge.
What type of magnetic and power restrictions will there be? Reliability? What type of optimizations will exist? Interfaces? Flexibility?
We're still quite far away from having things like this be applicable to modern day but think about when you too can say, "I know Kung Fu"!
"Communicated to the brain?" (Score:5, Insightful)
While the article mentions this in the introduction, it doesn't mention this happening at all in the research. It talks about neurons communicating with each other. This is a long way from connecting this chip into a living brain in an animal that can still function.
While I agree that this is a fascinating article, we should make sure not to sensationalize it too much. Making chips that interface with actual brains in actual animals, even if they are snails, is still a long way off.
would this introduce a new measure of speed? (Score:1, Insightful)
Re:I'm no Bill Joy (Score:5, Insightful)
No it's not. This involves interfacing with the neurons that are already there.
As these true neural webs become more complicated, it would be interesting to see if any kind of emergent behavior was evident
Given that large collections of neurons are well known to exhibit emergent behaviour, I think it would be more interesting if they didn't.
this could be the first step to replicating a nervous system without having to rely on fetuses for stem cells. It requires no human cloning and holds immense promise
Nerve cells harvested from an animal brain can be grown in the lab. There is no need for embryonic stem cells or cloning at all. Growing them on silicon does not make this easier - in fact they will probably grown better in a petri dish.
It would definitely be cool to have a couple of these chips implanted to enhance the base memory that we are kitted with at birth
Memory in the brain is not simple storage of information. It is unlikely that pluggin a DRAM into your brain would be able to enhance your memory.
OT: evolution vs. bruteforcing vs. creation (Score:4, Insightful)
On the other hand you are right: This trial and error seems to lead to better results in the long run compared to deterministic creation. But this scheme is already adopted by science. IIRC there was a distributed computing project simulating a robot with a defined task and changing the parameters of the robot. The different clients exchanged the information about the results. I don't remember anymore the name or the homepage of the project, I think it was already 4 or 5 years ago...
Nerve cells (Score:1, Insightful)
injuries in the future with this technology?
Half-bit bandwidth (Score:5, Insightful)
scientists stimulated one nerve cell to communicate with a second cell which transmitted that signal to multiple cells within the network.
Singal up (probably down too, though that is not said). That's a start. Now let me jump.
Imagine how this would feel in your own brain. Even strengthened to noticeable level by a lump of neurons, the signal would still read "beep". Now imagine being fed information through that channel. "Beep, bip beep bip bip beep". Better start training that morse.
Now let's enhance the input by adding more bits into it and running data through a digital-to-analog converter. This is where you would slowly be able to "see colors", one at a time. Low signal, cold feeling; high signal, hot feeling. That is brainable information. You can associate different patterns of these "colors" to different ideas.
But still it's not like you could see any shapes, is it?
Now add more bytes, feed them in side-by-side. That's a feed. At this point, feel nausea. Something is feeding noise into your thoughts, something you cannot possibly comprehend.
Would take a processing system not unlike vision inside the brain to translate that feed into experiences like colors, tastes, touches, then further associate these to make shapes out of the noise.
A long way.
Worth taking, of course, as research goes, but I wouldn't toss away those external displays as of yet. Have a hunch computers won't be the same, either, when we get there.
Future research will focus on interfacing silicon chips with the human brain to control artificial limbs and develop "thinking" computers.
Mostly fun!
Re:Hasn't this been done before? (Score:4, Insightful)
Re:This could really upset international politics (Score:5, Insightful)
to paraphrase Alan Cooper (Score:5, Insightful)
Q: What do you get when you cross a camera and a computer?
A: A computer.
His point is that from an interface and place-in-the-world point of view, most products that have been digitally enhanced tend to remain closer to their technology roots than their analog counterparts (with all of the usability, and I would say ethical, challenges inherient in a technologist-driven system).
That said, this is pretty frickin' cool, but the double-edged sword presented by this innovation seems both particularly sharp and far reaching. I really hope we get this one right.
"Why can't you use your powers for Good?"
Re:Half-bit bandwidth (Score:4, Insightful)
Imagine trying to describe vision to someone who's been blind from birth. It's nigh-on impossible to explain, as it's unlike anything else they can experience. This is what we're seeing here - a new sense we just can't comprehend, yet could offer us such incredible benefits we can't hope to fully understand at such an early stage as this.
Re:The Future of Computing (Score:4, Insightful)
Unless we use equivalent mechanisms for cpu based computing comparing the speed of the brain to silcon based units imho doesnt make much sense.
Re:to paraphrase Alan Cooper (Score:3, Insightful)
Q: What do you get when you cross a camera and a computer?
A: A computer.
Perhaps I'm missing the point (I've never read the afformentioned book), but when I cross a camera and a computer, I usually get a camera. Digital cameras are exactly this, no? The question seems a silly one. When we started making bridges out of steel did they somehow become something other than bridges?
A camera is a thing that can capture pictures and later reproduce them. You can use film, or silicon to do that, but it's a camera because of what it does, not how it does it.
Software version (Score:5, Insightful)
Re:The Future of Computing (Score:5, Insightful)
Neural computing will remain the domain of highly specialized research into AI and neural computing forever. We may develop neural analogs using nanotech or some other gee-whiz tech, but they will not be true neurons.
I disagree, I think neural computing will have practical applications, but more in the lines of neural interfaces than actual computers. Imagine a prosthetic(sp?) arm that works just like the old one did...
big deal (Score:4, Insightful)
The computational power of neurons comes from the way they work in groups, not the way they work alone. Therefore, it's strongly dependent upon the detailed organization of their connectivity. Grinding up a piece of brain and regrowing it on a dish will obviously not retain native connectivity. Additionally, the time it would take to manually rewire an interesting circuit by giving little localized electrical pulses (or do anything else interesting) is longer than neurons are viable in culture, and that's not a problem that's been solved yet.
I'm not saying this technology won't have important uses as a research tool, just that it won't be useful for what people here seem to think it will be useful for (high-density pornography storage). BTW, one of the more interesting characters in this field is Steve Potter [gatech.edu], a somewhat strange guy who does some technically impressive work [uwa.edu.au]
Re:Kinda cool (Score:3, Insightful)
Well, I find mine useful anyways, I am sure some people have mized results
Wow that a bold future (Score:2, Insightful)
Re:Software version (more than Boolean) (Score:2, Insightful)
a b aXORb
1 1 0
1 0 1
0 1 1
0 0 0
What you have described is:
e i e?i
1 1 0
1 0 1
0 1 0
0 0 0
Where ? is either &~, i.e. "e and not i", or "not if e then i". A "partially working" logical function is really just a fully working different logical function.
Re:Kinda cool (Score:3, Insightful)
und besides: The ads. Think of the ads!!!!