Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Medicine Science

New Imaging Technique Helps Explain Unconsciousness 78

smitty777 writes "A new imaging technique called fEITER (for functional Electrical Impedance Tomography by Evoked Response) attempts to explain the process of slipping into unconsciousness. The fEITER is a portable device that creates 3D imagery based on evoked potentials measured hundreds of times a second. The interesting finding from these studies is that unconsciousness appears to result from a buildup of inhibitor neurons. From the article: 'Our findings suggest that unconsciousness may be the increase of inhibitory assemblies across the brain's cortex. These findings lend support to Greenfield's hypothesis of neural assemblies forming consciousness.'"
This discussion has been archived. No new comments can be posted.

New Imaging Technique Helps Explain Unconsciousness

Comments Filter:
  • FTFA:

    The machine itself is a portable, light-weight monitor, which can fit on a small trolley. It has 32 electrodes that are fitted around the patient’s head. A small, high-frequency electric current (too small to be felt or have any effect) is passed between two of the electrodes, and the voltages between other pairs of electrodes are measured in a process that takes less than one-thousandth of a second.

    While we're still a long way away from a practical direct neural interface, this certainly looks like a step in the right direction. They've demonstrated that the measurements are possible, and at a sample rate that is useful. Certainly there's room for improvement in sensitivity, sample rate, and resolution as well as in miniaturization.

    When they can reduce this from a trolleycart -sized instrument to something one can support on one's head, then we'll see some more practical and less academic applications. (Yes, like porn. And games. And real virtual reality control of UAVs and waldoes.) Keep in mind that in the 80's, realtime Heads-Up Displays were this large and cumbersome... now look at them.

    It really is illuminating to see how little we know about the nature of consciousness and thought, and how far away we still are from technologically-aided introspection.

    • by yarnosh ( 2055818 ) on Saturday June 18, 2011 @04:14PM (#36486746)

      When they can reduce this from a trolleycart -sized instrument to something one can support on one's head, then we'll see some more practical and less academic applications. (Yes, like porn. And games. And real virtual reality control of UAVs and waldoes.) Keep in mind that in the 80's, realtime Heads-Up Displays were this large and cumbersome... now look at them.

      Are you, perhaps, confusing reading neural activity with sending specific information into the brain? As far as I can tell, the technology in the article is only for reading neurons activity, not altering them. And even at that, there's no indication that you can extract any real information out of the readings (thoughts, intentions, etc). It is simply an image of activity. I think you're reading WAYYY more into this technology than is there.

      • No, not confusing -- just extrapolating.

        A lot, admittedly... like I said, it's a step in the right direction. A baby step.

        • It appears to be an interesting experimental system because you can correlate structural changes with functional ones non invasively and seemingly relatively inexpensively (from the description in TFA the electronics seemed pretty straightforward). As with the 'other' popular way of looking at brain structure / function (fMRI [wikipedia.org]) we have a long way to go before it is terribly 'useful' but any decent non invasive technology for understanding brain function is good thing.

          Mwahahahaha....
      • And even at that, there's no indication that you can extract any real information out of the readings (thoughts, intentions, etc).

        Classifier technology is already advanced enough that making this jump shouldn't be too difficult. The real limitations are (a) the amount of data to be processed and (b) the resolution of the sensors.

        • by ceoyoyo ( 59147 )

          the resolution of the sensors.

          Classification isn't the problem, and never really has been. It's VERY unlikely that you can get anything like the resolution you'd need for any sort of useful brain reading from surface sensors and even if you're willing to undergo invasive surgery to implant electrodes you'd almost certainly need more than are practical to implant at the moment, placement is also a problem, and you STILL might not get enough spatial resolution.

          For SF style brain interfaces we're still lackin

        • The real limitations are (a) the amount of data to be processed and (b) the resolution of the sensors.

          Oh gee, is that all? That's a bit like saying the only thing stopping us from simulating a human brain inside a computer is processing power. Nevermind the enormous lack of basic understanding about how a thought is actually processed. Just throw a bunch of processing power at it and everything will just magically work itself out. /rollseyes

    • like porn. And games. And real virtual reality control of UAVs and waldoes

      And the TSA.

  • "Method to increase of inhibitory assemblies across the brain's cortex using an imaging technique"

    aka 'C-SPAN'
  • by Anonymous Coward

    In the first Knights of the Old Republic game, there was an implant item whose flavor text read that it kept over-stimulation from overloading sensory brain parts, and causing damage or unconsciousness. This would be great for soldiers getting hit with IEDs, or whatnot.

    • by TheLink ( 130905 )
      Oh yeah, I'm sure the main problem with an IED is overloading sensory brain parts, and that soldiers would be quite happy to not be unconscious after getting hit with an IED and losing a limb or three. They'd rather be able to soberly savour the reality of their situation and the consequences. So most would gladly give an arm and leg for such an implant.
  • I would be interested to know what constitutes a "neural assembly". I suspect than some form of coherence is involved. The question is whether or not this would be quantum coherence. This would be very difficult to establish of course, just as it was to establish quantum coherence in photosynthesis.
    • I think you're reading a little too much in to this - they are basically talking about groups or networks of neurons. Not sure how quantum coherence [wikipedia.org] would play a role in this.

      • At a guess, he's probably trying to integrate the whole "quantum" thing into consciousness because he either thinks it's what permits free will (or else because he thinks it has some other magical properties resulting from mankind's mystical unity with the universe - but I'll assume he's not that dumb). The thing is, as explained by The Hammock Physicist (who runs a decent blog), "quantum free will" theories are lame and don't actually get you the sufficient conditions for very much free will. [science20.com] Fortunately,
        • It's not really anything to do with free-will. It's more to do with the fact that I don't believe purely functional explanations of conscious experience, particularly the unitary aspect of it, can be adequately described with current theories. Quantum effects offer a route towards an understanding. Even with quantum effects there is still an explanatory gap; it's just that it is narrowed slightly.
  • by sgage ( 109086 ) on Saturday June 18, 2011 @04:55PM (#36486868)

    This is absurd. For a start, we don't have clue one about how to explain consciousness. Secondly, recording physical correlates to unconsciousness is not an explanation. Like so much of this stuff, it is description masquerading as explanation. Not bad as a start, perhaps, but don't call it "explanation".

    • It must be the LSD. /walterbishop

    • by Fred Ferrigno ( 122319 ) on Saturday June 18, 2011 @05:48PM (#36487106)

      You're upset that the researchers don't also assume that consciousness is some other kind of thing beyond material investigation. The researchers have no need for that assumption unless and until the evidence leads them there.

      • It sounds more like he is upset, because of wording making it sound as if consciousness is close to being fully explained by medical science and because of assumption that consciousness is just a physical thing. Although his being upset might a be misplaced feeling, since there seem to be two different definitions for what it means to research consciousness. Maybe I am wrong, but it seems to me that the two are not the same thing:

        1. One is used by medical researchers and means exploring the self-referential

      • Why assume a material existence, for that matter?

    • You're right, we don't have a clue how consciousness works. But this article is about the process of becoming unconscious, which is vastly different. It is significant that we can now associate precise physical components with the act of becoming unconscious. Since we are the ones putting the person under, we know that it's at least a partial explanation, which is something we didn't know before.

    • by ceoyoyo ( 59147 )

      Chill. We can explain consciousness as well as most people can explain how their cars work - it seems to arise from activity in the cortex. We even have a decent idea of the parts of the cortex that are most important. From this article it sounds like there are neurons that act to inhibit that activity and so inhibit consciousness. That does indeed explain unconsciousness. You don't need to know all about transistors and etching integrated circuits to explain why a computer that got dropped from a seco

    • I beg to differ greatly with your assumption that "we don't have clue one about how to explain consciousness".

      In fact we do know a few things...

      You are never "conscious" all the time.
      Every millisecond or so people are unconscious.
      When you sleep you are also unconscious.
      There can be many health related problems that would lead you to being unconscious.

      Conscious machine properties:

      Has minimums related to spacial volume, and computational capacity for human level "consciousness".

      • Being asleep does not mean being unconscious. If you can be awoken by sound, light, or movement you weren't unconscious.
        • You are unconscious while sleeping. If you can't be woken, that's called a coma. At least, it's a coma after some period of time. I'm not familiar with the exact technical meaning of the word. For example, being anesthetized is obviously not a coma. Let me simplify and put it another way. If you're awake, you're conscious. If you're not awake, you're unconscious. While sleeping, you are not awake, therefor you are unconscious.

    • by tgv ( 254536 )

      You're right: how the hell do they define unconsciousness without defining consciousness? And then they define it as the process that happens when you get anaesthesized? That's all about adding drugs to suppress brain functions. No wonder they find something like this.

      Apart from the methodological errors that plague such studies. 3D imaging from EEG is at least 10 years old now, and relies on assumptions about the electrical structure of the places where the signal originates, and can only see part of the b

  • I'm having trouble seeing what's so exciting about explaining unconsciousness. Explaining consciousness would be exciting. I realize understanding what makes a person unconscious might help to understand what makes a person conscious. But not in this case. If they're just saying the presence of these inhibitors makes a person unconscious, then we're no closer to understanding consciousness. Because you can't just make an unconscious object become conscious by taking away these inhibitors. And you have no in
    • by SydShamino ( 547793 ) on Saturday June 18, 2011 @05:48PM (#36487110)

      Well if neural inhibitors (which interfere with the processing of certain parts of our neural network) cause us to lose consciousness, then one could hypothesize that those parts of our neural network must play a role in consciousness. And that makes us at least a little closer to understanding it.

      • I mean, a blow to the head will also cause us to lose consciousness. But it won't help us understand what makes us conscious.
        • Yes, but do you know why a blow to the head makes you unconscious? Apparently it's these neural inhibitors. The better you understand the brain, the closer you are to understanding consciousness. This is one step closer.
        • by TrekkieGod ( 627867 ) on Saturday June 18, 2011 @06:44PM (#36487344) Homepage Journal

          I mean, a blow to the head will also cause us to lose consciousness. But it won't help us understand what makes us conscious.

          Actually, it does. It tells us that the organ responsible for consciousness resides in the head. Similarly, we've discovered a lot about what different regions of the brain are responsible for by looking at people who received brain damage to different areas and looking at what they were now unable to do as a result. You know the brain is responsible for consciousness, this can help narrow down what brain activity is involved by looking at what activity is inhibited when you're unconscious.

    • I guess it really depends on how you define "consciousness". If by consciousness you mean the ability to think, perceive, and reason, than it's relatively easy. However, it sounds as if you're trying to equate consciousness with life (e.g., how do you make a toaster conscious). Well, if you add a microprocessor to a toaster, does it become conscious?

      • The distinction is between consciousness (or awareness) versus "conscious awareness", which is the awareness that one IS conscious.

        My garage door opener has an electric eye that makes it "aware" whether anything is blocking the path of the closing door. It is not aware that it is aware of this. I, on the other hand, am both aware if the path is blocked, and I am AWARE that I am aware of it.

        A toaster with a microprocessor could be called "aware" of specific info, but it's now aware that it's aware of it.

        Cons

        • The distinction is between consciousness (or awareness) versus "conscious awareness", which is the awareness that one IS conscious.

          Actually, what you're describing is called metacognition [wikipedia.org] by us cognitive scientists. I was trying to make the point that the GP seemed to be confusing the two.

        • . You can brutalize your toaster, but not a ...

          I am a toaster, you insensitive clod!, and when I am done toasting, I also help in the drive in counter.

    • by ceoyoyo ( 59147 )

      Well, to start with, the actual mechanism by which general anaesthetic causes unconsciousness is unknown. It's interesting to know how something we use so much actually works. And it might be interesting to know whether it works by the same mechanism that falling asleep naturally does.

    • by sjames ( 1099 )

      For starters, it might allow us to develop safer anesthetics and a more effective measure of the depth of anesthesia during surgery so that patients can be given just enough without risk of awareness during the procedure.

      The current devices for that are known to have a significant margin for error.

  • by PPH ( 736903 )

    A Breathalyzer explains it just as well.

  • It looks like they tried to type "Feiter" as if they were normal people naming a thing, but they forgot their caps lock on.
  • 3-D movie shows what happens in the brain as it loses consciousness...

    http://www.youtube.com/watch?v=MzX7w2-FWAA [youtube.com] (24 seconds)

Get hold of portable property. -- Charles Dickens, "Great Expectations"

Working...