Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Biotech Medicine Technology

The Ethical Issues Surrounding OSU's Lab-Grown Brains 190

TheAlexKnapp writes: Last month, researchers at Ohio State University announced they'd created a "a nearly complete human brain in a dish that equals the brain maturity of a 5-week-old fetus." In the press release, the University hailed this as an "ethical" way to test drugs for neurological disorders. Philosopher Janet Stemwedel, who notes that she works in "the field where we've been thinking about brains in vats for a very long time" highlights some of the ethical issues around this new technology. "We should acknowledge," she says. "that the ethical use of lab-grown human brains is nothing like a no-brainer."
This discussion has been archived. No new comments can be posted.

The Ethical Issues Surrounding OSU's Lab-Grown Brains

Comments Filter:
  • A quantum-computing bio-neural gel pack would be great.
    Photonic co-processor would be nice.
    (Life-support and control housing would be 3D printed, naturally.)

    http://en.memory-alpha.wikia.c... [wikia.com]

  • You are supposed to create headless bodies to perform experiments on and harvest organs from. The living brain is the *only* part we can't ethically do this kind of shit to, because it's the part that makes each of us a person.
    • Re: (Score:2, Insightful)

      by fyngyrz ( 762201 )

      Not at five weeks, it doesn't. It's not magic. It's biology.

      • I never said that a 5 week baby brain is a person. Regardless of whether a 5 week old baby brain is a person, growing a brain in a vat to do experiments on is stupid. It's either a morally abhorrent thing to do in the case of personhood, or completely pointless in the case of non-personhood (i.e. just use a regular 5 week old non-person fetus).

        There are no ethical dilemmas solved by this.

        • I never said that a 5 week baby brain is a person.

          Um, fetus != baby. A 5 week old fetus/brain is quite different from a 5 week old baby/brain. The article is about the former not latter.

          • The article is about the former not latter.

            And yet the point I am making (that the age of the brain doesn't matter) still holds.

            • The article is about the former not latter.

              And yet the point I am making (that the age of the brain doesn't matter) still holds.

              It really doesn't hold.

              • I am not even going to bother trying to explain anything to you until you can demonstrate you even know what I am arguing.
        • OK, I'll provide you with a problem this technology solves. Getting cells from non-defective fetuses pisses off many people. It is wrong to piss off people when there is an easy way to avoid doing so. This is an easy way to avoid pissing off some of those people.
        • by rtb61 ( 674572 )

          It is a collection of human brain cells and not a brain. The inputs provide the growth and development of brain cells, so that collection of brain cells with no inputs will be hugely if not totally non-functional beyond the simplest biological processes. Likely that would make the experiment largely pointless beyond those simple biological process. This would mean it would make more sense to work with those specific developed portions of specific areas of the brain provided by voluntary human donors, not t

          • It is a collection of human brain cells and not a brain.

            What do you think a brain is? IT may not be a brain, but the fact that it is a collection of human brain cells is certainly not what disqualifies it.

            The inputs provide the growth and development of brain cells, so that collection of brain cells with no inputs will be hugely if not totally non-functional beyond the simplest biological processes.

            Where is your proof that *only* inputs (as you describe them), can "provide growth".

            Basically a whole bunch of areas of that brain would atrophy to nothing or more accurately never develop at all do to lack of stimulation.

            Neurons are connected to other neurons. They can stimulate eachother.

            I have no idea how close what they have is to a real brain. It has basically nothing to do with my comment.

    • by PvtVoid ( 1252388 ) on Friday September 18, 2015 @07:27PM (#50553243)

      You are supposed to create headless bodies to perform experiments on and harvest organs from.

      Extra points if they're in topless bars.

    • The living brain is the *only* part we can't ethically do this kind of shit to, because it's the part that makes each of us a person.

      Could a brain without any sensory input ever develop intelligence? And if a brain had no motor outputs, how could we ever tell?

      • Could a brain without any sensory input ever develop intelligence?

        That's like asking "Could an airplane without wings ever fly". It depends on your specific definition of "flying" and "wings". I suspect that even a brain in a vat has *some* input, whether or not you want to label that input "sensory" is another matter. Furthermore, I don't think sensory input is necessary for intelligence in general, even if it may incidentally be necessary for human intelligence as we know it.

        And if a brain had no motor outputs, how could we ever tell?

        Motor functionality (i.e. movement) is not necessary for intelligence. If we hooked up a ser

        • That's like asking "Could an airplane without wings ever fly". It depends on your specific definition of "flying" and "wings".

          Well, "airplane" also depends on specific definitions of "flying" and "wings", doesn't it? Also, as far as I am aware, all the alternative aircraft also have "flying" and "wings" of some sort. There's things like wings inside of jet turbines, so even if you just vectored the thrust off a jet and pointed its ass at the ground, you'd have things like wings.

          • Yes, most things that "fly" have something *like* "wings".

            What I am saying is that there is probably something *like* sensory input happening in a vat brain even if it doesn't have eyes, ears, nose, etc.

            Neurons just need other neurons to make a functioning brain. They can get sensory input from optic nerves in the form of electrical signals, but they can also get electrical signals from artificial sources. The human brain is very plastic. It will adapt to whatever input it can find.

            I don't expect a huma

        • by sjames ( 1099 )

          We do know that significant sensory input is a requirement to be conscious. Even the really incomplete deprivation in a sensory deprivation tank results in a dream like state in short order.

          Having no neural connection to sensory organs would be a much more complete deprivation.

          • We do know that significant sensory input is a requirement to be conscious.

            How do you know that?

            Even the really incomplete deprivation in a sensory deprivation tank results in a dream like state in short order.

            You seem to be conflating "consciousness" (i.e. sentience) vs. "consciousness" (i.e. wakefulness).

            Having no neural connection to sensory organs would be a much more complete deprivation.

            I am not saying that a brain can be awake without sensory input (although I don't accept this as an absolute truth either). I am saying that sensory input is not strictly necessary for sentience.

            • by sjames ( 1099 )

              I am suggesting that until something perturbs the neural net from it's default state, there is no sentience there. It needs to have been awake to some degree for at least an instant at some time to be anything more than a blob of neural net.

              I suspect coordinated and consistent input would be required to get from sentience to intelligence. In order to reason, there must be something to reason about.

              • Consciousness (wakefulness) is a spectrum not a binary quality.

                I am suggesting that until something perturbs the neural net from it's default state

                I don't know why a neural net would need outside stimulus in order to progress beyond the initial state. This makes a lot of assumptions about the way that every neuron works. Computers don't need outside stimulus to operate, their "nerons" transistors can be triggered by the other transistors.

                I suspect coordinated and consistent input would be required to get from sentience to intelligence.

                This would certainly be useful in a consciousness learning about it's external environment, but I don't think comprehension of one's external environmen

                • by sjames ( 1099 )

                  This would certainly be useful in a consciousness learning about it's external environment, but I don't think comprehension of one's external environment is necessary for sentience.

                  Please read what I wrote carefully. I said:

                  I suspect coordinated and consistent input would be required to get from sentience to intelligence.

                  In other words, development of sentience doesn't necessarily require coordinated input but intelligence does.

                  Also, note that I said DEFAULT state, not INITIAL state.

                  As to the computer analogy, pull the boot rom and turn it on, the clock ticks, but nothing useful happens. Power on an untrained artificial neural net, at most you get a meaningless oscillation (but if there is no form of output, you won't see it).

                  Consider, how can there be self if there isn't not-self?

                  • In other words, development of sentience doesn't necessarily require coordinated input but intelligence does.

                    Everyone seems to have their own definition of intelligence. The one I default to, assumes sentience. You can't be sentient and not intelligent.

                    Also, note that I said DEFAULT state, not INITIAL state.

                    I don't know what the "default state" of a complex system even is. I know what an initial state is. It seems you can define any state you want to be the default.

                    As to the computer analogy, pull the boot rom and turn it on, the clock ticks, but nothing useful happens. Power on an untrained artificial neural net, at most you get a meaningless oscillation (but if there is no form of output, you won't see it).

                    Consider, how can there be self if there isn't not-self? While Buddhism suggests there is a self-less state of being, it also indicates that there is no suffering in that state.

                    I don't accept Buddhism as evidence of anything scientific. And I don;t find anything particularly compelling in a philosophical sense with Buddhism either.

                    A little googling shows [brainblogger.com] that in fact, before week 25, the fetal neural net does not oscillate. Going back to the computer analogy, imagine an old mini where the power is on but the CPU clock hasn't been started. The potential is there but it isn't actualized. Of course, a neural net is asynchronous (or at least can be), but some stimulus is still needed to get it going. Note too that the normal fetal brain is not completely sensory deprived once the peripheral nervous system begins to develop.

                    I think you are taking an example of the way o

                    • by sjames ( 1099 )

                      Intelligence is a slippery term to be sure. However, I would say that certainly sentience does not imply intelligence. Depending on your favorite definition, intelligence doesn't require sentience. For example, artificial image classification nets don't likely have any sense of self or subjective experience. Nor do expert systems.

                      The default state of a neural net is untrained, without memories. Fresh from the vat in the case of an organic one.

                      None of your confusion between the computer analogy you introduce

                    • Intelligence is a slippery term to be sure. However, I would say that certainly sentience does not imply intelligence. Depending on your favorite definition, intelligence doesn't require sentience. For example, artificial image classification nets don't likely have any sense of self or subjective experience. Nor do expert systems

                      I agree that intelligence doesn't imply sentience. I am saying sentience implies intelligence.

                      The default state of a neural net is untrained, without memories. Fresh from the vat in the case of an organic one.

                      And as soon as one neuron fires, it is no longer in the same state.

                      None of your confusion between the computer analogy you introduced and organic neural nets alters the fact that the fetal neural net does not oscillate before week 25. It shows only random spikes that damp to nothing in short order.

                      I don't see how this is relevant.

                      I am well aware of the history of computing and the halting problem, but I'm not sure how the halting problem or batch vs interactive computing has any bearing on the question at hand.

                      It is an example of how something can "think" in the absence of external input.

                      Regardless of the philosophy, the techniques provide experience that may have bearing on the question at hand.

                      They certainly show us that some things are possible. They do not show us what is impossible.

                      I invoke Buddhist thought primarily because meditation is the only way we are likely to experience a self-less state without very dangerous physical experimentation on the brain.

                      It feels a bit as if we are talking at cross purposes. I am here hoping to spur new ideas on the subject in myself and perhaps you. I may be miss-perceiving, but you seem to be here expecting to win an argument?

                      I am disputing the specific claim that external sensory input is necessary for consciousness. I don't think there is any conclusive evidence to

                    • by sjames ( 1099 )

                      And as soon as one neuron fires, it is no longer in the same state.

                      No. If the net has learned nothing, it's behavior will remain indistinguishable from the default state. Random static electricity can cause a neon tube to fire as well, but that doesn't mean it's conscious, even if another tube fires due to the stimulus.

                      Certainly, no matter how many neurons randomly fire, it is not going to learn self vs. not-self. The concept won't be there because without external stimulus there is no information about not-self. No self, no sentience. Sentience is generally believed to be

                    • No. If the net has learned nothing, it's behavior will remain indistinguishable from the default state.

                      It's behavior could be indistinguishable even if it has learned something. Or it's behavior could be distinguishable even if it hasn't learned anything. It all depends on what you consider to be learning, what you consider a distinguishable difference, and how good you are at distinguishing that difference.

                      Random static electricity can cause a neon tube to fire as well, but that doesn't mean it's conscious, even if another tube fires due to the stimulus.

                      I never said that neurons firing implies consciousness. I said that I haven't seen any evidence that external sensory stimulus is a per-requisite for consciousness.

                      Certainly, no matter how many neurons randomly fire, it is not going to learn self vs. not-self.

                      Who says the neuron firings are random

                    • by sjames ( 1099 )

                      Alas, you seem willing to redefine terms into meaninglessness so you can claim a disconnected neural net has that mysterious thing. We'll call it quigby. It affects nothing and changes nothing and it can't be detected. Yes, I'll agree that neural nets may have quigby.

                      However, if sentience, consciousness, and intelligence have any sort of meaning that at all coincides with commonly accepted definitions, please do explain scientifically how a neural net might have those traits if it has never had connections

                    • However, if sentience, consciousness, and intelligence have any sort of meaning that at all coincides with commonly accepted definitions, please do explain scientifically how a neural net might have those traits if it has never had connections to the outside world and doesn't even show signs of oscillation.

                      I didn't say that this neural net had *no* connections to the outside world. I said it would be lacking external sensory inputs. The outputs are the only way you *could* measure anything, including sentience, consciousness, etc.

                      And originally this meant a brain that lacked traditional sensory organs (e.g. eyes, ears, etc). In fact in my original reply to drinkypoo, I references a two way communication line to the brain. But I actually don't think 2 way communication is necessary for sentience, intellige

      • No.

        Neurons do not fire without stimuli, so a brain without sensory input per definition isn't thinking.

        • Sensory input is not the only stimuli. The neurons are connected to eachother, and can stimulate eachother. This is analogous to a computer that can run a program when started without the need for any keyboard/mouse/network/etc input. Transistors do not work without input either, but the transistors in the system do have input, they have eachother.
        • A small amount of neural firing is "random", akin to electronic noise in a radio with a shorted input. This is certainly neural firing without external stimulus; whether it can be called "without stimulus" is unclear, but I'd argue it is firing without stimulus.
    • Funny, I was under the impression it's the collection of thoughts, memories and emotions that makes us a person.

  • by Anonymous Coward on Friday September 18, 2015 @07:37PM (#50553295)

    ... I much prefer free range brains. These GMO brains contain too many death-threatening chemical properties. The last thing I want to do is wake up one morning alive because of my diet.

    • I don't even see the need to argue or worry about the artificial additives of vat brains, the superior taste of cage-free grey matter is reason enough for the discriminating zombie palate.

    • ... I much prefer free range brains.

      Agreed: Give me cage free brains or give me death. I mean, Death^2.

  • Run the program at Wright State and declare the disembodied brains to be exempt immigrant workers. Then nobody will care that you're making them do 168 hour work weeks, and that termination of employment is literal termination.

  • We need new Ethics (Score:3, Insightful)

    by Dorianny ( 1847922 ) on Friday September 18, 2015 @08:01PM (#50553383) Journal

    I am tired of Religious beliefs dictating Ethics. This is especially true for stem-cell research.

    An embryo can grow into a human given the right conditions, namely being carried to term by the host.

    A zygote can grow into a human given the right conditions, namely attaching to uterues and being carried to term by the host

    An egg can do the same given the right conditions, namely getting fertilized and then attaching to uterues and being carried to term by the host.

    None of them is a human being despite your Religious convictions.

  • by PopeRatzo ( 965947 ) on Friday September 18, 2015 @09:52PM (#50553851) Journal

    Last month, researchers at Ohio State University announced they'd created a "a nearly complete human brain in a dish that equals the brain maturity of a 5-week-old fetus."

    In other news, the brain has announced its candidacy for the 2016 Republican presidential nomination, and is currently polling at 21% of likely voters.

  • "that the ethical use of lab-grown human brains is nothing like a no-brainer."

    Really? That JOKE is unethical.

    --PeterM

  • I haven't read forbes in a long time, because my popup blocker breaks their "quote of the day" splash screen. and nothing of value was lost.

  • That is can't hold a conscious being in it. I think the information about how many neurons are in other tissue, like heart or even digestive system will have a bearing on how 'self' is defined, one day.
  • A simple brain in a jar is not that many steps removed from glass-domed brains betting quatloos on battles between their slaves.

  • by EnsilZah ( 575600 ) <EnsilZah.Gmail@com> on Saturday September 19, 2015 @09:01AM (#50555471)

    So when are we going to see the show about zombies coming out of hiding now that they have an artificial source of nourishment, and their various sexy adventures?

  • "that the ethical use of lab-grown human brains is nothing like a no-brainer."

    There goes my coffee.
  • Comment removed based on user account deletion

CChheecckk yyoouurr dduupplleexx sswwiittcchh..

Working...