Forgot your password?
Biotech Technology

Download Your Brain 1147

Posted by CmdrTaco
from the gonna-need-a-bigger-drive dept.
Nicholas Roussos writes "Futurologist Dr. Ian Pearson predicts that death will be avoidable in the year 2050 by downloading your brain to a computer. Unfortunately, he is also predicting that the process will be only available to the wealthy for years after its release. I guess we should all start saving our pennies now."
This discussion has been archived. No new comments can be posted.

Download Your Brain

Comments Filter:
  • by suso (153703) * on Monday May 23, 2005 @01:24PM (#12613840) Homepage Journal
    The question on my mind is, how can you have your conscious self be in two places at once? If it would ever be possible for this, then I would think that the real power would not be longevity of life but in being able to copying ones self and retaining a kind of collective consciousness over a large array of machines.
    This is too much into the realm of metaphysics to talk about now. There is not enough factual data yet. We need to learn much much more about the human brain before we can approach such technology. Otherwise, talking about it sounds more like techno song lyrics than real science.
  • by YodaToo (776221) on Monday May 23, 2005 @01:24PM (#12613847) Homepage
    The new copy of your brain in the computer is just fine, but what about the human you that still suffers & dies?

    Its like the Star Trek transporter beam, the copy of you transported to the new location is fine, but what about the original which is obliterated in the process?

  • He's wrong. (Score:5, Interesting)

    by podperson (592944) on Monday May 23, 2005 @01:25PM (#12613871) Homepage
    Making a copy of yourself doesn't avoid death for you, it just means ongoing life for a copy of you.

    This is not a subtle point.

    Anyone who cannot grasp this either hasn't thought deeply about a subject, or is an idiot. Anyone who uses the title "futurologist" is likely the latter.
  • News? (Score:5, Interesting)

    by TripMaster Monkey (862126) * on Monday May 23, 2005 @01:26PM (#12613894)

    I thought this was supposed to be 'News for Nerds', not 'Speculation for Halfwits'...

    From TFA:

    He thinks that today's younger generation will benefit from the advances in technology to the point that death will be effectively eliminated. He explains his logic with a simple example.
    "The new PlayStation is 1 per cent as powerful as a human brain," he said. "It is into supercomputer status compared to 10 years ago. PlayStation 5 will probably be as powerful as the human brain." where does that put the Xbox?
    Seriously, this 'explanation' of his 'logic' leaves much to be desired...but there's more.
    Also from TFA:

    It [Pearson's AI] would definitely have emotions - that's one of the primary reasons for doing it. If I'm on an airplane I want the computer to be more terrified of crashing than I am so it does everything to stay in the air until it's supposed to be on the ground.

    Hmm...but what if the AI is a thrillseeker? Suicidal? Psychotic? What if it suddenly develops acrophobia? If we're going to have a true AI with emotions, these are issues that need to be addressed, don't you think?
    Here's another few nuggets from TFA:

    "You need a completely global debate. Whether we should be building machines as smart as people is a really big one. Whether we should be allowed to modify bacteria to assemble electronic circuitry and make themselves smart is already being researched."

    Well, that 'completely global debate' should be ready by the release of PlayStation 5...

    'We can already use DNA, for example, to make electronic circuits so it's possible to think of a smart yoghurt some time after 2020 or 2025, where the yoghurt has got a whole stack of electronics in every single bacterium. You could have a conversation with your strawberry yogurt before you eat it.'

    'Smart yoghurt'? Sure I guess it's possible to think of that...about as possible as it is to think of magical elves, unicorn-riding gnomes, and smart futurologists.

    One thing conspicuously missing from this article is speculation over the possible legal status of either a true AI or a downloaded brain. Apparently, that paragraph got bumped in favor of 'smart yoghurt'.

    In short, this is the dumbest thing I've heard all day (and I work in IT support). I'm sure that if Dr. Pearson didn't already have such a sweet position as 'head of the Futurology unit at BT', he could make good money writing speculative fiction...or reading palms.
  • Bunk (Score:1, Interesting)

    by Anonymous Coward on Monday May 23, 2005 @01:28PM (#12613917)
    There is no plausible way for replicating the structure and billions of individual minute biological connections present in the brain. Making such a promise is a good way to garner interest and sell your books and speeches to a gullible public. Particularly, a rich gullible public.

    Unlike ones and zeros represented on a medium for a computer's use, there is no steady-state representation for the human mind.
  • Questions (Score:3, Interesting)

    by teiresias (101481) on Monday May 23, 2005 @01:29PM (#12613936)
    Alright, so you've download my brain into a giant (or perhaps in my brain's case small) computer bank. Sure, why not.

    Will I than be able to "upload" my brain into a new body? A new cloned body? A completely new body?

    If not, since my brain is just stored somewhere is it completely read only, or will my brain have an interface to the world, ie living through the computer? If not, why not. If so, why would I want to be uploaded back into a body?

    Sure, I'll nod my head and say why not that you'll be able to download the entire human brain into a computer. But there are far to many other questions which would involve far to much more work to say this is a viable alternative for the rich.

    And on another note, seeing as harddrives crash on me like nobodies business, we'd need a more reliable medium than what is currently available today.
  • Re:It's a copy (Score:5, Interesting)

    by mrdaveb (239909) on Monday May 23, 2005 @01:30PM (#12613959) Homepage
    Conciousness is so poorly understood that I don't think you can even say that for sure. Am 'I' the matter or the data in my brain? If I go into a teleporter, do 'I' come out the other end?
  • by RealProgrammer (723725) on Monday May 23, 2005 @01:32PM (#12613997) Homepage Journal
    I lost my mom when in my early 20's, and my dad a few years ago.

    Every once in a while, I wish I could ask them what to do about this or that, what they did when such and such happened, and so forth.

    Sort of a Jor'El/Kal'El thing, though I usually don't need to save planets and such.

    And when a spouse of 50 years dies, the other would like to talk to them.

    It's no way to cheat death, but it is a way for those around you to avoid dealing with the fact that you're gone.

  • by m50d (797211) on Monday May 23, 2005 @01:34PM (#12614047) Homepage Journal
    What makes you the same person who went to sleep last night? Your conscious experience wasn't continuous, all that really makes you who you are is the things in your brain, the memories and personality and so on. If it's a perfect replica of your brain, the experience will be the same. If you were killed in your sleep last night and a replica made and put in your place, how would you even know?
  • Re:It's a copy (Score:0, Interesting)

    by Anonymous Coward on Monday May 23, 2005 @01:35PM (#12614070)
    ok being serious (no more shinyfeet plugs), I used to work as an admin where the retention policy was 1 year. however, that just meant you rotated the tapes for 1 year. the email growth rate was very small (even though there was 1,000s each day), it was the files that grew beyond the retention. even the attachments and email boxes with 1+GB were safe, as 20 years of email fit onto a single DLT4.

    granted, MS, er Morgan Stanley is a much bigger company, but I find it very hard to believe that any retention policy would include email, that has got to be their smallest backup.
  • From Neuromancer (Score:5, Interesting)

    by Dragon218 (139996) on Monday May 23, 2005 @01:36PM (#12614096) Homepage
    by William Gibson

    He turned on the tensor beside the Hosaka. The crisp circle of light fell directly on the Flatline's construct. He slotted some ice, connected the construct, and jacked in. It was exactly the sensation of someone reading over his
    He coughed. "Dix? McCoy? That you man?" His throat was tight.
    "Hey, bro," said a directionless voice.
    "It's Case, man. Remember?"
    "Miami, joeboy, quick study."
    "What's the last thing you remember before I spoke to you, Dix?"
    "Hang on." He disconnected the construct. The presence was gone. He reconnected it. "Dix? Who am I?"
    "You got me hung, Jack. Who the fuck are you?"
    "Ca--your buddy. Partner. What's happening, man?"
    "Good question."
    "Remember being here, a second ago?"
  • Re:A must read ... (Score:0, Interesting)

    by Anonymous Coward on Monday May 23, 2005 @01:37PM (#12614098)
    That is the question. The answer is keep it, for a while.

    Email records can be subpoenaed just like anything else. If it benefits your case, it would be nice to have, if it hurts our case, it would not be so nice to have.

    When I write computer use policies, I recommend keeping it for 1 to 2 years. Depending on the type of business that might get extended out much longer. A start-up company might want to keep it 10 or more years to cover any possible arguments with their VCs over who owns the IP.

    So why not keep it forever? Unless you want to have the lady sueing you for sexual harassment making your companies email part of the public record, you might want to set some limits.

    The key is to document, in writing, what that limit should be. For example, maybe put it in your companies Computer Use policy. You have one...right?

  • by Timesprout (579035) on Monday May 23, 2005 @01:38PM (#12614125)
    And what happens if you have a neurological problem or disease. Suppose you have alzheimers before your brain gets downloaded, what use is a program that cant rememeber what it was doing. And if you were happily on your way to insanity before downloading would the desent into madness or senility continue in the downloaded version, ie would the data be so mangled it would gradually corrupt itself beyond salvation or would some sysadmin have to keep rolling you back to version 1.
  • Re:It's a copy (Score:5, Interesting)

    by SpectreBlofeld (886224) on Monday May 23, 2005 @01:40PM (#12614165)
    Every cell in your body dies and is replaced over a scale of seven years or so. You're not the original you, having been replaced multiple times with a 'copy'. Care to redefine your idea of conciousness?
  • by Anonymous Coward on Monday May 23, 2005 @01:42PM (#12614196)
    And why shouldn't they? Are you required to keep every piece of paper that ever goes through your hands, or every email that might pass through your inbox, because someday you might violate some law and be prosecuted for it?

    You aren't required to tie your own noose, and there are even provisions to assume you are innocent until found guilty/liable and Morgan Stanley is being found liable for behavior after the suit was filed, which changes the rules.

    Certainly you are required to retain some records for legal purposes, but they all also have an expiration date for that legal requirement.

    In the not too distant future that legal requirement for business email will be three years, at which point you'd have to be an idiot not to just delete it all.

    Even Microsoft has legal rights in this country, and any right you deny to them you simply deny to yourself. Beware of the emotional response.

  • Re:It's a copy (Score:3, Interesting)

    by bill_mcgonigle (4333) * on Monday May 23, 2005 @01:44PM (#12614239) Homepage Journal
    If I go into a teleporter, do 'I' come out the other end?

    Depends on your SciFi. In Star Trek, absolutely. It "energizes" your matter into an energy stream and sends that actual energy to another place where it coalesces. It's your very quarks being transported.

    The Outer Limits did a good story once about the more likely form of teleportation. Some dinosaur-looking aliens made contact with earth and they had the technology. It worked by cooling the person to absolute zero, scanning the subatomic particles, transmitting the scan data over FTL links, and reassembling the body at the other end. The process was non-destructive and required the original to be terminated - doing so was their highest law. There was no FTL travel in that universe, and passengers who used the system found it acceptable. The story revolved around the highly screened man who fell in love with one of his passengers and couldn't bring himself to delete the original.
  • Re:download? (Score:5, Interesting)

    by Eric Smith (4379) * < minus author> on Monday May 23, 2005 @01:46PM (#12614281) Homepage Journal
    Irrelevant. You have no more (or less) facility in your own brain for initiating download than upload.

    The extropians have been using the term "upload" for many years, as has science fiction. It's based on standard use of computer industry terminology.

    I routinely use my laptop to initiate either uploads to or downloads from a server. And sometime the server initiates uploads from or downloads to my laptop (e.g., Z-modem). The terminology has nothing to do with which side initiates the transfer. It is a convention based on "up" being "to the (conceptually) bigger system". I certainly don't want to transfer my mind into a system that has less capacity than my current brain, so I want to upload it.

    And your "facility" claim doesn't even make sense. My brain does have the facility to initiate an upload, just as much as it has the facility to travel to Australia. My brain can choose to have my body buy an airline ticket and drive to the airport, or just as easily, to drive to an upload center, walk in the door, and sign the appropriate paperwork.

    The big questions are whether I will live long for the service to be available, and whether I'll be able to afford it. In his book "The Age of Spritual Machines", Ray Kurzweil makes a reasonably convincing argument that I will, thanks to Moore's Law.

    Ray points out that even if Moore's Law runs out of steam with regard to MOSFET technology, that there is good reason to believe that it will apply equally well to new technologies, since the known laws of physics still have "lots of room at the bottom" (as observed by Richard Feynman). He shows that Moore's law actually extrapolates fairly accurately all the way back to late 19th century mechanical calculators.

  • by Eric Smith (4379) * < minus author> on Monday May 23, 2005 @01:53PM (#12614383) Homepage Journal
    And what makes you think that it wouldn't be possible to experience those states while uploaded?
  • I'd never do it (Score:3, Interesting)

    by Rick.C (626083) on Monday May 23, 2005 @01:56PM (#12614443)
    So, who gets to decide how your "brain-dump" is used? Certainly not you. Sure, you could "wish" for various things to happen, but without a body to command, you'd have no enforcement powers.

    Why would I want to give my neural contents to someone I don't know, who could later sell them to someone I dislike, to be used as a "mental slave"?

    I can think of no better definition of hell than if I were somehow "aware" of what was going on, but powerless to stop it.
  • Re:Soulless (Score:5, Interesting)

    by bill_mcgonigle (4333) * on Monday May 23, 2005 @01:57PM (#12614451) Homepage Journal
    Uh, your memory engrams may be downloadable, but your consciousness and soul will die right along with your body.

    Doesn't that imply your soul is organic? I thought the point of a soul is a mechanism for an afterlife?

    Here's an interesting thought experiment. Say you have very good prosthetic and nanotechnology available. As you age your natural body starts to fail. You have organs, limbs, bones, even blood replaced over time. As your skin fails a nice polymer replaces it (with excellent nerve replacements of course so you don't notice a difference).

    Do you still have a soul at that point?

    OK, now your body is failing even more. Over another couple decades you've replaced everything in your body except for your brain with mechanical systems.

    Do you still have a soul at that point?

    Now, your nerves start to degenerate and your brain isn't signaling well. You get some nano-bots in there to replace the dendrites and get the neurons signalling right again.

    Do you still have a soul at that point?

    Finally the neurons are starting to go and you get some more nanotech in there that can replace failing ones on the fly as they go with more stable structures. Over the next 20 years all of your neurons are slowly replaced by nanotech, but it's very gradual so you don't ever notice it.

    Do you still have a soul at that point?

    The trick in this experiment is picking the point at which you don't have a soul, if ever, and identifying the change that caused the soul. Of course, if you can identify the change that lost the soul, it follows you've identified the temple of the soul.

    Discussion encouraged.
  • Re:It's a copy (Score:5, Interesting)

    by Conspiracy_Of_Doves (236787) on Monday May 23, 2005 @01:58PM (#12614458)
    The difference is continuity. If you replace a tiny piece of yourself, you are still the same person. The new peice is integrated into the rest of your previous self. Do it again, you are still the same person. Regardless of how many time it is done, you are still the same person, even if every original peice of you is replaced. However, if you replace everything at once, there is no longer any 'previous self' for the new peices to be integrated into, and continuity is lost.
  • Re:It's a copy (Score:3, Interesting)

    by Saeger (456549) < minus math_god> on Monday May 23, 2005 @02:01PM (#12614499) Homepage
    Simply because your mind isn't operating on the slow organic substrate we evolved with is no reason to think you'd be "dead" when transferred to better, faster artificial substrates, whether in a traditional meatspace vessel, or VR worlds.

    To clarify:

    • "You" are your emergent pattern of mind: Software.
    • "You" are NOT necessarily what composes your operating substrate: Hardware.
    I understand the cognitive dissonance [] a lot of people have to the idea of transhumanism, but that doesn't make it valid. People just tend to anthropomorphize the future because it's what we're used to. Case in point: most people are planetary chauvinists in the thinking that Mars is a great gravity well to terraform, when what we'll probably end up doing is tearing the planet(s) apart [] to create much more efficient substrates and infrastructure (not bound by gravity wells).
  • Re:Soulless (Score:3, Interesting)

    by cHALiTO (101461) <> on Monday May 23, 2005 @02:01PM (#12614502) Homepage
    well that depends on if you believe in a 'soul' or not. Counciousness, OTOH, could be emulated if we could have some interpreter software (and necessary HW) to interpret and 'run' your encoded brain.

    Much like in "Ghost in the Shell". Even if you believe in the 'soul' (whatever you define that to be), what if your brain, whichever hard(or wet-)ware it runs on, is able to generate a 'soul'? Is the soul a product of the brain, or the other way around?
  • by Rei (128717) on Monday May 23, 2005 @02:01PM (#12614509) Homepage
    Actually, that's not what I'd worry about. I'd worry more about getting something like:

    Sep 19 13:51:22: hda: dma_intr: status=0x51 { DriveReady SeekComplete Error }

    Better use RAID, and have a trust setup for maintenance... ;)
  • Re:"The wealthy... (Score:3, Interesting)

    by EricTheGreen (223110) on Monday May 23, 2005 @02:05PM (#12614580) Homepage
    Is this comment supposed to be an opinion or literature?
  • by solios (53048) on Monday May 23, 2005 @02:06PM (#12614600) Homepage
    Max Headroom dealt with this, both as an overall concept and as a specific episode of the TV series.

    Personally, I think it would be handy - dupe the skillset into a ROM construct and cut the sucker loose on photoshop. He can sit on IRC and CG my comic pages while I write and ink the sucker. Perfect division of labor, creatively speaking.... but I'm one of those creative types who needs multiple instances of himself, not collaborators clouding the idea stream. :P
  • Re:It's a copy (Score:1, Interesting)

    by nido (102070) <nido56@yaho o . com> on Monday May 23, 2005 @02:07PM (#12614605) Homepage
    Conciousness is so poorly understood...

    you mean that it's poorly understood by most people. Every now and then individuals get a clue: the Buddha, various monks in following in his footsteps, Jesus, etc. We've seen a rash of people in 20th century america who came to understand what it means to be human: Edgar Cayce, Jose Silva, etc. I'd advise looking into Robert Monroe's 3 books (Journeys Out of the Body, Far Journeys, Ultimate Journey, the last two moreso than the first). Robert even started his own private research institute for studying conciousness: The Monroe Institute []. Think about it: what if it were true, that we are all more than our physical bodies?

    Let the materialistic flamefets begin... :)
  • Re:download? (Score:4, Interesting)

    by HTH NE1 (675604) on Monday May 23, 2005 @02:07PM (#12614612)
    You have no more (or less) facility in your own brain for initiating download than upload.

    But you do, by manipulating a remote device to pull the data from your brain. Your brain does not need to push. Upload and download are just fancy terms for the pushing and pulling of data from one system to another.

    The extropians have been using the term "upload" for many years, as has science fiction. It's based on standard use of computer industry terminology.

    Actually more based on a misunderstanding of computer industry terminology. The lesser/greater system originated from people who didn't understand upload/download and were trying to explain -- poorly -- to laymen. At the time, it looked to be correct as they were the common types of systems which uploading and downloading were performed, but it was never the nature nor capacity of the machines involved that determined the terms.

    FTP's GET is always a download and its PUT is always an upload, even if the FTP server was on your laptop and you're directing it from a mainframe, and even if that direction is through a Telnet connection from your laptop.

    Thus also saying the RIAA and MPAA are only going after "uploaders" is incorrect. Everyone on P2P is downloading, pulling data towards themselves. They are going after servers just as the ATF would go after someone who puts free alcohol, tobacco, or firearms out for unregulated taking by any member of the public. They aren't pushing those products to people, only making them available to be taken in a manner contrary to law.
  • by MOBE2001 (263700) on Monday May 23, 2005 @02:17PM (#12614798) Homepage Journal
    Conciousness is so poorly understood that I don't think you can even say that for sure.

    Yep. What if you made a copy of yourself, which one is the real you? What if the copy decides you're the one who's is not the real you and should be destroyed?

    This whole business of uploading the mind onto a computer is so much unmitigated crackpottery. Star-Trek voodoo science, that's all. You can copy the brain but you can't copy consciousness. For one, you don't know what it is. Until you do, you're up shit creek. And when you find out what it is, you'll realize that it can be neither copied, nor created, nor destroyed.

    Having said that, if someone found a way to copy the brain and move your unique consciousness into the copy, now that would be cool!
  • Re:It's a copy (Score:5, Interesting)

    by EnderWigginsXenocide (852478) on Monday May 23, 2005 @02:30PM (#12614997) Homepage
    After much searching, I'm thinking of James Patrick Kelly and his story "Think Like a Dinosaur", first published in August of 1997 A quoted summary here: ""Think Like a Dinosaur", uses two props of the genre, aliens and matter transmitters, to set up the narrator's moral dilemma. Michael Burr works for the hanen, an alien race resembling dinosaurs: he guides infrequent human star-travellers through the 'migration' process. In the course of the transfer, the humans are copied, one of the copies travelling on to their stellar destination, while the other is exterminated before regaining consciousness - the hanen way of thinking (hence the story's title) allows no sentimentality over the eradication of the copy left behind. When Burr releases a traveller from a malfunctioning device, only to discover that transfer has actually been effected, he must end the life of the copy he can only view as human... In this story, the technology is not cutting edge but a device of artistic licence, which aficionados of Hard SF might deplore - a clever method of achieving an artistic end: the unflinching examination of the human psyche, and Kelly does it brilliantly"
  • Molecular Nanotech gives the ability to place every atom where it is needed
    Certainly. But it doesn't guarantee that it stays there, or that it moves where and when it is supposed to. Errors can still arise due to thermal energy, external radiation sources, contamination, etc.
    or rates so low that through redundancy failure can be avoided.
    Use of redundancy to decrease failure rate does not require nanotechnology.

    Nanotechnology has many potential advantages, but a zero failure rate is not one of them.

  • Re:subtle points (Score:3, Interesting)

    by ShieldWolf (20476) <> on Monday May 23, 2005 @02:52PM (#12615364)
    Some have moved on, but only to materialism which has _0_ ability to explain the experience of conciousness.

    Lets do a little Gedanken expirement shall we?:

    Let's say your conciousness IS reducible down to bits and bytes and you download it. Once you have downloaded your brain there is NOTHING stopping a third party from then copying the result to a SECOND computer. Can two entities share a conciousness and still be just like 'you'? Any answer other than 'No' is clearly absurd so something went wrong along the way during our experiment - i.e. our assumption that unique human conciousness is machine reducible.
  • Re:It's a copy (Score:5, Interesting)

    by lobsterGun (415085) on Monday May 23, 2005 @03:03PM (#12615530)
    How about this:

    a person has a special chip inserted in their skull that records their brain state over the course of their lifetime. The chip is wirelessly connected to the backup system and keeps it constantly and updated. Would that be a valid backup?

    Or how about this:

    Over the course of a lifetime, a person has various parts of their brain replaced/augmented with technology.

    Some of the implants replaced damaged brain functions (damage from a stroke).

    Some augment the senses (heads up display).

    Some add new capability (robo-telepathy).

    Eventually, the person replaces their entire brain to the point that they no longer need a body and can exist in a virtual world.

    When do they cease to be human?
    Is it when the last brain cell is replaced?
    Is it when the first one gets replaced?
    Is it somewhere in the middle?
  • No, its the Hitchi (Score:1, Interesting)

    by Anonymous Coward on Monday May 23, 2005 @03:05PM (#12615563)
    Frederik Pohl used the idea before Gibson, in the Gateway trilogy. (Neuromancer is from 1984, Gateway is from 1976, Beyond the Blue Event Horizon is from 1980). BTW, "Gateway" got the Hugo, the Nebula, and the Locus Award.

  • Re:It's a copy (Score:2, Interesting)

    by egomaniac (105476) on Monday May 23, 2005 @03:07PM (#12615599) Homepage
    So your opinion is that psychics and two-thousand-year-old mythological characters better understand the nature of consciousness than modern neuroscientists?

    Pray tell, what great scientific achievements did Edgar Cayce contribute to the world? I'll even let all of his missed and failed predictions, such as his belief that we would discover the death-ray used in Atlantis back in 1958, slide. I just want to know how he contributed to our understanding of consciousness.

    As for Jesus, he was a fictional character []. There is no more reason to believe that Jesus really existed than that Zeus, Achilles, or Hercules really existed. On top of that, none of the sayings attributed to him in any way contribute to our understanding of consciousness.

    Psychics and fictional characters didn't invent the light bulb or the rocket ship. They aren't going to decipher the mysteries of the conscious mind either. We got to where we are today because of science, not religion.
  • by Anonymous Coward on Monday May 23, 2005 @03:09PM (#12615624)
    The living body (including the brain) is not static over any period of time, so a perfect copy is impossible.

    In addition the living brain/body system is not linear, which means that even slight differences in the initial states will likely lead to radically different results over time.

    In addition, the act of observing (copying) the matter of the brain resolves and therefore alters its quantum state.

    Maternal twins are the closest to copying a computer program because they start with identical sets of static source code. While many maternal twins share similarities, they are demonstrably not the same person.
  • Re:subtle points (Score:3, Interesting)

    by Hortensia Patel (101296) on Monday May 23, 2005 @03:10PM (#12615639)
    Can two entities share a conciousness and still be just like 'you'? Any answer other than 'No' is clearly absurd so something went wrong along the way during our experiment

    Good to see you're approaching the question with an open mind.

    Personally I see nothing at all absurd about multiple copies of a conscious mind-state, each of which is (initially at least) just like me. There's no "sharing" going on.

    As the grandparent poster says, I don't see how you can deny such a possibility without lapsing into Cartesian dualism. Did you actually have an argument, or were you just hoping that we'd take your "clearly absurd" on faith?
  • by Anonymous Coward on Monday May 23, 2005 @03:12PM (#12615674)
    Anyone who assumes a copy of oneself is as good as eternal life assumes a soul.

    This is a simple logical test: say you manage to make a copy of yourself. Would you then have no fear shooting yourself in the head? Do you believe your consciousness would be transferred to the copy at the point of your death? That could only happen if there was some immaterial link between you and your copy. A soul, if you will.

    How would you even go about downloading your brain to hardware, no matter how sophisticated? Consciousness is not something separable from the body. You can't just plug a wire in your brain and have your consciousness dance along that wire to a new host. The only way I can see you might even possibly do it is, as suggested, to replace the organics gradually with something else. Would you still be you after that? Who's to say...

    All in all, if you create a copy of yourself and can't sense what the copy is feeling, that is, don't suddenly have some kind of telepathic link with that copy, I wouldn't trust it to be a path to immortality for individuals. Mankind as a whole, maybe, if you accept that machines can be children as much as babies.
  • Re:Haha (Score:2, Interesting)

    by fullpunk (518331) on Monday May 23, 2005 @03:39PM (#12616035)
    The new PlayStation is 1 per cent as powerful as a human brain
    Come on, where does that come from? And even if it was possible, you'd still have to find a person who is willing to give you his body. I think this man has seen/read too many sci-fi movies/books.
    If I'm on an airplane I want the computer to be more terrified of crashing than I am so it does everything to stay in the air until it's supposed to be on the ground.
    Not me, I prefer the computer to do what it is programmed for instead of being frozen by fear :)
  • Re:download? (Score:2, Interesting)

    by Anonymous Coward on Monday May 23, 2005 @03:43PM (#12616103)
    It is a convention based on "up" being "to the (conceptually) bigger system".

    Upload means to transmit from you to elsewhere.
    Download means to receive from elsewhere to you.

    Upload is to put.
    Download is to get.

    There is no "convention" here; the meanings are quite clear.
  • Re:It's a copy (Score:3, Interesting)

    by |/|/||| (179020) on Monday May 23, 2005 @03:47PM (#12616162)
    How would he/I know which was the copy?
    Well, I assume that the copy would either be limited to a videocamera, microphone, and speakers as an interface with the outside world, or else would be inside of a simulation. Most likely you would be able to tell it was a simulation, even if it was a very good one. If you were actually able to make a perfect copy of the entire human, I would suggest affixing a post-it note to the door of the "construction" chamber.
    Your use of the word "just" reflects an unwarranted value judgement.
    I didn't intend to make a value judgement. I used "just" because seeing humans as machines is a generalization. Seeing them as humans is more specific. You also make a good point that, depending on how you define "machine," it could include everything in the universe. The generalization that I was pointing out, however, is that animals are made up of very complex molecular hardware. These act like traditional mechanical "machines" in many ways - basically riding atop chemistry and physics in order to perform complex functions.

  • Re:He's wrong. (Score:5, Interesting)

    by Eric Smith (4379) * < minus author> on Monday May 23, 2005 @03:49PM (#12616181) Homepage Journal
    How do you know when you wake up in the morning that you are really the same person that went to bed the previous night? You don't have continuity of consciousness through the entire night. Maybe the "you" of yesterday died, and you are just a copy; how would you know? ("I'd know the difference." "No you wouldn't, you'd be programmed not to.")

    If you went to the "uploading clinic", and they put you under a general anaestheic, uploaded you, and terminated the leftover hunk of meat, how would that be different than simply going to sleep and waking up (albeit in a new "body")?

    As you said,

    This is not a subtle point.

    Anyone who cannot grasp this either hasn't thought deeply about a subject, or is an idiot.

  • by denissmith (31123) on Monday May 23, 2005 @03:50PM (#12616200)
    The interesting question, to me, has always been what will happen when we can extend life semi-indefinetly. How does society determine who gets to live? If he is correct that money will be the determinant, how long can that society last? I don't see roughly 6 billion people docily going to their death when real alternatives exist.
  • by hackus (159037) on Monday May 23, 2005 @04:01PM (#12616337) Homepage
    Sorry, you only get one license and it isn't transferable.

    Besides, how many quacks have been saying this sort of thing over the past 50 years???

    I thinks Mother Natures copy protection is quite effective. Although I have no doubt we will be able to genetically modify the human race to extend lifespan significantly (i.e. the wealthy and the powerful that is...), I doubt forever.

    Thinking we can build a machine to do it I think speaks volumes of our ignorance about how the brain really works and if it truly is the part that provides "conscious" thought.

    Note, I am not sure if we REALLY understand the difference from conscious thought and intelligence.

    Do the two require each other for example?

    Exactly what IS UN conscious thought if so?

    We have lots of crack pot organizations right now that measure intelligence for example, like MENSA.

    I am not even sure we know what intelligence is let alone how to measure it.

    I have a PhD sitting next too me who I think is clueless half the time and I do not find him intelligent. Meanwhile, the guy who use to do Tattoo's for people has written genuinely interesting and useful software for our customers and is self taught. His work pays for the over inflated EGO and salary of the PhD guy.


    So what is intelligence?

    I think it is any organisms ability to modify its environment to an extreme (i.e. make its own environment to sustain itself even when the outside environment would kill it.)

    So if you build a house in response to winter, or air conditioning units in response to heat I would consider that intelligent.
    (if you move into outer space and do it, your not just intelligent, your going to likely live forever...)

    However, I do not think you need to be conscious to do these thinks and explore the Universe, simply intelligent.

    Sort of like the creatures in the new War of the Worlds remake.

  • by exp(pi*sqrt(163)) (613870) on Monday May 23, 2005 @04:07PM (#12616407) Journal
    We're all copies. Of the hundred trillion cells in our body a large proportion are replaced, or have a large proportion of their atoms replaced, over time. Even if this weren't the case, I can't see what harm would be done by replacing all of the atoms in someone's body with identical ones. Because of this I find it hard to put value on the specific atoms that make up my body but instead I value their functionality. If that functionality can be (destructively) reproduced by a machine I'm happy to walk into that machine. If other people aren't, then they can choose not to use it. But someome claiming to be me will thumb my nose at them from its shiny new robot body at their funerals.
  • by Tony (765) on Monday May 23, 2005 @04:22PM (#12616600) Journal
    Don't forget "The Ophiuchi Hotline," by John Varley, 1977, in which people downloaded their brains for backup, and if they died, dumped it back into a clone.

    Very good book, almost as good as "Gateway," and *way* better than "Neuromancer."
  • Re:It's a copy (Score:3, Interesting)

    by alw53 (702722) on Monday May 23, 2005 @04:22PM (#12616618)
    I think in order to make any progress on this issue, you're going to have to stop using the word "is".

    See for example []

  • by orfanotna (813716) on Monday May 23, 2005 @04:27PM (#12616715)
    When you go to sleep, the electrical activity in your brain doesn't stop.

    However, I've read that in certain types of brain surgery, all electrical activity in the brain must be stopped for some period of time, and then "restarted". The person thus loses all the short term memory, but keeps the long-term, because that isn't dependent on continuous electrical activity. When that person wakes up, is he still considered the same old person, or just a "replica"?

  • by Eric Smith (4379) * < minus author> on Monday May 23, 2005 @04:30PM (#12616757) Homepage Journal
    Your computer doesn't think
    It hasn't been programmed to do so. Are you asserting that it is impossible for a computer to think? If so, what is your basis for making such an assertion?
    your brain doesn't compute.
    I suspect that you can win a Nobel prize for proving that assertion.
  • by NoImNotNineVolt (832851) on Monday May 23, 2005 @04:34PM (#12616795) Homepage
    See, that's not entirely accurate. As Raymond Kurzweil has pointed out, technology advances at a very predictable rate. His paper "The Law of Accelerating Returns []" does a good job documenting evidence of this fact spanning the entire last -century-.

    The consistent exponential trend observed, when extrapolated, is what the claims of these futurologists stem from. They're not picking a wild fantasy and claiming they know when it will come to pass. They're making reasonable predictions based on consistent observed trends.

    Also addressed in this paper, coincidentally, is the idea of uploading human consciousness, along with other common themes of futurism.
  • by snooo53 (663796) * on Monday May 23, 2005 @04:56PM (#12617044) Journal
    My thinking is that we could avoid this whole problem of whether conciousness is transferred to the copy by simply doing the process gradually.

    Replace the neurons, or areas of the brain one at a time, by directly connecting them to the rest of the functioning brain. The remaining part would treat the new electronic parts the same as the old one, and consciousness would remain intact.

    If you wanted to make a copy, then perhaps the new parts could be connected in parallel with the old parts. When your brain signals a group of neurons, it also signals the electronic copy. Eventually you've got an entire brain connected in parallel. Disconnect that and you have two functioning brains still.

    Of course with the millions of neurons this gradual replacement would have to happen pretty fast, but I would think as long as you give the brain the time to communicate with the new parts and adjust, you could do it successfully.

  • Re:It's a copy (Score:2, Interesting)

    by irefay (785141) on Monday May 23, 2005 @04:59PM (#12617072) Journal
    I disagree. As of right now, we do not understand how memory is even stored. After a time the synapse's distance grows and the nurans are no longer used for a spicific memory. Thus the memory should be no longer accessable... BUT IT IS. What happens? We dont know. It could move to a diffrent part of the brain but then again we have found no "instructions" on where to move it.
  • Re:It's a copy (Score:3, Interesting)

    by Ralph Yarro (704772) on Monday May 23, 2005 @05:04PM (#12617113) Homepage
    The question is, is it deterministic? There are theories of the brain that postulate quantum effects as being important, ie; in the folding of proteins and DNA in neurons. The romantic in me hopes this to be true, because I can't do the math.

    There are three basic models: deterministic, random and probabilistic.

    The three can be described by reference to a theoretical ability to form a perfect simulation of a person at a precise moment in time, including environmental and sensory data. In effect this is used to replay a moment of decision over and over again.

    Deterministic: the person behaves in exactly the same way each time the moment is replayed.

    Random: the person behaves in a random manner each time the moment is replayed - he could do anything.

    Probabilistic: the person's behaviour changes each time but within a restricted scope e.g. if we could replay my replying to your post then we might find that 90% of the time I write something along the lines of this post, 9% I write "me too!" and 1% I post about how BSD is dying.

    The random model doesn't match what we perceive as reality. I don't believe there was as much chance of me trying to eat the keyboard as of typing on it. So that leaves deterministic and probabilistic. Either of these could be valid.

    Of the two, I think deterministic is far more emotionally satisfying. That doesn't make it true of course but I'd much rather think that there is something that is ME with a strong sense of identity that in situation X having had day Y and being worried about factors Z I will lose my temper. The less well defined probabilistic alternative me that if the situation was replayed would sometimes get angry and sometimes not is far less appealing.

    These emotional preferences don't change reality of course, maybe we are probabilistic in nature, but I really can't understand why anyone would prefer to think of themselves that way.
  • Re:Yeah, whatever. (Score:2, Interesting)

    by Eukariote (881204) on Monday May 23, 2005 @05:09PM (#12617175)
    But how would you download the chemical state? [...] And once you have the chemical composition and the electrical composition, you ALSO need to know the wiring - the wiring between the neurons is unique to an individual, and isn't going to be easy to determine.

    There is a shortcut around the problems you mention: freeze the brain and scan it in destructively, sub-micron layer by sub-micron layer. For this to work two problems need to be solved:

    1. Freezing the brain without destroying its microscopic structure to the extent that the connectivity and thresholds of neurons can no longer be inferred. A simple freezing regimen will not work because of ice crystal formation.

    2. Making the scanning and required storage economical. With current tools (AFM, STM, electron microscopy) you'd do well to accurately map a single neuron, never mind 100e9 neurons.

    The good news is that #1 is pretty low tech and might be made feasible with current technology while #2 need only become practical eventually, because while frozen you've got all the time in the world.

  • Re:He's wrong. (Score:2, Interesting)

    by esampson (223745) on Monday May 23, 2005 @05:49PM (#12617570) Homepage
    If you went to the "uploading clinic", and they put you under a general anaestheic, uploaded you, and terminated the leftover hunk of meat, how would that be different than simply going to sleep and waking up (albeit in a new "body")?

    Well, I suppose for starters one difference would be the fact that I was dead.

    Just because a computer has a copy of my memories that doesn't mean I am now inside that computer. It just means there is a copy of me in that computer. Any clever games that are played so that the copy is unaware of the fact that it's a copy don't alter the fact that it's a copy.

    The proof of this can be seen in the fact that if I make a photocopy of a page in a book the photocopy is, obviously, unaware that it is a copy. Does that mean that what came out of the photocopier is now the original simply because it doesn't know any better? Of course not.

    The idea that somehow a copy of your mind is actually you can be easily disproven with the following example: Your engrams are uploaded into a clone. Because of the process the clone is completely unaware of the fact that it is a clone and it has been told that you, in fact, are the clone. As the two of you leave the clone is killed by a speeding bus. According to your logic, you, rather than some copy of you, has just died, which is of course complete nonsense.

  • Re:He's wrong. (Score:3, Interesting)

    by Squiffy (242681) on Monday May 23, 2005 @07:23PM (#12618495) Homepage
    You seem to accept the notion that the substrate doesn't matter -- that both the original human and the machine-borne copy are equally "alive" and identify themselves as the same person. On the other hand, you also seem to believe that the original consciousness does not continue in the machine -- that even though the copy does experience the illusion of continuity, somehow the original does not experience any such thing.

    This leads me to ask, why not? If you accept that information encoded as various memories completely defines identity, what part of the original person's identity does not survive?

    If something else is necessary to define identity in addition to memories, what is it? If it is measurable, why can't it be transferred to the machine? If it is not measurable, how can you be so sure that it exists?

Some people have a great ambition: to build something that will last, at least until they've finished building it.