

Bill Joy On His Own Future, And The World's 273
geeber writes "There is an interesting interview with Bill Joy in the current edition of the Magazine in the New York Times. He is still obssesed with what he calls a 'civilization-changing event' brought on by the fast pace of research into dangerous technologies such as genetic engineering and nanotechnology. Another interesting tidbit: he has flirted with the idea of going to work for Google."
Get your tin foil hats here (Score:5, Informative)
here [nytimes.com].
Re:Get your tin foil hats here (Score:2)
No boogedy-boogedy NYT registatrion required
Cookies don't bother me, it's that damn
Re:that victoria secret ad (Score:5, Funny)
Re:that victoria secret ad (Score:2, Informative)
http://spe.atdmt.com/b/AANYCVCSTVST/SAS04_P2
Who wouldn't? (Score:5, Funny)
"Another interesting tidbit : he has flirted with the idea of going to work for Google."
Really now, who these days hasn't thought about that? :D
Who modded this up? (Score:5, Insightful)
Re:Who modded this up? (Score:3, Insightful)
Re:Who wouldn't? (Score:2)
Bill Joy??!!? (Score:3, Informative)
Sorry! The wiki is experiencing some technical difficulties, and cannot contact the database server
Oh well never mind instead click here for a google cache of Bill's page on wikipedia [google.co.uk]
Re:Bill Joy??!!? (Score:2, Informative)
Wikipedia is working. (Score:2, Interesting)
Dangerous technologies (Score:2, Interesting)
Re:Dangerous technologies (Score:2, Interesting)
Re:Dangerous technologies (Score:2)
But I could fear more random mutation/genetic changes than engineered ones. In 1918 spanish flu killed 20 millon (probably more t
Re:Dangerous technologies (Score:4, Funny)
Re:Dangerous technologies (Score:3, Insightful)
Now I understand why people just blog these days. You get away from this type of mediocrity.
Re:Dangerous technologies (Score:5, Insightful)
I'm sorry, but that would sound like the end of at least interdisciplinary science if not science itself. I think the rubuttal that you labeled "simplistic" is pretty accurate. Just because the results of science can be used for destructive aims is not a reason to return to the ages of hidden knowledge.
Re:Dangerous technologies (Score:3, Insightful)
"anywhere" includes slashdot, so i wrote it because noone wrote that already :)
And yes, is simplistic, but so still is condemning technologies because it could have a (ok, in this case very) bad uses, and closing the door on any kind of good uses, including avoiding or mitigating disasters even bigger than the worst that they could possibly make. If we go to the worst case scenario, when all the bad things will happen
Re:Dangerous technologies (Score:5, Insightful)
He accomplished a lot in *programming*, nothing else. See, that's the problem. Once someone gets famous for doing X, they think they can speak authoritatively on all subjects. But they can't -- they can just babble, just as Einstein did about socialism and pacifism, and Bill Joy is doing about science. While we can all hold opinions on everything, and even babble about them on Usenet and Slashdot (or indeed on blogs, the most self-indulgent waste of time possible), it would be considerably more productive if people limited their interactions with journalists to the subjects they have actually been educated in.
Re:Dangerous technologies (Score:2)
Re:Dangerous technologies (Score:2)
What, then, does someone have to do to gain your permission to talk about socialism or pacifism? Where does this wonderful "education" come from that allows you to be infallible in subjects like socialism and pacifism? Why should journalists only talk to these infallible experts?
I'd rather hear Einstein's (and to a lesser degree Joy's) "babblings" than J. Random Blowhard on Slashdot.
Re:Dangerous technologies (Score:2)
For starters, in the case of socialism a degree in economics and in the case of pacifism a degree in international relations. Of course, given this, I'd still take the opinions of a student fresh out of school with a grain of salt -- experienced economists and diplomats would be more convincing.
I'd rather hear Einstein's (and to a lesser degree Joy's) "babblings" than J. Random Blowhard on Slashdot.
And som
Re:Dangerous technologies (Score:5, Insightful)
This problem exists, but is not valid in this case.
See, I'd agree if the interview was with Britney or Tiger - their opinion on the future counts for nothing. But you're talking about Bill Joy. When a deservedly prominent computer scientist - or, for that matter, biologist, economist, etc. - talks about the future, I'll listen.
Re:Dangerous technologies (Score:3, Insightful)
How did he babble? Remember that Einstein grew up between the world wars. An American WWI veteran said: "The Germans didn't win that war but neither did we. Only the war won that war." It was a house of cards that fell over, countries declared war because of their treaties and rarely because their own direct interests were at stake. And even the interests that were at stake, were more those of the elite than of the people. In the end, the
Re:Dangerous technologies (Score:2, Insightful)
Snigger. Yeah. Nothing will save the world quite like furiously pounding out endless blog entries.
Moderation, meta-mod, meta-meta mod? (Score:2, Offtopic)
But I have to agree, I often find that the tangents that get discussed are often irrelevan
Re:Dangerous technologies (Score:2, Insightful)
Re:Dangerous technologies (Score:5, Insightful)
Yes, knife can be useful but also dangerous.
Explosives can be useful but very dangerous too. In the wrong hands they're definitely more dangerous than knives.
Nuclear power can be useful but in general it's more dangerous (in the bomb form) than knives or explosives. It is, in fact, the first technology with which the human race could have committed a suicide.
To me it seems like that to Joy genetic engineering and nanotechnology are one more order of magnitude more dangerous than atomic power or any other existing human technology. Why? Because of the potential for self-replication. Atomic bombs certainly kill lots of people, but they cannot self-replicate and run out of our control.
In the end it boils down to the risk = probability * consequences. Even if the probability of us becoming victims to all-conquering grey nanogoo is vanishingly small, are the consequences so disasterous that the risk is eventually too high for us even experiment with the idea?
Incidentally, developers of the hydrogen bomb had to wrestle with the same equation. What if we lit up a hydrogen bomb in our atmosphere and, against all our calculations and predictions, nitrogen-nitrogen fusion would begin and our entire atmosphere would be consumed in one huge fusion burn.
Re:Dangerous technologies (Score:3, Funny)
Re:Dangerous technologies (Score:2)
With a knife you can kill a few people. Fine. With a nuclear bomb, you can kill many, but YOU CAN CONTROL it! (not everyone has access nor capability to build a bomb) With every nut-case in the world being able to engineer viruses on their personal computers, you have the capability to wipe out the world... but this time, there is no control.
The danger is different now (Score:2, Insightful)
Back when cavemen still said "ooga booga", maybe somebody figured out how to sharpen obsidian into a knife, and the other neanderthals spread the love.
Thousands of years later we had guys like L. Da Vinci and then B. Franklin, renaissance men with who dabbled in science for the joy of their own genius.
Now science is industrial (and so is science education, IMO). Much of it is driven by the search for profit (biotech) or
James Watson on Gray Ooze... (Score:5, Interesting)
Also, one forgets that cells have been evolving against this possiblity for billions of years. If a "Gray Ooze" were possible it would very likely have appeared on its own. As it is, cells, and multi-cellular organisms have extremely sophisiticated (sp) means of defense. While will be possible to create a disease that kills millions or billions of humans, I worry far more about nuclear war.
Re:James Watson on Gray Ooze... (Score:2, Informative)
This possibility has been dealt with at length in the novel, "Kalki" by Gore Vidal.
Re:James Watson on Gray Ooze... (Score:2, Informative)
This possibility has been dealt with at length in the novel, "Kalki" by Gore Vidal.
Are you seriously citing fiction by Gore Vidal as a reference on the subject?
Re:James Watson on Gray Ooze... (Score:3, Interesting)
AIDS, Bubonic plague, and I'm sure dozens of others I don't know about, either have or are currently killing millions of people. Barring medical breakthroughs, AIDS will kill every one of the 40 million people currently infected with it. The Bubonic plague wiped out a third of Europe, today with increased travel it could be a third of the world.
Ewan
Re:James Watson on Gray Ooze... (Score:2, Informative)
Minor technical nit: One cannot be infected with AIDS. One is infected with HIV. AIDS is a syndrome generally associated with HIV infection, but HIV infection is not a surefire predictor of AIDS.
Re:James Watson on Gray Ooze... (Score:5, Informative)
Devil's advocate (Score:3, Interesting)
Re:James Watson on Gray Ooze... (Score:2)
Yet the Bomb WAS a human-created civilization-changing event that has nearly done us in on a few occasions, and may still do so.
As to the Watson's assertion: chemical and biological weapons *do* exist. So why hasn't some predator evolved mustard gas jets to kill us off and take our food? Because evolution doesn't work all
I too flirted with idea of working for Google. (Score:4, Funny)
Re:I too flirted with idea of working for Google. (Score:2)
We managed to survive... (Score:5, Interesting)
i understand his concern over these new branches of study and it is of *dire* importance that we tread lightly and remember our lessons in the areas of genetic modification and nanotechnology, yet all the while moving forward. i'm no luddite, but i am always wary and respectful of the power of the human mind.
There's a difference (Score:4, Interesting)
The knife enables you to kill a person at a time.
A gun several.
Bombs - hundreds
Nukes are controlled by states, not individuals - but one fear behind the current war on terror is this will change.
Nano weapons...?
Weapons with gigantic destructive power might be very easy to synthesize in only 20 or 30 years - so imagine this: how do you run a world where every individual has the power to wipe out everyone else? There is no way around it - this is not like the right to bear arms - you simply have to ban the technology and pretty much wipe out everyone who seeks to acquire it, like an immune system killing viruses, while finding some way to lace the environment with 'antigens' of some kind that can automatically 'contain' any 'outbreaks'.
There has to be a point at which a hugely destructive technology becomes so cheap and widely available that it cannot be allowed to proliferate, no matter that it might have beneficial uses.
MOD PARENT UP. (Score:3, Interesting)
Let's have some thoughts folks!
Re:MOD PARENT UP. (Score:3, Interesting)
I thought "You seem very optimistic! This is slashdot, for crying out loud." Then I realised how negative I was being.
I'm no psychologist, but a futurist is anyone with an opinion about tomorrow, so here goes.
In a world where everyone has the power to destroy everyone else, we're already dead. There is no time to solve and understand mental illness. It only takes a handful of real loonies with access to total destruction weapons before we're all totally destroyed.
So in m
Re:MOD PARENT UP. (Score:3, Insightful)
Of course that sort of long term solution requires much more persistence, humility, dedication and sacrifice than packing lots of explosives into a bomb and just dropping it on people you don't like.
I think we are starting to see this, even
If you have enough "AK-47 proliferation" it doesn't matter how many bombs you drop
Re:MOD PARENT UP. (Score:3, Insightful)
generally
Key word. The problem is that, given that everyone can blow everyone else up, in a world of 6,000,000,000 people all it takes is 0.00000001% deviants and we're doomed. No social system can be so perfect that every one of that many people will be well adjusted.
Look at the present day; the number of terrorists in the world are statistically insignificant but there's still enough to cause all sorts of grief.
That's not to say we shouldn't do everything we can to create a better world. It's just
Re:MOD PARENT UP. (Score:3, Insightful)
As a species, our technical intelligence far exceeds our common sense and mental stability. Evolutionary dead-end.
What exactly do you mean by "technical intelligence" of our species? Do you mean the combined achievements of the human race? We've created the atom bomb, but 99.999% of people have no idea how it works and likely never will
As far as common sense goes, the scenario is the
Re:MOD PARENT UP. (Score:3, Interesting)
Yes that's pretty much what I mean; I used the word species because I was referring to the species, not the individuals.
As far as common sense goes, the scenario is the excact opposite. The individual person has lots of common sense, but humans as a race have (almost) none.
Again
Re:MOD PARENT UP. (Score:2)
When I hear "real loonies", It conjures up an image of drugged out vegatables confined to white padded rooms in straightjackets. A "real loony" is someone that's born with most (if not all of) his/her screws loose.
Fundamentalists are a much scarier breed. They live amongst us, and we don't know who they are. They have only 1 or 2 loose screws
Re:MOD PARENT UP. (Score:3, Interesting)
Re:MOD PARENT UP. (Score:2)
Re:MOD PARENT UP. (Score:2)
To a certain extent, I agree with you. But I think the real problem isn't that we don't care about the fate of our species, it's that our species doesn't care about the fate of any other species.
Until we learn to play nice with others, and not piss in the pool we swim in, I don't
Good government wont allow it (Score:2)
Re:There's a difference (Score:3, Insightful)
Hyperbole content of the above aside, I think the problem stems from the very idea that someone should be deciding how the world is to be run.
Re:There's a difference (Score:2, Insightful)
There is an argument among 2nd amendment supporters that says "If you criminalize guns, only criminals will have guns". That applies here, only it is more powerful. Possibly the only way to counter a nano-plague is with your own nanotechnology. It is inevitable that someone will develop the technology if it is feasible and there is a desi
Re:There's a difference (Score:3, Insightful)
That would be a good idea -- if it was even remotely possible. But of course it isn't, and banning the technology will only ensure that when the technology IS developed, it is only those who ignored your ban (i.e. your enemies) who have access to it. Good luck fighting that new plague when none of your scientists are allowed to research it!
Re:There's a difference (Score:2)
The solution is a Catch-22, IMO.
People WANT to develop WMD's to increase their innate desire for more POWER. The alphamale/alphatribe that strove for more power got control of more scarce resources (and the women) so their genes & memes spread at the expense the "peacenik monkeys". This law of the jungle still lurks beneath the facade of our presentday civilization.
Getting rid of our self-dest
Re:There's a difference (Score:2)
A Cynical Response... (Score:2)
The problem I find all too often is that people do
not acquaint themselves with history to know what the problems actually are. For germ warfare is not new. In fact it is over two hundred years old. Let me give an example http://www.somsd.k12.nj.us/~chssocst/ssgavittus1a m herstsmallpox.htm
To beat the Indians instead of fighting them a genera
Re:A Cynical Response... (Score:2)
Biological warfare [cdc.gov] is a lot more than 200 years old. I'd wager that man was throwng dead rats into enemy caves as soon as he figured out that dead rats carried disease.
Who will enforce? (Score:2)
Re:There's a difference (Score:3, Informative)
Over 100 Million people were slaughtered or executed by guns and knives so that Communists could stay in absolute power.
Nukes -- The Bomb accounts for less than 1% of the WWII dead.
Saddam's WMD's accounted for less than 10% of the people he butchered.
Most current nuclear proliferation activity is directed over conflict in Israel/Palestine, where hundreds die a year.
Re:There's a difference (Score:4, Insightful)
Very, very politely
Space Travel (Score:3, Insightful)
Re:We managed to survive... (Score:4, Insightful)
Each new tech advance is more powerful and more accessible than the last, but the minds that wield it are relatively stagnant and still saddled with millions of years of selfish evolutionary baggage which we won't be able to fix [hedweb.com] for quite a while yet.
Humankind is within ~30 years of reaching the vingean Singularity [caltech.edu], and the only question is the odds on making it [gmu.edu] without sabotaging ourselves first. IMO, the odds are very low, but unlike Bill Joy, I don't think there's any point in attempting to STOP or even slow this progress -- all we can do is try to safely guide the tech [foresight.org] and hope for the best.
--
Fawlty Towers (Score:3, Interesting)
Must be nice.
Hedley
He needs to question his underlying assumptions (Score:5, Insightful)
All facetiousness aside, his mention of Bertrand Russell's opposition to nuclear weapons raises a good point. Sure, we risked barbecuing ourselves during the Cold War. But, arguably, the same weapons also prevented World War III, and are continuing to do so. You could say that we traded an unimaginable amount of economic power -- strategic nuclear-weapons programs are, after all, the most expensive investment the human race has ever made -- for the very security that Joy says we're recklessly neglecting.
At the end of the day, he'll just have to finish his manifesto and submit it for review by civilization at large. Even Ted Kaczynski managed to get that far.
Re:He needs to question his underlying assumptions (Score:2)
Is that really true? How do you figure?
Re:He needs to question his underlying assumptions (Score:5, Interesting)
That vulnerability is a natural consequence of an increasingly technological society, because, after all, the whole point of "technology" is leverage. Technology cannot benefit the individual without empowering the individual, for good or for ill.
Joy's suggestion that we return to a medieval guild system to limit the spread of hazardous technological ideas is as wacky as anything in the Unabomber Manifesto. He seems to be forgetting the basic fact that the guild system didn't last, thanks to (guess what?) the spread of technological know-how, driven by the individual political empowerment that accompanied the printing press. Any solution that relies on keeping secrets is just prima facie naive, and if Joy keeps making proposals along those lines, he's going to find it increasingly difficult to avoid that label.
Somehow, I don't think liability insurance is an adequate answer, either. Who's going to underwrite the risk that we'll turn our solar system into a black hole the next time we fire up the Relativistic Heavy-Ion Collider? Are we going to be in good hands with Allstate then?
We don't even know what the right questions are, much less the answers. Stopping the progress of science and civilization for an extended navel-gazing session doesn't sound very interesting, though. It would shift the custodianship of scientific power away from the scientists and towards the politicians, the philosophers, or, heaven forbid, the priests. Bad move for civilization, IMHO.
googling (Score:3, Funny)
Grammar (Score:2, Funny)
Everybody misses this point somehow (Score:4, Interesting)
Re:Everybody misses this point somehow (Score:4, Insightful)
Other people have stated this principle with different connotations than Russell chose to. There's Patrick Henry's extreme line "Give me Liberty or Give me Death." And if that's not far enough for you, Milton's Satan goes even further " "Better to reign in Hell, than serve in Heaven".
You might wonder how anyone can entertain such fanatical positions. I think what you have to understand is that Choice, Power, Control, Freedom, Liberty--whatever synonym you choose to use--is the essence of Humanity. If you have lost the ability to act in pursuit of your wishes, then you as a human being are essentially dead. (Actually achieving your wishes is optional and possibly detrimental). The purpose of the 3 pounds of meat on top of bodies that drives us to do anything we are driven to do is to make decisions and act upon them. To be denied that ability is a fate worse than death.
When we consider Bill Joy, we must consider what Bill Joy is asking us to surrender in order to avoid Grey Goo. To save the world, Bill Joy is not asking us to give up mere Science, Technology, or Geekdom. He is asking us to give up Democracy. Whether through a Science Guild, a government bureaucracy, or some strange all powerful insurance company, Bill Joy wants to put decisions over technology in the hands of some elite few--with the public completely uninformed that a decision has even been made--because public knowledge of the banned technology is dangerous.
It is strange that he looked to insurance companies and the supposed "free market" to solve this problem. Anyone who equates capitalism with freedom should see this as a counter-example--money is a very old and straightforward means of Power. It is a Power Bill Joy is comfortable with--he is more comfortable with the dominance of Money than with the dangers of democracy or freedom, because he has Money.
In any event, if bio and nano technology are going to be the driving forces of our economy in the future, what Bill Joy is suggesting is prohibiting the vast majority of people from participating in the that economic change. There will be an elite few, who posess the power of death over us, who are impervious to any threat we the people can offer them , and have will have the ability to deny us life saving or enriching technology as their whims so dictate.
Bill Joy is asking us to adopt the teachings of Thomas Hobbes. I should hope that our prior experiences with absolute totalitarian power in history should be enough to dissuade us from that--we are weighing the possibility of destruction against the certainty of submission.
"Civilization Changing Event" (Score:4, Funny)
Thanks Mr.Joy for the joy called vi
Interesting links... (Score:5, Informative)
http://www.wired.com/wired/archive/8.04/joy.html [wired.com]
http://www.lcs.mit.edu/about/reason.html [mit.edu]
http://www.lcs.mit.edu/about/kurzweil.html [mit.edu]
future fear (Score:4, Insightful)
Disaster Insurance (Score:2, Insightful)
Re:Disaster Insurance (Score:2)
I agree with Joy that these approaches would not really solve the problem. For one thing, how are these enforced? The NYSE and Arthur Andersen apparently weren't even able to enforce any control over Enron. One can rebut: Andersen was punishe
who gets to choose? (Score:2, Insightful)
>advancing technology [kurzweilai.net] is that it
>is increasingly outpacing our primitive human
>brain's ability to intelligently deal with it.
What level of advance are you willing to put me in
jail to protect? How do you decide on this level?
How do you decide at any one time what fits under
your arbitrary bar? Given the human nature
you are so afraid of, i think we all know what
direction this will go.
What makes you think progress will continue at
all i
Compulsive (Score:2)
Backup civilization? (Score:4, Interesting)
Some people are seriously thinking of making 'backups' of civilization: "secure sanctuaries (think of the monasteries of the Middle Ages) that preserve and update copies of the vital records and articles needed for the conduct of our society". They would be placed all over Earth and eventually at locations in space. "In the event of a global catastrophe, the ARC facilities will be prepared to reintroduce lost technology, art, history, crops, livestock and, if necessary, even human beings to the Earth."
See Robert Shapiro [edge.org] and William E. Burrows [arc-space.org]
Re:"Civilization Changing Event" (Score:2, Funny)
Re:"Civilization Changing Event" (Score:3, Funny)
One answer is, of course, that time travel isn't possible, which neatly explains why we see no travelers.
Another answer lands you in the middle of a subgroup of UFO enthusiasts. We do see the travelers; we just don't realize what they are.
Other answers allow you to generate your own SF story.
Re:"Civilization Changing Event" (Score:2)
or at least against time travel ever being easy...
however I did read one article about some physics Guys with really big lasers who could send single particles back in time
Re:"Civilization Changing Event" (Score:2, Interesting)
First let's consider a few well-accepted values:
How fast is the Earth spinning? 0.5 km/sec
How fast is the Earth revolving around the Sun? 30 km/sec
How fast is the Solar System moving around the Milky Way Galaxy? 250 km/sec
How fast is our Milky Way Galaxy moving in the Local Group of galaxies? 300 km/sec
Alrighty then, now lets do some computation
General anesthesia and coma (Score:3, Funny)
Re:"Civilization Changing Event" (Score:3, Interesting)
Re:"Civilization Changing Event" (Score:2)
Re:"Civilization Changing Event" (Score:3, Insightful)
Re:"Civilization Changing Event" (Score:3, Interesting)
Time is an illusion employed by the consciousness in order to prevent having to deal with everything at once. Every instant in time is simply part of an already extant continuum. It's like a story in a book: the story is already there, but you haven't read the pages ahead yet. Some think this brings up the whole fate vs. free will debate, but actually i
Re:"Civilization Changing Event" (Score:5, Funny)
I'm not. You think the tourists are annoying now? Just wait...
Re:"Civilization Changing Event" (Score:2, Informative)
Well, I'm not sure how much I know about these things but in respo
Re:Too fast? Not hardly. (Score:2)
...and i'd hate to see the human race suffer because of someone in your mindset doing anything and everything, just because they can. There are things that should be handled very carefully. Not everything that is "new" is great. Calm down and go read...a lot more.
Re:Too fast? Not hardly. (Score:5, Interesting)
Agree or disagree with the man, he may be right, he may be wrong..time will tell, but he is anything but stupid.
Re:Too fast? Not hardly. (Score:2)
IMO it isn't slowing down science that's going to save the world. At best it'll give us a few extra decades or even centuries, but eventually someone will discover how to use a doomsday technology in his basement.
I think the solution lays in changing how we look at the world, how we look at other people. As long as there is hostility between countries, or between nations and groups of people there will be the risk of catastrofical events.
Re:Bill Joy (Score:5, Interesting)
Worth noting that a friend of Bill Joy was maimed by one of the Unabombers bombs.
Just because a person is a nutcase doesn't mean that all their ideas are to be instantly dismissed.
Re:Bill Joy (Score:2)
No, but focus on the people who make the points sanely, rather than giving legitimacy to horrible behavior simply because the person's ideas are "valid".
Re:Bill Joy (Score:3, Insightful)
Re:Who? (Score:2, Funny)