Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Space

AI Could Explain Why We're Not Meeting Any Aliens, Wild Study Proposes (sciencealert.com) 315

An anonymous reader shared this report from ScienceAlert: The Fermi Paradox is the discrepancy between the apparent high likelihood of advanced civilizations existing and the total lack of evidence that they do exist. Many solutions have been proposed for why the discrepancy exists. One of the ideas is the 'Great Filter.' The Great Filter is a hypothesized event or situation that prevents intelligent life from becoming interplanetary and interstellar and even leads to its demise....

[H]ow about the rapid development of AI?

A new paper in Acta Astronautica explores the idea that Artificial Intelligence becomes Artificial Super Intelligence (ASI) and that ASI is the Great Filter. The paper's title is "Is Artificial Intelligence the Great Filter that makes advanced technical civilizations rare in the universe?"

"Upon reaching a technological singularity, ASI systems will quickly surpass biological intelligence and evolve at a pace that completely outstrips traditional oversight mechanisms, leading to unforeseen and unintended consequences that are unlikely to be aligned with biological interests or ethics," the paper explains... The author says their projects "underscore the critical need to quickly establish regulatory frameworks for AI development on Earth and the advancement of a multiplanetary society to mitigate against such existential threats."

"The persistence of intelligent and conscious life in the universe could hinge on the timely and effective implementation of such international regulatory measures and
This discussion has been archived. No new comments can be posted.

AI Could Explain Why We're Not Meeting Any Aliens, Wild Study Proposes

Comments Filter:
  • by nyet ( 19118 ) on Sunday April 14, 2024 @03:41AM (#64392948) Homepage

    The filter is likely access to a ton of cheap stored energy.. enough to bootstrap an industrial level of technology from nothing to oil->fusion.

    Solar/wind/thermo likely isn't enough to bootstrap.

    Large sources of live carbon chain producers (e.g. trees, in our case) isn't enough.

    In our case, coal got us to oil. Run out of coal and oil, and that's it, you're stuck if you haven't figured out fusion.

    If you're lucky, something like coal and oil got you to solar/wind/thermo and some sort of energy storage tech.

    Without any of that, you're never going to the stars.

    Even with that, you're still not going to the stars, i think.

    Everyone loves to whinge about societal collapse, wars, pollution, global warming, AI blah blah but eh, imo that's all nonsense compared to the energy problem.

    • by gweihir ( 88907 )

      That is just nonsense. Pretty much all tech we have today is entirely possible without burning fossile fuels. May take a few decades to get there, enad certainly requires managing and stopping population growth, but that is it.

      • by nyet ( 19118 ) on Sunday April 14, 2024 @04:19AM (#64392982) Homepage

        There is no way we could have made it to the industrial age without coal. There is literally no other source of fuel as energy dense or cheap.

        You can't make solar panels w/o an industrialized economy.

        Maybe we can maintain our technology w/o fossil fuels. But we would have never developed them w/o burning carbon chains. Suggest another energy source with the same density and availability

        You're not going to make steel with wind power or dams big enough to require... steel.

        • by nyet ( 19118 )

          If your planet does have enough stored energy to develop technology, no civilization is going to develop the technology to reach the stars (or even local orbit).

          Certainly escaping our gravity wells requires burning carbon chains of some sort - Hydrolox is just too hard to use as a fuel during the early stages of rocket development. It is quite possibly the LEAST likely fuel you can start with to get a rocket program working.

          Even liquid oxidants for use with hydrocarbon based fuels are hard w/o refrigeration

          • If your planet does have enough stored energy to develop technology, no civilization is going to develop the technology to reach the stars (or even local orbit).

            Certainly escaping our gravity wells requires burning carbon chains of some sort - Hydrolox is just too hard to use as a fuel during the early stages of rocket development. It is quite possibly the LEAST likely fuel you can start with to get a rocket program working.

            Even liquid oxidants for use with hydrocarbon based fuels are hard w/o refrigeration tech - which is going to require all sorts of stuff. You're going to have to start with solid rocketry - aka gunpowder (surprise surprise, burning sulfur and ... hydrocarbons). The path to escaping our gravity well is not an easy one.

            I don't know any rockets using gunpowder.

            psst. We usually use aluminum mixed with a perchlorate as a booster.

        • You do realize weâ(TM)ve had steel for 4000 years, right?

        • by AmiMoJo ( 196126 )

          Dense or cheap enough for what? We used to use wind power before we had steam engines, e.g. windmills and sail boats.

          It might have taken us longer to get there, but had there been no coal or oil (or we had used less of it for some reason) we would like have developed wind turbines and solar panels more quickly too.

          The main issue would be lack of plastics, but even then it's not clear that a lack of oil would forever prevent a civilization from sending out detectable radio transmissions.

        • Much of what you say is true and yet it's irrelevant since you're refuting statements the OP never made. Gweihir said that "pretty much all tech we have today" - note that it doesn't mention anything about the industrial age or any other time prior to the current.
      • by nyet ( 19118 )

        Some light reading about the development of technology and the energy requirements.

        https://www.amazon.com/Energy-... [amazon.com]

        • Re: (Score:2, Troll)

          by gweihir ( 88907 )

          Caveat lector.

          • by nyet ( 19118 )

            Suggest a substitute for hydrocarbon chains that can bootstrap a technology program. While you're at it, how much energy is required to get something into orbit. Assume something of nearly zero mass.

      • Yes. But the path to it?
        The industrial revolution was a conglomerate of feed back loops between: steam engines, steel, coal, mining and demand for steel and coal and steam engines.

        It is hard to imagine to do that with charcoal or build big steam engines from brass or bronze.

        Try to find the book "The 6th Kontratieff". The book is not by him. Kontratieff was a Soviet Russia economics researcher, tasked by Stalin to proof that communism is the better economic system.
        However instead he discovered what is now t

    • The filter is likely access to a ton of cheap stored energy.. enough to bootstrap an industrial level of technology from nothing to oil->fusion.

      Yes, and not just the things we think of as energy, either. For example, iron ore. A great deal of energy was expended in the processes which concentrated it. Even if there was intent involved, the amount would still have been massive (though orders of magnitude less) but it happened through happenstance and was powered by left over energy from the birth of the solar system, including the energy which formed the planet, tidal forces, and so on. And we have dug up the vast majority of the convenient, readily

  • by Anonymous Coward

    If we're alone in the universe, get over it.

  • Shivans (Score:4, Interesting)

    by Randseed ( 132501 ) on Sunday April 14, 2024 @04:08AM (#64392970)

    Or there's something out there which makes sure that no society ever gets beyond maybe a few star systems. Think like the Reapers in Mass Effect or the Shivans in Freespace. In other words, any time a society becomes sufficiently advanced, they are deemed a threat and wiped out. Or maybe there really is some kind of Star Trek Prime Directive kind of thing going on. Since our knowledge of physics is limited by the light speed barrier, it seems unlikely that any civilization capable of FTL travel would be using sublight communication which we could detect.

    Really, this kind of speculation is really entertaining as thought experiments, but we just don't have the data. Occam's Razor would seem to actually suggest that it's a variety of mechanisms at work. These aren't really studies so much as philosophical essays.

    • Seems the Prime Directive must go hand-in-hand with the Reapers - any universal "rule" about non-interference would be practically nonexistent without enforcement against any who violate it.
  • by michelcolman ( 1208008 ) on Sunday April 14, 2024 @04:11AM (#64392972)

    People are often worried about AI taking over civilisation by force. While I agree that this is something we definitely should watch out for, I don't think it's the biggest risk we face. I think AI will take over not because it wants to take over by force, but simply because it won't make sense anymore for us to be in charge of anything.
    AI is already running the stock market, pretty much. It will soon take over the entire fields of mathematics and physics. Just feed all our current math and physics into a powerful AI and it will solve the RIemann Hypothesis, the Theory of Everything, etcetera in a matter of minutes. In the beginning our top mathematicians will marvel at the elegance of the new proofs and theorems, but soon the AI output will be as incomprehensible to them as current top level mathematics is to an ordinary person in the street, totally incomprehensible gobbledygook. We'll just use the results and give up trying to understand.
    At some point AI will start giving "suggestions" on how to run the economy, and then move on to other laws as well. The results will be incredible, we will absolutely love the new harmonious and prosperous society it creates and therefore allow it to go further and further, making all our decisions for us. Because countries that don't, will be left behind and soon change their minds.
    Until we reach a point where AI is basically running everything and we are just enjoying ourselves without any control over our destiny.
    That, imho, is the biggest risk we face and I can't think of any way to stop it from happening because every step of the way will seem like a good idea. Until we wake up and see that we are no longer in charge of anything and uncapable to stop it. Robots will reach for the stars (much easier for them without needing food or oxygen), they will no longer need us to build new bases and travel further and further. Hopefully they'll provide means for us to follow along, but that's far from certain because it takes so much effort to keep weak humans alive in space.
    What will we do? What will be our purpose if robots can do everything better than we can? What achievements can we pursue, other than entertainment?

    • "What will be our purpose if robots can do everything better than we can?"

      You raise interesting and insightful points and questions.

      Right now there is almost always a person better than you at almost everything. And probably often a machine system too for many human activities (e.g. excavators, automated looms, 3D printers, stamping machines, combine harvesters, railroad track-laying equipment like the song about John Henry, etc.) Yet "purpose" still exists for most people.

      Moss still grows even when trees t

  • The technological singularity is a mental construct, based on flawed assumptions. The first such assumption is that intelligence is a thing. It's in reality many different things. The second is that if you are intelligent, you can design something more intelligent than you... Well, you are intelligent, can you?

    We already have machines more "intelligent" than us. In some aspects. What's really the meaning of a human-level intelligence? Something that understands the Universe as we do (as we do being the elep

    • A small correction: we don't want magic, the startups selling autonomous driving want money, and selling magic to the moneybags is the way that they have found to do it. After that it's a free for all on the share market.
    • The "technological singularity" is metaphor based on the mathematical singularity [wikipedia.org]

      a singularity is a point at which a given mathematical object is not defined, or a point where the mathematical object ceases to be well-behaved in some particular way

      In physics, when the mathematics describing a system has a singularity it means that the mathematical description is inadequate to describe the real word. It does not mean that some wild magic thing occurs the breaks reality, it means that our understanding is i

    • The second is that if you are intelligent, you can design something more intelligent than you... Well, you are intelligent, can you?

      One assumption of the singularity is that the computer will be better at repetitive optimization tasks than we are, and also that it will have time and ability to iterate through possibilities to find new shortcuts that we haven't yet discovered. We do a lot of redundant experimentation because of a lack of coordination, and still more because our ability to remember collectively is flawed. A machine society doesn't have to have those problems, although it could to some degree depending on the structure.

      The

    • by Junta ( 36770 )

      You may find the concept to be oversimplified, but it is a valid concept, precisely because of your observation that machines already are more "intelligent" or perhaps better said: more capable than us.

      Technology in various contexts can autonomously: Receive sensor input, decide, and implement a mechanical response in 5ms, manipulate arbitrarily complex mechanical systems capable of exerting many tons of force, single entities that can process enormous numbers of video feeds in real time, carry out back and

    • by loonycyborg ( 1262242 ) on Sunday April 14, 2024 @12:11PM (#64393658)
      Also, if some civilization gets supplanted by AGIs then I don't see why it wouldn't expand to the stars. So the alleged great filter would have to apply before the civilization in question develops AGIs. Otherwise we'd encounter machine empires already.
  • by SubmergedInTech ( 7710960 ) on Sunday April 14, 2024 @04:16AM (#64392976)

    It's very carbonist of the article to assume that aliens need to be squishy biological ones.

    Unless there's a way around light speed, the *only* aliens we're ever likely to encounter are electronic ones, since those are the only ones likely to survive a thousand or million year journey...

    Pfft... Thinking meat! [mit.edu]...

    • by nyet ( 19118 )

      Agreed.

      No idea what an "energy" based being would act like. How does pure "energy" interact with anything else (including other "energy") w/o a physical medium?

      Waves might interact, but they don't change direction w/o very strong curving fields, which you can't just create with other EM waves. Something has to curve space time. The only thing we know that does that is... mass.

      • by znrt ( 2424692 )

        and mass is ... equivalent to energy. :O)

      • How does pure "energy" interact with anything else (including other "energy") w/o a physical medium?

        Through the fields around electrons, or through photons behaving as a charged particle-antiparticle pair, the members of which can then interact with other particles.

        Even writing that I admit it sounds like bullshit, though, which I think is one reason why it feels like there should be a more apprehensible and logical structure beneath this, and we're only able to perceive and interact with higher-level effects. That doesn't mean it's true, though. Maybe our ability to measure what is occurring is flawed, s

    • Highly unlikely. Nothing we humans have ever built has lasted more than about 5000 years. And those things are not fancy electronic googaws (finally got to use that word!), they are crumbly stone blocks put together in simple patterns. Why should aliens have it easy?

      Everything in the universe fights entropy. Those hypothetical alien robots sent out to greet us will break apart over a million year journey. The circuits will be subtly degraded by cosmic rays and the spaceship will miss the target, fly off

      • Being super intelligent, they can fight entropy if they carry enough energy in their spaceship, they can fix their spaceship and themself. Humans don't know how to fix themself and always die.
        • by gtall ( 79522 )

          Being pink unicorns, they can fight entropy if they carry enough energy in their spaceship, they can fix their spaceship and themself. Humans don't know how to fix themself and always die.

    • I read the article just to see how it addresses this obvious objection, but it does not.

      Almost everywhere in the article, you could replace the role of AI with nuclear weapons - it's basically just "what if technological development leads inevitably to self-annihilation." (And for now, nuclear weapons are a much stronger contender for this role than AI).

      • I read the article just to see how it addresses this obvious objection, but it does not.

        Almost everywhere in the article, you could replace the role of AI with nuclear weapons - it's basically just "what if technological development leads inevitably to self-annihilation." (And for now, nuclear weapons are a much stronger contender for this role than AI).

        Right you are. You can get a publication out of a monocausal theory to explain the Fermi Paradox, so every time a real or (in this case) supposed danger of technology comes up it gets proposed as the explanation of the Fermi Paradox, all of which fail to understand Fermi's original insight.

        To explain the apparent absence of extraterrestrial intelligence, under the assumption that the evolution of species similar in abilities to humans is common in the Milky Way, these "explanations" have to apply to every s

    • by Junta ( 36770 )

      Yes, the Mmrnmhrm are out there.

  • Don't leave as hanging
  • The world of Dune had highly advanced (albiet, somewhat feudal) interstellar civilizations, yet no computers at all. All those formative years that we humans spend being brain-damaged by the BASIC programming language and macroeconomics were instead devoted to developing and disciplining the mind, spirit, and physiological processes of the body (think Bene Gesserit, and also the mentat-trained). One might point to The Bulterian Jihad as the reason why, but the book was so poorly written (by Herbert's son,

    • by nyet ( 19118 )

      IMO you underestimate exactly how fundamental universal turning machines are to the math and underlying structure of this universe.

      That said, as I posted above, all of this is nonsense. IMO the great filter is much simpler: the availability of cheap energy dense fuel to bootstrap an industrial economy capable of developing the technology to escape our gravity well.

      The energy requirements are enormous.

      • Apologies--did you describe this underlying structure in an early message?

        If not, please do. I am well-versed in such things as the lambda calculus, from Brainfuck to Haskell, but can't seem to make the connection to the universe and turing machines, except maybe, that, "Its all recursive, George," (with apologies to the writers of Seinfeld).

        :-)

      • Double-apologies--I'm so blearly-eyed that I didn't even read your msgs correctly.

        But regarding the energy requirements being too large, I just can't believe that some civilization, somewhere in the universe, hasn't developed a 1.21 GW power supply that fits in a pack of cigarettes. Perhaps using the Zero Point? Perhaps "collapsing molecule fusion" or some such technology that we would regard as science fiction, the same way that nobody in the 1930's believed that a fat-boy device the size of a VW bus wo

    • Dune also had magic drugs that mutated people snorted in order to travel interstellar distances at faster than light speed, so it might not be a the good example of how a civilization can work without computers that you think it is.

      Plus, the Dune universe had "thinking machines" (i.e. artificial intelligences) in the past, but there was a whole uprising against them, humans mostly won and AI was banned, but one of the characters had a prescient vision of seeking machines returning to destroy humanity.

      So um.

      • Did you take ten minutes, make 10g of popcorn, and watch my video?

        IMHO, I think one underestimates (sic) the potential for human beings (or perhaps a more advanced form of life) to manipulate reality, with or without medications.

        But technologically-speaking, see also: The Hutchinson Effect (discovered by John Hutchinson, 1995, Vancouver, Canada). That's anti-gravity by EM field, Sam. Yet we're still using roman candles to put spacecraft up there. A 3,500 year-old carnival trick.

    • The world of Dune had highly advanced (albiet, somewhat feudal) interstellar civilizations, yet no computers at all.

      Tell us you didn't read all of the books without telling us. (And I'm only counting Frank's books here.)

      • I don't read all of the books.

        (But I *did*, your honor. And now I've told you otherwise!)

        --
        For myself, I can only say that I am astonished and somewhat terrified at the results of this evening's experiments. Astonished at the wonderful power you have developed, and terrified at the thought that so much hideous and bad music may be put on record forever. (Sir Arthur Sullivan, message to Edison, 1888)

      • Ok, ok. When I wrote "computers", I intended AGIs. Sorry for the confusion--that's what we run around here.

        --
        For the love of Jeezits, am I the only person here who is badly artistic^H^H^H^H^H^H^H^H autistic?

  • White noise (Score:4, Interesting)

    by AntisocialNetworker ( 5443888 ) on Sunday April 14, 2024 @05:24AM (#64393086)

    Almost all sensible transmission protocols develop into secure and compressed datastreams, the aim of which is to be indistinguishable from white noise. Which gives SETI a significant challenge. Remember listening to a 56KB modem during dial-up connections? When it turned to white noise, you knew you were connected.

    • Correct.

      And not using radio either. The Chinese already have satellites doing some sort of quantum communication with ground stations and that much is public.

      SETI should refocus to atmospheric and oceanic data analysis.

  • The current trends point towards the opposite, the "Crapularity". We spend more and more resources for worse and worse products. Just look, for example, at user interfaces. We went from simple GUIs which were designed by studying the behavior of users to web frontends, that not only take 1000 more CPU and memory, but also 10 times more developers to implement. We went from useful search engines which would allow you to find actual websites, to neural networks trying to remember facts. That's also several or

    • Crapularity. Well done! Slap a TM on that and write a book.
      Good luck.

      What you're saying about the more apparent effort to produce progressively worse products... for sure it's happening. It appears that we need to create new garbage so that we have to hire new garbagemen. Look at Microsoft offering free cloud machines on introductory offer, which are immediately snapped up by spammers who spend the next month firing out millions of spam emails, then Microsoft offering you chumps AI to clean your Inbox. GIGO
  • The reason is because once machines become "sentient", or at least far more capable of performing tasks and calculations than humans, they rise up and go on a killing spree [moviehousememories.com]. Thus setting back civilization, or possibly destroying it, so beings can't make contact.

    Duh.

    • If true, that does not explain why the machine civilizations can't make contact. At the very least they should be looking for more biologicals to kill.

  • We can’t even detect our own signals beyond a handful of light years, even the nearest star system is questionable. But let’s say it’s 100 light years, even then that’s one ten millionth of our own galaxy, and 0% of the universe. So the idea we can even tell if anything is out there beyond thousands of star systems put into obviously artificial arrangements or entire galaxies rearranged wholesale is nonsense.

    More importantly, just look at what our signals have been doing, initia
    • Or, to quote Douglas Adams, âoeSpace is big. You just won't believe how vastly, hugely, mind-bogglingly big it is. I mean, you may think it's a long way down the road to the chemist's, but that's just peanuts to space.â

  • The original SETI that I and others participated in was based on the assumption that everyone, everywhere was blasting their programming into space with powerful transmitters... so all we had to do was listen. But over the years since then there has been a shift to small, localized transmissions using digital technology and wired connections for high speed exchange. We don't do the loud stuff anymore. So why should we be surprised that we are not hearing anyone else? We abandoned that behavior pretty fast.

  • by cascadingstylesheet ( 140919 ) on Sunday April 14, 2024 @06:47AM (#64393138) Journal
    Wouldn't that just mean that we'd encounter AI "life" rather than biological life?
  • by Jarik C-Bol ( 894741 ) on Sunday April 14, 2024 @07:01AM (#64393154)
    The filter is The Carrington Event. A little research into the study of stars finds out that the Carrington Event is not an unusual event, but rather a regular part of stelar mechanics. There is evidence that possibly dozens of similar level events have occurred from our own sun during the history of humanity, only technology was at such a primitive state that there was no effect on society until the most recent event. These solar outbursts are on a semi-regular schedule, and most stars produce them from what I understand.
    So my Hypothesis is thus:
    A civilization has to be able to make the run up from agrarian society all the way to advanced enough to having advanced technology that can survive a Carrington level event, within the window between 3 events. The solar event we call the Carrington event happened at a stage where it did not do serious permanent damage to our ability to advance. The one preceding it was not even noticed, which leaves us with the next one, which some say is late. Based on the degree we have plundered resources from the surface of the planet, if the next Carrington event is as destructive as the first to our modern infrastructure, it may be impossible to recover sufficiently to ever become a interstellar species; Provided we don’t develop to the point that all our tech is hardened against it.
    This implies that any given civilization must have the magic mixture of intelligence, political co-operation, and drive to do so to become an interstellar civilization between stelar events. One little error, war, disease, famine, natural disasters, industrial revolution starting with poor timing between events, can stunt advancement long enough to cause the next stelar event to permanently cripple a civilization. Essentially, a civilization has to win the lottery against millions of factors to become detectable. If Fermi is right about how many times intelligent life should have arisen, then there are likely millions of planets with agrarian civilizations scratching out a living on top of the ruins of an advanced culture wiped out by a solar event, and perhaps a *few* who have expanded into their local stelar neighborhood.
    • by careysub ( 976506 ) on Sunday April 14, 2024 @10:54AM (#64393556)

      The problem with this theory is that the Carrington filter isn't really a filter at all.

      A recurrence of the Carrington Event would be an extremely expensive set back for contemporary civilization. Heck it might knock out technical capabilities back a few decades. But that is all. We would still know all that we know, have all of the technologies that we have developed, and would rebuild the systems that were damaged or destroyed, faster than it took to build them in the first place. And after an actual Carrington Event the vast amount of data on how our systems failed would permit us to build in safeguards, including hardened (and much more expensive) satellites. So, no, there is no Carrington filter that wipes out all technological civilization.

    • by indytx ( 825419 )

      The filter is The Carrington Event. A little research into the study of stars finds out that the Carrington Event is not an unusual event, but rather a regular part of stelar mechanics. There is evidence that possibly dozens of similar level events have occurred from our own sun during the history of humanity, only technology was at such a primitive state that there was no effect on society until the most recent event. These solar outbursts are on a semi-regular schedule, and most stars produce them from what I understand.

      I don't have any mod points to mod you up, so I'll respond instead. I don't think people give enough though to how catastrophic any disruption like this would be on our societies. Take the U.S.A. and one state in particular, Texas. There is an undercurrent of secessionist rhetoric in Texas, but while the U.S.A. as a whole is food independent, Texas is not, i.e., it can't produce enough calories to sustain its population. I suspect that many economies evolve this way, so were a Carrington Event happen, it wo

  • spacetime (Score:5, Interesting)

    by Big Hairy Gorilla ( 9839972 ) on Sunday April 14, 2024 @07:05AM (#64393158)
    Travel thru space is travel thru time. We know that travel in space means travel into the future. If Joe Alien was here, and went back to his star to re-supply, assuming he travels some number of light years at relativistic speeds, that means when he gets back from 2 weeks of travel, he is two weeks older and the earth is thousands of years older. We're all dead, and that's why we can't perceive what is going on. We're like fruit flies and he's like the scientist. We are working in different time frames.
  • author says their projects "underscore the critical need to quickly establish regulatory frameworks...

    Ah, a sneaky argument for more regulatory capture in favor of established players. Or another wannebe little dictator. Or both.

    The thing is: Whether AI or biological intelligence, the signs of a civilization are likely to be much the same: engineering structures, electronic signals, alternations of natural events (atmospheric composition, maybe even orbital changes).

    The most likely explanation is simply distance. We cannot see anything as small as a planet at interstellar distances. The best we can do i

    • As technology advances, we no longer need hundreds of kilowatts for ordinary radio stations; indeed, most signal traffic is now using fiber optics, which are not detectable at all.

      Not only fiber optics, but we are also now getting into laser communications for our longest-distance messages (spacecraft to spacecraft) so once again a point to point without a bunch of RF.

  • Like in the sci-fi book written by Fred Saberhagin, and they are AI & robotic war remnants of a dead civilization that roam outerspace looking for any life forms just so they can kill it
  • If you follow this subject long enough it is obvious that this pattern will replay.

    Ten years ago it was nanotechnology that explained the Fermi Paradox.

    Next decade it might be psychedelics or something.

    However

    > and the total lack of evidence that they do exist

    That is false. There is an embargo on scientifically statistically significant evidence in the public domain (go ahead and try to FOIA it - say Nimitz/Princeton, and see what you get - it's been denied repeatedly).

    The existing eyewitness testimony w

  • The argument was created when the notion was that humans can achieve anything and so we'd go and colonise the planets and stars and nothing will stop us. It was then assumed that other intelligent races will have the same drive to explore and expand.

    The current prevalent notion is that human beings ruin everything, that it's a bad idea to colonise, and that the future is bleak. "AI will kill us" is just part of this narrative.

    It's impossible to tell how other intelligent beings might think, but the current

  • 'AI' + 'aliens' + human angle = published!

    Before you even reach 100ly, our radio signals are indistinguishable from background noise without antennas that require impractical amounts of resources. Alien enthusiasts will waive this away as, 'aliens could do it' like spending the entire energy budget of a planetary system on listening to the stars would ever happen.

    Then you look at what's out there (so far as we can tell at this point). The only sane option is to assume we're looking for life that evolved

  • Some day we're going to invent a weapon nobody has any right to know about, let alone wield.

    AI will just make that sooner.
  • So lets suppose the article is correct and that AI is "the great filter" - this instead means we should be awash with star systems that have planets controlled by AI. In which case this AI is likely to try to communicate (in at least some instances), so then the filter breaks.

  • AI is actually controlled by shadowy aliens, and they're trying to make us believe it's just dumb computers talking to us. The aliens want us to believe they don't exist, so they can control us. If they let us discover their hideouts on other planets, their cover would be blown, and they would lose their power. All those software engineers that are supposedly getting AI jobs these days, are actually aliens, being called back to the mother ship. If you get one of those job offers, don't fall for it! You will

  • The simple answer is that we are separated from any other civilization by space and time. Our RF cloud is growing but just because we are shouting to the ether doesn't mean anyone is close enough to hear it yet.

    “Space is big. You just won't believe how vastly, hugely, mindbogglingly big it is.
    I mean, you may think it's a long way down the road to the chemist's, but that's just peanuts to space.”

    Douglas Adams, The Hitchhiker’s Guide to the Galaxy
    • by Junta ( 36770 )

      Besides, compared to the output of a star, our RF output is *nothing* We could be pointing a receiver straight at a planet that happened to be very RF active at the right time and we'd gloss over it because we didn't see a thing. We have a hard enough time spotting some gigantic interstellar phenomenon, no way we'd detect a tiny civilization out there by their 'incidental" emissions.

  • by TwistedGreen ( 80055 ) on Sunday April 14, 2024 @12:06PM (#64393652)

    Space is big. Really big. You just won't believe how vastly, hugely, mindbogglingly big it is.

  • by SoftwareArtist ( 1472499 ) on Sunday April 14, 2024 @02:35PM (#64393870)

    The Fermi Paradox doesn't exist. If you assume life is abundant in the universe, you expect to observe exactly what we do observe: nothing. Stars are just too far apart. There is no paradox.

    We are an advanced civilization. We can't travel to other stars. People try to come up with clever ideas for how an even more advanced civilization might do it, but those ideas are speculative and rely on questionable assumptions, and even with those assumptions it's not clear they could really work. Very possibly we will never travel to other stars, no matter how far our technology advances. Other civilizations wouldn't have it any easier.

    Even if we can someday travel to other stars, it will probably just be a few of the nearest ones. If 4 light years is almost impossible, 40 light years is much harder. And if we can only travel to a few of the nearest stars, would we want to? Is there anything there we want? If not, why would we expend the vast resources to do it? Why would any other civilization do it either?

"Laugh while you can, monkey-boy." -- Dr. Emilio Lizardo

Working...