Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Space United Kingdom Science

Exoplanet Hunting NGTS Telescope Array Achieves First Light 19

Zothecula writes The Next-Generation Transit Survey (NGTS) array, built by a UK, German and Swiss consortium, has achieved first light at the Paranal Observatory in Chile. The installation is designed to search for exoplanets between two and eight times the size of Earth, studying them as they pass in front of their parent star.
This discussion has been archived. No new comments can be posted.

Exoplanet Hunting NGTS Telescope Array Achieves First Light

Comments Filter:
  • Cool!! (Score:4, Interesting)

    by gstoddart ( 321705 ) on Thursday January 15, 2015 @12:23PM (#48820499) Homepage

    Many years ago when I was in university and hung out with astronomy nerds ... the notion of discovering an exoplanet was still speculative science, and it was largely thought there wouldn't be many planets.

    Flash forward, and exoplanets are real, documented, and numerous.

    It's awesome to see how much our understanding has changed in the last 25 years or so. And the more we discover about the universe, the bigger and cooler it actually is.

    And, just think, only a few hundred years ago you'd be burned at the stake (or whatever) for saying the Earth goes around the sun.

    • by Anonymous Coward

      And, just think, only a few hundred years ago you'd be burned at the stake (or whatever) for saying the Earth goes around the sun.

      Although a popular belief, no one died at the stake for espousing heliocentrism. The Catholic church burned folks for challenging their authority, which is disgusting, but not because of a view that the Sun is the center of the solar system.

    • by delt0r ( 999393 )

      Many years ago when I was in university and hung out with astronomy nerds ... the notion of discovering an exoplanet was still speculative science, and it was largely thought there wouldn't be many planets.

      What? My masters was in Astrophysics. It was widely speculated that planets are common. But size distribution/distance from parent and all that was a bit of an unknown. We don't believe we (the solar system) is special.

  • by butalearner ( 1235200 ) on Thursday January 15, 2015 @12:27PM (#48820551)
    I wonder what the real step forward is (field of view? accuracy? software?), because that is not much aperture. 1.5 square meters in all, compared to 6 square meters on Kepler and 18 square meters on Hubble. You can get a very basic 200mm reflector on a manual Dobsonian mount for less than $400, but even top of the line custom telescopes could not have been terribly expensive compared to just building the facility.
    • by gstoddart ( 321705 ) on Thursday January 15, 2015 @12:37PM (#48820723) Homepage

      I think software went a long way.

      Before Hubble, the only way was huge optics and high elevations.

      Somewhere in the middle, the software compensating for the atmosphere got much better, and suddenly ground-based stuff was as good as Hubble.

      They also do the neat tricks of the very-long-baseline where several smaller telescopes can get combined into what is effectively a huge telescope, but is much more cheaper to build than a single massive one.

      I kind of get the impression the astronomers have been busy over the last few decades ... CCDs used to be expensive and exotic. Now everybody carries on in their pocket -- so both the hardware and the software have given them huge strides over the last bunch of years.

      Almost to the point that by the time someone has built the next big thing, someone has come close to beating it for a fraction of the cost.

      • by RoLi ( 141856 )

        Ground-based telescopes are "as good as Hubble" for some applications, better at some and worse at others.

        Hubble is still one of the best telescopes ever.

        For example the "Hubble deep field" would not be possible with a ground-based telescope, regardless of how good the software may be.

    • I'd guess the real step forward is workflow. The array can view twelve targets at once, with no dependencies (except they all have to be visible from the site). It can presumably shift from target to target pretty quickly, so I guess they'd sample many objects per telescope per night -- after all, transits happen with a timescale on the order of hours (possibly minutes in extreme cases), not seconds.

      The big deal, though, seems like the relatively high-resolution brightness measurements (one part per thousan

      • Re:Not much aperture (Score:5, Informative)

        by Whiternoise ( 1408981 ) on Thursday January 15, 2015 @01:36PM (#48821479)

        See my other post for more info - particularly the bit about why we'd use this over a satellite.

        A major pro for a dedicated array is that it doesn't have anything else it should be observing. Normally these things are very wide-field, for telescopes. SuperWASP used off the shelf SLR lenses (good ones, mind, Canon 200mm f/1.8's) to create a mosaiced wide view of the sky. They also used a lot of very expensive (Andor) CCDs. The smaller amateur telescopes, e.g. a 3" refractor, might have a focal length of 400mm or so. The field of view of SuperWASP is around 22 x 22 degrees - that is ridiculously wide. The CCDs were 2048px square so we're not talking about high magnifications on deep objects here. These systems are not fast point-point scanners. They're huge eyes watching large chunks of the sky continually, pumping out gigabytes of data every night.

        NGTS has similar specs to SuperWASP, 200mm focal length covering a field of around 10 x 10 degrees. http://www.ngtransits.org/tele... [ngtransits.org]. Note that the mounts are also off the shelf, but super expensive for amateurs http://www.astrosysteme.at/ [astrosysteme.at].

        As I mentioned 1/1000 isn't that amazing. If you expose so your target gives you 15,000 counts and you a measurements per second then you can easily get a nice high signal to noise over a time scale of minutes. The star, once you correct it with some stable reference target and allow for atmospheric extinction, should have essentially a flat brightness so any dip is noticeable.

        After this it's a down to PhDs and Postdocs to sift through all the data, write automatic routines to generate light curves for all the stars and so on. Google sextractor, don't worry, it's SFW ;) .

        • Also remember that these are typically aperture photometry measurements, so the peak pixel could be 20,000 counts and you have an 8-16 pixel neighbourhood that also contributes so could easily get 100,000 counts within your aperture for a single exposure. The dark noise on the SuperWASP CCDs is extremely low: 72 electrons per pixel per hour.
          • That's very, very cool. How long are exposures? Are these devices effectively counting photons at each pixel?

            I'm still waiting for that sensor that reports timestamp, energy, angle of incidence, and x/y coordinates for each photon that hits it. THEN the fun can start.

            • Exposure times on SuperWASP are around 30 seconds according to them. The sensor quantum efficiency is 90% so it's close to counting photons (don't quote me!), I think in practice it's a bit more complex. They're multi-stage-Peltier cooled, backthinned, e2v, blah blah blah. Plus other amazing things like 1% linearity over the whole dynamic range, around 20 electrons readout noise and so on. http://arxiv.org/abs/astro-ph/... [arxiv.org]
      • Re:Not much aperture (Score:4, Interesting)

        by MouseR ( 3264 ) on Thursday January 15, 2015 @04:31PM (#48823647) Homepage

        Just a note regarding "can view twelve targets at once".

        That's just not the point of a telescope array; rather the contrary. The point is to utilise large number of smaller telescopes to point at the same object to gather more light. This simulates a larger mirror minus the greater atmospheric distorsion they provide. Anything above 12" gets really finicky about distorsion, requiring lasers to help compensate: the laser is used to compare the projected point in the upper atmosphere in order to compensate using adaptive optics. All that is terribly expensive.

        The real advancement is in software where all of the (in this case 12) telescopes in the array, are composited into a single image of greater accuracy & resolution.

        • In the case of NGTS and SuperWASP most of the time the telescopes aren't looking at the same target. The purpose of this array is to observe large swathes of the sky simultaneously so each camera has a distinct field of view of around 8x8 degrees which can be mosaiced together. In principle they could also observe a target simultaneously in different filter bands, but I think normally they would pass that duty over to the VLT to gather much more light.

          Also there are plenty of telescopes in the 1-2m clas

    • Not much aperture (Score:5, Interesting)

      by Whiternoise ( 1408981 ) on Thursday January 15, 2015 @01:22PM (#48821261)

      I would say it's observation time on thousands of potential targets. Who's going to do it?

      You don't need adaptive optics or anything fancy, exoplanet hunting is (mostly) measuring quantities of light. Whether that light's been bent a little through the atmosphere and lands on a nearby pixel makes little difference. All you end up doing is using a larger photometric aperture (a circle of pixels that you consider to be the star). Adaptive optics is useful for other things, but for transit detection, meh. Observatories regularly defocus stars (into donut shapes) if they're getting too much light from a star in the field - this is a surprisingly common problem with huge mirrors.

      You can observe exoplanet transits with a DSLR and a small telescope if you have the patience. It's a matter of finding bright stars. Again, you're not going for high resolution or magnification, you're just measuring light. By taking repeated observations, binning your data, phrase-wrapping (by plotting the data as a function of orbit phase) you can increase your signal to noise. The signal is maybe 0.001% of the light, but if you measure 1,000,000 counts then that 1000 count dip is probably above the noise.

      Big observatories cost a lot of money to run and are highly competitive. If you have an extremely strong case for a follow-up observation (e.g. Kepler spotted something and you want to observe it further) then you can get time, but really we'd like surveys that will stare at hundreds of thousands of stars for months on end. Amateur networks like the AAVSO (variable stars) are very valuable because they provide free, virtually continuous data for hundreds of stars. It's simple, boring work that isn't feasible with big-shot observatories; it would be a waste of instrument capabilities.

      Satellites can do this, but they can't store the data, they normally only provide flags that say "this star looks like a good candidate". So the benefit of something like this telescope array is that it can generate vast amounts of data (continuously) and we can actually store it for processing later.

    • by starless ( 60879 )

      I wonder what the real step forward is (field of view? accuracy? software?), because that is not much aperture. 1.5 square meters in all, compared to 6 square meters on Kepler and 18 square meters on Hubble.

      The collecting area of HST is ~4.5 m^2. (2.4/2)^2 x pi
      Collecting area of Kepler is ~0.7 m^2
      http://en.wikipedia.org/wiki/H... [wikipedia.org]
      http://en.wikipedia.org/wiki/K... [wikipedia.org]

      You also have to consider e.g. field of view and observation durations for use in planet searches.

Children begin by loving their parents. After a time they judge them. Rarely, if ever, do they forgive them. - Oscar Wilde

Working...