Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×
Databases Programming Software Space Science IT Technology

6.7 Meter Telescope To Capture 30 Terabytes Per Night 67

Lumenary7204 writes "The Register has a story about the Large Synoptic Survey Telescope, a project to build a 6.7 meter effective-diameter ground-based telescope that will be used to map some of the faintest objects in the night sky. Jeff Kantor, the LSST Project Data Manager, indicates that the telescope should be in operation by 2016, will generate around 30 terabytes of data per night, and will 'open a movie-like window on objects that change or move on rapid timescales: exploding supernovae, potentially hazardous near-Earth asteroids, and distant Kuiper Belt Objects.' The end result will be a 150 petabyte database containing one of the most detailed surveys of the universe ever undertaken by a ground-based telescope. The telescope's 8.4 meter mirror blank was recently unveiled at the University of Arizona's Mirror Lab in Tucson."
This discussion has been archived. No new comments can be posted.

6.7 Meter Telescope To Capture 30 Terabytes Per Night

Comments Filter:
  • 30TB raw? (Score:3, Interesting)

    by Jah-Wren Ryel ( 80510 ) on Saturday October 04, 2008 @01:57AM (#25254467)

    When I worked at the CFHT a few decades ago, they had a bunch of "data reduction" algorithms they ran on each night's run that reduced the amount of data they needed to store by at least a factor of 10.

  • Re:Why not... (Score:5, Interesting)

    by Nyeerrmm ( 940927 ) on Saturday October 04, 2008 @03:23AM (#25254705)

    The basic problem is that a 6.4 meter aperture can't fit in a launch vehicle, Ares V is only to be 5.5 meters.

    Hubble was built at the diameter it was (2.4 meter) because thats about the maximum you could build of a stiff mirror that held its shape well enough through launch to remain optically sound on orbit. When you require 10-nm level precision, it takes a hefty structure to keep things that stiff.

    In order to go bigger, the methods they're using for James Webb manage to double the aperture while halving the weight. The way they do this is using active controls and sensors to correct errors rather than rely on avoiding all errors. But looking at James Webb, you'll notice it focuses on IR which is very hard to observe from Earth, while no optical band concept is out there. This is because the new, big Earth-bound scopes use adaptive optics to eliminate seeing errors (the variations in the atmosphere that Hubble avoids), and get potentially better images than Hubble since larger mirrors can be used.

    Of course, if the money shows up, there are other advantages to having a space-based observatory, particularly access time and not having to worry about the effectiveness of the adaptive elements, so I'm sure we'll see a proper Hubble replacement eventually, but it's certainly not as critical for scientific progress as some might think.

  • by lysergic.acid ( 845423 ) on Saturday October 04, 2008 @03:48AM (#25254773) Homepage

    we might be reaching the physical (or practical) limit of data density for hard disk platters, but we'll probably just move to new technologies. it's very unlikely that magnetic disk drives are the pinnacle of data storage technology. there are probably more efficient storage mediums in the works already.

    i don't know what can currently match magnetic disks drives in terms of IO speed, but holographic storage [wikipedia.org] shows a lot of promise. in theory, holographic storage can read/write millions of bits of data in parallel rather than one bit at a time as with conventional optical or magnetic media. the theoretical limit of holographic storage density is tens of Tb (terabits) per cm^3. and already commercial industries have achieved 500 Gb per square-inch (about 5x the density achievable on magnetic hard disks).

    holograms also have some very interesting properties that may or may not transfer to digital data storage. for instance, if you record a hologram of a 3D object onto a photographic plate, you can in essence reproduce a 3D image of the whole object with any piece of that photographic plate. Wikipedia explains this phenomenon thusly:

    Since each point in the hologram contains light from the whole of the original scene, the whole scene can, in principle, be re-constructed from an arbitrarily small part of the hologram. To demonstrate this concept, the hologram can be broken into small pieces and the entire object can still be seen from each small piece. If one envisions the hologram as a "window" on the object, then each small piece of hologram is just a part of the window from which it can still be viewed, even if the rest of the window is blocked off.

    since holographic data storage also uses optical interference patterns to store information, i guess it's possible that this phenomenon would also transfer over, though it might not since we're talking about digital data in this case rather than analog data. with analog data, losing a part of the interference pattern simply reduces the resolution of the holographic image, though it remains whole. with digital data, that loss of resolution could simply corrupt the data. but i don't know, i'm not a holography expert.

  • by NixieBunny ( 859050 ) on Saturday October 04, 2008 @03:02PM (#25257967) Homepage
    I used to think that until I started to work for astronomers. They actually take photos of noise, and then add noisy images together ("stacking") until pictures of interesting faraway things emerge from the noise. That's why a nightly sky survey is so useful - you can add together a few months of images and see stuff that you would never have seen in a single image.

"Floggings will continue until morale improves." -- anonymous flyer being distributed at Exxon USA