Become a fan of Slashdot on Facebook


Forgot your password?
Space Science Technology

World's Largest Digital Camera Project Passes Critical Milestone 73

An anonymous reader writes in with a link about the progress of one of the coolest astronomy projects around. "A 3.2 billion-pixel digital camera designed by SLAC National Accelerator Laboratory is now one step closer to reality. The Large Synoptic Survey Telescope camera, which will capture the widest, fastest and deepest view of the night sky ever observed, has received 'Critical Decision 1' approval by the U.S. Department of Energy (DOE) to move into the next stage of the project. The Large Synoptic Survey Telescope (LSST) will survey the entire visible sky every week, creating an unprecedented public archive of data – about 6 million gigabytes per year, the equivalent of shooting roughly 800,000 images with a regular eight-megapixel digital camera every night, but of much higher quality and scientific value. Its deep and frequent cosmic vistas will help answer critical questions about the nature of dark energy and dark matter and aid studies of near-Earth asteroids, Kuiper belt objects, the structure of our galaxy and many other areas of astronomy and fundamental physics."
This discussion has been archived. No new comments can be posted.

World's Largest Digital Camera Project Passes Critical Milestone

Comments Filter:
  • Re:Sad for NASA (Score:5, Informative)

    by Jeng ( 926980 ) on Wednesday April 25, 2012 @04:23PM (#39799479)

    It is politically beneficial it for politicians to cut NASA's funding, but other agencies want these projects done so they do it because they actually have the funding to do it.

    So yea, it is sad for NASA, but it's not NASA's fault.

  • Re:WOW (Score:4, Informative)

    by Savantissimo ( 893682 ) on Wednesday April 25, 2012 @05:22PM (#39800101) Journal

    More to be said - here's the scientific FAQ: []
    Choice bits:

    ...That combination is unique: wide field of view (10 square degrees), short exposures (pairs of 15-second exposures), and sensitive camera (24th magnitude single images, 27th magnitude stacked). ...
    The etendue of LSST is 320 square meters square degrees. A primary mirror diameter of 8.4 m (effective aperture 6.7 m due to obscuration) is the minimum diameter that simultaneously satisfies the depth (24.5 mag depth per single visit and 27.5 mag for coadded depth) and cadence (revisit time of 3-4 days, with 30 seconds per visit) constraints....
    The nominal high-SNR sample defined by i25 for point sources) will include four billion galaxies (55 per square arcminute) with the mean photometric redshift accuracy of 1-2% (relative error for 1+z), and with only 10% of the sample with errors larger than 4%. The median redshift for this sample will be z=1.2, with the third quartile at z=2. ...

    Q: Will the full resolution, full depth image data be available to download?

    A: Yes. There will be a range of data products and download portals. The LSST data system is being designed to enable as wide a range of science as possible. Standard data products, including calibrated images and catalogs of detected objects and their attributes, will be provided both for individual exposures and the deep incremental data coaddition. For the "static" sky, there will be yearly database releases listing many attributes for billions of objects. This database will grow in size to about 30 PB and about 20 billion objects.
    As in the SDSS, we expect a power law of user interactions with the data. At one end of this distribution are simple lookup queries or color jpeg cutout downloads. At the other end are huge statistical calculations over the entire database, and image operation scripts on billions of objects. The data management system is budgeted to handle most but not all of that distribution. Institutions joining LSST early, and members of the LSST Science Collaborations, will have the customary advantage of deep familiarity with the LSST system and survey.

  • by MetricT ( 128876 ) on Wednesday April 25, 2012 @05:37PM (#39800291)

    I'm with the group at Vanderbilt developing the storage filesystem for LSST, and it has some interesting challenges.

    1. It requires redundancy at the server, rack, and site level. 2. Both data and metadata have to scale both in volume and in throughput (GB/sec or transactions/sec) separately of each other.
    3. It has to work on the WAN level (GPFS & Lustre don't scale beyond the LAN yet).
    4. It should optionally have HSM functionality so you can offload stuff to tape.
    5. The data must be maintained in perpetuity so researchers years/decades from now can use it.
    6. It must be portable across operating systems so Windows/Mac/Linux/etc users can all access the data.
    7. All of this should be completely transparent to the user.
    8. And it has to be done on the cheap (scientists definition of cheap, not CIO's definition).

    Yeah, it can be (and is) being done. We're already using our filesystem to store 2+PB of data for the CERN CMS-HI experiment on commodity hardware. But I can tell you it is a substantially harder problem than you think.

Lend money to a bad debtor and he will hate you.