World's Largest Digital Camera Project Passes Critical Milestone 73
An anonymous reader writes in with a link about the progress of one of the coolest astronomy projects around. "A 3.2 billion-pixel digital camera designed by SLAC National Accelerator Laboratory is now one step closer to reality. The Large Synoptic Survey Telescope camera, which will capture the widest, fastest and deepest view of the night sky ever observed, has received 'Critical Decision 1' approval by the U.S. Department of Energy (DOE) to move into the next stage of the project. The Large Synoptic Survey Telescope (LSST) will survey the entire visible sky every week, creating an unprecedented public archive of data – about 6 million gigabytes per year, the equivalent of shooting roughly 800,000 images with a regular eight-megapixel digital camera every night, but of much higher quality and scientific value. Its deep and frequent cosmic vistas will help answer critical questions about the nature of dark energy and dark matter and aid studies of near-Earth asteroids, Kuiper belt objects, the structure of our galaxy and many other areas of astronomy and fundamental physics."
Re:Sad for NASA (Score:5, Informative)
It is politically beneficial it for politicians to cut NASA's funding, but other agencies want these projects done so they do it because they actually have the funding to do it.
So yea, it is sad for NASA, but it's not NASA's fault.
Re:WOW (Score:4, Informative)
More to be said - here's the scientific FAQ: http://www.lsst.org/lsst/faq-science [lsst.org]
Choice bits:
Re:Opening the JPEG takes Eternity (Score:5, Informative)
I'm with the group at Vanderbilt developing the storage filesystem for LSST, and it has some interesting challenges.
1. It requires redundancy at the server, rack, and site level. 2. Both data and metadata have to scale both in volume and in throughput (GB/sec or transactions/sec) separately of each other.
3. It has to work on the WAN level (GPFS & Lustre don't scale beyond the LAN yet).
4. It should optionally have HSM functionality so you can offload stuff to tape.
5. The data must be maintained in perpetuity so researchers years/decades from now can use it.
6. It must be portable across operating systems so Windows/Mac/Linux/etc users can all access the data.
7. All of this should be completely transparent to the user.
8. And it has to be done on the cheap (scientists definition of cheap, not CIO's definition).
Yeah, it can be (and is) being done. We're already using our filesystem to store 2+PB of data for the CERN CMS-HI experiment on commodity hardware. But I can tell you it is a substantially harder problem than you think.