LHC Data Generation Expected To Scale Up To 400PB a Year 99
DW100 writes: Cern has said it expects its experiments with the Large Hadron Collider to generate as much as 400PB of information per year by 2023 as the scope of its work continues to expand. Currently LHC experiments have generated an archive of 100PB and this is growing by 27PB per year. Cern infrastructure manager Tim Bell, speaking at the OpenStack Summit in Paris, said the organization is using OpenStack to underpin this huge data growth, hoping it can handle such vast reams of potentially universe-altering information.
universe-altering information? (Score:3)
you mean how we see the universe? because i doubt the universe cares much about the data we generate....
A new theory (Score:1)
Another approach is to conceive of a completely different model. I have come up with a different model from the Standard Model, and Quantum Mechanics, and String Theory.
Spring-And-Loop Theory [just-think-it.com] resolves issues the other three theories are stuck on. It is also simpler. Unifies the four forces. And works from the very small to the very large.
But it is a different approach. Most are not ready for this.
Re: (Score:2)
If you want people to take you seriously then you need to show that you understand the current knowledge of physics. If you cannot do that then how ca
Re: (Score:2)
Thanks for your reply.
Regarding my mention of the Standard Model, you are quite right. Please replace that with "classic physics" or "old physics" or "non-relativistic non-quantum-mechanical" physics, or "the physics of Newton". It doesn't change my point, or my theory one iota.
As to my comments about light, in my first COASALT talk, Mr. David Thornley [slashdot.org] made this very same observation. It is rather like complaining that the numbers I chose for the street address of my building are a little too
Re: (Score:1)
You're a crackpot lunatic that has no understanding of physics whatsoever.
Re: (Score:2)
Re: (Score:2)
(2) I think you failed to grasp what I meant by "absurdly limiting notions". Say you have an idea, that does not appear to be testable. The consensus today would be your idea is worthless because it is untestable. I think that is absurdly limiting because (a) it could still allow you to see how something works more clearly, (b) it might become testable in the fut
Re: (Score:2)
Completely agree.
Agreed also. I wouldn't want you to think I am just dumping my theory on the world and walking away. For anything to have a value, it must grow, and grow healthily.
Re: (Score:2)
Now, rather than waste my time giving a blow by blow account of all your other nonsense claims, how
Re: (Score:2)
The observed CMB "temperature" is indeed uniform, but this destroys the Big Bang theory [wikipedia.org], not my theory.
Thanks for the link to Gamow paper. I'll have a look at it but really, if scientists are out by 10^^120 in their measure of the background energy of space, how likely are they to d
Re: (Score:2)
Yes you can argue that the effect is too small to measure but at this point you might as well be claiming that the universe is full of flying pigs that ju
Re: (Score:2)
I'm going to look at your Lorentz paper.
And you are going to relax. Don't worry about me. Stick to physics, it is clearly what you do best.
Best,
Floyd
Re: (Score:1)
But it is a different approach. Most are not ready for this.
Even if right, you're not ready for it, unless I missed pages with precise calculations of various values current theories can predict to high accuracy. Random ideas of how to do things differently are kind of a dime a dozen, to the point most physicists in such fields have more ideas than they have grad students to do the grunt work of getting quantitative predictions out of it. And some numeric predictions are easy to make, considering how simple the Rydberg formula for hydrogen-like spectral lines is,
Re: (Score:3)
(2) As to "new theories are a dime a dozen, I get two new ones a week." Do you ever ask yourself why this is so? Do chemists get two new theories of chemistry a week? No. Because they have a good base model. I maintain that physics lacks a good base model.
The most obviously broken parts of physics, like the inflat
Re: (Score:2)
Re: (Score:1)
(1) You want specific predictions?....
That was a single calculation, yet you are claiming vast implications. You briefly discuss implications on the impact on the structure of an atom, then you should be able to discuss calculations on the atomic spectrum. No need then to ramble on about more complex issues or problems associated with measurements in deep space etc., but instead would work with data that at the simplest level can be collected in a high school physics course lab, although has been done in detail to very high precision in bette
Re: (Score:2)
(2) Books and books have indeed been written about the Big Bang, etc. And I imagine in Ptolemy's time the same was true. It was certainly true with Newton. And of course Einstein. While String Theory probably caused bookstores to open up whole new wings. Is "number of books" your metric?
Re: (Score:2)
I am not sure why the original AC mentioned "spectroscopic data", nor am I sure how it is to be used to test my theory.
Spring-And-Loop Theory has a Planck-scale basis. Hence my repeated point that it will need to be simulated to allow bigger things (like atoms) and much bigger things (like Earth) and still bigger things (like Solar Systems) and the biggest things (like galaxies) to be modeled.
Spring-And-Loop Theory is like LEGO. It is a building system, not a smashing-things-
Re: (Score:2)
The Planck scale is 25 [wikipedia.org] orders of magnitude smaller.
10 million million million million times smaller.
If present spectroscopy's best resolution were the 15 billion light years we can see back in time, Planck scale resolution would be seeing things the size of a tree.
Re: (Score:1)
Who cares. Is it right or wrong is all that matters?
In response to a post that asks exactly where are the numbers, trying to get to that very point and gives a suggestion on how to do so in as straightforward way as possible, you instead go off on a tangent waffling about people being scared. Whether intentional or not, you're being very obtuse in both your comments and on your website. If all that matters is whether it is right or wrong, you're impeding any communication efforts by burying actual claims among a lot of fluff. The testable implications is
Re: (Score:2)
Out of curiousity, why is it you post as an AC? In fact, most of the nastiest posts in this sub-thread are from ACs. What do you hope to accomplish by this? With me, your attacks are, if anything, a proof of the value of my theory.
If what I was proposing was truly nonsense, the proper response would be to ignore it or offer a kindly word of condolence.
Your level of anger an
Re: (Score:2)
It is that the periodic changes and refinements are not Earth shattering.
When you get things right with the foundation of your house, you can build a house that will last. You will still have issues along the way, and there will be maintenance. But you won't have to build 10^^500 houses, or anti-houses, or a house that becomes a cloud that becomes a house again.
Re: (Score:2)
I've answered this, in the original paper, when I explained what Spring-And-Loop Theory thinks "dark energy" is.
By the way, my theory's explanation of dark energy also happ
Re: (Score:3)
I sat through a lecture on the Higgs Boson. It explained why they were expecting it -- basically the final jigsaw puzzle piece to a long-time theory. If the theory was correct, they would be able to find the Higgs Boson at certain energy levels. If they didn't find it, then it's back to the drawing board to figure out what they missed. So no, they weren't necessarily doing basic "Let's ram particles together and see what we get" science -- we've been doing that for decades. This was more of a "If we ra
Re: (Score:2)
you shouldn't be expecting anything, but looking for something that has nothing to do with previous human theories.
I can't agree with you. Science is completely dependent on falsifying theories. If Einstein made a theory, then it is highly scientific to try and falsify that theory - especially at it's margins and where results have never been confirmed. If it holds up, great! He's right again. If it fails, great! Time for an improved theory. That's the fun thing about science, is that the learning about the natural world never ends. Any result of an experiment always leads to more work, more learning. And no one says wh
Re: (Score:2)
Unless the data is so heavy, that they warp space.
HEAVY! Space warps! [strat-talk.com]
Re: (Score:2)
Well, universe damn well should care! Once we trigger the next phase change in the currently metastable quantum space, there'll be no going back. There won't even be time so even talking about going back makes no sense.
It's a bit like kids, once you get one started, you won't have much time (or any, depending on your moral values and local laws) to get rid of it, before it will trigger irreversible phase change in your life.
If only someone had explained these things to our universe, before it started to exp
Re: (Score:2)
Technically speaking, everything we generate, including data, alters the universe.
Re: (Score:2)
Oh, I don't know. Eventually we'll have so many hard drives dedicated to it that it'll collapse into a black hole.
Or - wait for it - the computing power requirements scale so large that the only way to keep the whole enterprise going is to build a Dyson sphere.
Maybe the universe won't care even then, but we'll at least come closer to leaving our mark!
Re: (Score:1)
Compared to Facebook (Score:4, Informative)
To put this in perspective, Facebook states [facebook.com] to be generating 4 PB per day, so 3.6 times more than the LHC. Does anybody know about anything generating more data than that?
Re: (Score:2)
Actually the LHC is bigger! (Score:3)
While Facebook generates 4 PB of new data per day they o
Re: (Score:2)
NSA, they generate all the data of everyone combined.
Re: (Score:2)
Facebook is generating 4 PB per day *now*, while the LHC will be generating 400PB per year by *2023*. 27PB to 400PB in 9 years is MUCH slower than Moore's Law, so their annual storage costs/space requirements will decrease each year.
With the highest density servers I know of (1U 136TB SSD servers), LHC generates around five racks of data per year today. By 2023, they will only be generating around one rack of data per year, based on an 18-month Moore's Law.
Re: (Score:2)
I'm not doubting or challenging you, but I'm interested in knowing about your 1U 136TB SSD servers. Can you suggest some specs?
The highest-density boxes I get to have some familiarity with are Netflix's OpenConnect caches, described at https://openconnect.itp.netfli... [netflix.com] -- where it's mentioned that they fit 36 6TB drives in a 2U chassis, for a total of 216TB, or 108TB/U. You're beating that, and with SSDs, which is ... impressive.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
For that kind of money, you could hire an equivalent of two full-time engineers for a year to design that thing from the scratch for you, and you'd probably get a couple of production units out of that deal, too. You'd need an ME to do thermal and case design, shouldn't take longer than a couple months for that. An EE to do any custom mezzanine boards that one might need + wiring and overall electrical design. Finally, a software guy to make a config console etc. I assume that project management is not incl
Re: (Score:2)
On the cheap, there are always Backblaze's storage pods. They take up more than 1 RU, but for something that is about $10,000 in price, the price is right.
This is tier 3 storage, though. If you want actual enterprise-grade stuff, it costs a lot more, but it will come with enterprise-grade performance and enterprise-grade warranties.
Of course, for long term storage for a lot of data, it is hard to beat LTO-6 for I/O speed and cheap capacity. After the drives and silos are in, if another PB is needed, that
Re: (Score:2)
Re:Compared to Facebook (Score:5, Funny)
So, the LHC should just create a Facebook profile and store all the data on steganographied selfies and baby pictures.
Re: (Score:2)
Eventually, sure, but NAND hasn't reached that point yet, and we're just starting to see 3D flash memory hit the market, offering dramatic increases in density.
Re: (Score:2)
Re: (Score:2)
Very true, but consider the sources and what is generating it.
Facebook is a large percentage of the Internet.
Cern is ONE project (with multiple experiments).
Also, this data has to be ARCHIVED and ACCESSIBLE for all time so that scientists can go back and compare/research past experiments.
Although I'm sure facebook is archiving a large portion of data, I doubt they archive ALL of it for all time.
Oracle (Score:2)
Re: (Score:2)
OpenStack is simply a cloud framework. What does any of that have to do with Oracle? In any case, this would be a great test case for a ginormous ceph cluster. I use ceph in conjunction with approximately 10PB of storage and am looking to increase that by at least an order of magnitude over the next year or two.
More info on ceph: http://en.wikipedia.org/wiki/C... [wikipedia.org]
Re: (Score:2)
Unless they are capturing pure noise, it should be compressable.
Well, it's raw data from physical instruments. It probably is quite noisy, so it will not compress much, except with jpeg-like lossy compression, but that would kinda defeat the purpose I think.
Re: (Score:2, Informative)
The raw data rate that's generated by the particle detectors themselves is unreal. Based on some poorly remembered numbers of the this-many-of-that variety I think it's in the region of 10TB/second: 144 2.5Gsps 8 bit channels per card, a few dozen cards per cartridge, and some dozens of cartridges that run like a ring around the ATLAS detector's front.
The first level trigger/filter rejects the 99.5% of events that are boring and dumb (two protons strike a glancing blow and emit photons; Two protons' quarks
Just to be pendantic... (Score:1)
(Disclaimer: Grad student)
ATLAS generates O(PB) of raw data per second, but we only trigger on events that look interesting (e.g. have an isolated lepton, a sign that something more than QCD background happened in the event), and save those for offline analysis. That works out to something on the order of 100s of MBps being saved during run time. I assume the other experiments have similar data rates.
Re: (Score:1)
Thank God for the dead (Score:2)
hmm... lets see
COBOL.. dead .. dead
BSD.. dead
TAPE
And yet there would be no LHC datacenter without tape.
ref:
http://information-technology.... [web.cern.ch]
http://www.economist.com/blogs... [economist.com]
http://storageservers.wordpres... [wordpress.com]
http://scribol.com/science/hal... [scribol.com]
This sounds like ... (Score:2)
Huh (Score:1)
That amount of data is something only an AI could get through.
Re: (Score:2)
there are no AI and none are planned to exist in high energy physics, the usual methods of filtering and statistics will instead be used and will suffice
Using other people's money (Score:1)
An archive of 100PB (Score:2)
Torrent link?