Quake-Catcher Aims to be Largest Distributed Seismometer Network 75
Nature is reporting that a new distributed computing application is looking to monitor earthquake data using the accelerometer in many computing devices. In the long run, "Quake-Catcher" will hopefully be fast enough to give warning before major earthquakes. "If it works, it will be the cheapest seismic network on the planet and could operate in any country. It wouldn't be as sensitive as traditional networks of seismometers, but Lawrence says that's not the point. 'If you have only two sensors in an area, you have to have a perfect system. If you have 15 sensors in a system it [can] be less perfect. One hundred, one thousand, ten thousand -- your need for the system to be perfect becomes much smaller,' he says. 'That's really our approach -- just to have massive numbers.'"
Quake Catcher... (Score:3, Funny)
Re: (Score:1)
Re: (Score:1)
Re:Great vaporware application (Score:5, Interesting)
Re: (Score:2, Interesting)
I'm more skeptical as to how accurate he can geolocate each laptop. I've had IP-geolocation tools tell me I'm in a city 500km away...
Re: (Score:2)
Re: (Score:2)
If you volunteered your machine for this, why would you have a problem telling them (in general terms) where you live?
It seems to me that precision down to the city would be close enough, but I suspect most people would have no problem giving somewhat more precise measurements, such as from google maps or some such.
Re: (Score:3, Interesting)
Plus it doesn't tear up your CPU at night
The article is thin on details, but I think this might kill your network instead of killing your CPU.
The idea here is to detect subtle movements of the laptop (which are small enough to not need shutting the laptop down). Apparently whenever the accelerometer senses a motion it will communicate to a central server within a second. Imagine using one of these in the train or a bus...the laptop would be constantly pinging the server. A quake of magnitude 4 is not going to feel any stronger than the moveme
Re:Great vaporware application (Score:5, Insightful)
1,000 laptop accelerometers cannot do what a single seismic sensor can, because they are orders of magnitude less sensitive. You can't take 1,000 sensors, add the data together, and say it is 1,000 times more effective than a single device. If the sensor granularity is not sufficient to detect what you are trying to detect, then one or one million will not be able to detect your subject. It'd be like using one cheap VGA webam to try to photograph surface topography on Pluto, and when that didn't work, trying the same thing by using 1,000 cheap VGA webcams together.
Stupid.
Re: (Score:3, Insightful)
Lucky Imaging uses multiple high accuracy devices that are accurate enough to capture the granularity required, but are otherwise limited by extraneous transient factors. By using multiple devices the chance of achieving an optimum reading vis a vis those extraneous factors is maximised. This situational opportunism is why it is called "Lucky Imaging", and it cannot be applied to the scenario where the device itself is not capable of making the reading necessary, even under optimum conditions.
A
Re:Great vaporware application (Score:5, Informative)
To extend this to a domain where you don't have the effective control, you have to automatically detect where different pictures fit. I remember having seen somebody that did this; I can't remember where, though.
Eivind.
Re: (Score:2)
There are techniques for extracting higher quality data from overlapping low-resolution data sets.
Yes and no. If your low-resolution images are properly acquired, that with with no aliasing, you're fucked. Aliasing means frequency components higher than half the sampling rate/camera resolution are not being filtered out prior to quantization by, in our example, the camera's CCD. When your image is anti-aliased, it looks good, but also these higher frequency components have been filtered out. Whatever they
Re: (Score:2)
There are techniques for extracting higher quality data from overlapping low-resolution data sets.
Yes and no. If your low-resolution images are properly acquired, that with with no aliasing, you're fucked. Aliasing means frequency components higher than half the sampling rate/camera resolution are not being filtered out prior to quantization by, in our example, the camera's CCD.
You're usually still good - the average energy is still distributed the correct place. Aliasing is only an issue for "single point" sampling; if the sample covers an average over a time period or an area, you're still getting an energy increase in the average for that area.
Here's a paper covering the area: High-resolution image reconstruction from multiple low-resolutionimages [ieee.org]; it's the 6th hit on Google for a search for "high res from many low res images". Note that you can even do this from JPEG-comp
Re: (Score:2)
You can't take 1,000 sensors, add the data together, and say it is 1,000 times more effective than a single device. If the sensor granularity is not sufficient to detect what you are trying to detect, then one or one million will not be able to detect your subject.
Well, actually that's quite wrong, to a certain extent. If we assume that these sensors always detect something (be it noise or parasite vibrations which you can consider noise), then by averaging their signals all together you can actually redu
Re: (Score:2)
My whole point was that they won't. Accelerometers in laptops will register 0 (as in the discrete digital zero value) when a seismic event occurs, making interpolative data extrapolation techniques impossible.
If what you are trying to detect is a reading on something that is beneath the Nyquist threshold for your sensors, then it matters not how many readings, or how many sub-sample deviations you can collect, you'll still end up with nothing but noise.
Re: (Score:2)
Accelerometers in laptops will register 0 when a seismic event occurs
If you *do* know that then fine (do you actually know that?) but if you don't I'd think twice before assuming it.
Noise averaging does *not* work the way you describe, because white-noise signals averaged do not produce lower noise, they produce more white noise.
Hahahahahahahahaha. That's funny because you seem to be serious. lol, seriosuly man, pick up a book on signal processing basics or something.
Given that you completely missed my
Re: (Score:2)
Noise is cancelled using lowpass or highpass filters around the signal. Take 10 series of 10 random numbers each. Average the first one in each series, the second one in each series, the third one in each series to get the "average series". What do you get? A set of 10 more random numbers. Unless there is already a common bias in the series (otherwise known as an analo
Re: (Score:2)
Noise is cancelled using lowpass or highpass filters around the signal.
Hahahaha. Come back when you get a clue, sucker. Signal processing 101 : Averaging is the most basic way to get rid of noise when you have many copies of the same signal with everytime a different noise. lol, low-pass and high-pass filters? And how the hell do you do if your signal covers the entire spectrum? There's a shitload of ways to reduce noise, but in that case you could trying profiling the noise in the frequency domain, then
Re: (Score:2)
Narrower at a median band with no data.
Look here, I'm going to say th
Re: (Score:2)
LAPTOP. SENSORS. WILL. GET. NO SIGNAL. Not a little, not below the noise threshold, not an eensy weensy bit. NONE. NADA. ZIP.
Read my lips : You're an idiot. Sensors return noise for many reasons, either because of their noise level or because they're close to a source of noise, like a hard drive spinning close to them. Besides they're more sensitive than you claim, here's some of what I've found to illustrate my claim :
"Place your laptop on a table and see the seismic waves from tapping your toe on the fl
Re: (Score:2)
That's a point too : how do you make sure that your signals are synchronous ?
You can use incoherent averaging, which is the same thing as coherent averaging except in the frequency domain, it seems (I just read about it in a book, not too familiar with it). However, it seems that incoherent averaging is less efficient at improving the SNR, so if you have an easily detectable event you care about you could use cross-correlation between the different signals to synchronize them with respect to the event stu
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Pointlessly wear out his wallet maybe?
Hint: Electricity costs money.
A CPU in hibernate takes next to nothing. A modern CPU 'tearing' along at max utilisation 24x7 will make a noticeable bump in your utility bill.
Figure 24h *30 days = 720 hours / month, so a 300Watt PC going full tilt for a month burns 216kWh. Average kWh in the US is
How about an embedded system? (Score:2)
Figure 24h *30 days = 720 hours / month, so a 300Watt PC going full tilt for a month burns 216kWh. Average kWh in the US is .10 cents. So $21 bucks a month on average just to 'tear up your CPU'. (or $250 a year on average... up to $500 a year in California or New York where electricity is higher...)
I would be surprised if this application couldn't be made to fit on something like a Technologic Systems http://www.embeddedarm.com/ [embeddedarm.com] TS-7400 (which comes with a 12Mb/s USB port, cost $100 in quantity). The AC power draw for the TS-7400 plus accelerometer would be 6 watts max ($5 to $10/yr) and would pay for itself in less than a year. In addition, the TS-7600 is fanless and diskless, so there would be much less extraneous vibration.
Research Project's Website (Score:2)
if they're inexpensive enough, I wouldn't mind dropping 15$ on a USB accelerometer. Heck, I'd drop $25 if it was at all accurate, as I'm highly interested as to see how sensitive and see what kinds of vibrations it does pick up.
Re: (Score:1)
I definitely don't mind anyone spying on my accelerometer..
How kind of you! Think, if I would also want share my accelerometer, we two could be predicting an earthquake somewhere - right? No, the system would also need to know the exact (GPS) location. Would you be so kind that you'd submit your location to some honest researchers at any given moment - after all you have done nothing wrong, and... have nothing to fear - or have you
How cute is that, for the benefit of the humanity you will let the terr..quake-catchers to spy your accelerometer.
Wait, I have a
Re: (Score:2)
There is no requirement for precise location. You over state your case.
Simply giving your zip code (or your county's equivalent) would be quite sufficient. When all the laptops in 98210 start signaling thats a pretty good hint. Most people would have no problem giving that level of precision or vagueness as the case may be.
Traditional seismometer technology can produce better r
Re: (Score:2)
Re: (Score:1, Interesting)
I definitely don't mind anyone spying on my accelerometer.
Don't be hasty. Your accelerometer could be used to profile you at home. No, I'm not kidding. When you sit down at your computer there will be an event. When you shut your doors there will be an event. By analyzing your accelerometer I could probably have a good idea of what your daily schedule is. This might be helpful if I wanted to break into your house (the coordinates of which I would know from the seismology network).
Re: (Score:2)
And your basis for this sweeping statement is
Accelerometers (Score:2, Interesting)
Re: (Score:2, Offtopic)
Re:Accelerometers (Score:5, Insightful)
Re:Accelerometers (Score:4, Interesting)
Re: (Score:3, Funny)
Re:Accelerometers (Score:4, Informative)
The fact that you could have corroboration from 1500 points in a 75 square mile area is quite an improvement on what they have now, and at a much cheaper price.
If you spend time analyzing data, it's amazing what you can find. That is one of the reasons that the US government wants to monitor everyone's communications... to spot small trends... and of course to gather evidence to use against political rivals thus ensuring their unending reign of
Back on track. The sensitivity of things like the wiimote add huge potential to such an endeavor. Just through sheer numbers, the size of the area shaking makes a big difference on the impact or relevance of the seismic event. It's physics, and if you are trying to see the true graph of something, the more data points you have to plot, the more informative it is. Even if some of the sensors are unreliable, they have the ability to ignore anomalous readings and use those that match others. Since you can be certain that there is an event happening (old system still in place) you can ignore or throw out data from sensors that are TOO active or not active at all, then sift through what is left to see what you find.
I'm reasonably certain that they will see a lot when they learn the true extent of the area affected by any particular event. For example, if the event stays limited to only the fault area it would be much different than if an entire area were affected outside the fault line area. Having thousands of sensors will help show that. Perhaps through this they will learn that certain geologic structures actually do redirect the energy to other areas, allowing predictions of damage to match what before were unpredictable events thus adding perhaps minutes to the warning times. That would save lives and that is what they want to do. Mapping effects through an area will help. Thousands of sensors will help achieve that despite the seemingly unreliability of the sensors themselves.
There are millions of ants in an ant hill, kill a couple hundred and they carry on. This is the same sort of idea.
Re: (Score:2)
Re: (Score:2)
That'd be awesome! They might even bring down a bridge or two...
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Similar Project? (Score:2, Interesting)
I signed up for the Tsunami Harddisk Detector project, but don't know if they are related.
"Thanks for your interest in the Tsunami Harddisk Detector project. We are currently installing the system on a world wide basis. To keep the system in a stable state, further installation is an incremental process. We have put you on our mailing list and will inform you as soon as we can make the software available to you. Best regards, Michael Stadler ____________________ www.ninsight.at "
So I sent off an email
Hope is not a plan (Score:3, Insightful)
and the scientific basis for prediction is what, exactly?
a meaningful prediction has to be precise in location and in time.
time is the enemy:
the thirty second warning is little better than "duck and cover" if it cannot be communicated effectively.
Re: (Score:3, Informative)
and the scientific basis for prediction is what, exactly?
Re:Hope is not a plan (Score:5, Insightful)
the thirty second warning is little better than "duck and cover" if it cannot be communicated effectively.
Actually, a 30 second warning is quite useful, but not to humans. There are such warning systems in California. When the warning system trips, elevators stop at the nearest floor, subways and BART trains stop, gas valves at schools and mobile home parks close, and some hazardous processes shut down.
But the data from that comes from fixed seismic stations, not somebody's random accelerometer.
Re: (Score:2)
I was aware that this was being tried.
But has anything been proven in the real world? Has a 30 second warning ever stopped a train?
It's not much time to communicate anything useful to a system as mechanically constricted as a passenger elevator. You can't change speed or direction instantaneously. Level the cars. Open the doors.
There are quite simple cup-and ball solutions to shutting off the
Re: (Score:2)
Re: (Score:2)
can you link to a single prediction - or even a theoretical basis for prediction - that has stood up to critical examination? yielded the right time? the right place? the right magnitude?
Re: (Score:2)
Except that "duck and cover" can be fairly useful, and it is a hell of a lot better than nothing. Insofar as earthquakes go, thirty seconds warning could get me out out of this room, filled with bookcases and other missile hazards, and into a small hallway that's nothing but doorways (and thus extremely strong and safer in a quake).
Re: (Score:2)
Thirty seconds would also be enough time to bring most rapid transit systems to a halt - though I wouldn't want to be halted in the middle of a long tunnel, especially BART's transbay tunnel.
Yup (Score:1)
Tag: Gamespy (Score:2)
OMG (Score:2)
Re: (Score:2)
Now we can prove the existance of Graboids! (Score:1)
Sweet. (Score:2)
oblig (Score:1, Funny)
WII quake (Score:1)
Tagging (Score:2)
Tagged: 'noclip'
Human (birth) error (Score:1)
Don't put anything in the hands of people unless you're ready to deal with the collective stupidity of said people.
It won't work (Score:1)
Give credit to Jesse Lawrence (Score:2)
http://slashdot.org/comments.pl?sid=440258&cid=22283136 [slashdot.org]
I do not see how anyone's privacy is violated if the government monitored the Internet and looked for patterns of computers going off line. An disk shaped pattern expanding at about 5000 m/s would be one pattern to look for.