Earlier this month you asked Principal Scientist at the BAE Systems Advanced Technology Centre, Dr. Ramsey Faragher, about his NAVSOP navigation system and the future of positioning systems in general. Below you'll find his answers.Automation?
How much of this can be done automatically and how much of this must be hand guided? For example you talked about [slashdot.org] fingerprints changing over time and being used only as a guide. Is there a measurement or confidence variable that you can employ to automate when the fingerprint is still valid or has morphed too much? Or is that something that a human overlord must monitor and do research to notice that a new apartment building has just been opened and there are now hundreds of new signals? It feels like you are using an open domain that could have outliers and irregularities that require a human to clean the data before it can be trusted to give you low false positives and true negatives. What statistical methods do you use to overcome these sort of real world problems so that your system can be put anywhere and work?
Dr. Faragher: In the outdoor environment, timing measurements are used. In the indoor environment however RSSI is the primary measurement metric. The inside positioning system exploits low cost inertial measurements to provide the initial push around the environment, with a gradual decrease in positioning accuracy over time if no corrections are applied. By monitoring the radio signal fingerprints and magnetic anomalies (and probably more metrics in the future) the system can recognize when it is back in a location it has been in before, and this provides corrections to the system and increases the accuracy. Over time, the inside environment is mapped out automatically as the user moves around, rather than requiring manual surveying. The system runs in a Bayesian framework, specifically DPSLAM (be careful googling that, DP means things other than “Distributed Particle” out in cyberspace!), and so there is a kernel used to score the fingerprints when comparing current measurements with old ones and a particle filter tracking a large set of hypotheses. For more details, see my ION paper. Fingerprints can indeed change over time – people might move a filing cabinet, or suddenly lots of people might all fill a room for an announcement, or someone might accidentally put their car through a wall. This is accounted for in two ways – the fingerprints naturally age over time, so the longer it has been since you have visited a location, the less confidence that is placed in that fingerprint. Secondly, the system is driven by the smartphone-grade inertial measurement set, so if the fingerprints in a given region have changed, the position estimate can still freewheel through that location and provide an estimate to the user, and the old measurements can be replaced by the new ones.
In standard smartphone-based fingerprinting the current measurement set is compared to a huge database and the user is hurled to whatever location has the best fit. So a sudden change in the real fingerprints means you might not find a match, or suddenly be hurled to the wrong place. This can’t happen with indoor NAVSOP, the core inertial sensing set drives the user around, and the radio and magnetic measurements provide corrections when applicable – the effect of a change to fingerprints in a region results in the system freewheeling through that location and replacing the old fingerprints with the new ones.
It would seem that to use this technology, the client would need to have a much larger datastore than with GPS: Whereas only the positions of the GPS satellites need to be known to make a calculation, the dataset here is in the many thousands to millions. In addition to the data required for map storage, it would seem any implimentation of this would require an internet connection to download the data in a geographically-restricted fashion. This opens the door to privacy issues that standalone GPS clients do not have. How do you plan on addressing the privacy issue with your product?
Dr. Faragher: The idea is that you don’t need to rely on an existing database, or a datalink to access it, the system can build up its own understanding of the environment as it learns. It is possible to encrypt this information to prevent a user from accessing, sharing, or reading the content. The system only uses the publically-accessible downlink synchronization information broadcast by all fixed masts to allow normal devices (DAB radios, DVB-T televisions, etc) to synchronise with them and start using them. No data content is gathered by the system, and there is nothing interesting in the data content for radio positioning anyway as by definition is it unpredictable content. NAVSOP relies on transmitters broadcasting repetitive and known structures in their synchronization fields, equivalent to the PRN sequences from GPS satellites. NAVSOP goes looking for these repetitive structures in the radio bands expected to be the downlink bands for the different signal types. Signal content that doesn’t repeat in a predictable fashion (i.e. the data content) is useless and not captured.
What is the range of frequencies?
by Anonymous Coward
Years ago during world war 2, pilots flying at night used dead reckoning, and followed two other signals, one from the north of England, and one from the south of England. One was called cat, and the other mouse. Cat chased mouse. Hyperbolic curves plotted on maps meant that when the signals were a certain distance apart, you were a certain distance from them (and using loop antennas gave a rough direction to each, given that the signal is strongest when the antenna is orthogonal to the EM radiation). It eventually became LORAN. HF radio direction finding usually involves "ELEPHANTS CAGES" like Pusher HF-DF Wullenweber "BULLSEYE" antennas, etc. These are very large because of the physics involved. To keep your unit small, how do you get around these challenges? Can you use HF frequencies or are you limited to VHF/UHF? I ask because the range of some of these signals (WIFI) is very small, cell phone signals are likewise less than 20 miles, TV is good for about 50 miles. When you are at sea, on the sea, unless you get backscattering, you don't get any of these (and the frequencies are too high to refract on the ionisphere, ie no 'skip'). HF will refract, but then we are talking about physics and size. An aircraft can get signals from much further, but you would still rely on a lot of dead reckoning over much of the worlds oceans.
Dr. Faragher: World War Two did indeed result in a host of radio positioning technologies, any anyone wishing to learn more about them and about the developments in radar and electronic warfare should read R.V. Jones’ excellent book “Most Secret War”. NAVSOP certainly has its roots in these early technologies. We have exploited signals as low as Long Wave, and looked into the use of VLF signals for very longe range radio navigation, although the accuracy drops off considerably at these wavelengths. The old Russian ALPHA positioning system at around 10 kHz also appears to still be functioning and that provides extensive coverage across more than half of the globe. eLoran operates at 100 kHz and there are some quite compact H-field antennas for those receivers. We have used an active E-field whip antenna for LW and MW (just like the antenna on your car that can pick up AM MW radio). There are useable signals from LEO satellites and airliners when in particularly remote locations. It is also possible to determine the time of arrival of signals from a digital transmitter such as a DVB TV transmitter at much greater distances than it is possible to decode data, due to coherent integration of repetitive timing markers. Consider that GPS signals are transmitted at around 50-100W and are 20,200km away. And thanks to coherent integration and correlation gain, they are useable by the time you pick them up (at around a quadrillionth of a Watt). The same principles can be applied to DAB, DVB, and cellular signals, to permit their utilization at very large distances, given line of sight (and so enough altitude).
Following the downing of an American drone in Iran the hypothesis was put forward that the Iranians spoofed the GPS signal and convinced the drone that it wasn't where it thought it was in order to get it to land in Iran (I'm not sure if this was ever confirmed). A recent issue of Aviation Week reported on a group I believe in the U.S. working on the same idea, spoofing the GPS signal in a transparent manner to convince an autonomous vehicle that is was somewhere other than its actual location. Would NAVSOP make it more difficult to accomplish this sort of spoofing?
Dr. Faragher: As NAVSOP learns about the opportunistic signals in the environment, and calibrates them for use, it becomes less dependent on GPS. It reaches a point where it treats GPS with an air of suspicion and is capable of flagging up spoofing. If GPS suggests I am going one way, but DAB, DVB, cellular, MW, etc all suggest I’m going a different way (and suddenly there is much more power in the GPS band than there should be) then it is a strong indicator of GPS spoofing.
When technology changes
by Anonymous Coward
Seems like a great alternative to GPS in most casual situations, or as an addition to it for faster position locks while driving. The question I have is, as technology changes, such as the changing of the cell phone signals to differing frequencies as they increase speed.. ie, 4g, LTE,...etc or OTA TV switching to digital.. Will this still work? Or would people have to replace their "non gps navigation unit" when various signals that this relies on for positioning stop, or change frequencies? We've seen a lot of changes already so we know that things can change very rapidly and perhaps in unexpected directions.
Dr. Faragher: Changing frequencies is not a big deal, as there are already a lot of frequencies to search over all the time anyway, but the appearance of a new synchronization marker (e.g. DVB-T changing to T2) or appearance of a brand new network (e.g. LTE) will be important. It is certainly sensible in this day and age to permit a device like this to be upgradeable via software or firmware tweaks. In principle a device could employ a full “blind search” mode where it searches everywhere for new repetitive structures and creates clean templates of new structures for itself, but the cost of the development and deployment of this piece of code would greatly outweigh its usefulness in the commercial sector, since that bit of code may only need to be run once or twice every ten years. An upgrade over the internet is much more sensible!
Galactic GPS using pulsars
So I seem to remember a proposal to use pulsars to provide a sort of galactic GPS. (Pulsars, spinning neutron stars, are extremely stable periodic emitters of radio waves at interstellar distances). I think this might be what an earlier poster was referring to for spacecraft navigation, I believe they were used on the famous Pioneer 10 plaque (with the naked humans) to show aliens where we live.
Anyway, what's the accuracy for this (the previous poster mentions several hundred meters over hundreds of kilometers but I don't know if it's the same system)? Is it as good as (terrestrial) GPS? Will it be good enough to use for the upcoming GAIA mission which will map the 3D location of a billion stars in our galaxy?* (The positioning requirements of that mission are borderline insane!). Is there any way to use these celestial beacons as (another) GPS backup or are the signals far too weak (or unstable or blocked by our atmosphere or are in already used radio bands). Sorry about the more than one questions but they're all related. :)
*actually since most (all?) of these pulsars are within our galaxy maybe they are not far enough away to have no apparent motion (in which case they would be hard for GAIA to use as a reference). Are there any extra-galactic sources (Quasars?) that could serve a similar function?
Dr. Faragher: Professor Tony Hewish, Nobel-Prize-winning co-discoverer of the pulsar (and coincidentally my old PhD supervisor’s old PhD supervisor!) proposed the use of pulsars for extra terrestrial navigation over three decades ago, so like terrestrial radio positioning, pulsar navigation is certainly not a new idea. The expected accuracy is around the 10km mark. Pulsar navigation could not be exploited on Earth because radio pulsars require very large antenna arrays for detection (Hewish’s and Bell’s array was the size of two football pitches), and the signals from x-ray pulsars (smaller antennas required) do not penetrate the atmosphere. They are going to stick GAIA at one of the Earth-Sun Lagrange points (L2) and from there the stellar map will be generated using parallax as the observatory orbits the sun with us every year. GAIA will probably be tracked from Earth using Delta-DOR or even just traditional radio ranging to provide positioning relative to Earth with much higher accuracy than pulsar positioning could. Delta-DOR exploits quasars rather than pulsars to provide its differential positioning corrections.
On the fly mapping with environmental data
You mentioned earlier the domination of signal strength when indoors. Can you also use patterns in observed environmental data for automated mapping and exploration?
For example a robot exploring a cave or a large indoor structure like a power plant might be able to even use information such as ambient temperature / humidity, echoic nature of surroundings, or patterns in ambient air pressure / acoustic input from machinery or the sound of treads against floor.
Also someone was skeptical about using stars to navigate in the day. However radio telescopes can make observations in the daytime, which seems to be the ultimate sensor for your platform. Would your system work to find landmarks underwater too?
Dr. Faragher: Other metrics could indeed be useful if they can be assumed to be relatively stable over time while varying on a fine spatial scale. Stellar cameras can provide positioning estimates accurate to a few hundred metres today, and still operate in the daytime by exploiting infrared sensing, in fact they can even see through thin cloud in this mode (an example of a modern stellar navigator). Radio sensing of stars requires quite big antennas, although of course when you are listening to static on your FM radio, some of what you are hearing is the cosmic microwave background of the universe!
The same learning principles exploited by NAVSOP have already been exploited in the underwater domain – have a google for bathymetric simultaneous localization and mapping.
by Arthur B.
In order to combine all the sources of information, are you relying on a messy approach, something based on many signature machine learning algorithms (think boosting, SVNs, random forests etc) or are you writing an explicit generative model for the noise and then applying filtering to it, with a particle filter for instance?
Dr. Faragher: We use a mixture of bespoke Bayesian estimators for multipath mitigation, Error State/Unscented/Extended/Gaussian-Mixture-Model-multiple-hypotheses Kalman filters, particle filters, and batch processing methods, depending on the exact application, set of signals, and hardware.
Wouldn't this thing require a whole slew of regulatory approvals since you'd be fishing for different types of signals? Or would this involve mere processing of data already available to, say, the smartphone armed with this technology?
Dr. Faragher: The measurements we make are the same performed by a cellphone during the initial stages of registering on a network, or a DAB/DVB receiver starting up and picking up the beginning of message frames to decode data. These synchronization procedures are all freely accessible within the publicly-available specifications documents defining how a receiver for each signal type works. Your cellphone already makes the same main measurements we do, as do your DAB and DVB tuners, we just make more precise measurements and process them within a positioning engine. We don’t access any of the data content (see related previous question).
your best guess on the GPS successor?
Hi, Dr Ramsey! What is your best estimate as to what is the US DOD’s current GPS backup system?
IIRC Obama cut the budget for LORAN around 2010 and till then the system was financed with the explicit explanation and purpose - GPS backup. But no more... I am currently teaching ECDIS systems to mariners and I always emphasize the weaknesses of GPS under jamming. Ever since Selective Availability has been switched off, the jamming topic pops up more and more as a soft spot of the whole process, so I think we are not fooling ourselves that the US would let down such a gaping hole in its systems uncovered...
Dr. Faragher: I’m afraid I can’t speculate on what anyone else might be using within their multisensor navigation systems. Typical sources that everyone in the navigation game knows about are inertial measurements, non-satellite radio aids, visual aiding (inferring velocity and heading changes by tracking the scenery with cameras as you move), stellar cameras, and map matching. It is certainly critical to have redundancy, and multiple aiding sensors with different failure modes. I think for low-value assets a combination of visual aiding and non-GNSS radio sources could provide good GPS backup in many scenarios. I also think gravitational aiding (see first question) is a really exciting area of research and development in this field and maturing the cold atom technology will be a really important development for GPS-denied navigation.
Most Surprising Correction?
I'd imagine a lot of positioning calculations involve accounting for or adjusting for known effects or noise. For example, accounting for general relativity in GPS. What is the most surprising correction you've ever come across (even on an exam or done in theory)? Have you ever found yourself saying "I didn't think that could affect the calculations so much."
Dr. Faragher: I would say the most fascinating aspect I have come across is the modern method of accounting for the Earth’s gravitational field properly. In really long distance navigation, such as for submarines or intercontinental aircraft, if you are relying on just inertial measurements then accounting for the Earth’s gravitational field properly can make a huge difference to the performance of the navigation solution. Basically the effect of gravity must be removed from the measurements by using the gyroscopes to determine your orientation relative to the surface of the Earth, then applying a gravity correction to the accelerometer measurements. There are a few issues resulting from this – your estimate of “up” from the gyros carries an error, and your estimate of the exact magnitude and direction of the local gravitational field will also be erroneous. The result is a significant contribution to the increase in error over time suffered by inertial navigation systems. In the last couple of decades people have looked at using gravitational gradiometers and gravimeters to provide a set of measurements directly to an inertial navigation system to take some of the guesswork out of the process of accounting for gravity, with excellent results (look up the work on gravitational INS aiding on the USS Memphis, e.g. http://www.navy.mil/navydata/cno/n87/usw/winter99/waterfront.htm), including the development of a nearly-unbelievable “gravitational sonar” system, using gravity to sense and image the terrain around the submarine. The initial research was performed in the 1990s but there is currently a big push in developing cold atom interferometer inertial navigation systems to provide a big step forward in inertial navigation based on a combined gyro+accelerometer+gravitational sensing technology.