## How To Build a Quantum Telescope 60

Posted
by
samzenpus

from the all-the-better-to-see-you-with dept.

from the all-the-better-to-see-you-with dept.

KentuckyFC (1144503) writes

*"The resolving power of telescopes is limited by the diffraction limit, a natural bound on resolution caused by the way light diffracts as it passes through a lens. But in recent years, physicists have worked out how to use quantum techniques to beat the diffraction limit. The trick is to create a pair of entangled photons, use one to illuminate the target and the other to increase the information you have about the first. All this is possible in the lab because physicists can use their own sources of light. Indeed, last month, physicists unveiled the first entanglement-enhanced microscope that beats the diffraction limit. But what about astronomy where the light comes from distant astrophysical sources? Now one physicist has worked out how to use quantum techniques to beat the diffraction limit in telescopes too. Her idea is to insert a crystalline sheet of excited atoms into the aperture of the telescope. When astrophysical photons hit this sheet, they generate an entangled pair of photons. One of these photons then passes through the telescope to create an image while the other is used to improve the information known about the first and so beat the diffraction limit. Of course, all this depends on improved techniques for increasing the efficiency of the process and removing noise that might otherwise swamp the astrophysical signal. But it's still the early days in the world of quantum imaging, and at least astronomers now know they're not going to be excluded from the fun."*
## The image formation process is still the same (Score:4, Informative)

The field being produced by the telescope optics is still the same, as the same primary mirror, secondary, etc. is being used to form the image. Yes, you can use multiphoton processes, even ones that are promoted by entangled photons, to produce apparently narrowed two-photon wavefunctions. However, this two-photon wavefunction is still derived from the ordinary resolution field created by the telescope optics. Therefore it seems to me that little is to be gained by using photon entangled detection to augment a process of image formation that is still fundamentally limited by the telescope aperture size.

Similar arguments have been used for other forms of imaging (e.g. microscopy and optical coherence tomography) and they all have this issue as the image formation process is still essentially a linear scattering process. There was some excitement around quantum lithography, however, even that has the problem that the probability of two-photon processes can be quite small even with entangled photons. For inherently multiphoton processes, such as two-photon absorption, stimulated Raman scattering, etc. there may be an advantage of increasing resolution and lowering dose, but I don't see much of a benefit to improving an instrument where the image formation process is a linear imaging process.

## A question about the microscope (Score:5, Informative)

There are also already fluorescence based microscopy techniques which surpass the diffraction limit. [wikipedia.org]

I'm not going to say it doesn't sound useful, since most of the time, you only realize how useful a thing is once you already have it. I'll just say that if the microscope mentioned here doesn't do fluorescence, I can't think of anything one would use it for that they wouldn't be able to do better with EM or fluorescence microscopy.

## Re:The image formation process is still the same (Score:5, Informative)

The effect of a telescope's point spread function [wikipedia.org] is to convolve [wikipedia.org] the image. A raw image f(x) is turned into f'(x) = int dy f(x-dy) g(y), where g(dx) is the point spread function. By the convolution theorem, a convolution is simply a product in fourier space [wikipedia.org], so F'(k) = F(k)*G(k), where uppercase functions are fourier-transpformed versions of lowercase ones, and k is the wavenumber [wikipedia.org]. From this you can see that recovering the raw, unblurred image (i.e. overcoming the diffraction limit), is just a question of computing F(k) = F'(k)/G(k), i.e. dividing by by the point spread function in fourier space, or deconvoluting it in normal coordinates.

What limits our ability to do so is the presence of noise in the final image. So a more realistic data model is F'(k) = F(k)*G(k) + N(k), where N is the noise, and when we now try to remove the point spread function, we get F_estimate(k) = F'(k)/G(k) + N(k)/G(k) = F(k) + N(k)/G(k). So we get back our unblurred original image, but with an extra noise term. And since G(k) usually falls exponentially with k [wikipedia.org] (meaning that high wavenumbers = small scales in the image are progressively blurred), N(k)/G(k) will grow exponentially with k, making the noise ever greater as one tries to recover smaller and smaller scales. This puts a limit on how small scales one can recover. But as you can see, it is not given just by G(k). I.e. not just by the diffraction limit. It is given by a combination of the telescope's noise level and the point spread function. In the absence of noise we can fully recover all the small scales. And even with noise one can push down a bit below the diffraction limit with enough data. But due to that exponential rise of the noise with wavenumber, it's an extreme case of diminishing returns. It is much cheaper to make the telescope bigger.