Slashdot is powered by your submissions, so send in your scoop


Forgot your password?

Nano-Viewing Record Broken 65

smitty777 writes "Wired magazine reports on a new nanoviewing lens that is capable of viewing objects less than 100 nm across. Rather than attempting to use a 'perfect' lens, this technology uses a porous surface that actually scatters the light. By measuring how it is scattered and setting up lasers to compensate, they're able to 'steer' the light back to the right spot. The abstract from the Physical Review Letters reads: 'The smallest structures that conventional lenses are able to optically resolve are of the order of 200 nm. We introduce a new type of lens that exploits multiple scattering of light to generate a scanning nanosized optical focus. With an experimental realization of this lens in gallium phosphide we imaged gold nanoparticles at 97 nm optical resolution. Our work is the first lens that provides a resolution better than 100 nm at visible wavelengths.'"
This discussion has been archived. No new comments can be posted.

Nano-Viewing Record Broken

Comments Filter:
  • pics or it didn't happen
    • pics or it didn't happen

      There are pics in TFA.

      • by Anonymous Coward

        pics or it didn't happen

        There are pics in TFA.

        Just where in the hell do you think you are?

  • by Anonymous Coward on Friday May 20, 2011 @03:09PM (#36194486)

    ... now we might be able to read all the fine print in those EULA's now...

  • Can someone in the know help interpret the article? Is this an engineering breakthrough or a scientific breakthrough? From my understanding the wavelength of light is a physical limitation to optically viewing small objects. Does this somehow provide us a way to go beyond that limit or is this simply getting closer to it?
    • "Our work is the first lens that provides a resolution better than 100 nm at visible wavelengths."

      Emphasis added. The wavelength of visible light ranges between somewhere around 400-800 nm (round numbers). Apparently they've smashed that limitation rather impressively.

    • by Muerte23 ( 178626 ) on Friday May 20, 2011 @03:20PM (#36194568) Journal

      You can download the article from Arxiv for free here:

      Basically, the imaging resolution of a lens (typically) has to do with its numerical aperture (NA). A small lens far away has terrible resolution, and vice-versa. The trouble with really high NA lenses is that they are hard to make without distortions. It's easy to make spherical shapes, but aptly named spherical distortion starts to ruin your image once the NA gets high. So what they've done is taken a ground glass surface and put it really close to the object, so that the "scattering lens" subtends close to 2pi steradians. Then they use a spatial light modulator (transmissive LCD screen) to control the phase of their laser beam across many domains to sort of pick out the random scattering elements on the frosted screen that give them the best image. Sort of. There is much additional trickery, but I think that's the jist of it.

      • by Fauxbo ( 1393095 )

        So what they've done is taken a ground glass surface and put it really close to the object, so that the "scattering lens" subtends close to 2pi steradians. Then they use a spatial light modulator...

        Ooooh yeah... that makes sense now.

    • First off physics says this is rubbish. They just re-invented super-resolution enhancement of point sources.

      First you need to know why a "perfect" lens is special. When light leaves a small region the shape of the wavefront can be described in a Fourier transform sense as a set of plane waves with various K vectors. Now it turns out that not all K-vectors can propagate to the far field. Ones with K-vectors greater than the reciprocal wavelength simply decay a short distance from the source and never re

      • by osu-neko ( 2604 )

        They are claiming they can reconstruct the missing k-vectors. they can't.

        That's not what they're claiming.

        If you know something about the source...

        They do, after the "first pass", and adjust the lasers accordingly.

        That cannot be done if the thing you are imaging is arbitrary.

        It can, it just can't be done instantaneously.

        You have to know something to make up the missing information.

        ...and you can determine information in the "first pass". Since this is essentially a kind of adaptive optics, you can start with something arbitrary, but end up knowing a lot about what you're imaging, and using that information to adjust the lens or light to reveal further information.

      • by Khyber ( 864651 )

        "But you can "fake" it. this is called super-resolution."

        No, it's called interpolation.

        • Interpolation is faking it. Interpolation is just assuming a cap on a derivative. It's an assumption. Perhaps a good one but it's not imaging.

    • by Morty ( 32057 )

      Is this an engineering breakthrough or a scientific breakthrough?

      Engineering breakthrough. They have a new technique for getting close to the theoretical limits without changing the theoretical limits. That's engineering.

  • by TheDarAve ( 513675 ) on Friday May 20, 2011 @03:15PM (#36194522)

    Wait wait wait... How are you able to get "visible wavelengths" from something that would only be the size of something deep in the Ultraviolet range on the electromagnetic spectrum?

    Serious question here, as I'd like to know if this means they're looking at quarterwave light or what...

    • by _0xd0ad ( 1974778 ) on Friday May 20, 2011 @03:23PM (#36194596) Journal

      In simple terms, I think they're carefully aligning the incoming photons.

      It's like trying to hit a target with a bullet that travels along a sine wave; you have to determine its phase at the point where it hits the target to figure out where it will end up.

    • I think you're mixing up the length of a wave period with the amplitude (size) of the wave vs. size of objects.

      Remember that light is made of photons, which are much much smaller than 1 nm. It's a quantum particle.

      So even if something is somewhat smaller than the visible wave length, it will still reflect these waves, although it probably will cause diffraction...

      I may be wrong, I don't grasp the wave functions very well...

      • It does cause diffraction, and that's the killer; when you enlarge it, you just enlarge the diffraction pattern.

        • What about this? []

          I'd interpret the fact that we see the laser through the atmosphere as "seeing" the particles in the air, even though I can't make out their form since they're so small. I think it's safe to suppose that those particles are much smaller than the wavelength of the laser itself.

          • "Seeing" would seem to imply the photon had "bounced off" of them, though, when in actuality it's more like it's slingshotted around them without actually touching them.

            Of course at the atomic level there's truly no difference between the two because, regardless of the size of the object, a photon will almost never actually touch a particle of mass - on the atomic level everything is pretty much just open space filled with electromagnetic fields.

            • That deviates from the original question a bit, but if the photon never "touches" a particle, what about black body radiation? Or are you only referring to reflection/diffraction/etc when you say that photons almost never touch a particle of mass?

        • by Lumpy ( 12016 )

          As shown in the photos.. you can see the diffraction pattern.

      • by ortholattice ( 175065 ) on Friday May 20, 2011 @04:10PM (#36195178)

        Remember that light is made of photons, which are much much smaller than 1 nm. It's a quantum particle.

        It's not as simple as that. In the double-slit experiment, which gives an interference pattern even if you fire one photon at a time, the photon is influenced by both slits (several hundred nm apart or more). If you cover one slit, the interference disappears.

    • by Lumpy ( 12016 )

      Which begs the question, why are they bothering with visible light at all? let's go really really deep UV and get a decent image. And why cant we use really low power Xrays or Gamma rays to do imaging without damaging the itty bitty virii?

      • And why cant we use really low power Xrays or Gamma rays to do imaging without damaging the itty bitty virii?

        You sort of answered your own question, there...

        4 possibilities: reflect, refract, transmit, absorb. There's a fundamental difference between a microscope that illuminates the sample from above/beside (you see mostly reflected light) and one that illuminates it from below (you see mostly transmitted light and a little refracted light).

        Different stuff reacts differently to different wavelengths, and the "absorb" thing tends to cook the sample. Visible light tends to reflect better. X-rays tend to absorb. Tha

        • Also, diffract:

          At the atomic level, x-rays at the right wavelength diffract around atoms (really, at that level, it's "electron clouds"), and you can use the diffraction pattern to estimate the localized density of the electron clouds, in an attempt to figure out what atoms go where (heavier atoms have heavier electron clouds). However, hydrogen atoms (protons) also tend to form a sort of cloud, but that's more of a physics limitation then a measurement one. And yes, the sample does "cook" in the process,
  • by Anonymous Coward

    Just curious, hope someone knows. How are they able to make transistors that are 32nm when they can only see 100nm? How do they verify?

    • Grow a lot of them in a crystalline structure and use the ones that work.

    • There's a difference between an optical lens, which this is supposed to be, and an electronic lens like what you can find in specialized microscopes. Quoting from Wikipedia:
      "An electron microscope is a type of microscope that uses a particle beam of electrons to illuminate the specimen and produce a magnified image. Electron microscopes (EM) have a greater resolving power than a light-powered optical microscope, because electrons have wavelengths about 100,000 times shorter than visible light (photons), and

    • by Amouth ( 879122 )

      diffrence between methods of viewing - this is looking at the visible light spectrum.. we can get greater or at 1 atom resolution with electron scanning microscopes.

  • does this mean that if we achieve a tighter concentration of pores on the lens we will achieve gradually smaller scales?
  • What record was broken, a bluray disk? That was how I first read the title, confusing record with some disk.

  • There are my keys...

  • SNOM (Scanning Near-field scanning optical microscopes) can easily resolve images at 100 nm at visible wavelengths and have done so for some years now. You can actually buy these microscopes commercially. I'm sure this new method is better than SNOM in some regard, or has the potential to be, but the resolution they achieved is not really a "Nano Viewing Record". More a lens building record.

    Non-optical methods like scanning force microscopy have resolved far better than that for years now, of course. Alb
  • So, the sample must sit on a very narrow area on the plate with the diffuser. Mah looks like we did not break free from the real limitations then. If we could do this in the middle of a cell, that may be something. The point they make is that you can use this diffuser as a - perfect- lens because one can compensate the phase distortion really well. The image they get is in resolution close to the theoretical wave limit. The surprise is that this works better than the classical approach of making optics: t

Do not underestimate the value of print statements for debugging.