Breaks classical wavelength diffraction limit
December 11, 2012 12:50 AM   Subscribe

New plasmon technique breaks classical diffraction limit of light Plasmon techniques have been showing up in some strange newer tech (Solar Cells, Metamaterials) This time researchers at CalTech have found a wave (using plasmons) to focus laser light (with a wavelength of hundreds of nanometers) into a point of "a few nanometers" across. Among the possible applications listed is a new kind of microscope that could image cell features, a 50x increase in disk drive density, and the usual increase of bandwidth and fiber optic communication capabilities.
posted by aleph (17 comments total) 8 users marked this as a favorite
 
found a *way* (using plasmons)...
posted by aleph at 1:12 AM on December 11, 2012


Screw microscopes, I wanna see *telescopes* with mega-resolution. More accurately, I wanna use these telescopes to read alien newspapers from the Nebulon Galaxy.
posted by DU at 4:51 AM on December 11, 2012 [1 favorite]


Great, so what color are we going to use for the new MetaMaterials site then?
posted by The 10th Regiment of Foot at 4:58 AM on December 11, 2012 [1 favorite]


Among the possible applications listed is a new kind of microscope that could image cell features

How much power is concentrated in the focused region? Wouldn't focusing a laser beam in this way damage (living) cells, making imaging a bit difficult? I think this is a problem with confocal microscopy, for example.
posted by Blazecock Pileon at 5:03 AM on December 11, 2012


Just had a thought, reading this... that it seems like every breathless announcement about the world changing is met with tons of excitement, and then the invention not really doing what was claimed, at least not at the time.

It's not until later, when the invention isn't exciting anymore, and when nobody is really paying attention, that things really change. Two examples I can think of; full-motion video on the computer, which at one time was exceedingly difficult, and had a lot of people tremendously excited for the possibilities, and the Internet, which had all of society wound up tight as a tick.

But the changes only started to get major when people weren't noticing them anymore. In August, for instance, I was watching two live video streams over the Internet, along with a sophisticated 3D simulation of a craft that was landing on another planet. I remember when having just ONE was a big deal, but here I was easily watching two in very high resolution, being pulled in live over a worldwide data network, plus a 3D model, and I could easily have opened two more streams without a problem, maybe more.

And I compared that to my first memory of a Moon landing (one of the later ones) on a tiny television, sitting by the staircase; I'm not even sure if it was color or not. The changes between those two times are immense, but they didn't happen when everyone was all het up about them, they happened quietly, a little at a time.

My overall conclusion: plasmons are very exciting and interesting, and because of that, they're not very important yet. It's only when they're routine that they're really going to shake things up in a serious way. It's mostly boring things that change the world.
posted by Malor at 5:14 AM on December 11, 2012 [8 favorites]


The problem with confocal is that you can bleach the flourophore, which releases toxic free radicals into the cell. The laser light itself is not harmful. If you're focusing light better, presumably you'd just decrease laser power to maintain similar intensity in the region of interest. As you'd have less off-focus light, there would be less bleaching and the cell would stay healthier.
posted by Humanzee at 5:16 AM on December 11, 2012 [2 favorites]


Ah, got it. Thanks.
posted by Blazecock Pileon at 5:19 AM on December 11, 2012


It's only when they're routine that they're really going to shake things up in a serious way. It's mostly boring things that change the world.

It's only people that change the world. And people need tools and information to do that with. When only a few people have some tools/information, then the chances that one of those few has a world-changing idea is fairly low. Tools and ideas have to be widespread (i.e. well-known and "boring") to have a good chance of triggering something.
posted by DU at 5:27 AM on December 11, 2012


Could this be used to project a low-powered laser directly onto the retina?
posted by empath at 6:46 AM on December 11, 2012


Malor: "It's not until later, when the invention isn't exciting anymore, and when nobody is really paying attention, that things really change. Two examples I can think of; full-motion video on the computer, which at one time was exceedingly difficult, and had a lot of people tremendously excited for the possibilities, and the Internet, which had all of society wound up tight as a tick. "

You've got a great point, but I don't think that your example quite applies here.

Science is difficult and takes a lot of time. New discoveries are pretty rare and far between.

In order for these discoveries to be made useful, Engineers must first determine the "real-world" applicability of these discoveries, and then figure out how to build the things economically, and then package them into marketable products. Quite often, this is also difficult, takes a lot of time, and cannot even begin until all of the science stuff has been sorted out.

Even more often, new technologies need to be sent back to the lab to be re-envisioned using cheaper and/or less complicated materials and manufacturing processes. While the scientists in the lab use gold conductors and germanium semiconductors, the engineers usually need to achieve the same results using readily-available materials.

Derail ahead:

The example you gave -- full-screen digital video -- is a little different. There were no fundamental new technologies that were invented to enable the proliferation of digital video content. Rather, the widespread adoption of digital video was achieved through the gradual improvement of several different technologies (all of which had surprisingly well-defined thresholds for widespread adoption).

First, we needed to develop processing and graphics hardware that could handle the high sustained data rates necessary to do full-screen digital video. Along the way, we needed to develop even faster processors so that we could "cheat" by compressing the video to take the data rates down to something more manageable. Even today, uncompressed HD bitrates are too high to be handled by anything other than specialized hardware. Once we overcame this hurdle, we started seeing computers used in high-end production facilities.

Secondly, we needed a way to store all this data. Tapes (ie. DV/DigiBeta) were the first technology to become economical, followed by optical discs (CD/DVD), spinning platters (Hard Drives), and finally now solid-state storage. All of these technologies were developed, and evolved independently outside of the world of video. Each of these steps brought something new to the table -- optical discs can do random-access reads, and hard drives can do random-access reads and writes.

Finally, we needed to develop and build out the network bandwidth necessary to transmit digital video across the interwebs in real time. Separately from the internet, we've actually been broadcasting digital video for quite some time via satellite, cable, etc, as those technologies matured before MPEG compression did. Once we could economically encode and decode digital video, we already had quite a few suitable ways to beam it through the air.

As far as web video goes, I think that there was considerable debate about the quality thresholds that were necessary to achieve widespread adoption, and I think that there were an equal number of people who were surprised by how early Netflix skyrocketed to popularity, as there were people who were surprised about how long it took. (TV executives really don't understand the web, and they drove a lot of that debate...). However, again, we gradually inched up to that threshold, and there was no single scientific breakthrough that enabled it. There were a lot of tiny steps that were necessary to enable the widespread adoption of that technology, and it was difficult to predict exactly where those steps would be.
posted by schmod at 7:07 AM on December 11, 2012 [1 favorite]


empath: Could this be used to project a low-powered laser directly onto the retina?

It could, potentially. However, there's no real reason to do that, since the diffraction-limited size of a light beam on your eye in normal conditions is on the order of microns (this is just due to the optics of the eye). We can already scan lasers on the eye to get pictures of the eye (that's exactly what I research) and it's very easy to turn an imaging setup into a projective setup. The real issue is getting a large enough field of view to simulate vision; the optics of the eye far from the vision center get really tricky, and you're also limited by scanning speed. This is an active area of research, though.

There's also another reason I'm skeptical about this being used to image anything. Sure, the light is concentrated to a tiny region in the semiconductor device, but you can't put semiconductors inside the human eye. So you then have to get the light from the semiconductor to your sample, probably using lenses. And those lenses, even for an infinitesimally small source, will have produce a laser spot again on the order of microns. So the potential for microscopy seems limited.

Still, it's a really cool paper, and very impressive work. The paper is here; if you want it but don't have access, memail me.
posted by Maecenas at 7:37 AM on December 11, 2012


That said, there are probably many who would not consider this to be basic science. The line between science and engineering is blurry, especially in cases like this.

The Caltech researchers did not discover any new properties of matter. Rather, they discovered and demonstrated a novel device that "sidesteps" a well-known physical boundary, using already-known laws of physics and properties of matter*.

This is by no means trivial, especially given that this is the first practical device of its kind. However, I'm sure that you could start a lively debate about whether this is 'Physics' or 'Applied Physics.'

The synopsis of the paper even notes that this technique is a (significant) improvement to one that was already known, and that efficiency and ease-of fabrication were both goals (and achievements) of the project. Those are not phrases that you often see in Physics journals.

*Caveat: I can't read the whole paper. The researchers on this project may have also discovered some new or novel properties of silicon dioxide that enabled them to build their device. If this is the case, the research fits firmly in the "hard science" category.
posted by schmod at 7:40 AM on December 11, 2012


The research was funded by the Defense Advanced Research Projects Agency (DARPA) Science and Technology Surface-Enhanced Raman Spectroscopy program, the Department of Energy, and the Division of Engineering and Applied Science at Caltech.
Cost to read the article: $32.

So, that's pretty upsetting....
posted by schmod at 7:42 AM on December 11, 2012 [1 favorite]


There were a lot of tiny steps that were necessary to enable the widespread adoption of that technology, and it was difficult to predict exactly where those steps would be.

Sure, absolutely, but it's precisely the fact that those steps got taken that made the technology boring -- and the now boring technology then had a gigantic impact. It was easy, and thus became cheap and ubiquitous.

I guess what I'm trying to say is that exciting and important are, at least in technology, often (usually?) diametrically opposed to one another. Hmm, maybe important is the wrong word. Impactful, maybe? Significant? Maybe this was already obvious to everyone else, but it was brand-new to me when I made the last post. The more fascinating something is, the more likely it is to be a long time before it will actually alter your life significantly.

That doesn't make it less exciting. Plasmons are cool. They're even better than bowties.
posted by Malor at 7:50 AM on December 11, 2012 [1 favorite]


This is a pretty cool paper, but it's not all that revolutionary. The main limitation of it is that it's a near-field approach to focusing light - the light is focused to a point at the end of the waveguide. So as Maecenas said, you need to have the sample you want to image very close (probably a micron or less) from the end of the device. This is distinct from ordinary focusing with a lens, where your focus can be arbitrarily far from your lens (hence, this is known as far-field imaging).

We've known for a long time that you can break the diffraction limit with near-field imaging (see NSOM; near-field scanning optical microscopy, developed in the 80s) or the recent metamaterial superlens imaging (which also works in the near-field regime). It sounds like (having only read the press release) that the main improvements here are technical - the device is fabricated using conventional nanofabrication techniques, meaning it's easy to mass produce and it's more efficient than previous devices.

Finally, breaking the diffraction limit in light microscopy is a popular thing to do these days. In none of these cases do they really break the diffraction limit, but as was mentioned above, they sidestep it. Truly breaking the diffraction limit, for example, focusing a laser beam to an arbitrarily small spot in the far-field, probably can't be done. But like all laws there are loopholes and you can get a lot of useful information out of a microscope that uses these loopholes. For more info, the wikipedia page on super-resolution microscopy is pretty good and MicroscopyU has a good page as well.
posted by pombe at 10:16 AM on December 11, 2012 [2 favorites]


Oh, and Malor: Plasmonic Bowties!
posted by pombe at 10:20 AM on December 11, 2012 [1 favorite]


The research was funded by the Defense Advanced Research Projects Agency (DARPA) Science and Technology Surface-Enhanced Raman Spectroscopy program, the Department of Energy, and the Division of Engineering and Applied Science at Caltech.

Cost to read the article: $32.

So, that's pretty upsetting....

As it stands, journals are required to allow access to NIH-funded research articles after 12 months. The pressure on Nature et. al to go open access with publicly funded science is increasing, but they seem unlikely to budge barring passage of something like FRPAA, which could very well happen in the near future.

For now, I'm happy to publish in PLOS.
posted by StrangerInAStrainedLand at 3:38 PM on December 11, 2012


« Older Space Oddity   |   Bloodier Is Better Newer »


This thread has been archived and is closed to new comments