We need to sample the incident light with picosecond resolution and be highly sensitive to a low photon arrival rate. Our depth resolution is limited by the response time of the detector and digitizer (250 ps, 7.5cm light travel). The high peak power of our laser was critical for registering SNR above the dark current of our photo sensor. Also, our STIR acquisition times are in nanoseconds, which allows us to take a large number of exposures and time average them to reduce Gaussian noise.
That's exactly where I think their visualization breaks down. This is a problem with that pesky particle/wave duality thing. I see it more as a wave.
I see it as illuminating the hidden object with a point source from the laser hitting the back wall. Then they're reading the reflected light off the back wall, and reconstructing the received wavefront with interferometry. You can see that happening a little, near the end, as the laptop shows layers of wavefronts overlaying and building the object.
« Older What's it like to play Bach with synaesthesia?... | 'HOME' an exhibition by Ian 'K... Newer »
This thread has been archived and is closed to new comments
Buy a Shirt