Why can't we purchase a perfect single shot 3D-sensor?
Institute of Optics, Information and Photonics, University of Erlangen-Nuremberg
gerd.haeusler@physik.uni-erlangen.de
Abstract
Many high quality 3D-sensors are available. Surprisingly, there are no sensors at the market that deliver "single-shot acquisition" and at the same time measurement of "dense surface data" in 3D-space. "Single-shot acquisition" means the option to get 3D-data within one single exposure, which would allow for real time 3D-data acquisition of moving objects by moving sensors. By "dense 3D-data" we mean, that each pixel carries individually measured distance information, neither interpolated from its neighbor pixels nor using any other kind of lateral (bandwidth limiting) context information. To our knowledge, a sensor that combines both features is not yet available. Why? Is such a "pixel-dense single-shot 3D-camera" possible at all? The discussion will reveal surprising insights about information theoretical differences between optical 3D-sensing and 2D-imaging. Understanding these differences and limits provokes thoughts about a novel sensor single-shot principle that may overcome the fundamental difficulty.