Autonomous vehicles rely on perception systems to understand their
surroundings for further navigation missions. Cameras are essential for
perception systems due to the advantages of object detection and recognition
provided by modern computer vision algorithms, comparing to other sensors, such
as LiDARs and radars. However, limited by its inherent imaging principle, a
standard RGB camera may perform poorly in a variety of adverse scenarios,
including but not limited to: low illumination, high contrast, bad weather such
as fog/rain/snow, etc. Meanwhile, estimating the 3D information from the 2D
image detection is generally more difficult when compared to LiDARs or radars.
Several new sensing technologies have emerged in recent years to address the
limitations of conventional RGB cameras. In this paper, we review the
principles of four novel image sensors: infrared cameras, range-gated cameras,
polarization cameras, and event cameras. Their comparative advantages, existing
or potential applications, and corresponding data processing algorithms are all
presented in a systematic manner. We expect that this study will assist
practitioners in the autonomous driving society with new perspectives and
insights.