The Evolution
The camera history has evolved during the past 150 years from wet plate camera, film camera to now the digital camera. The technology transition happened on the sensing part of the system, from chemical to electronic focal plane. Thus it is defined that a camera is now rather "a device that encodes images." than it’s original definition as a device which records images.
When 50 years ago the digital image signal processing (ISP) chips began to be available, it had been added to the camera system as the third component. Then it became possible to implement analog to digital conversion directly on the sensor chip. But since the encoding algorithms are most often implemented as ASICs defined, adjustable parameters have been relatively constrained. The basic goal of ISP (and the camera) has remained to capture and encode the physical image captured by the sensor.
The recent developments in camera design that lead to a third definition, a camera is now "a device that calculates images." This new definition emerges from the complementary fields of computational imaging and computational photography, which combine physical camera design with computational processing to improve estimation of the physical scene.
Computational Imaging
Computational imaging" consists of integrated design of physical data capture and digital image estimation systems. While all digital cameras necessarily rely on algorithms to control focus, exposure and other parameters, computational photography has focused on building platforms wherein such parameters may be controlled by sophisticated algorithms integrated with image estimation. Thus the computational imaging consists of joint design of image capture and image estimation.
Development Focus for AV Cameras
Camera system has become the most important and indispensable sensor for ADAS systems and autonomous vehicles. Before the autonomous vehicle makes a decision on the actions upon any foreseeing event, correct calculation and understanding of any sensor outputs are the key for right actions. Generally speaking, the camera system works following the three main procedures: Data acquiring, data processing and data analysis. So there are 3 influencers in the camera system operation: CAPTURE => PROCESS => UNDERSTANDING. And which capabilities matter most among the three?
At the beginning of the digital camera era, the optics were important compared to other parts of the system. Good optics makes sharp and aberration free images, when the camera was mainly for the purpose of photography. After that, and even till now, the center of development shifts to sensors, low noise, high dynamic, good quantum efficiency sensors that provide high potential of capturing good quality images, for the purpose of photography, and industrial applications. Most recently when computational imaging provides us the opportunity to influence and control the front capturing parameters in the Acquisition process. The center of development of the camera system again shifted to Processing.
Commenti