The role of camera sensors in machine vision applications

2022-11-11 10:23:05 浙江蓝海光学科技有限公司 Viewd 689

A camera sensor is a photosensitive detector that converts incident photons into electrical signals that can be read by digital devices.Most cameras use 2D array detectors, and choosing the right sensor type for a given application usually requires a trade-off between cost, the required final image resolution, and the necessary readout speed.


The most common type of camera sensor is a charge coupled device (CCD) or complementary metal oxide semiconductor (CMOS) image sensor.These are two different sensor technologies, which have different advantages and disadvantages depending on the possible operating conditions of the final equipment.


Historically, CCD sensors have been the sensor of choice for high-resolution and low-light applications.A CCD sensor is an array of pixels made of materials that produce electrons when hit by photons.A series of electrodes collect and read out the generated charge, usually through a single amplifier.Each pixel is read by transferring the charge to other pixels.Due to the design of the amplification stage, CCD usually has very low noise readings.CCD can be designed to have excellent dynamic range and good linearity over most of the detection range, which can simplify many measurements.Since each pixel needs to be read separately, the reading time may be problematic for creating a particularly large array detector using this architecture.


However, in recent years, CMOS technology has made many advances, especially for small-size, lightweight “camera-on-chip” applications, CMOS sensors have now become the most widely used technology.Manufacturing improvements have helped reduce noise levels. Although few CMOS sensors have the same dynamic range as CCD sensors, the power efficiency and cost-effectiveness of CMOS sensors can usually compensate for this.