Top two technology challenges for autonomous vehicle image sensors

A new Smithers Apex State-of-the-Art-Report indicates the top technology challenges facing autonomous vehicle image sensors to 2023.

The automotive market is demanding a wide span of developments based on semiconductor technology, due to rapid advance towards automated driving systems (ADS) and advanced driver-assistance systems (ADAS). To operate these require powerful compact  technologies based on vision, night vision and time-of-flight-based 3D sensing using lasers and radar that can operate across a range of temperatures.

As the automotive industry moves towards higher levels of ADAS integration and autonomy, the automotive semiconductor content will rise.

As per the internationally accepted standard SAE J3016, the level of autonomy of a road vehicle is categorised in five levels:

  • Level 1: Driver only: the human driver controls everything independently (steering, throttle, brakes, etc.)
  • Level 2: Assisted driving: assistance systems help during vehicle operation (cruise control, ACC, etc.)
  • Level 3: Partial automation. The operator must monitor the system at all times. At least one system, such as cruise control and lane centring, is fully automated.
  • Level 4: Conditional automation: the operator monitors the system and can intervene when necessary. Safety-critical functions under certain circumstances are shifted to the vehicle.
  • Level 5: Full automation: Operator-free driving, though it might still be limited to certain roads, weather conditions and road conditions.

Each level sets different requirements on the sensor systems of a vehicle. Level 2 to Level 3 constitutes a major breakpoint, as in Level 3 and the sensor systems need to monitor their own operation. The Audi A8 was pivotal in this development, being the first mass-market car to achieve Level 3 or higher.

Smithers Apex has identified the top 20 technology challenges facing autonomous vehicle sensors to 2023 with ‘compact and durable sensor package’ and ‘thermal stability’ ranking joint top.

Compact and durable sensor package

The situation for developers to build a sensor into a camera housing is a challenge. The size is crucial for the camera, though it is not as dramatic a demand as it is in the mobile phone space. Ultimately, pixel size and count determine the optical diagonal. This leads to a housing size and parallel also to a diameter for the optics.

The trend to put more and more functionality on the sensor is long gone. Today, an external image sensor processors (ISP) or similar processing unit gets the raw pixels. Colour imaging is one of the main tasks. This gives the actual silicon a lot of space for the pixel array. The bare die is the smallest use case and also requires only a little housing.

Due to the current trend to global shutter pixel sensors and the implementation in column analogue to digital converters (ADCs), this is again an area requirement. This sensor class has its own analogue-to-digital converter per pixel column. There are sensors where this complete unit is located only on one longitudinal edge on top or bottom. Some sensors use black reference pixels on the left side of the array. The result is the same - no geometrical centring. These restrictions shift the optical centre out of the sensor package centre - since typically the bare die sits centrally in the housing. Thus, the sensor must be placed in the camera design from the centre out. This complicates the further construction considerably, because seals to the optics cannot rest completely on the sensor cover glass.

Thermal resistance

The overall camera system has enormous life expectancy challenges to deal with. The temperature has a significant impact on this life expectancy and thus failure rate.

Any image sensor fitted to a vehicle must continue to function across a range of challenging environmental conditions. This is valid; but the sensor itself is also a source of heat. In principle, it is simple: the camera and especially the sensor must not be exposed to large fluctuations and extremes. Unfortunately, there are winter regions where temperatures below -20°C can occur overnight. On very sunny days in Europe even 45°C air temperature can be exceeded. If it comes to sunshine, a camera in a digital exterior mirror can hear up to well over 100°C due to the radiant heat. A smartphone provides a warning message and denies the functionality if it is left in the sun. The camera, which is designed for an ADAS system, can report the system status, but not deny the functionality.

Each image sensor delivers its best image quality at a given temperature and given read speed. Outside this range, the sensor is still functional, but with limited image quality or even visible artefacts in the picture.

A cold camera can also be heated for the start. The interior and the heating quickly generates the recommended operating temperature. When a car needs to be started at -40°C, some assistants are initially only partially functional.

At lower temperatures, the image quality is usually better. This result should be the goal. At high temperatures, heat generates additional electrons, which can then be recognised as bright pixels in the image. The original optical information is distorted.

Every 7°C, the dark current doubles in the pixel. This means that the black level increases, and some pixels become extra bright, without any exposure. These are the so-called warm or hot pixels. The effect becomes stronger with longer exposure time. A hot camera generates more image noise. When a car drives with a hot camera in sunshine and then into a dark tunnel, where it has to recognise a traffic sign or lane with a longer exposure time, then the increased noise in the image can affect the image processing negative, and possibly lead to incorrect interpretation of a scene.

Sensors have control loops, which very much affect the correction of the internal reference value for black in the picture. Sensors also generate different clock signals for operation and output. These control loops are always temperature-dependent.

The shielded pixels are used for an internal reference offset of each row and column, to correct the non-uniformities. When the temperature is drifting, it may be that element needs to be readjusted here. Possibly in case of heat, the internal zero reference position is so high that the output pixel value no longer reaches saturation, so the in-picture dynamics are restricted. For colour sensors with active colour balance, false colours occur in bright image areas.

The internal signal timing in the sensor can change with the temperature. As it gets hotter, so the internal signals become slower and noisier. Direct sunlight generates temperatures of well over 100°C in an active camera. The big challenge remains to deduce the intrinsic heat here, as well as ensuring there are no mechanical shifts – for example, in the optical path. The cameras mounted in the outdoor area, for example in the side mirrors, are especially challenging. These are heated in cold and rainy conditions, with an element to keep the optics clear and preheat the electronics. In sunlight, only the airstream during driving helps to cool down.

If the sensor image detects that the upper operating temperature is exceeded via a built-in temperature sensor, a warning message can be issued. Optionally, the frame rate can then be halved, and in the pause between two images unnecessary function blocks are switched to standby. Here, the energy requirement and the heat development is reduced.

At high temperatures more and more visual pixel defects are added, so that a dynamic correction of these anomalies is absolutely beneficial. Static tables that get initialised in the sensor in production are only of limited help.

Temperature may shift the timing of the serial pixel outputs. Here, the interfaces with embedded clock have clear advantages over, for example, the parallel interface.

In the medium-term, a pixel structure that is quite temperature stable is desirable. This is especially important with the storage node of the global shutter pixel.

The physical effects should be absolutely perfectly compensated by the surrounding correction circuits on the sensor over the entire usage range. A homogeneous picture in all areas is the goal.

A sensor can consume up to 1.5W depending on the operating mode. The frame rate and bit depth are decisive factors here. This power is always completely converted into heat, and this must then be dissipated by the silicon to the environment.

Autonomous Vehicle Image Sensors to 2023 – A state of the art report offers expert analysis into the future of image sensors.