Choose The Right Sensors For Autonomous Vehicles

2022-06-02 08:31:03 By : Ms. Vivian Sun

Identifying the right combination of sensors requires consideration of roles, capabilities, and limitations.

When the world’s first “motorwagen” was introduced in 1885, the notion that a car would one day drive itself was laughable. Today, assisted and autonomous vehicles are the reality of an age where digital sensors can outperform human ability to perceive motion, distance, and speed.

When used together, sensor technologies including camera, lidar, radar, and ultrasonic give vehicles one complete understanding of the world to navigate safely with little or no human intervention.

But as engineers and designers, identifying the right combination of these sensors to satisfy the end user’s needs — including safety, functional performance, and price — requires thoughtful consideration of each sensor type’s roles, capabilities, and limitations.

Examples of sensor applications in vehicles include:

High-resolution digital cameras help a vehicle “see” its environment and interpret the world around it. When multiple cameras are installed around the vehicle, a 360° view allows the vehicle to detect objects in its proximity, like other cars, pedestrians, road markings, and traffic signs.

There are several types of cameras to consider for meeting different design needs, including NIR cameras, VIS cameras, thermal cameras, and time of flight cameras. As with most sensors, cameras work best when used to complement each other.

Cameras are ideal for situations such as maneuvering and parking, lane departure, and recognizing driver distraction.

Lidar stands for “light detection and ranging,” and is a remote sensor technology that uses light pulses to scan an environment and produce a three-dimensional copy. It’s the same principle as sonar, except lidar uses light instead of sound waves. In autonomous vehicles, lidar scans surroundings in real-time, allowing cars to avoid collisions.

Lidar is very accurate with depth perception and determining the presence of an object. It can see at long distances and through poor environmental conditions, like nighttime or rain and fog. Because it recognizes and categorizes what it sees, it can tell the difference between objects like a squirrel and a stone and predict behavior accordingly.

Radar stands for “radio detection and ranging.” This sensor emits short pulses in the form of electromagnetic waves to detect objects in the environment. As soon as the waves hit an object, they are reflected and bounce back to the sensor. In autonomous vehicles, radar is used to identify other vehicles and large obstacles.

Because it does not rely on light, radar performs well regardless of weather conditions and is most commonly used to enable cruise control and collision avoidance systems.

While radar uses radio waves and lidar uses light pulses, ultrasonic sensors evaluate the objects in an environment by sending out short, ultrasonic impulses that are reflected back to the sensor. They are very cost effective, excellent at detecting solid hazards, and are typically used on car bumpers to alert drivers of obstacles while parking. For best results in assisted driving applications, ultrasonic sensors are commonly combined with cameras.

Interesting fact: Many of the best ultrasonic sensors are found in nature. Bats, dolphins, and narwhals all use ultrasonic waves to identify objects (echolocation).

While individual sensors each have their strengths, the interaction of sensor information makes assisted driving possible. And as vehicles move towards total independence, choosing the right combination of sensors becomes even more critical to achieving the safety standards required for autonomy.

For the highest level of safety and performance, sensor fusion between camera, radar, lidar, and ultrasound will maximize each sensor type’s strengths while compensating for others’ weaknesses. For example, lidar alone provides poor results for lane tracking, but the combination of lidar and camera is very effective at this function.

The right combination depends on several factors:

To ensure prototype vehicles are real-world ready, sensors must be designed with an immense variety of test cases.

By providing a realistic physics-based sensor response in real time for camera, lidar, radar, and ultrasonic sensors, the Ansys simulation platform provides engineers with all the information needed to validate the safety of their autonomous driving system designs

Use Ansys simulations at the beginning of your design exploration to accurately see how each combination of sensors will perform in the real world. Then, based on your goals, you can evaluate the right sensor combination for your project.

Name* (Note: This name will be displayed publicly)

Email* (This will not be displayed publicly)

Suppliers are investing new 300mm capacity, but it’s probably not enough. And despite burgeoning 200mm demand, only Okmetic and new players in China are adding capacity.

Tech and auto giants are putting even more pressure on the semiconductor labor market. Some say it could be just what the industry needs.

Why UCIe is so important for heterogeneous integration.

Funding rolls in for photonics and batteries; 88 startups raise $1.3B.

There are at least three architectural layers to processor design, each of which plays a significant role.

Disaggregation and the wind-down of Moore’s Law have changed everything.

It depends on whom you ask, but there are advantages to both.

Research shows significant improvement in time to market and optimization of key metrics.

Why UCIe is so important for heterogeneous integration.

Efficiency is improving significantly, but the amount of data is growing faster.

Funding rolls in for photonics and batteries; 88 startups raise $1.3B.

Some designs focus on power, while others focus on sustainable performance, cost, or flexibility. But choosing the best option for an application based on benchmarks is becoming more difficult.

The clock network is complex, critical to performance, but often it’s treated as an afterthought. Getting this wrong can ruin your chip.