MVIS 2024 Q3 AI Chat

Sumit Sharma Blog on LinkedIn: Camera-Only Systems Aren’t the Answer: Why Consumer Safety Requires More

 

Camera-Only Systems Aren’t the Answer: Why Consumer Safety Requires More



MicroVision
By Sumit Sharma, CEO of MicroVision 

“Vision is the art of seeing what is invisible to others.” That quote from Jonathan Swift rings especially true after reading the NHTSA’s most recent report on crashes involving vehicles equipped with SAE Level 2 Advanced Driver Assistance Systems. This report highlights the need for both the clearer reporting of data as well as the continued development of leading-edge ADAS features to avoid collisions and improve safety. True vision, in the case of enabling ADAS tech, requires vehicles to be able to “see” in real-time and enable vehicles to take rapid, proactive action faster than a human ever could. That’s a tall order for camera-based systems, as we can see from the report.

Cameras are not enough.

Data from the NHTSA report indicated that many of the reported crashes involved camera-only technology, illustrating that cameras are simply not enough. Just because cameras have been popularized through extensive use in early electric vehicles doesn’t mean they should be the standard. Cameras, for their low-cost and smaller size, simply don't provide the same level of data, range, and resolution that a lidar solution does. They simply don’t deliver on the promise of safety that drivers need and expect. Here are some issues that arise from using a camera-only system:

  • Camera-only systems are limited with a two-dimensional view in a three-dimensional world
  • Camera-only systems fail at highway speeds
  • Camera-only systems have difficulty performing in rain and fog
  •  Camera-only systems are unreliable in changing lighting conditions
  • Current camera module technology is not suitable for long ranges and wide fields of view
  • Camera-only systems require a lot of training on object classification and require significant computing power
  • Camera-only systems lose track of moving objects on the road, for instance, when a vehicle becomes obscured or changes lanes

Lidar makes the invisible visible. 

The very nature of lidar excels at making the invisible visible because it can “see” further, faster, and in real-time. A lidar system sends out laser pulses and then processes that information to “see” obstacles, pedestrians, other vehicles, and more—providing a vehicle’s ADAS with near-instant readings for distance and object measurements. As important as dynamic range and measurement, lidar systems perform more accurate three-dimensional mapping of a vehicle’s surroundings and process this information quickly and predictably. And lidar works well in different lighting conditions, even in direct sunlight and complete darkness, while also accurately evaluating distance regardless of road composition.

 In addition to three-dimensional mapping, dynamic range, and lighting conditions, we know that lidar has the edge in terms of performance in many of the scenarios in which camera-only systems struggle. Lidar provides:

  • Instant velocity of moving objects not only in relation to the vehicle but objects around it
  • An ultra-high-resolution point cloud showing drivable and non-drivable areas of the road ahead without needing to classify objects
  • Dynamic views at high speeds
  • Consistently faster response times

Perfecting the art of vision.

Perfecting the art of vision for ADAS systems, whether for Level 2 ADAS or autonomous vehicles, will require a solution that utilizes cameras, radar, and lidar. It’ll be a solution that enables dynamic view at high speeds—one that covers short-, medium- and long-range views to produce an ultra-high-resolution point cloud showing drivable and non-drivable areas of the road ahead. And it’ll be one that enables ADAS systems to respond more quickly and to take action every single time, no matter the lighting, weather conditions, or speed.

Future NHTSA reports will hopefully reflect accurate and useful data that can be used to create safer, higher performing technology to enable ADAS systems and features. And when ready to deploy, autonomous driving tech’s AI will require multiple data points from not just cameras but from lidar and radar in order to make the best decisions. For now, the takeaway from this report reflects what MicroVision has always believed—that the safest, most accurate, and highest performing ADAS needs an exceptional lidar system in place to guarantee a full and complete picture of the road ahead at highway speeds and in all weather conditions.

Comments