iDAR’s biomimicry enables autonomous perception engineers to create these situationally-specific scan patterns that are capable of searching a scene 4X-5X faster than the human eye. This scanning speed is matched by superior spatial coverage that breaks down a scene into Dynamic Vixels, a data type unique to iDAR which combines X,Y,Z and R,G,B data.
By finding and locating objects as fast or faster than a human, iDAR enables perception that can intelligently classify and track objects at unprecedented rates – including the unique ability to calculate the vector and velocity of each object within a frame. Much of this can be done at the sensor level within the same frame, bypassing 100s of milliseconds of latency seen in currently deployed systems. This ability to modulate both spatial and temporal dimensions simultaneously as humans do is what is needed to achieve level 4 and 5 autonomy.
“iDAR is based on a revolutionary new agile LiDAR design that allows autonomous vehicles to perceive far beyond the limits of human perception,” said Blair LaCorte, AEye’s Chief of Staff. “This powerful software-driven sensor system allows vehicle perception engines to actively interrogate their environment to identify the precise information they need at speeds that will radically improve safety.”
On Monday afternoon in Mountain View, CA, Mr. LaCorte and Dr. James Doty, Clinical Professor of Neurosurgery at Stanford University will present a discussion on “Making Sense of the Sensor: Applying Biomimicry to Vehicle Autonomy.” In this session, they will explore why the human brain and visual cortex are the ideal models for autonomous perception and how their performance could be best replicated with existing sensor technologies.
“AEye has taken some of the most elegant lessons from human brain science and combined them with cutting edge technology,” said James Doty. “This integration created something that I believe will allow autonomous vehicles to process data like a computer but perceive like a human.”