Researcher aims to help autonomous vehicles make faster, safer decisions in real time

“Shared perception” system uses AI to improve reliability and efficiency of self-driving vehicle technology.

EDMONTON — Imagine a child jumping out in front of your car as you drive down a residential street. Now count to three. That’s how long it could take a self-driving car with insufficient data — confused over whether to brake or swerve — to turn control over to the driver. By then it’s too late to react.

In dynamic urban environments, three seconds is an eternity, says Ehsan Hashemi, an expert on autonomous navigation in the University of Alberta’s Faculty of Engineering.

“It’s called a handover situation, when the autonomous driving system realizes it’s not capable of driving the vehicle,” he says. “The current standard transition of between three and 10 seconds is huge, and really scary.”

The ability for an autonomous vehicle to quickly perceive and understand the scene around it remains a big hurdle to safe commercialization of the technology, says Hashemi, mainly because of computer processing constraints and uncertainties in the environment.

He and his team are aiming to dramatically improve that processing speed by limiting the information a vehicle receives to what is most relevant, allowing them to set the processing bar (including decision making) at about 20 milliseconds for vehicles to refresh available data and make faster, safer decisions in real time.

Hashemi’s “shared perception” system works by combining on-board sensors — which read the vehicle’s immediate surroundings and track moving objects — with remote sensing units which include cameras and light detection and ranging (LIDAR) sensors mounted on nearby fixed objects such as lampposts and buildings at intersections, providing a much wider view.

Together, they have a more comprehensive view of the vehicle’s environment, catching obstructed or unexpected moving objects in their sights and reducing the need to hand over control to the driver. If a child chases a ball across a residential street, a remote sensor could detect that movement before the car enters the scene from as far away as 100 meters. 

“This would allow for proactive measures, braking or steering ahead of time rather than waiting for a driver to see a moving object and understand the circumstances.”

A key innovation in Hashemi’s work is the filtering of information using artificial intelligence. Rather than accounting for everything in a vehicle’s field of view, as would a human driver, the system’s algorithm targets only relevant data — such as objects moving in its path and important features like traffic lights and signs.

Hashemi's team has received funding to test its system on the U of A’s North Campus. He hopes in the future to expand the research through a potential pilot project with the City of Edmonton. 

The full story can be seen here. To arrange an interview or opportunities to gather visuals, please contact:

Ross Neitz | U of A media strategist | ross.neitz@ualberta.ca | 780-297-8354