Is AI the Solution to Robotaxi Woes?

It's no secret that robotaxis have not been off to a great start this year. One only has to look at how things have unraveled in San Francisco (where a robotaxi ran someone over) to see that Level 4 autonomous driving tech is not the golden goose people hoped it would be when it was first launched.

So, what's the solution for these tech companies and their investors throwing eye-watering sums of money into developing this floundering tech?

To keep throwing in money and add artificial intelligence (AI) into the mix (of course).

But Will AI Be Able To Improve Autonomous Vehicle Tech?

It sounds like a great idea if you're a tech startup or a ridesharing company struggling to reach profitability. However, you might be scratching your head if you're just a regular person. So the solution for autonomous tech that can't adequately make real-world driving decisions is just more autonomous technology?

It's exactly what they're doing.

Tech developer Ghost Autonomy has plans to use multi-modal large language models (MLLMs) to enhance Level 4 autonomous driving systems.

Ghost Autonomy's founder and CEO, John Hayes, believes coupling AI with Level 4 autonomous driving systems will be a “holy grail” caliber “breakthrough.” According to Hayes, it will enable “everyday consumer vehicles to reason about and navigate through the toughest scenarios” because “LLMs have already proven valuable” for “data labeling and simulation.”

In layman's terms, that means that because a chatbot can write a B-minus caliber high school English paper, it will be able to improve Level 4 autonomous driving technology drastically.

For a quick refresher, robotaxis operating in San Francisco have caused many issues because these autonomous vehicles don't know how to properly navigate the “complex” driving scenarios that accompany encountering active emergency vehicles or situations.

Many videos have popped up across the internet and social media of police officers or firefighters being forced to smash an autonomous vehicle's windows to move them out of intersections after they became confused and froze in the middle of the road.

Additionally, autonomous vehicles have repeatedly blocked emergency vehicles on roadways and driven through emergency scenes that were taped off by the police. Beyond emergency scenes, Level 4 autonomous vehicles also have issues recognizing verbal commands from emergency services, hand signals from other motorists, and less-than-perfect road markings.

So, unless driving and traffic conditions are ideal, Level 4 autonomous vehicles have not proven themselves up to the task of safely navigating busy urban thoroughfares. The situation in San Francisco became so problematic that the DMV suspended the permit for robotaxis to operate in the city.

Ghost Autonomy seeks to make a difference by using its MLLMs to enhance a Level 4 autonomous vehicle's ability to interpret and navigate what's happening around it on the road. Apparently, the “complex” real-world traffic and emergency scenarios happening in real time where public safety is at stake are similar enough to “data labeling and simulation” that Ghost Autonomy believes their LLMs to be a solution to all these autonomous driving woes.

The Solution May Not Be So Simple Though

While it's a plausible, if not intriguing, idea in theory, there's no guarantee that MLLMs can help autonomous vehicles more effectively “interpret visual data” of traffic situations surrounding them through their audio, radar, lidar, and video feeds.

However successful the use of MLLMs may or may not be, the reality is that there will still be a need for remote human drivers to be on standby to intervene if necessary. On the bright side, this would create new jobs.

On the other hand, if a human driver has to intervene every time a “complex” situation unfolds on the road, it questions whether the vast sums of money being poured into developing all this autonomous technology are worth it.

There's a prescient metaphor here involving an antiquated mode of transportation that says the four-legged animal pulling the wooden cart can't be behind it.

Considering that even with AI, human intervention could still be necessary for autonomous vehicles to operate safely, we wonder if all these tech companies pouring money into rapidly expanding autonomous driving tech and services shouldn't stop and take a long hard look at a horse standing next to a wagon.

There's a reason that metaphor lives on while that mode of transportation died out long ago.

Source: Autoweek.