Human-like sensing and perception for Autonomy

No way around that. It’s a requirement to enable the autonomous future we want.

And let’s be honest: we’re not there yet.

Yes, we’ve made tremendous progress, but full autonomy (the real stuff, SAE Level 5, the world we all dream about) is still a moving target, seemingly out of reach, always a few years away. Many believe that if we just use more compute, more training data, more and more powerful sensors, we can brute-force our way to a solution. But that’s not the way forward.

The existing path will not get us there.

Why? Because the architecture of today’s sensing and machine perception stacks does not, and cannot, understand the world in the same rich, meaningful, interconnected way that the human brain does. The problem with the existing approach is not inadequate computational resources, or an insufficient number of sensors, or sensor power…
The problem is one of method and approach.

Something fundamental is amiss.

Before we can build safe, truly autonomous vehicles, we need to rethink Sensing and perception.

This is precisely what we’re doing at Perceptive.

Lemniscape (infinity symbol) with text "Perception" and "Sensing"

A unified solution for sensing and perception.

In humans, sensing and perception are one and the same: sensing is perceiving. For autonomous vehicles to understand the world as humans do – to maneuver in unfamiliar environments, to make decisions based on intuition – they need to appreciate the world as humans do. Make split-second decisions like we do, and navigate the world like we do.

Stay Connected

Top