Erin Linebarger
In the autonomous vehicle industry, we are often asked, “I want to make my {insert vehicle type here} autonomous…how much does your autonomy kit cost?” A “plug and play” autonomy kit that could be installed in any vehicle, without modification, that would enable the vehicle to be autonomous, would be awesome.
Unfortunately, no such technology exists and likely never will. The reason for this is simple: Different vehicle types and use-cases require different sensors, sensor configurations, electrical and mechanical interfaces, and capabilities. Autonomous systems must be individually crafted for a given use case.
So what exactly is an “autonomous system”? It can be described in many ways, but for our purposes here, it is sufficient to define it by its physical components, which consist of the following subsystems:
Drive-By-Wire Kit (or “B-kit”) | The hardware and software system that allows seamless electronic control of a vehicle’s brake, throttle, steering, and transmission. |
Autonomy Hardware (or “A-kit hardware”) | The rugged computers that run the autonomy software, the sensors for perception (e.g. cameras, lidars, radar that “see” the world) and localization (e.g. GPS, IMU, encoders which allow the vehicle to understand where it is in the world). |
Autonomy Software (or “A-kit software”) | The autonomous behaviors the vehicle must perform according to the customer needs (e.g. path following, object detection and avoidance, person following). |
Vehicle Control Interfaces | Intermediary components that enable software-to-hardware communication between the A-kit and the B-kit. |
To achieve optimal performance, customization of each of these “layers” is required for each new autonomous platform and use case. Consider the vast differences in required capabilities for autonomous robots for the following scenarios:
- A quadruped (canine) robot inspecting valves in a factory.
- An all-terrain vehicle autonomously following a human leader on foot through a forest.
- A casualty evacuation vehicle navigating an urban battlefield to reach a wounded soldier.
A helpful way to contrast these scenarios, to appreciate how different they are, is to determine how important various evaluation metrics might be for each. Although numerous evaluation metrics exist, we will consider only:
- GPS-denied localization precision
- Classification accuracy (i.e., is that a person, is that a gauge, does the gauge need work?)
- Object avoidance accuracy
- Object tracking precision
- Risk-averse emergency stop
- Ability to operate over uneven terrain
- Time to accomplish its task
- Ability to detect and avoid adversaries
The robot-dog must navigate inside, with no access to a global positioning system, and locate gauges, signaling when a valve requires attention. Meanwhile it must safely and efficiently avoid obstacles, including moving factory workers and equipment. A failure to safely detect these “obstacles” and make an appropriate decision can be fatal. Thus, metrics include GPS-denied localization precision, classification accuracy, and object avoidance accuracy.
The leader-follower vehicle must track and maintain an appropriate distance from its human leader, and not lock onto a different object as its leader. It must have an emergency stop mechanism to allow immediate and fail-proof system shut down in case of emergency. (Imagine a 3-ton vehicle following you on your next hike down a mountainside, and you’ll likely agree that an emergency stop is a vital component!) Lastly, it must follow the human regardless of terrain difficulty. Thus, metrics include classification accuracy, object tracking precision, risk-averse emergency stop, and ability to operate over uneven terrain.
Finally, the casualty evacuation vehicle must reach the soldier as quickly as possible to increase their chance of survival. The vehicle must avoid both obstacles and enemy fire, and then keep the soldier safe during evacuation. Thus, metrics include time to accomplish task, object tracking precision, and ability to detect and avoid adversaries.
Because each situation has its own set of performance metrics, the required autonomous capabilities differ significantly. That in turn means each requires different sensors, perception algorithms, and path planners. While these scenarios are each quite different from each other, thus good examples of the unique nature of their required autonomous capabilities, even scenarios with more in common have use-case specific requirements that are not available “out-of-the-box”. So the answer, unfortunately, is no: “Plug and play” autonomy does not exist.