Tech moves fast! Stay ahead of the curve with Techopedia!
Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.
An autonomous vehicle is a vehicle that can drive itself without input from a human driver. These types of vehicles are also known as self-driving cars, driverless cars, or robotic cars.
The term self-driving car is becoming a standard as these technologies continue to mature.
In some ways, the story of the autonomous vehicle is a tale of technological evolution that is not finished yet, a gradual advance of integrated artificial intelligence services and sensor-based driver safety features that may or may not eventually replace human drivers entirely.
Although the autonomous vehicle has come a long way, it is not yet a common mode of transportation, and various obstacles to adoption apply.
The earliest edge of the autonomous driving frontier was the evolution of multiple driver safety features that are now standard in many new vehicles. For example, a lane departure warning system alerts drivers if the vehicle seems to be leaving a particular space on a multilane road. Parking assist features, automated braking, and other features also apply. Each one has its own specialized function and controls a given task.
What these systems have in common is that, although they promote the idea of autonomous vehicle design, they do not approach what some would call a self-driving car. The human driver is still in control, and needs to be in control, but utilizes the warnings and alerts to make better driving decisions.
As we progress toward autonomous vehicle design, we've seen some successes and some tragic failures in intermediate types of autopilot systems.
Perhaps the most prominent is Tesla's autopilot system, where the company has begged drivers not to hop in the back seat and let the autopilot take control – but where some have sadly done so anyway, with fatal results. In one case, Tesla autopilot failed to recognize a “wedge” structure dividing two lanes, and without human driver intervention, the driver was killed.
These autopilot systems show how dangerous a middle-of-the-road approach to autonomous driving is. Drivers with a false sense of security may cede too much control to the computer, and will be extremely vulnerable if and when the computer system fails to live up to fully imitating the human driver’s responses.
Some of the most common conversations around autonomous cars today center on the inability of the industry to move forward in the short term.
Many auto companies have adjusted their outlooks to avoid promising consumers self-driving cars within a limited number of years. There's the general consensus that the industry is moving slowly and with caution, according to very real obstacles to autonomous vehicle completion. The fatalities seen in road tests lead to greater concerns about liability, in which today’s system puts the responsibility on the human driver.
Scaling has also proven difficult, and a fatal Uber test has also set the industry back. Federal agencies are asking auto makers to provide more information about some autonomous vehicle road tests.
In this context, although agencies point out that self-driving cars have the potential to “revolutionize” transportation, it will take a while to bring us closer to autonomous vehicle adoption.