In June of this year, a Tesla driving on autopilot along a Florida highway, drove beneath the trailer of a semi-truck crossing the road, tearing the roof off of the Tesla and killing the driver. Because the trailer was white and the sun was bright, the Tesla camera failed to “see” the trailer and its forward radar detected it as an elevated road sign due to its high profile. Ironically Florida is the only state that allows driverless car operation.
This tragedy does raise some legal questions. Had the driver survived, would he have been charged with careless driving and be at fault in the crash even though he was technically not in control of the vehicle? Was the driver allowing the car to exceed the speed limit or was he unaware of the vehicles speed?
Although autonomous vehicles (AV) are being designed with the intention of making the roads safer, as the evolution of AVs continues to progress rapidly, also evolves an entanglement of legal issues. If a driverless vehicle is speeding or parks itself illegally who receives the ticket. If the driverless vehicle is involved in an accident, who is at fault, the owner or the manufacturer. If it is not the owner, then would it be the makers of the operational software or system? Or the maker of the vehicle? What happens in the situation where the vehicle makes a maneuver controlled by its software to avoid a potentially dangerous situation and creates a new one, such as swerving into a lane or driving onto a sidewalk? How will the insurance company handle a claim against a car with no driver? In many states, holding a driver responsible for an act committed with his or her vehicle, without proof that they were driving, is a denial of due process.
In one situation in San Francisco, a Nissan operating with “Cruise Automation” struck a parked car as the driver was trying to regain control of the vehicle. Did the driver have control when he hit the other vehicle and for how long? Would the accident have been avoided if he trusted the technology and did not react at all?
A survey by the Pew Research Center indicated that nearly half of all Americans said that they would ride in a driverless car. Despite that indication of the anticipated popularity of autonomous vehicles, few states have addressed the certainty of future legal issues. In fact, only nine states and the District of Columbia have. California is one of them.
In December of 2015 the California Department of Motor Vehicles published draft regulations requiring all driverless vehicles to have steering wheels, brake pedals along with a human on board that is licensed and responsible for the vehicles operation. As predicted, the AV manufacturers responded negatively. Google, the leader in the driverless car movement claimed that California was “writing a ceiling” on the potential for fully self-driving cars. Google’s self-driving vehicle has no steering wheel or controls other than a black push button starter and a red push button emergency stop. Disabled persons and disability advocates joined with Google in opposition, but their disagreement was with the requirement of a responsible licensed driver on board. Austin Texas, however, would be “thrilled” to host driverless car development according to a spokesperson for the mayor. As mentioned earlier, Florida allows the operation of AV’s autonomous vehicles as long as there is a remote operator that can take control of the vehicle if a failure occurs.
Many states have laws that would seemingly eliminate the possibility of driverless car operation. New York requires at least one hand on the steering wheel at all times. Seatbelts are required for drivers in most states. Licenses are also required for “persons” driving all motor vehicles on public roads. If the software is the “driver” it can’t hold the wheel, wear a seatbelt or become a person and pass a driver’s license test.
Autonomous vehicles are actually a form of a robot. A UK government agency known as the Physical Science Research Council compiled a list of design rules governing the use of robotic devices. These rules, although not law, make perfect sense, especially the following:
“Responsibility for a robot’s actions lie with its designer/manufacturer and the user when the robot is designed to allow human intervention. Legal liability must be shared or transferred e.g. both designer and user might share fault where a robot malfunctions during use due to a mixture of design problems, or user-control failures.”
As more manufacturers create and test new designs, states must address the legal issues that will arise regarding safety and liability of AVs. Manufacturers must develop programs that can correctly direct the actions of the vehicle in every possible situation before these AV’s are permitted to join the daily flow of traffic. If these state regulations and safety programs are not in place prior to the AV’s joining conventional vehicles on the roads, the outcome would be completely disastrous.
Aside from the obvious privacy concerns that come along with such advanced vehicles, there are a number of other issues that must be addressed at some point. For instance:
- What happens if the police need to stop one of these vehicles?
- Would the vehicle be too polite to aggressive human drivers and allow them to go through traffic, unheeded?
- What type of insurance would need to be taken out and by whom?
Dolman Law Group is a Florida personal injury law firm. With Florida being the only state allowing non-test driving of autonomous vehicles, Dolman Law Group is prepared for the complexity of representing any person who is harmed in an accident involving a driverless vehicle. For a free consultation for any personal injury case, contact Dolman Law Group at 727-451-6900
Dolman Law Group
800 North Belcher Road
Clearwater, FL 33765
Source: The Legal Intelligencer, Lexology