A Data-Sharing Framework To Ensure Self-Driving Cars Are Road-Ready

• Bookmarks: 79


Rapid advances in technology have made autonomous vehicles (AVs)—once the realm of science-fiction—into an emerging reality. Auto industry newcomers such as Waymo and Tesla now compete with incumbent car manufacturing giants in a race to be the first to produce a commercially viable AV. In their haste to beat their competitors to market, these companies are fiercely guarding the data their automation algorithms are building. Jesse Krompier, an associate in the Commercial & Business Litigation department of Michelman & Robinson, LLP, addressed the safety issues that may arise as a result of competition between for-profit AV companies in a recent article for the Journal of Law, Technology & Policy. He argued that when AV companies treat safety data as proprietary trade secrets, the resulting algorithms they design may not appropriately handle unusual driving scenarios and could result in self-driving cars that make unsafe decisions and even cause accidents. He advocated for a new legal framework for data-sharing between original equipment manufacturers (OEMs) as a new federal safety standard in order to prevent these undesirable outcomes.

OEMs collect vast amounts of data in the process of designing the algorithms that underlie their vehicles’ autonomous functions. In addition to motion data such as vehicle speed, acceleration and position, this information also includes audio and video records of the vehicle’s surroundings, as well behavioral data for other entities on the road like pedestrians and other vehicles. These data are used to teach AVs to drive more safely. Because safety is the most important component of an AV’s commercial viability, OEMs treat their proprietary safety data sets as a crucial competitive advantage that they are loathe to share.

However, AV accidents can occur in complex or ambiguous driving situations where safety data sets have gaps or shortcomings. Krompier described a 2016 accident in which a Tesla Model S operating in autopilot mode crashed into a truck and killed its occupant when its “sensors failed to distinguish the 18-wheeler’s white trailer from the bright sky and attempted to drive full speed underneath the trailer.” In the aftermath of this incident, Tesla examined the crash data and claimed to have fixed the problem but did not provide specifics about what the issue was or how it was addressed. Krompier argued that if OEMs continue to address such cases in a piecemeal fashion, the result will be an ecosystem of sub-optimal AVs that are each equipped to handle certain unusual driving scenarios but not others. The issue is compounded when considering the implications of vehicle-to-vehicle communication, wherein AVs from different companies that operate under different algorithms encounter each other on the road, potentially resulting in dangerous or unpredictable interactions.

Krompier noted that some states, such as California and Nevada, have passed legislation that attempts to mitigate the problem by placing requirements on the type of data that AVs must collect and to whom that data must be available in the event of an accident. However, in Krompier’s view, these laws are patchwork fixes that don’t go far enough in mandating data sharing among OEMs. He contended that state legislatures fear driving OEMs out of their states with heavy-handed regulation that is perceived to stifle innovation.

Given this dynamic, Krompier posited that the situation is ripe for modest regulation at the federal level by the National Highway Traffic Safety Administration (NHTSA). Key components of his proposal included identifying safety-critical data to be shared and protecting OEMs’ proprietary machine-learning algorithms. He suggested two possibilities for data sharing: in the first case, a “compulsory license” regime, the NHTSA would set forward a minimum standard for the volume and category of safety data each OEM would be required to use in designing a self-driving car that can be considered suitably safe. If an OEM did not meet the minimum standards, it would need to pay to access another OEM’s data. The second approach, a “pay-to-play” model, would involve all OEMs contributing all their safety data into a shared pool and paying access fees to participate. These fees would then be recouped by royalties based on the volume of data they contributed and the extent to which other OEMs utilized their data.

Krompier admitted that each method has its drawbacks, but he maintained that AV safety is a critical public health issue and that formalizing a mandatory data-sharing framework at the federal level is necessary to ensure this revolutionary new technology is deployed as safely as possible.

Article source: Krompier, Jesse, “Safety First: The Case for Mandatory Data Sharing as a Federal Safety Standard for Self-Driving Cars,” Journal of Law, Technology and Policy 2017(2): 439-468.

Featured photo: cc/(Scharfsinn86, photo ID: 880913608, from iStock by Getty Images)

1163 views
bookmark icon