The Daily Tar Heel
Printing news. Raising hell. Since 1893.
Friday, April 19, 2024 Newsletters Latest print issue

We keep you informed.

Help us keep going. Donate Today.
The Daily Tar Heel

Editorial: Driverless cars make driving easier and ethical decision-making harder

Self-driving cars are on the way (Mayor Pam Hemminger has even mentioned their possible arrival in Chapel Hill during an interview with this board last year), and for the most part, they are a good thing. They will make driving astronomically safer, commute times shorter and traffic less punishing. Yet driverless cars present one glaring issue to the public: Who decides what happens when an accident involving a self-driving car, rare as the case may be, is unavoidable?

As much as it’s become a meme, the classic trolley problem actually provides a perfect example of this dilemma. What happens when a person walks out onto the interstate, and the only way for the car to avoid hitting them is by steering into the median and probably killing its passenger? 

Patrick Lin of The Atlantic writes, “If you complain here that robot cars would probably never be in the trolley scenario — that the odds of having to make such a decision are minuscule and not worth discussing — then you’re missing the point.” 

At some point in the development of one of these vehicles, a programming team will account for this ever-so-rare possibility and decide how the self-driving car’s system will react.

Should we entrust the federal government to regulate this decision-making process, and in doing so, risk the additional time necessary to establish and staff an entirely new agency within the Department of Transportation? Some American cities, including Austin, Nashville and San Francisco, already have self-driving cars on their roads. 

On the other hand, the Trump administration has left an unprecedented number of sub-Cabinet level positions unfilled almost a year into the president’s term. Even if this process were completed somewhat quickly, would we be comfortable with this legislation being in the hands of an administration notorious for placing unqualified individuals in high-level roles?

Or would we rather shift the responsibility to the tech industry and the dozens of companies developing driverless car technologies? While private companies are arguably less susceptible to the bureaucratic atrophy that many federal agencies fall victim to, they are not without their fair share of unethical practices. 

Uber, whose Advanced Technologies Group is currently testing its autonomous cars in Arizona, has been caught circumventing municipal ride-sharing laws on multiple occasions. Google, along with Facebook and Twitter, has recently been scrutinized for hosting election advertisements financed by Russian firms during the 2016 campaign season. Google broke ground on its own self-driving car project in 2009, an effort that has since been taken over by Waymo, a subsidiary of the online tech giant’s parent company, Alphabet Inc.

The responsibility of making these potentially life-and-death decisions doesn’t necessarily need to fall on each private developer of self-driving cars or the local, state or federal agencies longing to regulate them. Still, alternatives to these solutions have yet to gain traction in the media or the legislature. And as far as what those might look like? Who knows.

As self-driving cars come closer to hitting the streets of Chapel Hill and around the country, the public must remain vigilant of the moral questions that arise when we put our lives in the hands of the tech industry. Driverless cars are the future — and that future, while attractive, is also relentlessly complicated.

To get the day's news and headlines in your inbox each morning, sign up for our email newsletters.