Mae Capozzi

Should Software Engineers Care About Ethics?

Prometheus

In an essay published on Medium called “Design’s Lost Generation,” Mike Monteiro describes how he shocked a crowd of designers at a San Francisco tech conference by suggesting design, like medicine, law, and even driving, should be regulated.

A year ago I was in the audience at a gathering of designers in San Francisco...at some point in the discussion I raised my hand and suggested...that modern design problems were very complex. And we ought to need a license to solve them. (Monteiro)

In response to the designers’ surprise, Monteiro asked:

“How many of you would go to an unlicensed doctor?” I asked. And the room got very quiet. “How many of you would go to an unaccredited college?” I asked. And the room got even quieter. (Monteiro)

Monteiro understands that software has become too deeply embedded in everything we do for it to remain unregulated. Doctors, journalists, and 911 call-center workers all rely on technology. We’ve been sharing our data with Facebook since 2004. We get in Ubers and assume we’ll safely arrive at our destination.

Companies like Facebook, Uber, and Volkswagen have proven that they do not value the safety of their users and that making money trumps following the law. Facebook has revealed massive amounts of user data to advertisers and companies like Cambridge Analytica without user consent. Uber built software that helped them evade officials in cities where Uber’s services were still illegal. Volkswagen wrote software to cheat emissions tests––tests meant to protect the environment and the people who live in it. We have placed our data and our lives in the hands of people who are not armed with a guiding set of ethical principles that dictate how they should approach the consequences of the features they are building. Software engineers are the last line of defense against corrupt products. We need to develop an ethical framework that prevents us from harming our users, otherwise we are complicit in corruption.

Business people and product managers hold most of the decision-making power when deciding whether a company should build a product or feature. Unfortunately, we’ve become obsessed with shipping features as quickly as possible in a system that values speed over quality. CEOs frequently make decisions based on the promise of more funding from venture capitalists, not the safety and health of their user base. If you’re an engineer at company that values ethics, good for you! Most engineers don’t have that experience. I’d wager that many of us will find ourselves in at least one situation where we need to make a decision between saying no (and potentially losing our jobs), or enabling the production of features that will hurt our users. We can study Uber, Volkswagen, and Facebook valuable case studies that software engineers can use as they evaluate future products they may find themselves asked to build.

In March of 2017, The New York Times reported that Uber had written software called Greyball. That software enabled Uber to operate illegally in cities like Portland, Oregon. It worked like this: when an official trying to crack down on Uber’s illegal activity tried to hail a ride, Greyball spun up a fake instance of its application populated with “ghost” cars. If an Uber driver accepted a ride with one of those officials, the driver would receive a call instructing them to cancel the ride.
After the press caught on, Uber tried to spin Greyball as software designed to protect its drivers from potential threats, not just to evade officials:

This program denies ride requests to users who are violating our terms of service — whether that’s people aiming to physically harm drivers, competitors looking to disrupt our operations, or opponents who collude with officials on secret ‘stings’ meant to entrap drivers. (Isaac)

The statement tries to make it seem like Uber is the victim in this case. It has a tone of snide derision, especially with the word ‘stings’ in quotation marks. In the statement, Uber doesn’t apologize or admit to any wrongdoing, even though the software is used to break the law. Additionally, Uber uses the excuse of protecting drivers as an excuse for evading the law, even though it’s well-known that Uber does little to protect or support their drivers.

Volkswagen also resorted to illegal behavior to gain a leg up on its competition. In September of 2015, the Environmental Protection agency found that many diesel cars built by Volkswagen sidestepped environmental regulations. The BBC reported that “the EPA has said that the engines had computer software that could sense test scenarios by monitoring speed, engine operation, air pressure and even the position of the steering wheel” (Hotten). Volkswagen’s software could detect when a vehicle was undergoing testing, and lower the diesel emissions during that period, thereby violating the Clean Air act.

The scandal had consequences for the engineers and managers who developed the software. Oliver Schmidt, the former general manager for Volkswagen’s Engineering and Environmental Office, received 7 years in prison for his involvement in the scandal. James Liang was sentenced to 40 months in prison for helping to develop the software that allowed Volkswagen to cheat emissions tests. (Vlasic) According to MIT news, the “excess emissions generated by 482,000 affected vehicles sold in the U.S. will cause approximately 60 premature deaths across the U.S” (Chu). The software Volkswagen built had fatal consequences. Engineers did not step up to block the decisions made by the corporations, and the software they built has taken lives.

While Facebook’s actions have not directly taken lives, the company has been collecting and sharing private user data for years. In 2018, the story broke that a data analytics firm called Cambridge Analytica used a loophole in Facebook’s API to access Facebook data from 50 million user accounts (Chang). The API not only exposed access to a user’s freely-given data, but also data about a user’s “friends” without their consent. Making matters worse, Facebook’s documentation dictated that although accessing user data non-consensually was possible, application developers should not do it.

Cambridge Analytica should not have taken advantage of the loophole in the API, but the brunt of the blame lies with Facebook. Facebook developers exposed user data not freely given by the user. They enabled Cambridge Analytica’s ethical breach, which allowed Cambridge Analytica to build a massive marketing database to target those users and shared the data with political campaigns.

Vox’s Aja Romano explains how breach of user trust happened:

The factors that allowed Cambridge Analytica to hijack Facebook user data boiled down to one thing: no one involved in the development and deployment of this technology stopped to consider what the results might look like at scale. (Romano)

This tends to be the central problem when news like this breaks: the software engineers who built the API, the product managers who came up with the feature, and the management at Facebook didn’t think about the ethical implications of the software they were building. They didn’t have a framework to turn to, and did not know how to determine whether what they were building was ethical, nor did they try to find out. The time has come for engineers to take responsibility for their creations. We can no longer sustain a myopic view of the products we are creating. We no longer have the luxury of only focusing on writing elegant and well-tested code, or if we shipped the feature on time. We need to understand the context of what we’re building, and to validate that it won’t harm our users.

In Mary Shelley’s Frankenstein, Victor Frankenstein describes the horror he felt as his monster woke up:

How can I describe my emotions at this catastrophe, or how delineate the wretch whom, with such infinite pains and care, I had endeavored to form? His limbs were in proportion, and I had selected his features as beautiful. Beautiful! Great God! His yellow skin scarcely covered the work of muscles and arteries beneath; his hair was of a lustrous black, and flowing; his teeth of a pearly whiteness; but these luxuriances only formed a more horrid contrast with his watery eyes, that seemed almost of the same color as the dun white sockets in which they were set, his shrivelled complexion, and straight black lips. (45)

Like Frankenstein, we carefully select the parts that make up our creations. We might choose a JavaScript framework that will help us write elegant code and we might carefully write tests to prevent our code from breaking. But what do we see when we step back and evaluate our feature in the context of the larger application? Does the application protect and serve its users? Or does it shave years off of people’s lives? Does it negatively influence an election? It doesn’t matter if the parts that make up an application are beautiful.

We need software engineers to start thinking deeply about the ethical implications of the world they are creating. Computer science majors should have to take courses in the humanities. Software should be regulated. Engineers should develop a shared set of ethics that they can turn to when they are asked to build something illegal or unethical, like they have been by Uber, Volkswagen, Facebook, and many other companies. Lives depend on it.

Works Cited

Chu, Jennifer, and MIT News Office. “Study: Volkswagen's Excess Emissions Will Lead to 1,200 Premature Deaths in Europe.” MIT News, 3 Mar. 2017, news.mit.edu/2017/volkswagen-emissions-premature-deaths-europe-0303.

Chang, Alvin. “The Facebook and Cambridge Analytica Scandal, Explained with a Simple Diagram.” Vox, Vox, 23 Mar. 2018, www.vox.com/policy-and-politics/2018/3/23/17151916/facebook-cambridge-analytica-trump-diagram.

Hotten, Russell. “Volkswagen: The Scandal Explained.” BBC News, BBC, 10 Dec. 2015, www.bbc.com/news/business-34324772. Isaac, Mike. “How Uber Deceives the Authorities Worldwide.” The New York Times, The New York Times, 3 Mar. 2017, www.nytimes.com/2017/03/03/technology/uber-greyball-program-evade-authorities.html.

Monteiro, Mike. “Design's Lost Generation – Mike Monteiro – Medium.” Medium, Medium, 19 Feb. 2018, medium.com/@monteiro/designs-lost-generation-ac7289549017.

Romano, Aja. “The Facebook Data Breach Wasn't a Hack. It Was a Wake-up Call.” Vox, Vox, 20 Mar. 2018, www.vox.com/2018/3/20/17138756/facebook-data-breach-cambridge-analytica-explained.

Shelley, Mary. “Frankenstein, or, The Modern Prometheus.” Google Books, books.google.com/books?id=2Zc3AAAAYAAJ&printsec=frontcover&source=gbsgesummary_r&cad=0#v=onepage&q&f=false.

Vlasic, Bill. “Volkswagen Engineer Gets Prison in Diesel Cheating Case.” The New York Times, The New York Times, 25 Aug. 2017, www.nytimes.com/2017/08/25/business/volkswagen-engineer-prison-diesel-cheating.html.

License: CC BY 4.0