As self-driving cars advance to mainstream markets and phones diminish individual privacy, programmers are becoming the new moderators of ethics. Elaine Rich, a UT computer science senior lecturer, discussed these issues in a talk last Thursday at the Gates Dell Complex.
The talk, which covered the unintended consequences of technology, was hosted by the UT chapter of the Association for Computing Machinery. This was the first lecture the student organization hosted in more than five years, and the first of three presentations this semester, the next of which will be Oct. 24.
From tobacco to the Model T Ford, technology throughout history has had unintended consequences, Rich said. She added this is especially true of computers.
“As we get new technology, we end up with policy vacuums,” Rich said. “We have something, but we don’t really know what it is or how to classify it, so we don’t know what to do with it [from a legal standpoint].”
For example, the government can not only look through suitcases as people cross the border, but can also search devices like phones and laptops under similar circumstances. Many feel that these devices are more an extension of a person’s brain than simple property and should not be subject to searches, Rich said.
More specifically, Rich said the effect of policy vacuums has been seen in the FBI’s investigation of the San Bernardino shooter, when the bureau wanted Apple to create a backdoor into the shooter’s phone. According to Rich, it is important for computer scientists to get involved in situations like this because others may not understand the consequences.
“One of the things I’d like to convince you of is that you are a citizen of the world who knows stuff about this technology,” Rich said. “We would like to have decisions not just made by politicians who may mean well but don’t understand technology.”
Rich said computing affects almost every aspect of society, including economics. Machines have allowed factories to increase output with fewer workers, and artificial intelligence may begin taking more complicated jobs as well. According to Rich, this could have a large societal impact.
“People get very scared when life as they know it is threatened by technology and they don’t know how to respond,” Rich said. “I think we are already seeing [the impact] in the ‘jobless recovery’ and the associated social unrest among segments of our population who haven’t benefited from it. We ignore this at our peril.”
Additionally, as AI becomes capable of making ethical decisions, Rich questioned how computers should be programmed to act. In the case of self-driving cars, the car may have to decide between saving the life of a passenger or a pedestrian.
“Those kinds of decisions already exist, but on an individual level, so we don’t all have to agree,” Rich said. “Now we do have to agree on it and on regulations and on who will get sued if something goes wrong. It’s not so simple as to just say, ‘Build better vision algorithms.’ We have to be careful that we don’t cede moral responsibility when we write algorithms.”
Stas Ilinsky, vice-president of academics for the UT chapter of ACM, said he found Rich’s presentation fascinating.
“I was pleasantly surprised at how encouraging she was of us to think about different issues,” he said. “She provided context for questions but never actually gave us the answers.”
Ultimately, Rich said her goal for the presentation was to encourage students to become involved and responsible professionals later in life.
“I want you to think when you’re building something that there is the reason you built it in the first place, but there may also be unintended consequences,” Rich said. “When you’re doing your job, how do you ask yourself what the right thing to do is? As a person controlling new technology, do you have special responsibilities as a citizen of the planet?”