German President Frank-Walter Steinmeier was among the distinguished panelists who gathered at Harvard Law School for a panel and webcast, “Ethics of the Digital Transformation.” Presented by the Berkman Klein Center for Internet & Society at Harvard University, the event brought together academics, philosophers, and scientists to ponder moral and practical issues related to the rise of technology and artificial intelligence.
Introduced by Berkman Klein Center Executive Director Urs Gasser LL.M. ’03, Steinmeier began his talk with a question that he said is key to his presidency and to the future of liberal democracy: How do social media and the internet, particularly Twitter and Facebook, change the democratic culture of debate?
“Despite the daily waves of outrage that you have to live with, how can we make sure that we keep a general overview?” he asked through a translator. “How can we distinguish what is important from what is unimportant, and does this culture of thinking in simple opposites take away from our ability to see the nuances? Do we continue to be capable of entering into compromises, which I believe to be vital for any democracy … if we no longer have time to differentiate because it’s no longer popular?”
Steinmeier was particularly concerned with the place of the individual in an age of technological advance. He saw an ethical dilemma in the practice of social credit scoring, which is now being widely used in China. Companies there are keeping extensive databases that quantify the economic trustworthiness of individuals, rewarding some and blacklisting others. This, he said, caused some debate between him and the Chinese government on a recent visit.
“We who live under different political circumstances are scared and shocked by the idea of having to submit to total surveillance of all aspects of our lives,” he said. “No matter what we did, it might be linked up to a system that assesses our performance in a negative or positive way, and this of course has an effect on the way we develop as human beings. Hopes, wishes and dreams are becoming externalized; they are stored on a software that I no longer have any influence over. Our concept of individual responsibility and personal freedom is being called into question by such an approach.”
VIDEO: Urs Gasser on AI governance and advising the German government
In this short Q&A, Urs Gasser, professor of practice and executive director of the Berkman Klein Center for Internet & Society, discusses his work on governance of artificial intelligence; German President Frank-Walter Steinmeier’s November visit to the Berkman Klein Center; and advising Chancellor Angela Merkel as part of the German Digital Council.
Steinmeier warned that German and American companies are likely to employ the same practice in the future, making it necessary to examine the morality of the digital age. “That brings me to the question of whether we really need a more intensive exchange between the tech community and the political scientists about the philosophy of the individual. I would like to leave Boston today having received the confirmation from all of you that I need not be afraid, that the debate is taking place.”
Other panelists explored issues related to the individual and the algorithm. HLS Professor Crystal Yang ’13 looked at the criminal justice system, where algorithms are now used in pre-trial, bail, sentencing and parole decisions to measure “risk assessment,” to predict someone’s future criminality. In Los Angeles, historical data on crimes in specific locations is also used to determine the possibility of future crimes. Another new algorithm, called Compass, uses a subject’s answers in a survey to determine their likelihood of recidivism. “These raise a huge host of issues and challenges,” Yang said; including the possibility that using algorithms to predict future risk “will only entrench or exacerbate the inequality that we see in society at large.” On the other hand, she noted, human decision-makers may prove to evince more bias than the algorithms.
Matthew Liao, director of the Center of Bioethics at New York University, explored the mixed blessing of AI in health care. He noted recent medical advances using AI, which can detect cancer cells, determine the viability of an embryo, and even detect suicidal thoughts in the brain—and reduce health care costs accordingly. Yet they also open ethical concerns, particularly on the collection of personal data. “One of the things we need to worry about is whether they are collecting the data appropriately.” More notably, data can be fallible: Machines can solve complex problems using images, but researchers have found that if a single pixel is changed, that can upset the whole equation—one pixel in a picture of a car will cause the machine to perceive it as a picture of a dog. “Just imagine deploying that kind of machine learning in health care or self-driving cars, where people’s lives are at stake.”
Wolfgang Schulz, a professor of media and public law at the University of Hamburg, suggested that the discussion on humans and technology needs to be reframed. “We often have a distinction: Here is the society and here is the technology. That can be a dangerous thing because it frames technology as a natural disaster that is coming and we have to build walls to protect against it. Technology forces us to ask those hard questions about societal values and to better understand what makes human decision-making so special.”
Early in the panel, the audience also got a glimpse of the German president’s humorous side. He recalled a previous trip to Boston, where he visited Fenway Park accompanied by Condoleezza Rice and agreed to throw a pitch. “I was extremely naïve; never before had I held a baseball in my hands. My colleague was aghast when she heard what I was about to do. But this is typical of us Germans: I had accepted and didn’t want to go back on my promise. The stadium was filled with people, and I had a certain feeling that they hadn’t come because of me. The Red Sox were playing the Yankees and I realized this was not just any match; it was really about religious issues.”