Microsoft’s President on Privacy, Artificial Intelligence, and Human Rights

Brad Smith ’84 returns to the Law School for a conversation with Dean Gillian Lester and Professor Tim Wu about his new book, “Tools and Weapons: The Promise and The Peril of the Digital Age.”

Dean Gillian Lester, Microsoft President Brad Smith '84, and Professor Tim Wu discuss Smith's new book on October 1 at the Law School, as part of the Dean's Distinguished Lecture Series.

Before a rapt, standing-room-only audience of more than 300 students, faculty, and other members of the Law School community, Microsoft President and Chief Legal Officer Brad Smith ’84 returned to campus on October 1 to discuss his new book, Tools and Weapons: The Promise and the Peril of the Digital Age (cowritten with Carol Ann Browne).

The event with Gillian Lester, Dean and the Lucy G. Moses Professor of Law of Columbia Law School, and Professor Tim Wu, a leading authority on antitrust law who advocates for breaking up Big Tech companies, was the season’s first installment of the Dean’s Distinguished Speaker Series. “Brad may be the only tech executive who would willingly share the stage with Professor Wu, given Tim’s strong and well-articulated position on the perils associated with the bigness of today’s technology companies,” said Dean Lester in her introduction.

The conversation touched on a number of pressing concerns, including cybersecurity, government regulation, ethics, and human rights.

 

Consequences of Digital Technology

Smith’s book addresses the untold ramifications of digital technology’s ubiquity in our personal lives, our societies, and our economies. “It has reached the point where, as the title suggests, it actually is both a tool and a weapon,” he said. “We use the analogy of a broom: You can sweep the floor or hit somebody over the head. The more powerful the tool, the more powerful the weapon.”

The unintended consequences of digital technology, he added, threaten consumer privacy, jobs, public safety, and democracy. “There are real problems that need to be solved,” said Smith. “They require that the tech sector do more to step up to solve them. It requires that governments move faster and that we do enter a new phase with more law and regulation.”

However, the current lack of government oversight, said Smith, is unprecedented—and so is the industry’s new willingness to work with the government to police itself. “I’m not sure that any fundamentally important technology has gone so many decades with so little regulation as digital technology,” he said. When the internet exploded in the 1990s, Smith explained, it was seen as an extension of telecommunications, which was being deregulated in the U.S. “The first instinct of Congress was, Let’s give this technology immunity from the law and that way we will have more of an opportunity to flourish.” But that was 23 years ago, before Google and Facebook dominated the internet.

In 2005, Smith gave a speech on Capitol Hill advocating for a comprehensive national privacy law. He recalled warning Congress that it needed to act before individual states passed their own privacy legislation. When the first such state law—the California Consumer Privacy Act of 2018—goes into effect in 2020, Smith predicts it will virtually become a federal law. He says tech companies will find it simpler to apply California’s rules nationwide rather than having separate standards. “Which really means on a de facto basis, the capital of the United States for privacy in 2020 may not be Washington, D.C. It may be Sacramento.”

A packed lecture hall with Brad Smith, Tim Wu, and Dean Lester on stage.

The Ethics of AI

Does the tech sector need its own version of the Hippocratic oath? Smith thinks so. When Wu, the Julius Silver Professor of Law, Science and Technology, asked him about the idea, Smith pointed to his concerns about ethical questions raised by artificial intelligence. “I think we should consider [AI] to be the rapidly emerging dominant economic force of the next three decades,” he said. “Here we are fundamentally equipping machines with the power to make decisions that previously were only made by human beings. So you have to ask, ‘How do we want machines to make these decisions?’ And as soon as you ask that question, I think one of the things you realize is we probably want to make these decisions based on more than what people who study computer or data science learn in their disciplines.”

As for how to make these decisions, Smith said, “I think there is a much broader role for every discipline in a university. . . . I do think we need a cultural change across the tech sector in part to imbue not just the sense of responsibility but a more multidisciplinary approach.”

 

Opening Up Open Data

During a Q&A, Columbia Law Professor Eben Moglen, the founder of the Software Freedom Law Center, praised his longtime colleague and onetime adversary for leading Microsoft to embrace open source code after many years of opposition. He asked Smith how they could join forces today.

Smith suggested they could advocate for open data “because the whole technology field has really evolved from being about code to now being about code and data. . . . I do believe [open data] is something that’s badly needed to equalize the playing field. I think it speaks to the economic concerns of large companies. I think it also speaks to a strategy that’s needed for overall U.S. competitiveness.”

 

Ensuring Human Rights

Nina Gardner ’86, adjunct professor at Johns Hopkins School of Advanced International Studies and director of Strategy International, asked Smith about the due diligence needed to ensure that new products or services—such as facial recognition technology—don’t adversely impact human rights. He responded by saying it’s become essential to think about the confluence of human rights issues and technology trends that “give governments the potential to surveil citizens on the streets and in their homes and the risks of putting data centers in a country that does not respect human rights.”

Smith is a longtime supporter of human rights programs at Columbia Law School. In 2017, he and his wife, Kathy Surace-Smith ’84, pledged $1.25 million to support the Human Rights Clinic. The couple previously established the Smith Family Opportunity Scholarship for students from countries that are rarely represented at the Law School, and they are co-chairs of The Campaign for Columbia Law. Recently, Microsoft developed an app for the groundbreaking TrialWatch program, a collaboration between the Law School’s Human Rights Institute and the Clooney Foundation for Justice, which trains and sends monitors to observe legal proceedings in nations where human rights may be at risk.

However, Smith doesn’t claim that there are any easy solutions to ensuring people’s liberties and fundamental freedoms: “There are going to be very hard questions that companies and then ultimately governments are going to have to address about this new world and how we’re going to do everything we can, I believe, to avoid the future that George Orwell warned about in his book 1984.”

Watch the livestream of the entire event:

 

# # #

Published October 9, 2019

Back to latest news at Columbia Law