On Tuesday, the Tsai Center for Innovative Thinking at Yale hosted a conversation at the School of Management with venture capitalist Roger McNamee ’80, an early investor in Facebook and author of “Zucked: Waking Up to the Facebook Catastrophe.”

In his talk, McNamee discussed how his view of Facebook changed in the aftermath of the 2016 election upon witnessing what he called “the failure of Facebook senior executives to take responsibility for the exploitation of their platform for unethical pursuits involving private user data.” The event attracted around 120 attendees.

McNamee was an early mentor to current Facebook CEO Mark Zuckerberg and even played a role in recruiting Sheryl Sandberg as Facebook’s COO. But he came to believe that the trust he put in Facebook was misplaced, McNamee said.

“Basically all the business plans [in Silicon Valley] at that time were sociopathic. I thought Mark was different. But he wasn’t. I think I wanted to believe that Mark was still the right guy for the opportunity,” he said.

McNamee explained various ways in which tech companies such as Google and Facebook manipulate users by using their data. The examples he cited included selling the data of African American activists who expressed interest in the Black Lives Matter movement to police departments.

In addition, McNamee voiced concerns about how Facebook’s algorithms are assigned to favor the dissemination of the most polarizing content. According to McNamee, engineers at companies such as Facebook aim to maximize the time users spend on the platform. As a result, the engineers’ algorithms look for content that engages users the most — which is often content that triggers public outrage.

“It is easy to provoke [someone] with controversial issues like climate change or anti-Semitism. The whole system is designed to capture hate speech because it gets to the essence of better behavioral prediction,” he said.

According to Yale psychology professor Molly Crockett, who moderated the conversation, psychology research suggests that social media amplifies moral outrage. In response to McNamee’s point that users feel a “herd benefit” to sharing outrage, Crockett added that there is also an individual benefit to sharing outrage since it can earn that individual trust within their community.

Still, McNamee’s concerns were not exclusive to Facebook.

“Google terrifies me so much more,” he said.

According to McNamee, Google has surveilled user emails and messages under the guise of “targeted advertisement.” For Google, analyzing user emails is a particularly effective method of behavioral prediction because users communicate a great deal of information about future activities through email.

The use of “targeted advertisement” in the conventional sense — advertisements that analyze information about a particular user’s interests and past purchases to show customized ads that align with the user’s interests — is fairly uncontroversial among legislators and users, according to McNamee. The problem emerges when this data is used beyond advertisements, transforming into much more intimate and large-scale psychological manipulation.

McNamee said that Google has experimented with its capacity to influence user behavior in ways that some people may perceive as “frightening.” When Google started acquiring massive quantities of information about the location of users through Google Maps, the company ran behavioral modification experiments to see if they could alter a user’s behavior through different algorithms.

For example, given users’ locations, Google executives observed whether they could manipulate a user going to a particular Starbucks to walk to a different Starbucks three blocks away by offering a negligible discount. Users behaved exactly as intended, McNamee said. McNamee expressed concern that this level of control that Google has over consumers could eliminate consumer choices, posing the threat of monopolization.

In another example, McNamee explained that when Pokemon Go was launched and achieved immense popularity, Google experimented to see whether it could manipulate users into going to certain places they would not normally go.

“People are playing Pokemon Go. A Google algorithm would place a Pokemon across the side of a fence, and you would find people going over the fence into other people’s property to get the Pokemon,” he said.

“That sounds like an episode of Black Mirror,” Crockett added.

Nevertheless, McNamee expressed that he is “immensely optimistic” about potential changes in legislation regarding data protection as a result of increased civic engagement. He said that the key solution to data privacy-related problems is reducing the influence of technology companies such as Google and Facebook through legislation, civic engagement and conversations at universities, conferences and the workplace.

“Universities serve an important role in facilitating conversations on important issues and considering a range of views. I was glad to contribute to this discussion on technology and society,” Crockett said.

McNamee emphasized the importance of data protection legislation, which act similarly to antitrust laws — promoting fair competition for the benefit of consumers. With these potential legal ramifications, tech companies would have to develop new business models that ensure fairer competition, according to McNamee.

“We can reproduce current businesses with good business models,” he said.

Tech executives have begun publicly responding to these concerns. Last month, Zuckerberg expressed support for data regulation in an opinion article for the Washington Post, saying that “we need a more active role for governments and regulators.”

According to an official Facebook investor relations report, Facebook had 2.32 billion active users as of December 2018.

Viola Lee | kyounga.lee@yale.edu