Wikimedia Commons

NewsGuard Technologies, a new company co-founded by Steven Brill ’72 LAW ’75 and Gordon Crovitz LAW ’86, recently launched its first product — a web browser extension that flags unreliable news sources and misinformation posted online.

Brill and Crovitz founded the company in March with the aim of restoring trust and accountability in journalism. Last month, NewsGuard launched its free web browser extension, which rates websites that appear in searches based on their reliability. The company employs experienced journalists — rather than a computer-based algorithm — to assess the credibility and transparency of each website before assigning a rating.

“The goal is to take advantage of the fact that, every once in a while, human intelligence is better than artificial intelligence,” Brill said in an interview with the News. “We are using journalists to fight unreliable news and disinformation by applying nine basic journalistic criteria to the thousands of websites responsible for 98 percent of the news and information that is shared online in the United States.”

The NewsGuard browser plug-in displays reliability ratings through a shield icon that appears next to articles on social media and news websites. A green shield indicates that a source upholds basic journalistic standards of accuracy and accountability, according to NewsGuard’s standards. Red shields designate sources that fail to meet these standards. The plug-in is available for installation on Google Chrome, Microsoft Edge and Mozilla Firefox.

By hovering over the colored shield, users can also access NewsGuard’s “nutrition labels,” which offer thorough explanations of each news sites’ ownership and financing, content, credibility, transparency and history. The nutrition labels display NewsGuard’s nine weighted criteria — including the responsible presentation of information to the use of deceptive headlines — that are used to evaluate each website.

Brill emphasized that NewsGuard is human-driven rather than algorithm-based. A team of trained analysts and experienced journalists — seven of whom have attended or are attending Yale College — meticulously reviews each website before drafting a nutrition label. And each nutrition label is reviewed by at least two senior editors. The News’ current editor in chief, Rachel Treisman, worked for NewsGuard over the summer.

Multiple companies have emerged over the past year with the aim of combating online misinformation. Most notably, Facebook recently announced that it will use artificial intelligence to identify unreliable articles and to prevent its users from accessing them.

“The whole point of what we’re trying to do is to use intelligent humans,” Brill said. “Artificial intelligence is very good at identifying websites with hate speech or pornography, … but the whole idea of fake news is to make it look like real news. Spotting it takes reporters.”

Since NewsGuard launched its first product, some users have taken to Twitter to express their disapproval of the company’s method of rating websites. One user with the handle @SnowyEvergreen lamented that the company’s employees could characterize a website as “fake” just because they disagreed with its content.

“What criteria [does NewsGuard] use to judge the difference between #FakeNews and #Truth, #false and #reality?” the user asked.

But Anna-Sophie Harling ’16, NewsGuard’s vice president of business development, maintained that the company’s ratings were objective and unbiased, partly because every review is subjected to multiple rounds of drafting and editing by experienced journalists as well as senior editors. She added that “red ratings are spread equally across the political spectrum,” which may serve as evidence that NewsGuard does not lean one way politically.

Harling stressed that the rating criteria, as well as the names of the writers and editors responsible for the rating, are all clearly stated on each nutrition label. This transparency not only gives NewsGuard more credibility, Harling argued, but also differentiates the company from algorithms that leave social media users wondering what causes certain articles to appear on their feeds.

“What really separates NewsGuard from its competitors is that it’s not just an alternative to algorithms — it’s an alternative to censorship,” Harling told the News. “Rather than removing websites from a person’s feed or search results, the browser extension gives them information about whether the information it publishes is credible by basic journalistic standards. We don’t tell people what to read, or what not to read. Instead, we tell people who’s behind the news they’re reading and let them decide whether to keep reading.”

NewsGuard is working with Microsoft as part of the tech company’s Defending Democracy Program, which aims to combat cyber-enabled interference in democratic countries. Though the company offers its browser extension free of charge, it plans to make a profit by licensing its product to social media sites. NewsGuard is also seeking to affect the economic underpinning of news sites by selling a product that will allow advertising agencies to place their advertisements only on reliable sites, which will help secure more ads for legitimate news organizations.

According to Brill, 10 to 15 percent of sites shared online do not meet NewsGuard’s basic standards of reliability and transparency.

Lorenzo Arvanitis | lorenzo.arvanitis@yale.edu

LORENZO ARVANITIS