Richard Gingras on The Open Mind: News, Disinformation, and Free Expression

Richard Gingras on The Open Mind: News, Disinformation, and Free Expression


HEFFNER: I’m Alexander Hefner, your host on
The Open Mind. On the day of this recording, our guest, distinguished
public digital and new media executive newsgeist organizer, Google News Vice President Richard
Gingras announced a $25 million dollar investment to expand support for the news eco-systems
long-term success in online video, specifically to improve the news experience on YouTube,
including features to give viewers more context on issues that have been subject to misinformation. Richard overseas pages that connect more than
a billion unique readers each week, articles from journalists in 72 countries, 45 languages,
and we’re so grateful that you joined us here today. Thank you, Richard
GINGRAS: Thank you. It’s a pleasure and an honor. HEFFNER: You just came from an event where
you told journalists and your fellow Googlers that you wanted to enshrine Google News and
information that’s disseminated via Google, protect it from disinformation and misinformation
by investing those dollars. There is an urgency because we are told by
scholars like Zeynep Tufekci and others that the algorithms of Google, of YouTube, are
engendering cycles of disinformation or misinformation. How are we gonna address this fundamental
problem? GINGRAS: Well, you know, there are obviously
many challenges I think addressing our very new world today and I think it’s important
to set that stage. We live in a dramatically different world. We as human beings consume news and information
in dramatically different ways. Interestingly, consume more news and information
than ever before for many, many more sources, right? I mean the Internet in effect put the printing
press and everyone’s hand. There’s more knowledge and information available
than there has ever been the case in the history of civilization. At the same time, of course, as is generally
the case with free expression there is also bad information out there, what people might
refer to as misinformation and disinformation. So clearly it’s in our interest, I think it’s
in all of our interests for that matter to see how can we address the evolving ecosystem
of news and specifically from Google’s perspective, you know, what can we do from our position
in the ecosystem indeed as an operator of extremely popular services like Google search,
Google News, YouTube to use our capabilities and know how to help put the ecosystem in
a better place HEFFNER: And what is the central challenge
right now in protecting the integrity of fact on YouTube and Google more broadly. GINGRAS: Well, in fact, I think there are
many dimensions to that as well. You know, and it starts frankly, even with
the, with efforts to help educate our own populations about how they make better determinations
in what they trust and don’t trust. Media literacy is more important than it’s
ever been and not just in schools. Right? That’s an important element. How can we at Google with Google Search with
Google News, do our continuing best efforts to make sure that we’re not surfacing or recommending
information that is inaccurate or not to be trusted, but there are challenges there too,
which we could get into. Google Search is a search tool after all,
it’s designed to help you find information including the information in the darker corners
of the web. But the third point also is how do we, I have
long believed in the precept that the best way to fight bad information is with more
good information. So a big part of our efforts with the Google
News Initiative and one component was what we announced this morning with YouTube was
how do we enable the ecosystem itself, news organizations, both legacy existing news organizations
as well as new digital news organizations to evolve what they do to produce good quality
journalism such that we can have that in place. We as a society, Google as part of its results
in search can have that information available to users to help them be in a more informed
place. HEFFNER: And how can we help them? GINGRAS: We can help them in many ways. I think frankly, and again on all of those
dimensions, we can help with media literacy. We can help in our institute. Each of our institutions I think deserves
at this point a reconsideration of their role in today’s broad information society, right. As you and I have discussed, I think most
significant question facing us today, and I say us in the global sense, is how do open
societies and open democracies continue to thrive in an environment of unfettered free
expression, right? We’ve never had truly unfettered free expression
and as I said, a printing press in everyone’s hands. That’s a remarkable thing. I don’t think any of us would want to wind
back the clock on that. We put the First Amendment in everyone’s hands. It’s what allowed the worldwide web to become
what it is to be a such an extraordinary resource of information for people around the world,
around from subjects ranging from news to medical information, health information, so
on and so forth, right? That’s extraordinary. But in consequence of that in consequence
of the fact that people now can find information to suit their own perspectives, to suit their
own biases that I think it is ultimately upon us all as institutions, whether they be technological
companies, technology companies like Google or news institutions or governmental institutions
to think how do we evolve what we do to address these challenges, how do news organizations
evolve their adherence to the norms of journalism to help people understand what is indeed fact-based
information versus opinion, versus perspective, right? How do we at Google search and Google News
do our best to sort through that so that we can give our users, as I like to put it, the
tools and information they need to develop their own critical thinking about a subject
and form what I hope will be, they’re more informed, thoughtful, conclusion about that
and to do that in an assiduously apolitical way. Oh, not simple questions. HEFFNER: In Google News, you want verified
sources. And that continues to be the aspiration that
if you are discriminating against all the other pieces of information that if you Google
a particular term, person, idea, that if you click news instead of a more wide ranging
search, that it will give you the news. And that’s, isn’t that really important to
preserve? GINGRAS: There are many things that are important
to preserve as you point out. You know, we live in a society in the United
States that has constitutionally, an extraordinarily well crafted principle of free expression
and the First Amendment, I would say globally at the far extreme in terms of free expression. There are many countries, obviously that constraint
expression significantly. So in that regard, we’re at the extreme, which
also means that we’re very accommodating of information that is information that all of
us in our own way are uncomfortable with. In fact, you know, if you believe in the First
Amendment, then you have to accept the fact that there will be expression you don’t like. HEFFNER: America has a distinctive climate
that protects constitutionally authorized speech. In effect, most anything is authorized from
the user’s perspective. It can be most counterproductive to society
and still it’s something that you may find in a Google search. But do we want those dark corners of the web
to be highlighted in News? GINGRAS: Well, yes, as you point out, I mean,
actually the First Amendment guarantees unauthorized speech and in truth, which means yes, we will
have bad speech. I think in terms of, you know, obviously when
people come to Google Search or come to Google News, they’re looking for what they hope will
be the right answer, you know, and in many cases we know the right answer. If you ask us how tall is Theresa May will
come back and tell you she’s five feet nine inches tall, but obviously so many questions
and so many issues, there is no singular right answer. And so we see it as our role again, how do
we give people multiple perspectives, multiple sources of expression, so they can come to
their own opinion. What we’re very cautious of is two things. We want to, on the one hand, do as great a
job as we can at surfacing authoritative information from authoritative sources, right? With each one of those words chosen particularly,
we want to do that at the same time as I pointed out, it is a search engine for instance. You should be able to find even the bad stuff
and often you can, you know, the example I sometimes use is, you know, if you do a query
for peach pits as a cancer cure, then you will find documents out on the web that say,
yeah, maybe it can be, in fact maybe some sites that want to sell you the powder because
guess what, there doesn’t happen to be some recent fresh articles from the New York Times
or the New England Journal of Medicine saying, Oh, maybe it doesn’t, right? You will find this stuff. And we are. We’re also cautious and recognizing that in
serving that role of being a search engine and in serving that role of identifying authoritative
information, we also don’t want to be the ultimate determinators of what is acceptable
or unacceptable free expression. Right? I think that’s a very, very important distinction
for us to make, do our best to surface the best possible information that’s out there
as we can determine, but allow you to find anything. HEFFNER: But you were saying to me before
that you visited countries, Singapore and Mexico that each have their own character
of speech and discourse that they seek to preserve. Mustn’t we preserve a character in our discourse
to that is and bigoted that is not marketing or monetizing disinformation, bigotry hate. GINGRAS: Well, I think we as a society, obviously
I would hope that we would strive towards the society with cultural norms that are those
you suggest. But I, here too, I want to be cautious and
clear that it’s not the role of any particular tech company or Google to decide what the
societal norms are or are not. You know, actually when I talk about Google
search, the way I phrase it around the world is find anything that’s findable in the corpus
of legal expression because in different parts of the world, what’s illegal is different,
right? In Germany for instance, suggesting that the
Holocaust did not happen is against the law, right? We obviously respect that in Google search
results. In the United States the word truth isn’t
in the First Amendment, right? So here too, we’re very careful at how we
put our thumb on the scale to determine what is appropriate or inappropriate for citizens
to find and consume from information online. But there are many dimensions of this. You mentioned monetizing content. Yes. We have, for instance, we have ad platforms
that are used by 2 million publishers around the world. We do make our best efforts to make sure those
tools aren’t being used by producers of content who are misrepresenting themselves or misrepresenting
content for being something that it’s not. So we can in different parts of our business
operate in different ways to try to have an effective influence on the ecosystem. But atcore, it really it’s appropriate for
it to reflect society as it is, right? HEFFNER: And so the argument would be that
in order for the American humanity to be reflected, as in the German example, the law is what
you ultimately will heed as it relates to free expression and if there is a legal statute
that is ratified that bans certain speech, whether it’s the KKK or any particular hate
group, then that speech would not be authorized in Google and Google News would heed that
accordingly. So it takes, it takes steps proactively from
society to then reflect what may or may not be permissible on your platform. GINGRAS: That’s true. And of course that always gets into the questions
for society, for an individual society, our own, for instance, again, the First Amendment
here is very broad, right? Hate speech for instance there’s a very,
very high bar for what is considered hate speech in the United States. So theoretically one could attempt to pass
laws to constrain free expression. A lot of people would obviously argue with
that because they would fear it’s a slippery slope. What do we, is acceptable versus not. HEFFNER: To that end, how can your current
project expand on, on sort of the better angels of our discourse in providing people context
you see now on Twitter and Facebook and in Google platforms that Wikipedia for instance,
is integrated and so you have a more reliable stream of information and you can also see
whether that indexed Google outlet is in effect, verified beyond what we perceive as really
the important verifications that were needed during the 2016 campaign and were absent. What next? GINGRAS: Well, I think as you point out, you
know, many of our efforts have been how can we collaborate with the industry, with the
news industry, with the journalism community to move things forward. You know one key effort that I was engaged
and founding as an effort called the Trust Project, run by a brilliant woman by the name
of Sally Lehrman. And it’s an effort of the journalism community
to basically reconsider the norms and how those norms are presented to users. It asked the basic question in an information
world is chaotic, is ours. Should I have a better sense of why this piece
of information should be deemed credible? Would it not be helpful for me to have a better
sense from a news organization as to is this an opinion piece or is it fact based coverage? What do we know about the author? What about the author would help me get comfortable
that they might know what they know. Expertise matters, expertise, seals authority,
so we do think that there are institutional steps that can be made as institutions organically
to better address what journalism is, how it presents itself, and thus obviously from
Google’s perspective, allow us to do a better job of understanding what is fact based, what’s
opinion, how do we present that to users such that they can have a better understanding
of what it is they’re consuming. HEFFNER: and how do we transcend from users
to citizens in the engagement and how do we do that? GINGRAS: Well, to me it comes back down to
journalism again and our role certainly in the ecosystem as well, but you know, there
are many definitions of journalism. My favorite definition and my personal definition
of journalism and how do we give citizens the tools and information they need to be
good citizens, right? How do we give them the knowledge and enlightenment
to go to the polls and make good judgments and there’s a lot more we can do, right? I feel there’s so much more we can do in reinventing
and rethinking what journalism is. I’ll give you an example. Data journalism I think has enormous potential
to help us have a better sense of context about stories. Right? Too often today, you know, I’ll give you an
example. Last year we had the unfortunate attack on
the British parliament, right? Our cable news networks here went wall to
wall for two, three days in their coverage of this event. Sad event. Four people died on each of those days in
the United States. There were mass murders of four or more people
that didn’t get covered. How do we give our citizens a sense of context
about what’s important and what’s not? How do we give people in our communities,
for instance, an understanding of the key metrics of their communities beyond the weather
for them to understand to what extent is crime an issue, or graduation from schools an issue,
or air quality an issue or housing costs in issue so that when they go to the polls, they’re
going to the polls informed and ready to vote about issues that really matter to their community,
not based on perspectives that were driven by fears. HEFFNER: Well, those salient details, it is
in your discretion and due diligence to elevate them for Google readers. GINGRAS: Oh wait, that’s what. And I would agree. And I would love to do so. HEFFNER: You are doing so. GINGRAS: We are, we strive to do more. And part of it is again, is how do we simply
evolve and I say we, the community of journalists, the publishing news organizations and so on,
evolve their own practices. How do they get more data into their coverage
so that when they cover an incident, they actually give you the context that says this
is not a one-time thing. This happens a lot. It’s an issue we should consider, or by the
way, this is anomaly, right? I mean news by definition tends to cover anomalistic
events, right, they’re notable because they’re anomalistic event, but data and statistics
can help us get a sense of is it an anomaly or is it not? Do I need to be concerned or not? HEFFNER: In terms of engagement, Jay Rosen
says the most important words to a journalist or to a reader are “help me investigate”
and I think that’s a piece of this too, so that Google and those other social networks
that are the aggregators, that are the hosts of this information are not viewed as a non
engaging party but are interactive with readers and citizens. GINGRAS: I think that’s very true. And people and even I think Jay sometimes
uses that term in different ways. “Help me investigate” help the journalist
investigate the problem because that too can be a factor. Is this a problem in your community? Help us understand its true nature, but it
also can mean how is that corpus of news information? How does Google in representing that Corpus
of news information is it giving users, again, the tools they need to investigate and understand
an issue. HEFFNER: And what do you find to be the unifying
need on the part of journalists and Singapore, Mexico, the Scandinavian countries that you
visited recently? Is there an overwhelming unified need a given
that these tech platforms have in effect co-opted the news industry or at least are the host
of the news content? GINGRAS: Well, I, I would, I would disagree
on the notion of cooperative, but I think the key thing is to understand how dramatically
the world has changed and why it’s changed in how one might respond to that, right? How have the business models change? Why of the business models change, how do
information consumption practices, how have they changed and therefore how do I need to
think about how I present information to them going forward? As I said, these are culturally significant
impacts that we’re seeing and we can’t address them until we understand them. That to me continues to be the biggest challenge. We’re 25 years into the Internet and our level
of understanding is still significantly low. Not surprisingly, right. I mean, there was a, there was a sociologist
in the early fifties who surfaced the notion, he said with any technological change, you
know, the inventors of that technology had a particular purpose in mind, but there are
often secondary consequences, always secondary consequences. Then he said there is always a cultural lag
in our understanding of the impact of technology and that certainly has been the case with
the Internet, right? Is, we’re experiencing that cultural lag
between the idea of putting the printing press in everyone’s hands in that true impact on
our society and both positive and negative ways. How it changes marketplaces for information,
how it changes marketplaces for ideas. HEFFNER: I want to return to the central issue
we started with which was the algorithms that do produce a vicious cycle sometimes on YouTube
of misinformation and sometimes hate mongering or bigotry associated with particular users
of YouTube. There’s a campaign Sleeping Giants that wants
to make bigotry less profitable and is petitioning Google and YouTube every day for certain accounts
to be removed. Knowing that you stipulated what Google’s
position is, which is much like the position that Jack Dorsey has taken at Twitter, in
that climate, when we feel like the commenters of news stories, have hijack the discourse
so that anti-semitism or bigotry have, they have equal weight to pro social ideals, tolerance,
understanding how can our audience and how can you address the problem so that we can
have the unfettered expression but not feel as though hate is monopolizing the content. GINGRAS: I think you start off with simply
recognizing the challenge, in recognizing, by the way that it’s a very, very complex
challenge. The algorithm, you mentioned the algorithm
on YouTube. Yes it will, it will look to satisfy your
interest as a woodworker. I look at a woodworking video. Guess what? It’s going to recommend more woodworking videos,
right? It’s part of the nature of what YouTube is
and clearly that can happen with controversial content as well. And here too, we try as best to address that. We try to make sure that people are on YouTube,
are satisfying YouTube’s policies while also being careful not to exercise a particularly
heavy hand on determining what free expression is or is not. There are interestingly, troubling secondary
consequences. Right? In the last year or so when there have been
these controversies about controversial content on YouTube, right? Many major brands said, we don’t want to advertise
against controversial content. Well guess what? Controversial content includes people sharing
videos about transgender rights, about human rights, about all kinds of powerful issues
that are also in the minds of others controversial and they don’t get funded either because the
big advertiser says, I don’t want my ads next to controversial content. So these are very, very tricky challenges
because there are always, as I mentioned before, there are always secondary consequences is
how do you look to theoretically address this perceived ill behavior and not untowardly
address other forms of behavior that some may or may not think are ill behavior as well. HEFFNER: Richard, I think of Martin Luther
King Jr. The internet is vast and you have, I think,
a really essential role in bending the internet towards justice, not barring speech, but bending
the internet towards justice can, can we together embark on that mission? Is that… GINGRAS: I sure hope so and honestly it’s
the mission I’m on. It’s a mission we’re on and for good reason. By the way, some people ask like, why does
Google do this, right? When you just want to make friends with the
publishing industry or keep your critics from criticizing you, actually no, I mean, that’s
not a bad thing to accomplish by the way, but if you think about our business, people
talk about platforms today and it’s a dangerous word because platforms are very different. Google, Google specifically our platform is
the open web, the Google Search, the value of Google Search would diminish to the extent
there was not a rich, knowledgeable ecosystem called the web. Our ad technologies would not be as successful
as they are if publishers didn’t find success on the web, so we have intrinsic business
interest to make sure that the open web continues to thrive and be successful. I’m optimistic about that. I’m optimistic about the future of journalism,
the future of news, the future of open societies, but only if we all step forward in our own
ways and recognize the challenges and address them with evolving journalistic norms, with
our own evolving technological approaches to how we address these issues, right, and
with our own, hopefully governmental and political wisdom to be careful in what to what extent
we use regulation to impose on these problems, right? The biggest with the whole notion of fake
news, and don’t get me wrong, fake news is not a good thing. Misinformation is not a good thing, but in
too many places around the world right now, fake news is simply a very, very good attractive
pawn for some politicians to take steps towards constraining free expression, right? There are a lot of people out there looking
to very good, do good things in the policy arena, but there are also some who would just
prefer to say, well, maybe we should constrain it. You know, I don’t like those independent journalists,
those bloggers over there who are constantly criticizing me. Right? So you’re it. It requires, I think, really thoughtful judgment
on all our parts, technological, journalistically, in the public policy sphere. If we take that wisdom forward, I think we’ll
be okay. HEFFNER: I think about sunlight as that disinfectant,
that the gods are watching us as we decide with our due diligence, like I said before,
how we respond. GINGRAS: Thank you. HEFFNER: And thanks to you in the audience. I hope you join us again next time for thoughtful
excursion into the world of ideas. Until then, keep an open mind. Please visit The Open Mind website at Thirteen.org/OpenMind
to view this program online or to access over 1,500 other interviews and do check us out
on twitter and Facebook @OpenMindTV for updates on future programming.

Author:

Leave a Reply

Your email address will not be published. Required fields are marked *