Facebook CEO Mark Zuckerberg testifies before Congress on data scandal

Facebook CEO Mark Zuckerberg testifies before Congress on data scandal

Committees on the Judiciary and Commerce, Science and
Transportation will come to order. We welcome everyone to today’s
hearing on Facebook’s social media privacy and the
use and abuse of data. GRASSLEY: Although
not unprecedented, this is a unique hearing. The issues we will consider
range from data privacy and security to consumer protection
and the Federal Trade Commission enforcement touching on
jurisdictions of these two committees. We have 44 members
between our two committees. That may not seem like a large
group by Facebook standards … (LAUGHTER) … but it is significant here for
a hearing in the United States Senate. We will do our best to keep
things moving efficiently given our circumstances. We will begin with opening
statements from the chairmen and ranking members
of each committee, starting with Chairman
Thune, and then proceed to Mr. Zuckerberg’s opening statement. We will then move
onto questioning. Each member will have five
minutes to question witnesses. I’d like to remind the members
of both committees that time limits will be and must be
strictly enforced given the numbers that we have here today. If you’re over your time,
Chairman Thune and I will make sure to let you know. There will not be a
second round as well. Of course there will be
the usual follow-up written questions for the record. Questioning will alternate
between majority and minority and between committees. We will proceed in order
based on respective committee seniority. We will anticipate a couple
short breaks later in the afternoon. And so it’s my pleasure to
recognize the chairman of the Commerce
Committee, Chairman Thune, for his opening statement. SEN. JOHN THUNE (R-S.D.):
Thank you, Chairman Grassley. Today’s hearing
is extraordinary. It’s extraordinary to hold
a joint committee hearing. It’s even more extraordinary to
have a single CEO testify before nearly half of the
United States Senate. But then, Facebook is
pretty extraordinary. More than 2 billion people
use Facebook every month. 1.4 billion people
use it every day; more than the population of any
country on Earth except China, and more than four times the
population of the United States. It’s also more than 1,500 times
the population of my home state of South Dakota. Plus, roughly 45 percent of
American adults report getting at least some of
their news from Facebook. In many respects, Facebook’s
incredible reach is why we’re here today. We’re here because of
what you, Mr. Zuckerberg, have described as
a breach of trust. A quiz app used by approximately
300,000 people led to information about 87 million
Facebook users being obtained by the company Cambridge Analytica. There are plenty of questions
about the behavior of Cambridge Analytica and we expect to hold
a future hearing on Cambridge and similar firms. But as you’ve said, this is
not likely to be an isolated incident; a fact demonstrated by
Facebook’s suspension of another firm just this past weekend. THUNE: You’ve promised that when
Facebook discovers other apps that had access to
large amounts of user data, you will ban them and
tell those affected. And that’s appropriate, but it’s
unlikely to be enough for the 2 billion Facebook users. One reason that so many people
are worried about this incident is what it says about
how Facebook works. The idea that for every person
who decided to try an app, information about nearly 300
other people was scraped from your service is,
to put it mildly, disturbing. And the fact that those
87 million people may have technically consented to making
their data available doesn’t make those people
feel any better. The recent revelation that
malicious actors were able to utilize Facebook’s default
privacy settings to match email addresses and phone numbers
found on the so-called Dark Web to public Facebook profiles
potentially affecting all Facebook users only
adds fuel to the fire. What binds these two incidents
is that they don’t appear to be caused by the kind of negligence
that allows typical data breaches to happen. Instead they both appear to be
the result of people exploiting the very tools that you
created to manipulate users’ information. I know Facebook has
taken several steps, and intends to take more,
to address these issues. Nevertheless, some have warned
that the actions Facebook is taking to ensure that third
parties do not obtain data from unsuspecting
users, while necessary, will actually serve to enhance
Facebook’s own ability to market such data exclusively. Most of us understand that
whether you are using Facebook or Google or some
other online services, we are trading certain
information about ourselves for free or low-cost services. But for this model to persist,
both sides of the bargain need to know the stakes
that are involved. Right now I am not convinced
that Facebook’s users have the information that they need
to make meaningful choices. In the past, many of my
colleagues on both sides of the aisle have been willing to defer
to tech companies’ efforts to regulate themselves,
but this may be changing. Just last month, in
overwhelming bipartisan fashion, Congress voted to make it easier
for prosecutors and victims to go after websites that knowingly
facilitate sex trafficking. This should be a wake-up
call for the tech community. We want to hear
more, without delay, about what Facebook and other
companies plan to do to take greater responsibility for
what happens on their platforms. How will you
protect users’ data? How will you inform users about
the changes that you are making? And how do you intend to
proactively stop harmful conduct instead of being forced to
respond to it months or years later? Mr. Zuckerberg, in many ways
you and the company that you created, the story that you’ve
created represents the American Dream. Many are incredibly
inspired by what you’ve done. At the same time, you
have an obligation, and it’s up to you, to ensure
that that dream does not becalm a privacy nightmare for
the scores of people who use Facebook. This hearing is an opportunity
to speak to those who believe in Facebook and those who are
deeply skeptical about it. We are listening, America is
listening and quite possibly the world is listening, too. GRASSLEY: Thank you. Now Ranking Member Feinstein. DIANNE FEINSTEIN
(D-CALIF.): Thank you very much, Mr. Chairman. Chairman
Grassley, Chairman Thune, thank you both for
holding this hearing. Mr. Zuckerberg, thank
you for being here. You have a real opportunity this
afternoon to lead the industry and demonstrate a meaningful
commitment to protecting individual privacy. We have learned over
the past few months, and we’ve learned a
great deal that’s alarming. We’ve seen how foreign
actors are abusing social media platforms like Facebook to
interfere in elections and take millions of Americans’ personal
information without their knowledge in order to manipulate
public opinion and target individual voters. Specifically, on
February the 16th, Special Counsel Mueller
issued an indictment against the Russia-based Internet Research
Agency and 13 of its employees for interfering (sic) operations
targeting the United States. Through this 37-page indictment,
we learned that the IRA ran a coordinated campaign through
470 Facebook accounts and pages. The campaign included ads and
false information to create discord and harm
Secretary Clinton’s campaign, and the content was seen by an
estimated 157 million Americans. A month later, on March
17th, news broke that Cambridge Analytica exploited the personal
information of approximately 50 million Facebook users without
their knowledge or permission. And, last week, we learned
that number was even higher: 87 million Facebook users who had
their private information taken without their consent. Specifically, using a
personality quiz he created, Professor Kogan collected the
personal information of 300,000 Facebook users, and then
collected data on millions of their friends. It appears the information
collected included everything these individuals had on
their Facebook pages and, according to some reports, even
included private direct messages between users. Professor Kogan is said to have
taken data from over 70 million Americans. It has also been reported that
he sold this data to Cambridge Analytica for $800,000 dollars. Cambridge Analytica then
took this data and created a psychological warfare tool
to influence United States elections. In fact, the CEO, Alexander
Nix, declared that Cambridge Analytica ran all
the digital campaign, the television campaign, and its
data informed all the strategy for the Trump campaign. The reporting has also
speculated that Cambridge Analytica worked with the
Internet Research Agency to help Russia identify which
American voters to target, which its – with its propaganda. I’m concerned that press reports
indicate Facebook learned about this breach in 2015, but appears
not to have taken significant steps to address
it until this year. So this hearing is
important, and I appreciate the conversation we had yesterday. And I believe that Facebook,
through your presence here today and the words
you’re about to tell us, will indicate how strongly your
industry will regulate and/or reform the platforms
that they control. FEINSTEIN: I believe this is
extraordinarily important. You lead a big company
with 27,000 employees, and we very much look
forward to your comments. Thank you, Mr. Chairman. GRASSLEY: Thank you,
Senator Feinstein. The history and growth of
Facebook mirrors that of many of our technological giants. Founded by Mr.
Zuckerberg in 2004, Facebook has exploded
over the past 14 years. Facebook currently has over 2
billion monthly active users across the world,
over 25,000 employees, and offices in 13 U.S. cities and various
other countries. Like their expanding user base,
the data collected on Facebook users has also skyrocketed. They have moved on from schools,
likes and relationship statuses. Today, Facebook has
access of data points, ranging from ads
that you’ve clicked on, events you’ve
attended and your location, based upon your mobile device. It is no secret that Facebook
makes money off this data through advertising revenue,
although many seem confused by or altogether
unaware of this fact. Facebook generates – generated
$40 billion in revenue in 2017, with about 98 percent coming
from advertising across Facebook and Instagram. Significant data collection
is also occurring at Google, Twitter, Apple, and Amazon. And even – an ever-expanding
portfolio of products and services offered by these
companies grant endless opportunities to
collect increasing amounts of information on their customers. As we get more free or
extremely low-cost services, the trade-off for the American
consumer is to provide more personal data. The potential for further
growth and innovation based on collection of data
is unlimitedless. However, the potential for
abuse is also significant. While the contours of the
Cambridge Analytica situation are still coming to light, there
was clearly a breach of consumer trust and a likely
improper transfer of data. The Judiciary Committee
will hold a separate hearing exploring Cambridge and
other data privacy issues. More importantly, though, these
events have ignited a larger discussion on consumers’
expectations and the future of data privacy in our society. It has exposed that consumers
may not fully understand or appreciate the extent to
which their data is collected, protected,
transferred, used and misused. Data has been used in
advertising and political campaigns for decades. The amount and type
of data obtained, however, has seen a
very dramatic change. Campaigns
including Presidents Bush, Obama and Trump all use these
increasing amounts of data to focus on microtargeting and
personalization over numerous social media platforms,
and especially Facebook. In fact, Presidents – Obama’s
campaign developed an app utilizing the same Facebook
feature as Cambridge Analytica to capture the information
of not just the app’s users, but millions of their friends. GRASSLEY: The digital director
for that campaign for 2012 described the data-scraping
app as something that would, quote, ìwind up being the
most groundbreaking piece of technology developed for
this campaign,î end of quote. So the effectiveness of these
social media tactics can be debated. But their use
over the past years, across the political spectrum,
and their increased significance cannot be ignored. Our policy towards data privacy
and security must keep pace with these changes. Data privacy should be
tethered to consumer needs and expectations. Now, at a minimum, consumers
must have the transparency necessary to make an informed
decision about whether to share their data and
how it can be used. Consumers ought to
have clearer information, not opaque policies and complex
click-through consent pages. The tech industry has an
obligation to respond to widespread and growing concerns
over data privacy and security and to restore
the public’s trust. The status quo no longer works. Moreover, Congress must
determine if and how we need to strengthen privacy standards
to ensure transparency and understanding for the billions
of consumers who utilize these products. Senator Nelson. BILL NELSON (D-FLA.):
Thank you, Mr. Chairman. Mr. Zuckerberg, good afternoon. Let me just cut to the chase. If you and other social media
companies do not get your act in order, none of us are going
to have any privacy anymore. That’s what we’re facing. We’re talking about personally
identifiable information that, if not kept by the social media
– media companies from theft, a value that we have in America,
being our personal privacy – we won’t have it anymore. It’s the advent of technology. And, of course, all
of us are part of it. From the moment that we
wake up in the morning, until we go to bed, we’re
on those handheld tablets. And online companies like
Facebook are tracking our activities and
collecting information. Facebook has a responsibility
to protect this personal information. We had a good
discussion yesterday. We went over all of this. You told me that the
company had failed to do so. It’s not the first time that
Facebook has mishandled its users’ information. The FTC found that Facebook’s
privacy policies had deceived users in the past. And, in the present case,
we recognize that Cambridge Analytica and an app developer
lied to consumers and lied to you, lied to Facebook. But did Facebook watch
over the operations? We want to know that. And why didn’t Facebook notify
87 million users that their personally identifiable
information had been taken, and it was being also used –
why were they not informed – for unauthorized political purposes? NELSON: So, only now – and I
appreciate our conversation – only now, Facebook has pledged
to inform those consumers whose accounts were compromised. I think you are genuine. I got that sense in
conversing with you. You want to do the right thing. You want to enact reforms. We want to know if
it’s going to be enough. And I hope that will be
the in the answers today. Now, since we still don’t know
what Cambridge Analytica has done with this data, you
heard Chairman Thune say, as we have discussed, we want to
haul Cambridge Analytica in to answer these questions
at a separate hearing. I want to thank Chairman Thune
for working with all of us on scheduling a hearing. There’s obviously a great deal
of interest in this subject. I hope we can get to
the bottom of this. And, if Facebook and other
online companies will not or cannot fix the
privacy invasions, then we are going
to have to – we, the Congress. How can American consumers trust
folks like your company to be caretakers of their most
personal and identifiable information? And that’s the question. Thank you. GRASSLEY: Thank you, my
colleagues and Senator Nelson. Our witness today
is Mark Zuckerberg, founder, chairman, chief
executive officer of Facebook. Mr. Zuckerberg launched
Facebook February 4th, 2004, at the age of 19. And, at that time, he was a
student at Harvard University. As I mentioned previously, his
company now has over $40 billion of annual revenue
and over 2 billion, monthly, active users. Mr. Zuckerberg,
along with his wife, also established the Chan
Zuckerberg Initiative to further philanthropy causes. I now turn to you. Welcome to the committee,
and, whatever your statement is orally – if you
have a longer one, it’ll be included in the record. So, proceed, sir. MARK ZUCKERBERG:
Chairman Grassley, Chairman Thune,
Ranking Member Feinstein, Ranking Member Nelson and
members of the committee, we face a number of
important issues around privacy, safety and democracy. And you will rightfully have
some hard questions for me to answer. Before I talk about the steps
we’re taking to address them, I want to talk
about how we got here. Facebook is an idealistic
and optimistic company. For most of our existence, we
focused on all of the good that connecting people can do. And, as Facebook has grown,
people everywhere have gotten a powerful new tool for staying
connected to the people they love, for making their
voices heard and for building communities and businesses. Just recently, we’ve seen the
ìMe Tooî movement and the March for our Lives
organized, at least in part, on Facebook. After Hurricane Harvey, people
came together to raise more than $20 million for relief. And more than 70 million
businesses – small business use Facebook to
create jobs and grow. But it’s clear now that we
didn’t do enough to prevent these tools from
being used for harm, as well. And that goes for fake news,
for foreign interference in elections, and hate speech,
as well as developers and data privacy. ZUCKERBERG: We didn’t take
a broad enough view of our responsibility, and
that was a big mistake. And it was my mistake. And I’m sorry. I started Facebook, I run it,
and I’m responsible for what happens here. So, now, we have to go through
our – all of our relationship with people and make sure that
we’re taking a broad enough view of our responsibility. It’s not enough to
just connect people. We have to make sure that
those connections are positive. It’s not enough to
just give people a voice. We need to make sure that people
aren’t using it to harm other people or to
spread misinformation. And it’s not enough to just
give people control over their information. We need to make sure that the
developers they share it with protect their information, too. Across the board, we have a
responsibility to not just build tools, but to make sure
that they’re used for good. It will take some time to work
through all the changes we need to make across the company, but
I’m committed to getting this right. This includes the basic
responsibility of protecting people’s information, which
we failed to do with Cambridge Analytica. So here are a few things that we
are doing to address this and to prevent it from happening again. First, we’re getting to the
bottom of exactly what Cambridge Analytica did, and
telling everyone affected. What we know now is that
Cambridge Analytica improperly accessed some information about
millions of Facebook members by buying it from an app developer. That information – this
was information that people generally share publicly
on their Facebook pages, like names and their profile
picture and the pages they follow. When we first
contacted Cambridge Analytica, they told us that they
had deleted the data. About a month ago, we heard
new reports that suggested that wasn’t true. And, now, we’re working
with governments in the U.S., the U.K. and around the world to do a
full audit of what they’ve done and to make sure they get rid
of any data they may still have. Second, to make sure no other
app developers out there are misusing data, we’re now
investigating every single app that had access to a large
amount of information in the past. And, if we find that
someone improperly used data, we’re going to ban them from
Facebook and tell everyone affected. Third, to prevent this
from ever happening again, going forward, we’re making sure
that developers can’t access as much information now. The good news here is that we
already made big changes to our platform in 2014 that would
have prevented this specific situation with Cambridge
Analytica from occurring again today. But there’s more to do, and
you can find more details on the steps we’re taking in
my written statement. My top priority has always been
our social mission of connecting people, building community
and bringing the world closer together. Advertisers and developers will
never take priority over that, as long as I am
running Facebook. I started Facebook
when I was in college. We’ve come a long
way since then. We now serve more than 2
billion people around the world. And, every day, people use our
services to stay connected with the people that
matter to them most. I believe deeply in
what we are doing. And I know that, when we address
these challenges we’ll look back and view helping people connect
and giving more people a voice as a positive
force in the world. I realize the issues we’re
talking about today aren’t just issues for Facebook
and our community. They’re issues and challenges
for all of us as Americans. Thank you for
having me here today, and I’m ready to
take your questions. GRASSLEY: I’ll
remind members that, maybe, weren’t here when I had
my opening comments that we are operating under the five-year
– the five-minute rule. And that applies to … (LAUGHTER) … the five-minute rule. And that applies to those of us
who are chairing the committee, as well. GRASSLEY: I’ll start with you. Facebook handles extensive
amounts of personal data for billions of users. A significant amount of that
data is shared with third-party developers, who
utilize your platform. As of this – early this year,
you did not actively monitor whether that data was
transferred by such developers to other parties. Moreover, your policies only
prohibit transfers by developers to parties seeking to
profit from such data. Number one, besides Professor
Kogan’s transfer and now, potentially, Cubeyou, do you
know of any instances where user data was improperly transferred
to third party in breach of Facebook’s terms? If so, how many
times has that happened, and was Facebook only made aware
of that transfer by some third party? ZUCKERBERG: Mr.
Chairman, thank you. As I mentioned, we’re now
conducting a full investigation into every single app that had
a – access to a large amount of information, before we
locked down platform to prevent developers from accessing
this information around 2014. We believe that we’re going
to be investigating many apps, tens of thousands of apps. And, if we find any
suspicious activity, we’re going to conduct a
full audit of those apps to understand how they’re using
their data and if they’re doing anything improper. If we find that they’re
doing anything improper, we’ll ban them from Facebook and
we will tell everyone affected. As for past activity, I don’t
have all the examples of apps that we’ve banned here,
but if you would like, I can have my team follow
up with you after this. GRASSLEY: Okay. Have you ever required an
audit to ensure the deletion of improperly transferred data? And, if so, how many times? ZUCKERBERG: Mr.
Chairman, yes we have. I don’t have the exact figure
on how many times we have. But, overall, the way we’ve
enforced our platform policies in the past is we have looked at
patterns of how apps have used our APIs and
accessed information, as well as looked into reports
that people have made to us about apps that might
be doing sketchy things. Going forward, we’re going to
take a more proactive position on this and do much more regular
stock checks and other reviews of apps, as well as increasing
the amount of audits that we do. And, again, I can make sure that
our team follows up with you on anything about the specific past
stats that would be interesting. GRASSLEY: I was
going to assume that, sitting here today, you have no
idea – and if I’m wrong on that, that you’re able –
you were telling me, I think, that you’re able to
supply those figures to us, at least as of this point. ZUCKERBERG: Mr. Chairman, I will
have my team follow up with you on what information we have. GRASSLEY: Okay but, right now,
you have no certainty of whether or not – how much
of that’s going on, right? Okay. Facebook collects massive
amounts of data from consumers, including content,
networks, contact lists, device information, location,
and information from third parties, yet your data policy
is only a few pages long and provides consumers with
only a few examples of what is collected and how
it might be used. The examples given
emphasize benign uses, such as ìconnecting with
friends,î but your policy does not give any indication for more
controversial issues of such data. My question: Why doesn’t
Facebook disclose to its users all the ways that data might be
used by Facebook and other third parties? And what is Facebook’s
responsibility to inform users about that information? ZUCKERBERG: Mr. Chairman, I
believe it’s important to tell people exactly how the
information that they share on Facebook is going to be used. That’s why, every single time
you go to share something on Facebook, whether it’s
a photo in Facebook, or a message – in
Messenger or What’s App, every single time, there’s a
control right there about who you’re going to be sharing it
with – whether it’s your friends or public or a specific group
– and you can – you can change that and control that in line. To your broader point
about the privacy policy, this gets into an – an issue
that I – I think we and others in the tech industry
have found challenging, which is that long privacy
policies are very confusing. And if you make it long and
spell out all the detail, then you’re probably going to
reduce the percent of people who read it and make it
accessible to them. So, one of the things that –
that we’ve struggled with over time is to make something
that is as simple as possible so people can understand it, as
well as giving them controls in line in the product in the
context of when they’re trying to actually use them, taking
into account that we don’t expect that most people will
want to go through and read a full legal document. GRASSLEY: Senator Nelson? NELSON: Thank you, Mr. Chairman. Yesterday when we talked, I gave
the relatively harmless example that I’m communicating with my
friends on Facebook and indicate that I love a
certain kind of chocolate. And all of a sudden I start
receiving advertisements for chocolate. What if I don’t want to receive
those commercial advertisements? So your chief operating
officer, Ms. Sandberg, suggested on the NBC ìToday
Showî that Facebook users who do not want their personal
information used for advertising might have to pay
for that protection. Pay for it. Are you actually considering
having Facebook users pay for you not to use the information? ZUCKERBERG: Senator, people
have a control over how their information is used in
ads in the product today. So if you want to have an
experience where your ads aren’t – aren’t targeted using all
the information that we have available, you can turn
off third-party information. What we found is that even
though some people don’t like ads, people really don’t
like ads that aren’t relevant. And while there is some
discomfort for sure with using information in
making ads more relevant, the overwhelming feedback that
we get from our community is that people would rather have us
show relevant content there than not. So we offer this control
that – that you’re referencing. Some people use it. It’s not the majority
of people on Facebook. And – and I think that that’s –
that’s a good level of control to offer. I think what Sheryl
was saying was that, in order to not run ads at all,
we would still need some sort of business model. NELSON: And that is
your business model. So I take it that – and I
used the harmless example of chocolate. But if it got into
more personal thing, communicating with friends,
and I want to cut it off, I’m going to have to pay
you in order not to send me, using my personal information,
something that I don’t want. That in essence is what I
understood Ms. Sandberg to say. Is that correct? ZUCKERBERG: Yes, senator. Although to be clear, we don’t
offer an option today for people to pay to not show ads. We think offering an
ad-supported service is the most aligned with our mission of
trying to help connect everyone in the world, because we want
to offer a free service that everyone can afford. NELSON: Okay. ZUCKERBERG: That’s the only way
that we can reach billions of people. NELSON: But – so, therefore,
you consider my personally identifiable data
the company’s data, not my data. Is that it? ZUCKERBERG: No, senator. Actually, at – the first line
of our Terms of Service say that you control and own the
information and content that you put on Facebook. NELSON: Well, the recent scandal
is obviously frustrating, not only because it
affected 87 million, but because it seems to be
part of a pattern of lax data practices by the
company, going back years. So, back in 2011, it was a
settlement with the FTC. And, now, we discover yet
another incidence where the data was failed to be protected. When you discovered that
Cambridge Analytica – that had fraudulently obtained
all of this information, why didn’t you
inform those 87 million? ZUCKERBERG: When we learned in
2015 that Cambridge Analytica had bought data from an app
developer on Facebook that people had shared it
with, we did take action. We took down the app, and we
demanded that both the app developer and Cambridge
Analytica delete and stop using any data that they had. They told us that they did this. In retrospect, it was clearly
a mistake to believe them … NELSON: Yes. ZUCKERBERG: … and we should have followed
up and done a full audit then. And that is not a
mistake that we will make. NELSON: Yes, you did that,
and you apologized for it. But you didn’t notify them. And do you think that you have
an ethical obligation to notify 87 million Facebook users? ZUCKERBERG: Senator, when
we heard back from Cambridge Analytica that they had told us
that they weren’t using the data and had deleted it, we
considered it a closed case. In retrospect, that
was clearly a mistake. We shouldn’t have
taken their word for it, and we’ve updated our policies
and how we’re going to operate the company to make sure that we
don’t make that mistake again. NELSON: Did
anybody notify the FTC? ZUCKERBERG: No, senator, for
the same reason – that we’d considered it a
closed – a closed case. GRASSLEY: Senator Thune. THUNE: And – and,
Mr. Zuckerberg, would you that – do
that differently today, presumably? That – in response to
Senator Nelson’s question … ZUCKERBERG: Yes. THUNE: … having to do it over? This may be your first
appearance before Congress, but it’s not the first time
that Facebook has faced tough questions about its
privacy policies. Wired Magazine recently noted
that you have a 14-year history of apologizing for ill-advised
decisions regarding user privacy, not unlike the one
that you made just now in your opening statement. After more than a decade
of promises to do better, how is today’s
apology different? And why should we trust Facebook
to make the necessary changes to ensure user privacy and give
people a clearer picture of your privacy policies? ZUCKERBERG: Thank
you, Mr. Chairman. So we have made a lot of
mistakes in running the company. I think it’s – it’s
pretty much impossible, I – I believe, to start a
company in your dorm room and then grow it to be at the scale
that we’re at now without making some mistakes. And, because our service is
about helping people connect and information, those mistakes have
been different in – in how they – we try not to make the
same mistake multiple times. But in general, a lot of the
mistakes are around how people connect to each other, just
because of the nature of the service. ZUCKERBERG: Overall, I would
say that we’re going through a broader philosophical
shift in how we approach our responsibility as a company. For the first 10 or 12
years of the company, I viewed our responsibility as
primarily building tools that, if we could put those
tools in people’s hands, then that would empower
people to do good things. What I think we’ve learned now
across a number of issues – not just data privacy, but also fake
news and foreign interference in elections – is that we need to
take a more proactive role and a broader view of
our responsibility. It’s not enough to
just build tools. We need to make sure
that they’re used for good. And that means that we need to
now take a more active view in policing the ecosystem and in
watching and kind of looking out and making sure that all of
the members in our community are using these tools in a way
that’s going to be good and healthy. So, at the end of the day, this
is going to be something where people will measure us
by our results on this. It’s not that I expect anything
that I say here today – to necessarily
change people’s view. But I’m committed to
getting this right. And I believe that,
over the coming years, once we fully work all
these solutions through, people will see
real differences. THUNE: Well – and I’m glad
that you all have gotten that message. As we discussed in
my office yesterday, the line between legitimate
political discourse and hate speech can sometimes
be hard to identify, and especially when
you’re relying on artificial intelligence and other
technologies for the initial discovery. Can you discuss what steps that
Facebook currently takes when making these evaluations, the
challenges that you face and any examples of where you may draw
the line between what is and what is not hate speech? ZUCKERBERG: Yes, Mr. Chairman. I’ll speak to hate speech, and
then I’ll talk about enforcing our content
policies more broadly. So – actually, maybe, if
– if you’re okay with it, I’ll go in the other order. So, from the beginning of the
company in 2004 – I started in my dorm room; it was
me and my roommate. We didn’t have A.I. technology that could look at
the content that people were sharing. So – so we basically had to
enforce our content policies reactively. People could share
what they wanted, and then, if someone in
the community found it to be offensive or
against our policies, they’d flag it for us, and
we’d look at it reactively. Now, increasingly,
we’re developing A.I. tools that can identify
certain classes of bad activity proactively and flag it
for our team at Facebook. By the end of this
year, by the way, we’re going to have more
than 20,000 people working on security and content review,
working across all these things. So, when content
gets flagged to us, we have those –
those people look at it. And, if it
violates our policies, then we take it down. Some problems lend
themselves more easily to A.I. solutions than others. So hate speech is
one of the hardest, because determining if
something is hate speech is very linguistically nuanced, right? It’s – you need to
understand, you know, what is a slur and what –
whether something is hateful not just in English, but the
majority of people on Facebook use it in languages that are
different across the world. Contrast that, for example, with
an area like finding terrorist propaganda, which we’ve
actually been very successful at deploying A.I. tools on already. Today, as we sit here, 99
percent of the ISIS and Al Qaida content that we
take down on Facebook, our A.I. systems flag before
any human sees it. So that’s a success in
terms of rolling out A.I. tools that can proactively
police and enforce safety across the community. Hate speech – I am
optimistic that, over a 5 to 10-year
period, we will have A.I. tools that can get into some
of the nuances – the linguistic nuances of different types of
content to be more accurate in flagging things for our systems. But, today, we’re
just not there on that. So a lot of this
is still reactive. People flag it to us. We have people look at it. We have policies to try to
make it as not subjective as possible. But, until we get
it more automated, there is a higher error
rate than I’m happy with. THUNE: Thank you … (CROSSTALK)
GRASSLEY: Senator Feinstein? FEINSTEIN: Thanks, Mr. Chairman. Mr. Zuckerberg, what is Facebook
doing to prevent foreign actors from interfering in U.S. elections? ZUCKERBERG: Thank you, senator. This is one of my top priorities
in 2018 – is to get this right. I – one of my greatest regrets
in running the company is that we were slow in identifying the
Russian information operations in 2016. We expected them to do a
number of more traditional cyber attacks, which we did identify
and notify the campaigns that they were trying
to hack into them. But we were slow at identifying
the type of – of new information operations. FEINSTEIN: When did you
identify new operations? ZUCKERBERG: It was right around
the time of the 2016 election itself. So, since then, we – 2018 is –
is an incredibly important year for elections. Not just in – with the U.S. midterms, but, around the world,
there are important elections – in India, in Brazil, in Mexico,
in Pakistan and in Hungary, that – we want to make sure
that we do everything we can to protect the integrity
of those elections. Now, I have more confidence that
we’re going to get this right, because, since
the 2016 election, there have been several
important elections around the world where we’ve
had a better record. There was the French
presidential election. There was the German election. There was the U.S. Senate Alabama
special election last year. FEINSTEIN: Explain what is
better about the record. ZUCKERBERG: So
we’ve deployed new A.I. tools that do a better job of
identifying fake accounts that may be trying to
interfere in elections or spread misinformation. And, between those
three elections, we were able to proactively
remove tens of thousands of accounts that – before
they – they could contribute significant harm. And the nature of
these attacks, though, is that, you know, there are
people in Russia whose job it is – is to try to exploit our
systems and other Internet systems, and other
systems, as well. So this is an arms race, right? I mean, they’re going to
keep on getting better at this, and we need to invest in keeping
on getting better at this, too, which is why one of things
I mentioned before is we’re going to have more
than 20,000 people, by the end of this year, working
on security and content review across the company. FEINSTEIN: Speak for a moment
about automated bots that spread disinformation. What are you doing to punish
those who exploit your platform in that regard? ZUCKERBERG: Well, you’re not
allowed to have a fake account on Facebook. Your content has
to be authentic. So we build technical tools to
try to identify when people are creating fake accounts –
especially large networks of fake accounts, like the Russians
have – in order to remove all of that content. After the 2016 election, our
top priority was protecting the integrity of other
elections around the world. But, at the same time, we had a
parallel effort to trace back to Russia the IRA activity –
the Internet Research Agency activity that was – the part of
the Russian government that – that did this
activity in – in 2016. And, just last week, we were
able to determine that a number of Russian media organizations
that were sanctioned by the Russian regulator were operated
and controlled by this Internet Research Agency. So we took the step last week –
that was a pretty big step for us – of taking down sanctioned
news organizations in Russia as part of an operation to remove
270 fake accounts and pages, part of their
broader network in Russia, that was – that was actually
not targeting international interference as much as –
sorry, let me correct that. It was primarily targeting –
spreading misinformation in Russia itself, as well as
certain Russian-speaking neighboring countries. FEINSTEIN: How many accounts of
this type have you taken down? ZUCKERBERG: Across – in
the IRA specifically, the ones that we’ve
pegged back to the IRA, we can identify the 470 in the
American elections in the 270 that we specifically went
after in Russia last week. There were many others
that our systems catch, which are more difficult to
attribute specifically to Russian intelligence, but the
number would be in the tens of thousands of fake
accounts that we remove. And I’m happy to have my
team follow up with you on more information, if
that would be helpful. FEINSTEIN: Would you, please? I think this is very important. If you knew in 2015 that
Cambridge Analytica was using the information of
Professor Kogan’s, why didn’t Facebook
ban Cambridge in 2015? Why’d you wait another … (CROSSTALK) ZUCKERBERG: Senator,
that’s a – a great question. Cambridge Analytica wasn’t
using our services in 2015, as far as we can tell. So this is – this is clearly one
of the questions that I asked our team, as soon as I learned
about this – is why – why did we wait until we found out about
the reports last month to – to ban them. It’s because, as of the time
that we learned about their activity in 2015, they
weren’t an advertiser. They weren’t running pages. So we actually
had nothing to ban. FEINSTEIN: Thank you. Thank you, Mr. Chairman. GRASSLEY: No, thank
you, Senator Feinstein. Now, Senator Hatch. SEN. ORRIN G. HATCH (R-UTAH):
Well, in my opinion, this is the most – this is the
most intense public scrutiny I’ve seen for a tech-related
hearing since the Microsoft hearing that – that I
chaired back in the late 1990s. The recent stories about
Cambridge Analytica and data mining on social media have
raised serious concerns about consumer privacy,
and, naturally, I know you understand that. At the same time, these stories
touch on the very foundation of the Internet economy and the
way the websites that drive our Internet economy make money. Some have professed themselves
shocked – shocked that companies like Facebook and Google share
user data with advertisers. Did any of these individuals
ever stop to ask themselves why Facebook and Google didn’t –
don’t change – don’t charge for access? Nothing in life is free. Everything involves trade-offs. If you want something without
having to pay money for it, you’re going to have to pay
for it in some other way, it seems to me. And that’s where –
what we’re seeing here. And these great websites that
don’t charge for access – they extract value in some other way. And there’s
nothing wrong with that, as long as they’re upfront
about what they’re doing. To my mind, the issue
here is transparency. It’s consumer choice. Do users understand what they’re
agreeing to – to when they access a website or
agree to terms of service? Are websites upfront about how
they extract value from users, or do they hide the ball? Do consumers have the
information they need to make an informed choice regarding
whether or not to visit a particular website? To my – to my mind, these are
questions that we should ask or be focusing on. Now, Mr. Zuckerberg, I remember
well your first visit to Capitol Hill, back in 2010. You spoke to the Senate
Republican High-Tech Task Force, which I chair. You said back then that
Facebook would always be free. Is that still your objective? ZUCKERBERG: Senator, yes. There will always be a version
of Facebook that is free. It is our mission to try to
help connect everyone around the world and to bring the
world closer together. In order to do that, we believe
that we need to offer a service that everyone can afford, and
we’re committed to doing that. HATCH: Well, if so, how do
you sustain a business model in which users don’t
pay for your service? ZUCKERBERG: Senator, we run ads. HATCH: I see. That’s great. Whenever a
controversy like this arises, there’s always the danger that
Congress’s response will be to step and overregulate. Now, that’s been the
experience that I’ve had, in my 42 years here. In your view, what sorts of
legislative changes would help to solve the problems the
Cambridge Analytica story has revealed? And what sorts of legislative
changes would not help to solve this issue? ZUCKERBERG: Senator, I think
that there are a few categories of legislation that –
that make sense to consider. Around privacy specifically,
there are a few principles that I think it would be useful to
– to discuss and potentially codified into law. One is around having a simple
and practical set of – of ways that you explain what
you are doing with data. And we talked a little bit
earlier around the complexity of laying out these
long privacy policies. It’s hard to say that people
fully understand something when it’s only written out in
a long legal document. This needs – the stuff needs to
be implemented in a way where people can
actually understand it, where consumers can
– can understand it, but that can also capture
all the nuances of how these services work in a way that
doesn’t – that’s not overly restrictive on – on
providing the services. That’s one. The second is around
giving people complete control. This is the most important
principle for Facebook: Every piece of content that
you share on Facebook, you own and you have complete
control over who sees it and – and how you share it, and
you can remove it at any time. That’s why every day,
about 100 billion times a day, people come to one of our
services and either post a photo or send a message to someone,
because they know that they have that control and that who they
say it’s going to go to is going to be who sees the content. And I think that that control is
something that’s important that I think should apply
to – to every service. And the third point is – is
just around enabling innovation. Because some of the abuse cases
that – that are very sensitive, like face recognition, for
example – and I feel there’s a balance that’s extremely
important to strike here, where you obtain special consent
for sensitive features like face recognition, but don’t – but we
still need to make it so that American companies can
innovate in those areas, or else we’re going to fall
behind Chinese competitors and others around the world who
have different regimes for – for different new
features like that. GRASSLEY: Senator Cantwell? SEN. MARIA CANTWELL
(D-WASH): Thank you, Mr. Chairman. Welcome Mr. Zuckerberg. Do you know who Palantir is? ZUCKERBERG: I do. CANTWELL: Some people refer to
them as a Stanford Analytica. Do you agree? ZUCKERBERG: Senator, I
have not heard that. CANTWELL: Okay. Do you think Palantir
taught Cambridge Analytica, as press reports are
saying, how to do these tactics? ZUCKERBERG:
Senator, I do not know. CANTWELL: Do you think that
Palantir has ever scraped data from Facebook? ZUCKERBERG: Senator,
I’m not aware of that. CANTWELL: Do you think that
during the 2016 campaign, as Cambridge Analytica was
providing support to the Trump campaign under Project Alamo,
were there any Facebook people involved in that sharing of
technique and information? ZUCKERBERG: Senator, we provided
support to the Trump campaign similar to what we provide to
any advertiser or campaign who asks for it. CANTWELL: So that was a yes. Was that a yes? ZUCKERBERG: Senator, can you
repeat the specific question? I just want to make sure I get
specifically what you’re asking. CANTWELL: During
the 2016 campaign, Cambridge Analytica worked with
the Trump campaign to refine tactics. And were Facebook
employees involved in that? ZUCKERBERG: Senator, I don’t
know that our employees were involved with
Cambridge Analytica. Although I know that we did help
out the Trump campaign overall in sales support in the same way
that we do with other companies. CANTWELL: So they may have
been involved and all working together during
that time period? Maybe that’s something your
investigation will find out. ZUCKERBERG: Senator, my – I can
certainly have my team get back to you on any specifics
there that I don’t know, sitting here today. CANTWELL: Have you heard of
Total Information Awareness? Do you know what
I’m talking about? ZUCKERBERG: No, I do not. CANTWELL: Okay. Total Information
Awareness was, 2003, John Ashcroft and others trying
to do similar things to what I think is behind all of this –
geopolitical forces trying to get data and information
to influence a process. So, when I look at
Palantir and what they’re doing; and I look at WhatsApp,
which is another acquisition; and I look at where you are,
from the 2011 consent decree, and where you are
today; I am thinking, ìIs this guy
outfoxing the foxes? Or is he going along with
what is a major trend in an information age, to try
to harvest information for political forces?î And
so my question to you is, do you see that
those applications, that those companies – Palantir
and even WhatsApp – are going to fall into the same situation
that you’ve just fallen into, over the last several years? ZUCKERBERG: Senator,
I’m not – I’m not sure, specifically. Overall, I – I do think that
these issues around information access are challenging. To the specifics
about those apps, I’m not really that familiar
with what Palantir does. WhatsApp collects very
little information and, I – I think, is less likely to
have the kind of issues because of the way that the
service is architected. But, certainly, I think that
these are broad issues across the tech industry. CANTWELL: Well, I guess,
given the track record – where Facebook is and why
you’re here today, I guess people would say that
they didn’t act boldly enough. And the fact that
people like John Bolton, basically, was an investor – in
a New York Times article earlier – I guess it was actually last
month – that the Bolton PAC was obsessed with how America
was becoming limp-wristed and spineless, and it wanted
research and messaging for national security issues. So the fact that, you know,
there are a lot of people who are interested in this larger
effort – and what I think my constituents want to know is,
was this discussed at your board meetings? And what are the applications
and interests that are being discussed without
putting real teeth into this? We don’t want to come back
to this situation again. I believe you
have all the talent. My question is whether you have
all the will to help us solve this problem. ZUCKERBERG: Yes, Senator. So data privacy and foreign
interference in elections are certainly topics that we have
discussed at the board meeting. These are some of the biggest
issues that the company has faced, and we feel a huge
responsibility to get these right. CANTWELL: Do you believe
European regulations should be applied here in the U.S.? ZUCKERBERG: Senator, I think
everyone in the world deserves good privacy protection. And, regardless of whether
we implement the exact same regulation, I would guess that
it would be somewhat different, because we have somewhat
different sensibilities in the U.S. as to other countries. We’re committed to rolling out
the controls and the affirmative consent and the special controls
around sensitive types of technology, like
face recognition, that are required in GDPR. We’re doing that
around the world. So I think it’s certainly worth
discussing whether we should have something
similar in the U.S. But what I would like to say
today is that we’re going to go forward and implement
that, regardless of what the regulatory outcome is. GRASSLEY: Senator Wicker? Senator Thune will chair next. Senator Wicker? SEN. ROGER WICKER
(R-MISS): Thank you, Mr. Chairman. And, Mr. Zuckerberg,
thank you for being with us. My question is
going to be, sort of, a follow-up on what
Senator Hatch was talking about. And let me agree with
basically his – his advice, that we don’t want to
overregulate (inaudible) to the point where we’re stifling
innovation and investment. I understand with regard to
suggested rules or suggested legislation, there are at
least two schools of thought out there. One would be the ISPs, the
Internet service providers, who are advocating for privacy
protections for consumers that apply to all online entities
equally across the entire Internet ecosystem. Now, Facebook is an edge
provider on the other hand. It is my understanding
that many edge providers, such as Facebook, may
not support that effort, because edge providers have
different business models than the ISPs and should not be
considered like services. So, do you think we need
consistent privacy protections for consumers across the entire
Internet ecosystem that are based on the type of consumer
information being collected, used or shared, regardless of
the entity doing the collecting, reusing or sharing? ZUCKERBERG: Senator, this
is an important question. I would
differentiate between ISPs, which I consider to be
the pipes of the Internet, and the platforms like
Facebook or Google or Twitter, YouTube that are the apps
or platforms on top of that. I think in general, the
expectations that people have of the pipes are somewhat
different from the platforms. So there might be areas
where there needs to be more regulation in one
and less in the other, but I think that there are going
to be other places where there needs to be more
regulation of the other type. Specifically,
though, on the pipes, one of the important issues that
– that I think we face and have debated is … WICKER: When you – when you
say ìpipes,î you mean … ZUCKERBERG: ISPs. WICKER: … the ISPs. ZUCKERBERG: Yeah. So I know net neutrality has
been a – a hotly debated topic, and one of the reasons why I
have been out there saying that I think that should be
the case is because, you know, I look at my own story
of when I was getting started building Facebook at
Harvard, you know, I only had one
option for an ISP to use. And if I had to pay extra in
order to make it so that my app could potentially be seen
or used by other people, then – then we probably
wouldn’t be here today. WICKER: Okay, well – but we’re
talking about privacy concerns. And let me just say, we’ll –
we’ll have to follow up on this. But I think you and I agree,
this is going to be one of the major items of debate if we have
to go forward and – and do this from a governmental standpoint. Let me just move on to
another couple of items. Is it true that – as
was recently publicized, that Facebook collects the call
and text histories of its users that use Android phones? ZUCKERBERG: Senator, we have an
app called Messenger for sending messages to your
Facebook friends. And that app offers people an
option to sync their – their text messages into
the messenging app, and to make it so that – so
basically so you can have one app where it has both your texts
and – and your Facebook messages in one place. We also allow
people the option of … WICKER: You can opt
in or out of that? ZUCKERBERG: Yes. It is opt-in. WICKER: It is easy to opt out? ZUCKERBERG: It is opt-in. You – you have to affirmatively
say that you want to sync that information before
we get access to it. WICKER: Unless you –
unless you opt in, you don’t collect that
call and text history? ZUCKERBERG: That is correct. WICKER: And is that true for –
is this practice done at all with minors, or do you make an
exception there for persons aged 13 to 17? ZUCKERBERG: I do not know. We can follow up with that. WICKER: Okay, do
that – let’s do that. One other thing: There have been
reports that Facebook can track a user’s Internet
browsing activity, even after that user has logged
off of the Facebook platform. Can you confirm
whether or not this is true? ZUCKERBERG: Senator – I – I
want to make sure I get this accurate, so it would probably
be better to have my team follow up afterwards. WICKER: You don’t know? ZUCKERBERG: I know that the
– people use cookies on the Internet, and that you can
probably correlate activity between – between sessions. We do that for a
number of reasons, including security, and
including measuring ads to make sure that the ad experiences
are the most effective, which, of course,
people can opt out of. But I want to make sure
that I’m precise in my answer, so let me … WICKER: When –
well, when you get … ZUCKERBERG: … follow up with you on that. WICKER: … when you get back to me, sir,
would you also let us know how Facebook’s – discloses to its
users that engaging in this type of tracking gives
us that result? ZUCKERBERG: Yes. WICKER: And thank you very much. GRASSLEY: Thank
you, Senator Wicker. Senator Leahy’s up next. SEN. PATRICK J. LEAHY (D-VT): Thank you. Mr. Zuckerberg, I – I assume
Facebook’s been served with subpoenas from the – Special
Counsel Mueller’s office. Is that correct? ZUCKERBERG: Yes. LEAHY: Have you or anyone at
Facebook been interviewed by the Special Counsel’s Office? ZUCKERBERG: Yes. LEAHY: Have you
been interviewed … ZUCKERBERG: I have not. I – I have not. LEAHY: Others have? ZUCKERBERG: I – I believe so. And I want to be careful here,
because that – our work with the special counsel is confidential,
and I want to make sure that, in an open session, I’m not
revealing something that’s confidential. LEAHY: I understand. I just want to make clear
that you have been contacted, you have had subpoenas. ZUCKERBERG: Actually,
let me clarify that. I actually am not
aware of – of a subpoena. I believe that there may be, but
I know we’re working with them. LEAHY: Thank you. Six months ago, your general
counsel promised us that you were taking steps to prevent
Facebook preserving what I would call an unwitting co-conspirator
in Russian interference. But these – these unverified,
divisive pages are on Facebook today. They look a lot like the
anonymous groups that Russian agents used to spread propaganda
during the 2016 election. Are you able to confirm whether
they’re Russian-created groups? Yes or no? ZUCKERBERG: Senator, are you
asking about those specifically? LEAHY: Yes. ZUCKERBERG: Senator, last week,
we actually announced a major change to our ads and pages
policies: that we will be identifying the identity of
every single advertiser … LEAHY: I’m asking
about specific ones. Do you know whether they are? ZUCKERBERG: I am not familiar
with those pieces of content specifically. LEAHY: But, if you
decided this policy a week ago, you’d be able to verify them? ZUCKERBERG: We are
working on that now. What we’re doing is we’re going
to verify the identity of any advertiser who’s running a
political or issue-related ad – this is basically what the
Honest Ads Act is proposing, and we’re following that. And we’re also going
to do that for pages. So … LEAHY: But you
can’t answer on these? ZUCKERBERG: I – I’m not familiar
with those specific cases. LEAHY: Well, will you – will you
find out the answer and get back to me? ZUCKERBERG: I’ll have
my team get back to you. I do think it’s
worth adding, though, that we’re going to do the same
verification of identity and location of admins who
are running large pages. So, that way, even if they
aren’t going to be buying ads in our system, that will make it
significantly harder for Russian interference efforts or
other inauthentic efforts … LEAHY: Well, some … ZUCKERBERG: … to try to spread misinformation
through the network. LEAHY: … it’s a fight that’s been
going on for some time, so I might say it’s about time. You know, six months ago, I
asked your general counsel about Facebook’s role as a breeding
ground for hate speech against Rohingya refugees. Recently, U.N. investigators blamed Facebook
for playing a role in inciting possible genocide in Myanmar. And there has
been genocide there. You say you use A.I. to find this. This is the type of
content I’m referring to. It calls for the death
of a Muslim journalist. Now, that threat went straight
through your detection systems, it spread very quickly, and then
it took attempt after attempt after attempt, and the
involvement of civil society groups, to get you to remove it. Why couldn’t it be
removed within 24 hours? ZUCKERBERG: Senator, what’s
happening in Myanmar is a terrible tragedy, and
we need to do more … (CROSSTALK) LEAHY: We
all agree with that. ZUCKERBERG: Okay. LEAHY: But U.N. investigators have blamed you –
blamed Facebook for playing a role in the genocide. We all agree it’s terrible. How can you dedicate,
and will you dedicate, resources to make sure such hate
speech is taken down within 24 hours? ZUCKERBERG: Yes. We’re working on this. And there are three specific
things that we’re doing. One is we’re hiring dozens of
more Burmese-language content reviewers, because hate speech
is very language-specific. It’s hard to do it without
people who speak the local language, and we need to ramp up
our effort there dramatically. Second is we’re working with
civil society in Myanmar to identify specific hate figures
so we can take down their accounts, rather than
specific pieces of content. And third is we’re standing up
a product team to do specific product changes in Myanmar and
other countries that may have similar issues in the future
to prevent this from happening. LEAHY: Senator Cruz and I
sent a letter to Apple, asking what they’re going to
do about Chinese censorship. My question, I’ll place … THUNE: That’d be great. Thank you, Senator Leahy. LEAHY: … I’ll place for the record – I
want to know what you will do about Chinese censorship,
when they come to you. THUNE: Senator Graham’s up next. SEN. LINDSEY O. GRAHAM (R-S.C.): Thank you. Are you familiar
with Andrew Bosworth? ZUCKERBERG: Yes, senator, I am. GRAHAM: He said, ìSo
we connect more people. Maybe someone dies in a
terrorist attack coordinated on our tools. The ugly truth is that we
believe in connecting people so deeply that anything that allows
us to connect more people, more often, is de facto
good.î Do you agree with that? ZUCKERBERG: No,
senator, I do not. And, as context, Boz wrote
that – Boz is what we call him internally – he wrote
that as an internal note. We have a lot of
discussion internally. I disagreed with it at
the time that he wrote it. If you looked at the comments
on the internal discussion … GRAHAM: Would you say … ZUCKERBERG: … the vast majority of
people internally did, too. GRAHAM: … that you did a
poor job, as a CEO, communicating your
displeasure with such thoughts? Because, if he had understood
where you – where you were at, he would have never
said it to begin with. ZUCKERBERG: Well, senator, we
try to run our company in a way where people can express
different opinions internally. GRAHAM: Well, this is an
opinion that really disturbs me. And, if somebody worked
for me that said this, I’d fire them. Who’s your biggest competitor? ZUCKERBERG: Senator, we
have a lot of competitors. GRAHAM: Who’s your biggest? ZUCKERBERG: I think the
categories of – did you want just one? I’m not sure I can give
one, but can I give a bunch? GRAHAM: Yes. ZUCKERBERG: So there are three
categories that I would focus on. One are the other tech
platforms – so Google, Apple, Amazon, Microsoft – we
overlap with them in different ways. GRAHAM: Do they do – do they
provide the same service you provide? ZUCKERBERG: In different
ways – different parts of it, yes. GRAHAM: Let me put it this way. If I buy a Ford, and
it doesn’t work well, and I don’t like it,
I can buy a Chevy. If I’m upset with Facebook,
what’s the equivalent product that I can go sign up for? ZUCKERBERG: Well, there – the
second category that I was going to talk about are … (CROSSTALK) GRAHAM: I’m not
talking about categories. I’m talking about, is there
real competition you face? Because car companies
face a lot of competition. If they make a defective
car, it gets out in the world, people stop buying that
car; they buy another one. Is there an alternative to
Facebook in the private sector? ZUCKERBERG: Yes, Senator. The average American uses eight
different apps to communicate with their friends and
stay in touch with people … (CROSSTALK) GRAHAM: Okay. Which is … ZUCKERBERG: … ranging from
texting apps, to email, to … GRAHAM: … is the same service you provide? ZUCKERBERG: Well, we provide a
number of different services. GRAHAM: Is Twitter the
same as what you do? ZUCKERBERG: It overlaps
with a portion of what we do. GRAHAM: You don’t
think you have a monopoly? ZUCKERBERG: It certainly
doesn’t feel like that to me. GRAHAM: Okay. (LAUGHTER) So it doesn’t. So, Instagram – you
bought Instagram. Why did you buy Instagram? ZUCKERBERG: Because they were
very talented app developers who were making good use of our
platform and understood our values. GRAHAM: It is a good
business decision. My point is that one way to
regulate a company is through competition, through
government regulation. Here’s the question that all
of us got to answer: What do we tell our constituents,
given what’s happened here, why we should let
you self-regulate? What would you tell
people in South Carolina, that given all of the things
we’ve just discovered here, it’s a good idea for us to rely
upon you to regulate your own business practices? ZUCKERBERG: Well, senator,
my position is not that there should be no regulation. GRAHAM: Okay. ZUCKERBERG: I think the
Internet is increasingly … (CROSSTALK) GRAHAM:
You embrace regulation? ZUCKERBERG: I think
the real question, as the Internet becomes more
important in people’s lives, is what is the right regulation,
not whether there should be or not. GRAHAM: But – but
you, as a company, welcome regulation? ZUCKERBERG: I think, if
it’s the right regulation, then yes. GRAHAM: You think the
Europeans had it right? ZUCKERBERG: I think that
they get things right. GRAHAM: Have you
ever submitted … (LAUGHTER) That’s true. So would you work with us in
terms of what regulations you think are necessary
in your industry? ZUCKERBERG: Absolutely. GRAHAM: Okay. Would you submit to us
some proposed regulations? ZUCKERBERG: Yes. And I’ll have my team
follow up with you so, that way, we can have this
discussion across the different categories where I think that
this discussion needs to happen. GRAHAM: Look forward to it. When you sign up for Facebook,
you sign up for a terms of service. Are you familiar with that? ZUCKERBERG: Yes. GRAHAM: Okay. It says, ìThe terms govern
your use of Facebook and the products,
features, apps, services, technologies, software we
offer – Facebook’s products or products – except where we
expressly state that separate terms, and not these,
apply.î I’m a lawyer. I have no idea what that means. But, when you look
at terms of service, this is what you get. Do you think the average
consumer understands what they’re signing up for? ZUCKERBERG: I don’t think that
the average person likely reads that whole document. GRAHAM: Yeah. ZUCKERBERG: But I think that
there are different ways that we can communicate that, and
have a responsibility to do so. GRAHAM: Do you – do you agree
with me that you better come up with different ways,
because this ain’t working? ZUCKERBERG: Well,
senator, I think, in certain areas, that is true. And I think, in other areas,
like the core part of what we do – right, if you – if
you think about – just, at the most basic level,
people come to Facebook, Instagram, WhatsApp, Messenger,
about a hundred billion times a day to share a piece of content
or a message with a specific set of people. And I think that that basic
functionality people understand, because we have the
controls in line every time, and given the volume of
– of – of the activity, and the value that people tell
us that they’re getting from that, I think that that control
in line does seem to be working fairly well. Now we can always do better, and
there are other – the services are complex, and there is
more to it than just – you know, you go and you post a photo, so
I – I – I agree that – that in many places we could do better. But I think for the
quarter of the service, it actually is quite clear. GRASSLEY: Thank
you, Senator Graham. Senator Klobuchar. SEN. AMY KLOBUCHAR
(D-MINN): Thank you, Mr. Chairman. Mr. Zuckerberg, I think we all
agree that what happened here was bad. You acknowledged it
was a breach of trust. And the way I explain it to my
constituents is that if someone breaks into my apartment with
the crowbar and they take my stuff, it’s just like if the
manager gave them the keys or if they didn’t have any
locks on the doors, it’s still a breach;
it’s still a break in. And I believe we need to
have laws and rules that are sophisticated as the – the
brilliant products that you’ve developed here. And we just
haven’t done that yet. And one of the areas that
I’ve focused on is the election. And I appreciate the
support that you and Facebook, and now Twitter, actually, have
given to the Honest Ads Act bill that you mentioned, that I’m
leading with Senator McCain and Senator Warner. And I just want to be clear, as
we work to pass this law so that we have the same rules in place
to disclose political ads and issue ads as we do
for TV and radio, as well as disclaimers, that
you’re going to take early action, as soon as June I heard,
before this election so that people can view these
ads, including issue ads. Is that correct? ZUCKERBERG: That is
correct, senator. And I just want to take a moment
before I go into this in more detail to thank you for
your leadership on this. This, I think, is an important
area for the whole industry to move on. The two specific things that
we’re doing are – one is around transparency, so now you’re
going to be able to go and click on any advertiser or any page on
Facebook and see all of the ads that they’re running. So that actually brings
advertising online – on Facebook to an even higher standard than
what you would have on TV or print media, because there’s
nowhere where you can see all of the TV ads that
someone is running, for example. Whereas you will be able to
see now on Facebook whether this campaign or third party is
saying different messages to different types of people, and
I think that that’s a really important element
of transparency. But the other really important
piece is around verifying every single advertiser who’s going
to be running political or issue ads. KLOBUCHAR: I appreciate that. And Senator Warner and I have
also called on Google and the other platforms to do the same. So memo to the rest of you, we
have to get this done or we’re going to have a
patchwork of ads, and I hope that you’ll be
working with us to pass this bill. Is that right? ZUCKERBERG: We will. KLOBUCHAR: Okay, thank you. Now on the subject of
Cambridge Analytica, were these people,
the 87 million people, users, concentrated
in certain states? Are you able to figure
out where they’re from? ZUCKERBERG: I do not have
that information with me, but we can follow up
with your – your office. KLOBUCHAR: Okay,
because as we know, that election was close, and it
was only thousands of votes in certain states. You’ve also estimated that
roughly 126 people – million people may have been shown
content from a Facebook page associated with the
Internet Research Agency. Have you determined when –
whether any of those people were the same Facebook users who’s
data was shared with Cambridge Analytica? Are you able to make
that determination? ZUCKERBERG: Senator,
we’re investigating that now. We believe that it is entirely
possible that there will be a connection there. KLOBUCHAR: Okay, that seems like
a big deal as we look back at that last election. Former Cambridge Analytica
employee Christopher Wiley has said that the data that it
improperly obtained – that Cambridge Analytica improperly
obtained from Facebook users could be stored in Russia. Do you agree that
that’s a possibility? ZUCKERBERG: Sorry, are you –
are you asking if Cambridge Analytica’s data – data
could be stored in Russia? KLOBUCHAR: That’s what he said
this weekend on a Sunday show. ZUCKERBERG: Senator, I don’t
have any specific knowledge that would suggest that. But one of the steps that we
need to take now is go do a full audit of all of
Cambridge Analytica’s systems to understand what they’re doing,
whether they still have any data, to make sure that
they remove all the data. If they don’t, we’re going to
take legal action against them to do so. That audit, we have temporarily
ceded that in order to let the U.K. government complete their
government investigation first, because, of course, a government
investigation takes precedence against a company doing that. But we are committed to
completing this full audit and getting to the bottom
of what’s going on here, so that way we can have
more answers to this. KLOBUCHAR: Okay. You earlier stated publicly and
here that you would support some privacy rules so that everyone’s
playing by the same rules here. And you also said here that you
should have notified customers earlier. Would you support a rule that
would require you to notify your users of a breach
within 72 hours? ZUCKERBERG: Senator,
that makes sense to me. And I think we should have our
team follow up with – with yours to – to discuss the
details around that more. KLOBUCHAR: Thank you. I just think part of this was
when people don’t even know that their data’s been
breached, that’s a huge problem. And I also think we get to
solutions faster when we get that information out there. Thank you. And we look forward to passing
this bill – we’d love to pass it before the election
– on the honest ads. And I’m looking forward to
better disclosure this election. Thank you. THUNE: Thank you,
Senator Klobuchar. Senator Blunt’s up next. SEN. ROY BLUNT (R-MO):
Thank you, Mr. Chairman. Mr. Zuckerberg, nice to see you. When I saw you not too long
after I entered the Senate in 2011, I told you, when I sent
my business cards down to be printed, they came back from
the Senate print shop with the message that it was the
first business card they’d ever printed a Facebook address on. There are days when
I’ve regretted that, but more days when we get lots
of information that we need to get. There are days when I wonder if
ìFacebook friendsî is a little misstated. It doesn’t seem like I
have those every single day. But, you know, the – the
platform you’ve created is really important. And my son Charlie, who’s
13, is dedicated to Instagram. So he’d want to be sure I
mentioned him while I was here with – with you. I haven’t printed
that on my card yet, I – I will – will say that, but
I think we have that account as well. Lots of ways to connect people. And the – the
information, obviously, is an important commodity and
it’s what makes your business work. I get that. However, I wonder about some
of the collection efforts. And maybe we can go through
largely just even ìyesî and ìnoî and then we’ll get back to more
expansive discussion of this. But do you collect user data
through cross-device tracking? ZUCKERBERG: Senator, I believe
we do link people’s accounts between devices in order to make
sure that their Facebook and Instagram and their other
experiences can be synced between their devices. BLUNT: And that would
also include offline data, data that’s tracking that’s not
necessarily linked to Facebook, but linked to one – some device
they went through Facebook on, is that right? ZUCKERBERG: Senator, I want to
make sure we get this right. So I want to have my team follow
up with you on that afterwards. BLUNT: Well, now, that doesn’t
seem that complicated to me. Now, you – you understand
this better than I do, but maybe – maybe you can
explain to me why that’s that – why that’s complicated. Do you track devices that an
individual who uses Facebook has that is connected to the device
that they use for their Facebook connection, but not necessarily
connected to Facebook? ZUCKERBERG: I’m not – I’m
not sure of the answer to that question. BLUNT: Really? ZUCKERBERG: Yes. There – there may be some data
that is necessary to provide the service that we do. But I don’t – I don’t have
that on – sitting here today. So that’s something that I
would want to follow up on. BLUNT: Now, the FTC, last year,
flagged cross-device tracking as one of their
concerns – generally, that people are tracking devices
that the users of something like Facebook don’t know
they’re being tracked. How do you disclose your
collected – collection methods? Is that all in this document
that I would see and agree to before I entered into Facebook? ZUCKERBERG: Yes, senator. So there are – there are
two ways that we do this. One is we try to be exhaustive
in the legal documents, or on the terms of
service and privacy policies. But, more importantly, we try to
provide in-line controls so that – that are in plain English,
that people can understand. They can either go to settings,
or we can show them at the top of the app, periodically, so
that people understand all the controls and settings they have
and can – can configure their experience the
way that they want. BLUNT: So do people – do people
now give you permission to track specific devices
in their contract? And, if they do, is that a
relatively new addition to what you do? ZUCKERBERG: Senator, I’m sorry. I don’t have that. BLUNT: Am I able to –
am I able to opt out? Am I able to say, ìIt’s okay for
you to track what I’m saying on Facebook, but I don’t want you
to track what I’m texting to somebody else, off
Facebook, on an Android phone.”? ZUCKERBERG: Okay. Yes, senator. In – in general, Facebook is not
collecting data from other apps that you use. There may be some specific
things about the device that you’re using that Facebook needs
to understand in order to offer the service. But, if you’re using Google or
you’re using some texting app, unless you specifically opt
in that you want to share the texting app information,
Facebook wouldn’t see that. BLUNT: Has it
always been that way? Or is that a recent addition to
how you deal with those other ways that I might communicate? ZUCKERBERG: Senator, my
understanding is that that is how the mobile operating
systems are architected. BLUNT: The – so do you – you
don’t have bundled permissions for how I can agree to
what devices I may use, that you may have contact with? Do you – do you
bundle that permission? Or am I able to, one at a –
individually say what I’m willing for you to – to watch,
and what I don’t want you to watch? And I think we might have
to take that for the record, based on everybody else’s time. THUNE: Thank you, Senator Blunt. Next up, Senator Durbin. SEN. RICHARD J. DURBIN (D-ILL):
Thanks very much, Mr. Chairman. Mr. Zuckerberg, would you be
comfortable sharing with us the name of the hotel you
stayed in last night? ZUCKERBERG: No. (LAUGHTER) DURBIN: If you
messaged anybody this week, would you share with us the
names of the people you’ve messaged? ZUCKERBERG: Senator, no. I would probably not
choose to do that publicly, here. DURBIN: I think that may be what
this is all about: your right to privacy, the limits of your
right to privacy and how much you give away in modern
America in the name of, quote, ìconnecting people
around the world;î a question, basically, of what information
Facebook’s collecting, who they’re sending it to and
whether they ever asked me, in advance, my
permission to do that. Is that a fair thing for the
user of Facebook to expect? ZUCKERBERG: Yes, senator. I think everyone should
have control over how their information is used. And as we’ve talked about in
some of the other questions, I think of that is laid out
in and some of the documents, but more importantly, you
want your people control in the product itself. So the most important way that
this happens across our services is that every day, people come
to our services to choose to share photos or send messages,
and every single time they choose to share something, there
– they have a control right there about who they
want to share it with. But that level of control
is extremely important. DURBIN: They certainly know
within the Facebook pages who their friends are, but they may
not know as has happened – and you’ve conceded this
point in the past, that sometimes that information
is going way beyond there friends, and sometimes people
have made money off of sharing that information, correct? ZUCKERBERG: Senator, you
are referring I think to our developer platform, and it may
be useful for me to give some background on how
we set that up, if that’s useful. DURBIN: I have
three minutes left, so maybe you can do
that for the record, because I have couple other
questions I would like to ask. You have recently announced
something that is called Messenger Kids. Facebook created an app allowing
kids between the ages of 6 and 12 to send video and text
messages through Facebook as an extension of their
parent’s account. You have cartoonlike stickers,
and other features designed to appeal to little
kids – first-graders, kindergartners. On January 30th, the Campaign
for Commercial-Free Childhood and lots of other child
development organizations warned Facebook. They pointed to a wealth of
research demonstrating the excessive use of digital devices
and social media is harmful to kids, and argued that young
children simply are not ready to handle social media
accounts at age 6. In addition, their concerns
about data that is being gathered about these kids. Now, there are
certain limits of the law, we know. There’s a Children’s
Online Privacy Protection Act. What guarantees can you give
us the note data from Messenger Kids is or will be collected
or shared with those of might violate that law? ZUCKERBERG: All right, senator,
so a number of things I think are – are important here. The background on
Messenger Kids is, we heard feedback from thousands
of parents that they want to be able to stay in touch with
their kids and call them, use apps like FaceTime when
they’re working late or not around and want to
communicate with their kids, but they want to have
complete control over that. So I think we can all agree that
if you – when your kid is 6 or 7, even if they have
access to a phone, you want to control
everyone who they can contact. And there was an app
out there that did that. So we build this
service to do that. The app collects a minimum
amount of information that is necessary to
operate the service. So, for example, the messages
that people send is something that we collect in order
to operate the service, but in general, that data is not
going to be shared with third parties, it is not connected
to the broader Facebook … DURBIN: Excuse me, as a lawyer,
I picked up on that word ìin general,î the phrase ìin
general.î It seems to suggest that in some circumstances
it will be shared with third parties. ZUCKERBERG: No. It will not. DURBIN: All right. Would you be open to the idea
that someone having reached adult age, having grown
up with Messenger Kids, should be allowed to delete
the data that you collected? ZUCKERBERG: Senator, yes. As a matter of fact,
when you become 13, which is our legal limit – our
limit – we don’t allow people under the age of 13 to
use Facebook – you don’t automatically go from having
a Messenger Kids account to a Facebook account. You have to start over
and get a Facebook account. So I think it’s a good idea to
consider making sure that all that information is
deleted, and in general, people are going to be starting
over when get their – their Facebook or other accounts. DURBIN: I’ll close, because
I just have a few seconds. Illinois has a Biometric
Information Privacy Act, or the state does, which is to
regulate the commercial use of facial, voice, finger and
iris scans and the like. We’re now in a
fulsome debate on that. And I’m afraid Facebook has come
down to the position of trying to carve out exceptions to that. I hope you’ll fill me in on
how that is consistent with protecting privacy. Thank you. THUNE: Thank you,
Senator Durbin. Senator Cornyn? SEN. JOHN CORNYN (R-TEX):
Thank you, Mr. Zuckerberg, for being here. I know in – up until 2014, a
mantra or motto of Facebook was move fast and break things. Is that correct? ZUCKERBERG: I don’t
know when we changed it, but the mantra is currently move
fast with stable infrastructure, which is a much
less sexy mantra. CORNYN: Sounds much more boring. But my question is, during
the time that it was Facebook’s mantra or motto to move
fast and break things, do you think some
of the misjudgments, perhaps mistakes that
you’ve admitted to here, were as a result of that
culture or that attitude, particularly as it regards
to personal privacy of the information of your subscribers? ZUCKERBERG: Senator, I do think
that we made mistakes because of that. But the broadest mistakes that
we made here are not taking a broad enough view of
our responsibility. And while that wasn’t a matter –
the ìmove fastî cultural value is more tactical around whether
engineers can ship things and – and different ways
that we operate. But I think the big mistake that
we’ve made looking back on this is viewing our responsibility
as just building tools, rather than viewing our whole
responsibility as making sure that those tools
are used for good. CORNYN: Well I – and
I appreciate that. Because previously, or
earlier in the past, we’ve been told that
platforms like Facebook, Twitter, Instagram, the
like are neutral platforms, and the people who own and run
those for profit – and I’m not criticizing doing something
for profit in this country. But they bore no
responsibility for the content. Do you agree now that Facebook
and the other social media platforms are not
neutral platforms, but bear some
responsibility for the content? ZUCKERBERG: I agree that we’re
responsible for the content, but I think that there’s – one
of the big societal questions that I think we’re going to
need to answer is the current framework that we have is
based on this reactive model, that assumed that
there weren’t A.I. tools that could
proactively tell, you know, whether something was
terrorist content or something bad, so it naturally relied on
requiring people to flag for a company, and then the company
needing to take reasonable action. In the future, we’re going to
have tools that are going to be able to identify more
types of bad content. And I think that there is –
there are moral and legal obligation questions that I
think we’ll have to wrestle with as a society about when we want
to require companies to take action proactively on
certain of those things, and when that gets
in the way of … CORNYN: I appreciate that,
I have two minutes left … ZUCKERBERG: All right. CORNYN: … to ask you questions. So you – you – interestingly,
the terms of the – what do you call it, the terms of service is
a legal document which discloses to your subscribers how their
information is going to be used, how Facebook is
going to operate. CORNYN: And – but you concede
that – you doubt everybody reads or understands that legalese,
those terms of service. So are – is that to suggest that
the consent that people give subject to that terms of
service is not informed consent? In other words,
they may not read it, and even if they read it,
they may not understand it? ZUCKERBERG: I just think we have
a broader responsibility than what the law requires. So I – what you … CORNYN: No, I’m talking – I’m
talking about – I appreciate that. What I’m asking about, in
terms of what your subscribers understand, in terms of how
their data is going to be used – but let me go to
the terms of service. Under paragraph
number two, you say, ìYou own all of the content
and information you post on Facebook.î That’s what
you’ve told us here today, a number of times. So, if I chose to
terminate my Facebook account, can I bar Facebook or any third
parties from using the data that I had previously supplied,
for any purpose whatsoever? ZUCKERBERG: Yes, senator. If you delete your account, we
should get rid of all of your information. CORNYN: You should? Or do you? ZUCKERBERG: We do. We do. CORNYN: How about third parties
that you have contracted with to use some of that
underlying information, perhaps to target
advertising for themselves? You can’t – do you – do you
call back that information, as well? Or does that remain
in their custody? ZUCKERBERG: Well, senator, this
is actually a very important question, and I’m glad
you brought this up, because there’s a very common
misperception about Facebook – that we sell data
to advertisers. And we do not sell
data to advertisers. We don’t sell data to anyone. CORNYN: Well, you
clearly rent it. ZUCKERBERG: What we allow is for
advertisers to tell us who they want to reach, and
then we do the placement. So, if an advertiser
comes to us and says, ìAll right, I am a ski shop and
I want to sell skis to women,î then we might have some
sense, because people shared skiing-related content, or said
they were interested in that, they shared
whether they’re a woman, and then we can show the ads to
the right people without that data ever changing hands
and going to the advertiser. That’s a very fundamental
part of how our model works and something that is
often misunderstood. So I’m – I appreciate
that you brought that up. THUNE: Thank you,
Senator Cornyn. We had indicated earlier on
that we would take a couple of breaks, give our
witness an opportunity. And I think we’ve
been going, now, for just under two hours. So I think what we’ll do is … (CROSSTALK) ZUCKERBERG:
You can do a few more. (LAUGHTER) THUNE: You –
you’re – you want to keep going? ZUCKERBERG: Maybe –
maybe 15 minutes. Does that work? THUNE: Okay. All right, we’ll keep going. Senator Blumenthal is up next. And we will commence. SEN. RICHARD BLUMENTHAL
(D-CONN): Thank you, Mr. Chairman. Thank you for being here
today, Mr. Zuckerberg. You have told us today – and
you’ve told the world – that Facebook was deceived by
Aleksandr Kogan when he sold user information to
Cambridge Analytica, correct? ZUCKERBERG: Yes. BLUMENTHAL: I want to show
you the terms of service that Aleksandr Kogan provided to
Facebook and note for you that, in fact, Facebook was on notice
that he could sell that user information. Have you seen these
terms of service before? ZUCKERBERG: I have not. BLUMENTHAL: Who in Facebook
was responsible for seeing those terms of service that put you
on notice that that information could be sold? ZUCKERBERG: Senator, our app
review team would be responsible for that. Had … BLUMENTHAL: Has anyone been
fired on that app review team? ZUCKERBERG: Senator,
not because of this. BLUMENTHAL: Doesn’t that term
of service conflict with the FTC order that Facebook was under at
that very time that this term of service was, in fact,
provided to Facebook. And you’ll note that the Face
– the FTC order specifically requires Facebook
to protect privacy. Isn’t there a conflict there? ZUCKERBERG: Senator, it
certainly appears that we should have been aware that this app
developer submitted a term that was in conflict with the
rules of the platform. BLUMENTHAL: Well,
what happened here was, in effect, willful blindness. It was heedless
and reckless, which, in fact, amounted to a violation
of the FTC consent decree. Would you agree? ZUCKERBERG: No, senator. My understanding is that – is
not that this was a violation of the consent decree. But as I’ve said a
number of times today, I think we need to take
a broader view of our responsibility around privacy
than just what is mandated in the current law. BLUMENTHAL: Well,
here is my reservation, Mr. Zuckerberg. And I apologize for
interrupting you, but my time is limited. We’ve seen the
apology tours before. You have refused to acknowledge
even an ethical obligation to have reported this violation
of the FTC consent decree. And we have letters – we’ve
had contacts with Facebook employees. And I am going to submit a
letter for the record from Sandy Parakilas, with your permission,
that indicates not only a lack of resources, but lack
of attention to privacy. And so, my reservation about
your testimony today is that I don’t see how you can change
your business model unless there are specific rules of the road. Your business model is to
monetize user information to maximize profit over privacy. And unless there are specific
rules and requirements enforced by an outside agency, I have no
assurance that these kinds of vague commitments are
going to produce action. So I want to ask you a couple
of very specific questions. And they are based on
legislation that I’ve offered, the MY DATA Act; legislation
that Senator Markey is introducing today,
the CONSENT Act, which I’m joining. Don’t you agree that companies
ought to be required to provide users with clear, plain
information about how their data will be used, and specific
ability to consent to the use of that information? ZUCKERBERG: Senator, I do
generally agree with what you’re saying. And I laid that out earlier
when I talked about what … BLUMENTHAL: Would you agree
to an opt-in as opposed to an opt-out? ZUCKERBERG: Senator, I think
that – that certainly makes sense to discuss. And I think the details
around this matter a lot. BLUMENTHAL: Would you agree that
users should be able to access all of their information? ZUCKERBERG: Senator, yes. Of course. BLUMENTHAL: All of the
information that you collect as a result of
purchases from data brokers, as well as tracking them? ZUCKERBERG: Senator, we
have already a ìdownload your informationî tool that allows
people to see and to take out all of the information that
Facebook – that they’ve put into Facebook or that
Facebook knows about them. So, yes, I agree with that. We already have that. BLUMENTHAL: I have a number of
other specific requests that you agree to support as
part of legislation. I think
legislation is necessary. The rules of the road have to
be the result of congressional action. We have – Facebook has
participated recently in the fight against scourge – the
scourge of sex trafficking. And a bill that we’ve just
passed – it will be signed into law tomorrow – SESTA, the Stop
Exploiting Sex Trafficking Act – was the result of
our cooperation. I hope that we can cooperate on
this kind of measure as well. ZUCKERBERG: Senator, I look
forward to having my team work with you on this. THUNE: Thank you,
Senator Blumenthal. Senator Cruz. SEN. TED CRUZ (R-TEX):
Thank you Mr. Chairman. Mr. Zuckerberg, welcome. Thank you for being here. Mr. Zuckerberg, does Facebook
consider itself a neutral public forum? ZUCKERBERG: Senator, we consider
ourselves to be a platform for all ideas. CRUZ: Let me ask
the question again. Does Facebook consider itself
to be a neutral public forum, and representatives of your
company are giving conflicting answers on this? Are you a … ZUCKERBERG: Well … CRUZ: … First Amendment speaker
expressing your views, or are you a neutral public
forum allowing everyone to speak? ZUCKERBERG: Senator, here’s
how we think about this: I don’t believe that – there are certain
content that clearly we do not allow, right? Hate speech,
terrorist content, nudity, anything that makes people
feel unsafe in the community. From that perspective, that’s
why we generally try to refer to what we do as
platform for all ideas … CRUZ: Let me try this,
because the time is constrained. It’s just a simple question. The predicate for Section 230
immunity under the CDA is that you’re a neutral public forum. Do you consider yourself
a neutral public forum, or are you engaged
in political speech, which is your right
under the First Amendment. ZUCKERBERG: Well, senator, our
goal is certainly not to engage in political speech. I am not that familiar with the
specific legal language of the – the law that you –
that you speak to. So I would need to
follow up with you on that. I’m just trying to lay out
how broadly I think about this. CRUZ: Mr. Zuckerberg, I will say
there are a great many Americans who I think are deeply concerned
that that Facebook and other tech companies are engaged in a
pervasive pattern of bias and political censorship. There have been numerous
instances with Facebook in May of 2016, Gizmodo reported
that Facebook had purposely and routinely suppressed
conservative stories from trending news,
including stories about CPAC, including stories
about Mitt Romney, including stories about
the Lois Lerner IRS scandal, including stories
about Glenn Beck. In addition to that, Facebook
has initially shut down the Chick-fil-A
Appreciation Day page, has blocked a post of
a Fox News reporter, has blocked over two
dozen Catholic pages, and most recently blocked Trump
supporters Diamond and Silk’s page, with 1.2
million Facebook followers, after determining their
content and brand were, quote, ìunsafe to the
community.î To a great many Americans that appears to be a
pervasive pattern of political bias. Do you agree with
that assessment? ZUCKERBERG: Senator, let me
say a few things about this. First, I understand where
that concern is coming from, because Facebook in the tech
industry are located in Silicon Valley, which is an
extremely left-leaning place, and I – this is actually a
concern that I have and that I try to root out in the company,
is making sure that we do not have any bias in
the work that we do, and I think it is a fair concern
that people would at least wonder about. Now … CRUZ: Let me – let me ask this
question: Are you aware of any ad or page that has been taken
down from Planned Parenthood? ZUCKERBERG: Senator, I’m not. But let me just … CRUZ: How about moveon.org? ZUCKERBERG: Sorry. CRUZ: How about moveon.org? ZUCKERBERG: I’m not
specifically aware of those … CRUZ: How about any
Democratic candidate for office? ZUCKERBERG: I’m not
specifically aware. I mean, I’m not sure. CRUZ: In your testimony, you say
that you have 15,000 to 20,000 people working on
security and content review. Do you know the political
orientation of those 15,000 to 20,000 people engaging
engaged in content review? ZUCKERBERG: No, senator. We do not generally ask
people about their political orientation when
they’re joining the company. CRUZ: So as CEO, have you ever
made hiring or firing decisions based on political positions or
what candidates they supported? ZUCKERBERG: No. CRUZ: Why was
Palmer Luckey fired? ZUCKERBERG: That is a specific
personnel matter that seems like it would be
inappropriate to speak to here. CRUZ: You just made a
specific representation, that you didn’t make decisions
based on political views. Is that accurate? ZUCKERBERG: Well, I can – I can
commit that it was not because of a political view. CRUZ: Do you know, of those
15 to 20,000 people engaged in content review,
how many, if any, have ever
supported, financially, a Republican
candidate for office? ZUCKERBERG: Senator,
I do not know that. CRUZ: Your testimony says,
ìIt is not enough that we just connect people. We have to make sure those
connections are positive.î It says, ìWe have to make sure
people aren’t using their voice to hurt people or
spread misinformation. We have a responsibility,
not just to build tools, to make sure those tools are
used for good.î Mr. Zuckerberg, do you feel it’s your
responsibility to assess users, whether they are good and
positive connections or ones that those 15 to 20,000 people
deem unacceptable or deplorable? ZUCKERBERG: Senator, you’re
asking about me personally? CRUZ: Facebook. ZUCKERBERG: Senator, I think
that there are a number of things that we would all
agree are clearly bad. Foreign
interference in our elections, terrorism, self-harm. Those are things … CRUZ: I’m talking
about censorship. ZUCKERBERG: Well, I – I think
that you would probably agree that we should remove terrorist
propaganda from the service. So that, I agree. I think it is – is clearly bad
activity that we want to get down. And we’re generally proud of
– of how well we – we do with that. Now what I can say – and I – and
I do want to get this in before the end, here – is that I am – I
am very committed to making sure that Facebook is a
platform for all ideas. That is a – a very important
founding principle of – of what we do. We’re proud of the discourse and
the different ideas that people can share on the service,
and that is something that, as long as I’m
running the company, I’m going to be committed
to making sure is the case. CRUZ: Thank you. THUNE: Thank you, Senator Cruz. Do you want to break now? (LAUGHTER) Or do you
want to keep going? ZUCKERBERG: Sure. I mean, that was –
that was pretty good. So. All right. THUNE: All right. We have – Senator
Whitehouse is up next. But if you want to take a … ZUCKERBERG: Yeah. THUNE: … a five-minute break right now,
we have now been going a good two hours, so … ZUCKERBERG: Thank you. THUNE: … I will be – we’ll recess for
five minutes and reconvene. (RECESS) GRASSLEY:
We’ll come to order. (CROSSTALK) GRASSLEY: Oh, okay. I want to read this first. Before I call on
Senator Whitehouse, Senator Feinstein asked
permission to put letters and statements in the record, and
without objection they will be put in from the ACLU, the
Electronic Privacy Information Center, the Association for
Computing – Computing Machinery Public Policy Council
and Public Knowledge. Senator Whitehouse? SEN. SHELDON WHITEHOUSE
(D-RI): Thank you, Chairman. ZUCKERBERG: Thank you. Mr. Chairman, I want to correct
one thing that I said earlier in response to a
question from Senator Leahy. He had asked if – why we didn’t
ban Cambridge Analytica at the time when we learned
about them in 2015. And I answered that what my
– what my understanding was, was that they were
not on the platform, were not an app
developer or advertiser. When I went back and met
with my team afterwards, they let me know that Cambridge
Analytica actually did start as an advertiser later in 2015. So we could have in
theory banned them then. We made a mistake
by not doing so. But I just wanted to make sure
that I updated that because I – I – I misspoke, or
got that wrong earlier. GRASSLEY: (OFF-MIKE) Whitehouse? WHITEHOUSE: Thank you, Chairman. Welcome back, Mr. Zuckerberg. On the subject of bans, I just
wanted to explore a little bit what these bans mean. Obviously Facebook has been done
considerable reputational damage by it’s association with
Aleksandr Kogan and with Cambridge Analytica, which is
one of the reasons you’re having this enjoyable
afternoon with us. Your testimony says that
Aleksandr Kogan’s app has been banned. Has he also been banned? ZUCKERBERG: Yes, my
understanding is he has. WHITEHOUSE: So if he were to
open up another account under a name and you were able to find
out that would be taken – that would be closed down? ZUCKERBERG: Senator, I believe
we – we are preventing him from building any more apps. WHITEHOUSE: Does he have
a Facebook account still? ZUCKERBERG: Senator, I believe
the answer to that is no, but I can follow up
with you afterwards. WHITEHOUSE: Okay. And with respect to
Cambridge Analytica, your testimony is that first
you required them to formally certify that they had deleted
all improperly acquired data. Where did that formal
certification take place? That sounds kind of like
a quasi-official thing, to formally certify. What did that entail? ZUCKERBERG: Senator, first they
sent us an email notice from their chief data officer telling
us that they didn’t have any of the data any more, that they
deleted it and weren’t using it. And then later we
followed up with, I believe, a full legal contract
where they certified that they had deleted the data. WHITEHOUSE: In a legal contract? ZUCKERBERG: Yes, I believe so. WHITEHOUSE: Okay. And then you ultimately said
that you have banned Cambridge Analytica. Who exactly is banned? What if they
opened up Princeton, Rhode Island Analytica? Different corporate
form, same enterprise. Would that
enterprise also be banned? ZUCKERBERG: Senator, that
is certainly the intent. Cambridge Analytica actually has
a parent company and we banned the parent company. And recently we also
banned a firm called AIQ, which I think is also
associated with them. And if we find other firms
that are associated with them, we will block them from
the platform as well. WHITEHOUSE: Are
individual principals – P-A-L-S, principals of the
firm also banned? ZUCKERBERG: Senator, my
understanding is we’re blocking them from doing
business on the platform, but I do not believe that
we’re blocking people’s personal accounts. WHITEHOUSE: okay. Can any customer amend
your terms of service? Or is the terms of service a
take it or leave it proposition for the average customer? ZUCKERBERG: Senator, I think the
terms of service are what they are. But the service is
really defined by people. Because you get to choose
what information you share, and the whole service is about
what friends you connect to, which people you
choose to connect to … WHITEHOUSE: Yes, I guess my
question would relate to – Senator Graham held up
that big, fat document. It’s easy to put a lot of things
buried in a document that then later turn out to
be of consequence. And all I wanted to establish
with you is that that document that Senator Graham held up,
that is not a negotiable thing with individual customers;
that is a take it or leave it proposition for your
customers to sign up to, or not use the service. ZUCKERBERG: Senator, that’s
right on the terms of the service, although we offer a
lot of controls so people can configure the
experience how they want. WHITEHOUSE: So, last question,
on a different subject having to do with the authorization
process that you are undertaking for entities that are putting up
political content or so-called issue-ad content. You said that they all have
to go through an authorization process before they do it. You said here we will be
verifying the identity. How do you look behind a
shell corporation and find who’s really behind it through
your authorization process? Well, step back. Do you need to look behind shell
corporations in order to find out who is really behind the
content that’s being posted? And if you may need to look
behind a shell corporation, how will you go
about doing that? How will you get
back to the true, what lawyers would call,
beneficial owner of the site that is putting out
the political material? ZUCKERBERG: Senator, are –
are you referring to the verification of
political and issue ads? WHITEHOUSE: Yes,
and before that, political ads, yes. ZUCKERBERG: Yes. So what we’re going to do is
require a valid government identity and we’re going
to verify the location. So we’re going to do that so
that way someone sitting in Russia, for example, couldn’t
say that they’re in America and, therefore, able to
run an election ad. WHITEHOUSE: But if they were
running through a corporation domiciled in Delaware, you
wouldn’t know that they were actually a Russian owner. ZUCKERBERG: Senator,
that’s – that’s correct. WHITEHOUSE: Okay. Thank you, my time has expired
and I appreciate the courtesy of the chair for the extra seconds. Thank you, Mr. Zuckerberg. GRASSLEY: Senator Lee. SEN. MIKE LEE (R-UTAH):
Thank you, Mr. Chairman. Mr. Zuckerberg, I wanted to
follow up on a statement you made shortly before the
break just a few minutes ago. You said that there are
some categories of speech, some types of content that
Facebook would never want to have any part of and
takes active steps to avoid disseminating,
including hate speech, nudity, racist speech, I – I – I
assume you also meant terrorist acts, threats of
physical violence, things like that. Beyond that, would you agree
that Facebook ought not be putting its thumb on the scale
with regard to the content of speech, assuming it fits out of
one of those categories that – that’s prohibited? ZUCKERBERG: Senator, yes. There are generally two
categories of content that – that we’re very worried about. One are things that
could cause real world harm, so terrorism
certainly fits into that, self-harm fits into that,
I would consider election interference to fit into
that and those are the types of things that we – I – I don’t
really consider there to be much discussion around whether
those are good or bad topics. LEE: Sure, yes, and
I’m not disputing that. What I’m asking is, once you
get beyond those categories of things that are
prohibited, and should be, is it Facebook’s position that
it should not be putting its thumb on the scale; it should
not be favoring or disfavoring speech based on its content,
based on the viewpoint of that speech? ZUCKERBERG: Senator, in
general that’s our position. What we – one of the things that
is really important though is that in order to create a
service where everyone has a voice, we also need to make
sure that people aren’t bullied, or – or basically intimidated,
or the environment feels unsafe for them. LEE: Okay. So when you say in general,
that’s the – the exception that you’re referring to, the
exception being that if someone feels bullied, even if
it’s not a terrorist act, nudity, terrorist
threats, racist speech, or something like that
you might step in there. Beyond that, would you step in
and put your thumb on the scale as far as the viewpoint of
the content being posted? ZUCKERBERG: Senator, no. I mean, in general our – our
goal is to allow people to have as much expression as possible. LEE: Okay. So subject to the
exceptions we’ve discussed, you would stay out of that. Let me ask you this, isn’t
there a significant free market incentive that a
social media company, including yours, has, in order
to safeguard the data of your users? Don’t you have free market
incentives in that respect? ZUCKERBERG: Yes, senator. Yes. LEE: Does – don’t your interests
align with – with those of us here who want to
see data safeguarded? ZUCKERBERG: Absolutely. LEE: Do you have the
technological means available, at your disposal, to make sure
that that doesn’t happen and to – to protect, say, an app
developer from transferring Facebook data to a third party? ZUCKERBERG:
Senator, a lot of that, we do. And some of that happens outside
of our systems and will require new measures. And so, for example, what we saw
here was people chose to share information with
an app developer. That worked according to
how the system was designed. That information was then
transferred out of our system to servers that this
developer, Aleksandr Kogan, had. And then that person chose
to then go sell the data to Cambridge Analytica. That is going to require much
more active intervention and auditing from us to
prevent, going forward, because once it’s out of our
system it is a lot harder for us to have a full understanding
of what’s happening. LEE: From what
you’ve said today, and from previous statements
made by you and other officials at your company, data is at the
center of your business model. It’s how you make money. Your ability to run
your business effectively, given that you don’t
charge your users, is based on monetizing data. And so the real
issue, it seems to me, really comes down to
what you tell the public, what you tell users of Facebook,
about what you’re going to do with the data. About how you’re
going to use it. Can you – can you give
me a couple of examples, maybe two examples, of ways
in which data is collected by Facebook, in a way that
people are not aware of? Two examples of types of data
that Facebook collects that might be surprising
to Facebook users? ZUCKERBERG: Well, senator, I
would hope that what we do with data is not
surprising to people. LEE: And has it been at times? ZUCKERBERG: Well,
senator, I think in this case, people certainly didn’t expect
this developer to sell the data to Cambridge Analytica. In general, there are two
types of data that Facebook has. The vast majority – and
then the first category, is content that people chose to
share on the service themselves. So that’s all the
photos that you share, the posts that you make, what
you think of as the Facebook service, right? That’s – everyone has control
every single time that they go to share that. They can delete that
data any time they want; full control, the
majority of the data. The second category is around
specific data that we collect in order to make the
advertising experiences better, and more relevant, and
work for businesses. And those often
revolve around measuring, okay, if you – if
we showed you an ad, then you click through
and you go somewhere else, we can measure that you actually
– that the – that the ad worked. That helps make the experience
more relevant and better for – for people, who are
getting more relevant ads, and better for the businesses
because they perform better. You also have control completely
of that second type of data. You can turn off the ability for
Facebook to collect that – your ads will get worse, so a lot of
people don’t want to do that. But you have complete control
over what you do there as well. GRASSLEY: Senator Schatz? SEN. BRIAN SCHATZ
(D-HAWAII): Thank you, Mr. Chairman. I want to follow up on the
questions around the terms of service. Your terms of service are
about 3,200 words with 30 links. One of the links is
to your data policy, which is about 2,700
words with 22 links. And I think the point has been
well made that people really have no earthly idea of
what they’re signing up for. And I understand that,
at the present time, that’s legally binding. But I’m wondering if you can
explain to the billions of users, in plain language,
what are they signing up for? ZUCKERBERG: Senator, that’s a
good and important question here. In general, you know, you
sign up for the Facebook, you get the ability to share the
information that you want with – with people. That’s what the
service is, right? It’s that you can connect
with the people that you want, and you can share
whatever content matters to you, whether that’s
photos or links or posts, and you get control over it. SCHATZ: Who do
you share it with? ZUCKERBERG: And you can
take it down if you want, and you don’t need to put
anything up in the first place if you don’t want. SCHATZ: What the part that
people are worried about, not the fun part? ZUCKERBERG: Well, what’s that? SCHATZ: The – the part that
people are worried about is that the data is going to
be improperly used. So people are trying to figure
out are your D.M.s informing the ads? Are your browsing
habits being collected? Everybody kind of understands
that when you click like on something or if you say you like
a certain movie or have a – a particular political proclivity,
that – I think that’s fair game; everybody understands that. What we don’t
understand exactly, because both as a matter of
practice and as a matter of not being able to decipher those
terms of service and the privacy policy is what exactly are you
doing with the data and do you draw a distinction between
data collected in the process of utilizing the platform, and that
which we clearly volunteer to the public to present ourselves
to other Facebook users? ZUCKERBERG: Senator, I’m not
sure I – I fully understand this. In – in general, you – your –
you – people come to Facebook to share content with other people. We use that in order to also
inform how we rank services like news feed and ads to provide
more relevant experiences. SCHATZ: Let me – let me try a
couple of specific examples. If I’m email – if I’m mailing
– emailing within WhatsApp, does that ever
inform your advertisers? ZUCKERBERG: No, we don’t see
any of the content in WhatsApp, it’s fully encrypted. SCHATZ: Right, but – but is
there some algorithm that spits out some information to your ad
platform and then let’s say I’m emailing about Black
Panther within WhatsApp, do I get a WhatsApp – do I
get a Black Panther banner ad? ZUCKERBERG: Senator, we don’t –
Facebook systems do not see the content of messages being
transferred over WhatsApp. SCHATZ: Yes, I know, but that’s
– that’s not what I’m asking. I’m asking about whether
these systems talk to each other without a human
being touching it. ZUCKERBERG: Senator, I think the
answer to your specific question is, if you message someone
about Black Panther in WhatsApp, it would not inform any ads. SCHATZ: Okay, I want to follow
up on Senator Nelson’s original question which is the question
of ownership of the data. And I understand as the
sort of matter of principle, you were saying, you know, we
want our customers to have more rather than less
control over the data. But I can’t imagine that it’s
true as a legal matter that I actually own my Facebook
data, because you’re the one monetizing it. Do you want to modify that
to sort of express that as a statement of principle, a
sort of aspirational goal, but it doesn’t seem to me
that we own our own data, otherwise we’d be getting a cut. ZUCKERBERG: Well, senator, you
own it in the sense that you chose to put it there, you
could take it down anytime, and you completely control the
terms under which it’s used. When you put it on Facebook, you
are granting us a license to be able to show it to other people. I mean, that’s necessary
in order for the service to operate. SCHATZ: Right, but the – so the
– the – so your definition of ownership is I sign up, I’ve
voluntarily – and I may delete my account if I wish,
but that’s basically it. ZUCKERBERG: Well, senator, I – I
think that the control is much more granular than that. You can chose each photo
that you want to put up or each message, and you
can delete those. And you don’t need to
delete your whole account, you have specific control. You can share different
posts with different people. SCHATZ: In the time I have left,
I want to – I want to propose something to you and
take it for the record. I read an interesting article
this week by Professor Jack Balkin at Yale that proposes
a concept of an information fiduciary. People think of fiduciaries as
responsible primarily in the economic sense, but this
is really about a trust relationship like
doctors and lawyers, tech companies should hold
in trust our personal data. Are you open to the idea of an
information fiduciary and shrine and statute? ZUCKERBERG: Senator, I think
it’s certainly an interesting idea, and Jack is very
thoughtful in this space, so I do think it
deserves consideration. SCHATZ: Thank you. THUNE: Senator Fischer? FISCHER: Thank
you, Mr. Chairman. FISCHER: Thank
you, Mr. Zuckerberg, for being here today. I appreciate your testimony. The full scope of Facebook
user’s activity can print a very personal picture I think. And additionally, you have those
2 billion users that are out there every month. And so we all know that’s larger
than the population of most countries. So how many data
categories do you store, does Facebook store, on the
categories that you collect? ZUCKERBERG: Senator, can you
clarify what you mean by data categories? FISCHER: Well, there’s – there’s
some past reports that have been out there that indicate that it
– that Facebook collects about 96 data categories for
those 2 billion active users. That’s 192 billion data
points that are being generated, I think, at any time
from consumers globally. So how many do – does
Facebook store out of that? Do you store any? ZUCKERBERG: Senator, I’m not
actually sure what that is referring to. FISCHER: On – on the points
that you collect information, if we call those categories, how
many do you store of information that you are collecting? ZUCKERBERG: Senator, the way I
think about this is there are two broad categories. This probably doesn’t line up
with whatever the – the specific report that you were seeing is. And I can make sure that we
follow-up with you afterwards to get you the
information you need on that. The two broad categories that I
think about are content that a person is chosen to share and
that they have complete control over, they get to control
when they put into the service, when they take it
down, who sees it. And then the other category
are data that are connected to making the ads relevant. You have complete
control over both. If you turn off the
data related to ads, you can choose not to share any
content or control exactly who sees it or take down the
content in the former category. FISCHER: And does
Facebook store any of that? ZUCKERBERG: Yes. FISCHER: How much do
you store of that? All of it? All of it? Everything we click on, is
that in storage somewhere? ZUCKERBERG: Senator, we store
data about what people share on the service and information
that’s required to do ranking better, to show you what
you care about in news feed. FISCHER: Do you – do
you store text history, user content,
activity, device location? ZUCKERBERG: Senator, some of
that content with people’s permission, we do store. FISCHER: Do you
disclose any of that? ZUCKERBERG: Yes, it – Senator,
in order to – for people to share that
information with Facebook, I believe that almost everything
that you just said would be opt in. FISCHER: And the
privacy settings, it’s my understanding that they
limit the sharing of that data with other Facebook
users, is that correct? ZUCKERBERG: Senator, yes. Every person gets to control
who gets to see their content. FISCHER: And does that also
limit the ability for Facebook to collect and use it? ZUCKERBERG: Senator, yes. There are other – there are
controls that determine what Facebook can do as well. So for example, people have a
control about face recognition. If people don’t want us to be
able to help identify when they are in photos that
their friends upload, then they can turn that off. FISCHER: Right. ZUCKERBERG: And then we won’t
store that kind of template for them. FISCHER: And – and there was
some action taken by the FTC in 2011. And you wrote a Facebook post at
the time on a public page on the Internet that it used
to seem scary to people, but as long as they
could make the page private, they felt safe sharing
with their friends online; control was key. And you just mentioned control. Senator Hatch asked you a
question and you responded there about complete control. So you and your company have
used that term repeatedly, and I believe you use
it to reassure users, is that correct? That you do have control and
complete control over this information? ZUCKERBERG: Well, senator,
this is how the service works. I mean, the core
thing that Facebook is, and all of our
services, WhatsApp, Instagram, Messenger. FISCHER: So is this – is then a
question of Facebook is about feeling safe, or are
users actually safe? Is Facebook – is
Facebook being safe? ZUCKERBERG: Senator, I
think Facebook is safe. I use it, my family uses it, and
all the people I love and care about use it all the time. These controls are not just
to make people feel safe; it’s actually what
people want in the product. The reality is, is that when you
– just think about how you use this yourself. You don’t want to share
it – if you take a photo, you’re not always going to
send that to the same people. Sometimes you’re going to
want to text it to one person. Sometimes you
might send it group. I bet you have a page. You’ll probably want to put some
stuff out there publicly so you can communicate with
your constituents. There are all these different
groups of people that someone might want to connect with,
and those controls are very important in practice for
the operation of the service. Not just to build trust,
although I think that the providing people with
control, also does that, but actually in order to make it
so that people can fulfill their goals of the service. GRASSLEY: Senator Coons. FISCHER: Thank you. SEN. CHRISTOPHER A. COONS (D-DEL): Thank
you, Chairman Grassley. Thank you, Mr. Zuckerberg,
for joining us today. I think the whole reason we’re
having this hearing is because of a tension between two basic
principles you have laid out. First you’ve said about the data
that users post on Facebook. You control and own the
data that you put on Facebook. You said some very positive,
optimistic things about privacy and data ownership. But it’s also the reality that
Facebook is a for-profit entity that generated $40 billion in ad
revenue last year by targeting ads. In fact, Facebook claims that
advertising makes it easy to find the right people, capture
their attention and get results and you recognize that an
ad-supported service is, as you said earlier today, best
aligned with your mission and values. But the reality is, there’s
a lot of examples where ad targeting has led to results
that I think we would all disagree with or
dislike or would concern us. You’ve already admitted that
Facebook’s own ad tools allow Russians to target users, voters
based on racist or anti-Muslim or anti-immigrant views, and
that that may have played a significant role in
election here in United States. Just today, Time magazine posted
a story saying that wildlife traffickers are continuing to
use Facebook tools to advertise illegal sales of
protected animal parts, and I am left questioning
whether your ad-targeting schools would allow other
concerning practices like diet pill manufacturers targeting
teenagers who are struggling with their weight, or allowing
a liquor distributor to target alcoholics or a gambling
organization to target those with gambling problems. I’ll give you one concrete
example I’m sure you are familiar with: ProPublica
back in 2016 highlighted that Facebook lets advertisers
exclude users by race in real estate advertising. There was a way that you could
say that this particular ad, I only want to be
seen by white folks, not by people of color, and that
clearly violates fair-housing laws and our basic sense of
fairness in the United States. And you promptly announced
that that was a bad idea, you were going to
change the tools, and that you would build a
new system to spot and reject discriminatory ads that violate
our commitment to fair housing. COONS: And yet a year later, a
follow-up story by ProPublica said that those changes
hadn’t fully been made; it was still possible to target
housing advertisement in a way that was racially
discriminatory. And my concern is that this
practice of making bold and – and engaging promises
about changes and practices, and then the reality of how
Facebook has operated in the real world, are in
persistent tension. Several different senators have
asked earlier today about the 2011 FTC consent decree that
required Facebook to better protect users’ privacy. And there are a whole series of
examples where there have been things brought to
your attention, where Facebook has apologized
and has said we’re going to change our
practices and our policies. And yet, there doesn’t seem to
have been as much follow up as would be called for. At the end of the day, policies
aren’t worth the paper they’re written on if Facebook
doesn’t enforce them. And I’ll close with a question
that’s really rooted in an experience I had today,
as an avid Facebook user. I woke up this morning and was
notified by a whole group of friends across the country,
asking if I had a new family, or if there was a fake
Facebook post of Chris Coons? I went to the one
they suggested. It had a different
middle initial than mine. And there’s my picture with
Senator Dan Sullivan’s family; same schools I went to, but a
whole lot of Russian friends. Dan Sullivan’s got a
very attractive family, by the way. SULLIVAN: Keep that
for the record there, Mr. Chairman. (LAUGHTER) COONS: The friends
who brought this to my attention included people I went to law
school with in Hawaii and our own attorney general in
the state of Delaware. And fortunately
I’ve got, you know, great folks who
work in my office. I brought it to their attention. They pushed Facebook and it
was taken down by midday. But I’m left worried about what
happens to Delawareans who don’t have these resources. It’s still possible to find
Russian trolls operating on the platform. Hate groups thrive in
some areas of Facebook, even though your
policies prohibit hate speech, and you’ve taken strong
steps against extremism and terrorists. But is a Delawarean who’s not in
the Senate going to get the same sort of quick response? I’ve already gotten input from
other friends who say they’ve had trouble getting a positive
response when they’ve brought to Facebook’s
attention a page that’s, frankly, clearly
violating your basic principals. My core question is isn’t it
Facebook’s job to better protect its users? And why do you shift the burden
to users to flag inappropriate content and make
sure it’s taken down? ZUCKERBERG: Senator, there are
a number of important points in there. And I think it’s clear
that this is an area, content policy enforcement, that
we need to do a lot better on over time. The history of how we got here
is we started off in my dorm room with not a lot of resources
and not having the A.I. technology to be able to
proactively identify a lot of this stuff. So just because of the
sheer volume of content, the main way that this works
today is that people report things to us and then we
have our team review that. And as I said before,
by the end of this year, we’re going to have more than
20,000 people at the company working on security
and content review, because this is important. Over time, we’re going to shift
increasingly to a method where more of this content is
flagged up front by A.I. tools that we develop. We’ve prioritized the most
important types of content that we can build A.I. tools for today, like
terror related content, where I mentioned earlier
that our systems that we deploy; we are taking down 99 percent
of the ISIS and Al Qaida-related content that we take down before
a person even flags them to us. If we fast
forward 5 or 10 years, I think we’re going
to have more A.I. technology that can
do that in more areas. And I think we need to get
there as soon as possible, which is why we’re
investing in it. GRASSLEY: Senator Sasse. COONS: I couldn’t agree more. I just think we can’t
wait five years … GRASSLEY: Senator … COONS: … to get housing discrimination
and personally offensive material out of Facebook. ZUCKERBERG: I agree. GRASSLEY: Senator Sasse? SASSE: Thank you, Mr. Chairman. Mr. Zuckerberg,
thanks for being here. At current pace, you’re due to
be done with the first round of questioning by about 1:00
a.m., so congratulations. I – I like Chris Coons a
lot, with his own family, or with Dan Sullivan’s family. Both are great photos. But I want to ask a similar
set of questions from the other side, maybe. I think the line – the
conceptual line between mirror-tech
company, mirror tools, and an actual content company,
I think it’s really hard. I think you guys
have a hard challenge. I think regulation over time
will have a hard challenge. And you’re a private company so
you can make policies that may be less than First Amendment
full spirit embracing in my view. But I worry about that. I worry about a world where when
you go from violent groups to hate speech in a hurry – and one
of your responses to the opening questions, you may
decide, or Facebook may decide, it needs to police a
whole bunch of speech, that I think America might be
better off not having policed by one company that has a really
big and powerful platform. Can you define hate speech? ZUCKERBERG: Senator, I think
that this is a really hard question. And I think it’s one of the
reasons why we struggle with it. There are certain definitions
that – that we – that we have around, you know,
calling for violence or … SASSE: Let’s just agree on that. ZUCKERBERG: Yes. SASSE: If somebody’s
calling for violence, we – that shouldn’t be there. I’m worried about the
psychological categories around speech. You used language of
safety and protection earlier. We see this happening on college
campuses all across the country. It’s dangerous. Forty percent of Americans under
age 35 tell pollsters they think the First Amendment is dangerous
because you might use your freedom to say something that
hurts somebody else’s feelings. Guess what? There are some really
passionately held views about the abortion issue
on this panel today. Can you imagine a world where
you might decide that pro-lifers are prohibited from speaking
about their abortion views on your content – on your platform? ZUCKERBERG: I certainly would
not want that to be the case. SASSE: But it might really be
unsettling to people who’ve had an abortion to have an
open debate about that, wouldn’t it? ZUCKERBERG: It might be, but I
don’t think that that would – would fit any of the definitions
of – of what we have. But I do generally agree with
the point that you’re making, which is as we – as we’re able
to technologically shift towards especially having A.I. proactively look at content,
I think that that’s going to create massive questions for
society about what obligations we want to require
companies to – to fulfill. And I do think that that’s
a question that we need to struggle with as a country,
because I know other countries are, and they’re
putting laws in place. And I think that America needs
to figure out and create the set of principles that we want
American companies to operate under. SASSE: Thanks. I wouldn’t want you to leave
here today and think there’s sort of a unified view in the
Congress that you should be moving toward policing more
and more and more speech. I think violence has no
place on your platform. Sex traffickers and human
traffickers have no place on your platform. But vigorous debates? Adults need to engage
in vigorous debates. I have only a little
less than two minutes left, so I’m going to shift
gears a little bit. But that was about adults. You’re a dad. I’d like to talk a little bit
about social media addiction. You started your comments today
by talking about how Facebook is and was founded as an
optimistic company. You and I have had conversations
separate from here. I don’t want to put
words in your mouth, but I think as you’ve aged
you might be a little bit less idealistic and optimistic than
you were when you – when you started Facebook. As a dad, do you worry about
social media addiction as a problem for America’s teens? ZUCKERBERG: Well my hope is –
is that we can be idealistic but have a broad view of
our responsibility. To your – your
point about teens, this is certainly something that
I think any parent thinks about, is how much do you want
your kids using technology. It – at Facebook, specifically,
I view our responsibility as not just building
services that people like, but building services that are
good for people and good for society as well. So we study a lot of effects of
well being of our – of our tools and broader technology. And you know, like any tool,
there are good and – and bad uses of it. What we find in general is that
if you’re using social media in order to build
relationships, right? So you’re – you’re
sharing content with friends, you’re interacting, then that
is associated with all of the long-term measures of well-being
that you’d intuitively thing of. Long-term health,
long-term happiness, long-term feeling
connected, feeling less lonely. But if you’re using the Internet
and social media primarily to just passively consume content,
and you’re not engaging with other people, then it doesn’t
have those positive effects and it could be negative. SASSE: We’re –
we’re almost at time, so I want to – I want
to ask you one more. Do social media companies hire
consulting firms to help them figure out how to get more
dopamine feedback loops so that people don’t want to
leave the platform? ZUCKERBERG: No, Senator. That’s not how we
talk about this, or – or how we set
up our product teams. We want our products to
be valuable to people. And if they’re valuable,
then people choose to use them. SASSE: Are you aware of other
social media companies that do hire such consultants? ZUCKERBERG: Not
sitting here today. SASSE: Thanks
GRASSLEY: Senator Markey? MARKEY: Thank you, Mr. Chairman. In response to Senator
Blumenthal’s pointed questions, you refused to answer whether
Facebook should be required by law to obtain clear permission
from users before selling or sharing their
personal information. So I’m going to ask
it one more time. Yes or no. Should Facebook get clear
permission from users before selling or sharing sensitive
information about your health, your finances,
your relationships? Should you have to
get their permission? That’s, essentially, the consent
decree with the Federal Trade Commission that
you signed in 2011. Should you have
to get permission? Should the
consumer have to opt in? ZUCKERBERG: Senator, we do
require permission to use the – the system, and to – to
put information in there, and for – for all
the uses of it. I want to be clear. We don’t sell information. So regardless of whether we
could get permission to do that, that’s just not a thing
that we’re going to go do. MARKEY: So would you
support legislation? I have a bill, Senator
Blumenthal referred to it, The Consent Act, that would just
put on the books a law that said that Facebook, and any other
company that gathers information about Americans, has to
get their permission, their affirmative permission,
before it can be reused for other purposes. Would you support that
legislation to make it a national standard
for not just Facebook, but for all the other
companies out there? Some of them, bad actors. Would you support
that legislation? ZUCKERBERG: Senator,
I – I – in general, I think that that
principle is exactly right. And I think we should have a –
a discussion around how to best apply that. MARKEY: No, would you support
legislation to back that general principle, that opt-in,
that getting permission is the standard. Would you support legislation to
make that the American standard? Europeans have
passed that as a law. Facebook’s going to live with
that law beginning on May 25th. Would you support that as
the law in the United States? ZUCKERBERG:
Senator, as a principle, yes, I would. I think the
details matter a lot, and now that … MARKEY: Right. But assuming that we
work out the details, you do support
opt-in as the standard? Getting permission affirmatively
as the standard for the United States? Is that correct? ZUCKERBERG: Senator, I think
that that’s the right principle. And a hundred billion
times a day in our services, when people go to share content,
they choose who they want to share it with affirmatively. MARKEY: So you – you – you could
support a law that enshrines that as the promise that we
make to the American people, that permission has to
be obtained before their information is used. Is that correct? ZUCKERBERG: Senator, yes. I said that in principle I
think that that makes sense, and the details matter and I
look forward to having our team work with you on
fleshing that out. MARKEY: Right. So the next subject,
because I want to, again I want to make sure
that we kind of drill down here. You earlier made reference
to the Child Online Privacy Protection Act of 1999,
which I am the author of. So that is the constitution for
child privacy protection online in the country, and
I’m very proud of that. But, there are no protections
additionally for a 13, a 14, or a 15-year-old. They get the same protections
that a 30-year-old or a 50-year-old get. So I have a separate piece of
legislation to insure that kinds who are under 16 absolutely
have a privacy bill of rights, and that permission has to be
received from their parents for their children before any of
their information is reused for any other purpose other
than that which was originally intended. Would you support a child online
privacy bill of rights for kids under 16 to guarantee that that
information is not reused for any other purpose without
explicit permission from the parents for the kids? ZUCKERBERG: Senator, I
think, as a general principle, I think protecting minors and
protecting their privacy is extremely important, and we do a
number of things on Facebook to do that already,
which I am happy to … MARKEY: I appreciate that. I’m talking about a law. I’m talking about a law. Would you support a law to
insure that kids under 16 have this privacy bill of rights? I had this conversation with you
in your office seven years ago, both this specific
subject and Palo Alto. And I think that’s really what
the American people want to know right now: What is the
protections of this? What are the protections that
are going to be put on the books for their families, but
especially for their children? Would you support a privacy bill
of rights for kids where opt in is the standard? Yes or no? ZUCKERBERG: Senator, I
think that that’s an important principle and … MARKEY: I appreciate that. ZUCKERBERG: … and I think we should … MARKEY: But we need a law
to protect those children. That’s my question to you. Do you think we
need a law to do so? Yes or no? ZUCKERBERG: Senator, I’m
not sure if we need a law, but I think that this is
certainly a thing that deserves a lot of discussion. MARKEY: And again, I
couldn’t disagree with you more. We’re leaving these children to
the most rapacious commercial predators in the country will
exploit these children unless we absolutely have a
law on the books. And I think it’s … GRASSLEY: Please give a short
– please give a short answer. ZUCKERBERG: Senator, I look
forward to having my team follow up to have my team flesh
out the details of it. GRASSLEY: Senator Flake? Senator Flake? (CROSSTALK) MARKEY: … issued to get a
correct answer to that. FLAKE: Thank you, Mr. Chairman. Thank you, Mr. Zuckerberg. Thanks for enjoying so far, and
I’m sorry if I plow old ground; I had to be away for a bit. I, myself, and Senator
Coons, Senator Peters, and a few others were in the
country of Zimbabwe just a few days ago. We met with opposition figures
who had talked about you know their goal is to be able to
have access to state-run media. FLAKE: In many
African countries, many countries around the
world, third-world countries, small countries, the only
traditional media is state run, and we ask them how they
get their message out, and it’s through social media. Facebook provides a very
valuable service in many countries for opposition leaders
or others who simply don’t have access, unless maybe
just before an election, to traditional media. So that’s very valuable, and I
think we all recognize that. On the flip side,
we’ve seen with Rohingya, that example of, you know, where
the state could use similar data or use this platform
to go after people. You talked about what
you’re doing in that regard, hiring more, you
know, traditional – or, local-language speakers. What else are you doing in
that regard to ensure that these states don’t – or, these
governments go after opposition figures or others? ZUCKERBERG: Senator, there are
three main things that we’re doing, in Myanmar specifically,
and that will apply to – to other situations like that. The first is hiring enough
people to do local language support, because the definition
of hate speech or things that can be racially coded to
incite violence are very language-specific and we
can’t do that with just English speakers for people
around the world. So we need to grow that. The second is, in these
countries there tend to be active civil society, who can
help us identify the figures who are – who are spreading hate. And we can work with them in
order to make sure that those figures don’t have a
place on our platform. The third is that there are
specific product changes that we can make in order to – that –
that might be necessary in some countries but not others,
including things around news literacy – right. And, like, encouraging people in
– in different countries about, you know, ramping up or down. You know, things that we
might do around fact-checking of content, specific product-type
things that we would implement in different places. But I think that that’s
something that we’re going to have to do in a
number of countries. FLAKE: There are
obviously limits, you know, native speakers that
you can hire or people that have eyes on the page. Artificial intelligence is going
to have to take the bulk of this. How – how much are you investing
in working on – on that tool to – to do what, really, we don’t
have or can’t hire enough people to do? ZUCKERBERG: Senator, I think
you’re absolutely right that over the long
term, building A.I. tools is going to be the
scalable way to identify and root out most of
this harmful content. We’re investing a
lot in doing that, as well as scaling up the number
of people who are doing content review. One of the things that I’ve
mentioned is this year we’re – or, in the last year, we’ve
basically doubled the number of people doing security
and content review. We’re going to have more
than 20,000 people working on security and content
review by the end of this year. So it’s going to be coupling
continuing to grow the people who are doing review in
these places with building A.I. tools, which is – we’re – we’re
working as quickly as we can on that, but some of
this stuff is just hard. That, I think, is going to
help us get to a better place on eliminating more of
this harmful content. FLAKE: Thank you. You’ve talked some
about this, I know, do you believe that Russian
and/or Chinese governments have harvested Facebook data and have
detailed data sets on Facebook users? Has your forensic
analysis shown you who else, other than Cambridge Analytica,
downloaded this kind of data? ZUCKERBERG: Senator, we have
kicked-off an investigation of every app that had access to a
large amount of people’s data before we locked down
the platform in 2014. That’s underway, I
imagine we’ll find some things, and we are committed to telling
the people who were affected when we do. I don’t think,
sitting here today, that we have specific knowledge
of – of other efforts by – by those nation-states. But, in general, we assume that
a number of countries are trying to abuse our systems. FLAKE: Thank you. Thank you, Mr. Chairman. GRASSLEY: (Inaudible)
person is Senator Hirono. HIRONO: Thank you, Mr. Chairman. Mr. Zuckerberg, the U.S. Immigration and Customs
Enforcement has proposed a new extreme vetting imitative which
they have renamed VISA Life Cycle vetting, that
sounds less scary. They have already held an
industry that they advertised on the federal contracting website
to get input from tech companies on the best way to,
among other things, and I’m quoting ICE, ìexploit
publicly available information, such as media,
blogs, public hearings, conferences, academic websites,
social media websites such as Facebook, Twitter and LinkedIn
to extract pertinent information regarding targets.î And
basically what they – what they want to do with these
targets is to determine, and again, I’m
quoting ICE’s own document, they want – ICE has been
directed to develop processes that determine and
evaluate an applicant, i.e. targets probability of becoming
a positively contributing member of society as well as
their ability to contribute to national interest in order
to meet the executive order. That is the
president’s executive order. And then ICE must also develop
a mechanism or methodology that allows them to assess whether
an applicant intends to commit criminal or terrorists acts
after entering the United States. Question to you is, does
Facebook plan to cooperate with this extreme vetting
initiative, and help the Trump administration target people
for deportation or other ICE enforcement? ZUCKERBERG: Senator, I don’t
know that we’ve had specific conversations around that. In general … HIRONO: If you were asked to
provide or cooperate with ICE so that they could determine
whether somebody is going to commit a crime, for example, or
become fruitful members of our society, would you cooperate? ZUCKERBERG: We would
not proactively do that. We cooperate with law
enforcement in two cases. One is if we become aware of
an imminent threat of harm, then we will proactively
reach out to law enforcement, as we believe is our
responsibility to do. The other is when law
enforcement reaches out to us with a valid legal subpoena
or – or request for data. In those cases, if their request
is overly broad or we believe it’s not a legal request,
then we’re going to push back aggressively. HIRONO: Well, let’s assume that
ICE doesn’t have a – a – there’s no law or rule that requires
that Facebook cooperate to allow them to get this kind of
information so that they can make those kinds of assessments,
it sounds to me as though you would decline? ZUCKERBERG:
Senator, that is correct. HIRONO: Is there
some way that – well, I know that you determine what
kind of content would be deemed harmful, so do you believe that
ICE can even do what they are talking about? Namely, through a combination
of various kinds of information including information that
they would hope to obtain from entities such yours, predict who
would commit crimes or present a national security problem. Do you think that –
that that’s even doable? ZUCKERBERG: Senator, I’m
not familiar enough with what they’re doing to offer an
informed opinion on that. HIRONO: Well you have to
make assessments as to what constitutes hate speech. That’s pretty hard to do. You have to assess what
election interference is. So these are rather
difficult to identify, but wouldn’t the – try to
predict whether somebody’s going to commit a crime fit into the
category of pretty difficult to assess? ZUCKERBERG: Senator, it
sounds difficult to me. All of these things,
like you’re saying, are difficult. I don’t know without having
worked on it or thinking about it … (CROSSTALK) HIRONO: I think
common sense would tell us that that’s pretty difficult. And yet, that’s what
ICE is proceeding to do. You were asked about
discriminatory advertising, and in February of 2017,
Facebook announced that it would no longer allow certain kinds
of ads that discriminated on the basis of race,
gender, family status, sexual orientation,
disability, or veteran status, all categories prohibited
by federal law and housing. And yet, after 2017, it was
discovered that you could in fact place those kinds of ads. So what is the status of whether
or not these ads can currently be placed on Facebook? And have you followed through
on your February 2017 promise to address this problem? And is there a way for the
public to verify that you have, or are – are we just expected
to trust that you’ve done this? ZUCKERBERG: Senator, those
– those are all important questions, and in general it
is against our policies to – to have any ads that
are discriminatory. Some of … HIRONO: Well, you said
that you wouldn’t allow it, but then – was it ProPublica –
could place these ads even after you said you would no
longer allow these kinds of ads. So what assurance do we have
from you that this is stop – going to stop? ZUCKERBERG: Well, two things. One is that we’ve removed the
ability to exclude ethnic groups and other sensitive
categories from ad targeting. So that just isn’t a feature
that’s even available anymore. For some of these cases, where
it may make sense to target proactively a group, the
enforcement today is – is still – we review ads, we
screen them up front, but most of the enforcement
today is still that our community flags issues
for us when they come up. So if the community
flags that issue for us, then our team, which has
thousands of people working on it, should take it down. We’ll make some mistakes, but we
try to make as few as possible. Over time, I think the strategy
would be to develop more A.I. tools that can more proactively
identify those types of content and do that filtering up front. (CROSSTALK) HIRONO: So
it’s a work in progress. ZUCKERBERG: Yes. THUNE: Thank you. Thank you, Senator Hirono. Senator Sullivan’s up next. HIRONO: Thank you. SULLIVAN: Thank
you, Mr. Chairman. And Mr.
Zuckerberg, quite a story, right? Dorm room to the global
behemoth that you guys are. Only in America, would
you agree with that? ZUCKERBERG: Senator,
mostly in America. SULLIVAN: You couldn’t – you
couldn’t do this in China, right? Or, what you did in 10 years. ZUCKERBERG: Well
– well, senator, there are – there are some
very strong Chinese Internet companies. SULLIVAN: Right but – you’re
supposed to answer ìyesî to this question. (LAUGHTER) Okay, come on,
I’m trying to help you, right? (CROSSTALK) THUNE: This
is – this is the softball. SULLIVAN: I mean,
give me a break. You’re in front of a bunch of
– the answer is ìyes,î okay, so thank you. (LAUGHTER) Now, your – your
testimony – you have talked about a lot of power – you’ve
been involved in elections. I thought your – your
testimony was very interesting. All – really all over the world,
the Facebook – 2 billion users, over 200 million Americans,
40 billion in revenue. I believe you and Google have
almost 75 percent of the digital advertising in the U.S. Is – one of the key issues
here, is Facebook too powerful? Are you too powerful? And do you think
you’re too powerful? ZUCKERBERG: Well, senator, I
think most of the time when people talk about our scale,
they’re referencing that we have two billion people
in our community. And I think one of the big
questions that we need to think through here is the vast
majority of those 2 billion people are outside of the U.S. And I think that
that’s something that, to your point, that
Americans should be proud of. (CROSSTALK) ZUCKERBERG: And
when I brought up the Chinese Internet companies, I think that
that’s a real – a real strategic and competitive threat that, in
American technology policy we (inaudible) should
be thinking about. (CROSSTALK) SULLIVAN: Let me
ask you another point here real quick. I – I want to – I – I
don’t want to interrupt, but you know, when you look
at kind of the history of this country and you look at
the history of these kind of hearings, right. You’re a smart guy. You read a lot of history. When companies become big and
powerful and accumulate a lot of wealth and power, what typically
happens from this body is there’s an – there is a instinct
to either regulate or break up, right. Look at the
history of this nation. You have any thoughts on
those two policy approaches? ZUCKERBERG: Well, senator,
I’m not the type of person that thinks that all
regulation is bad. So I think the Internet is
becoming increasingly important in people’s lives, and I
think we need to have a full conversation about what
is the right regulation, not whether it should
be or shouldn’t be. SULLIVAN: Let me – let me
talk about the tension there, because I – I think it’s a
good point and I appreciate you mentioning that. You know, my – one of my
worries on regulation, again, with a
company of your size, you’re saying, hey, we might be
interested in being regulated. But as you know regulations can
also cement the dominant power. So what do I mean by that? You know, you have
a lot of lobbyists, I think every lobbyist in town
is involved in this hearing in some way or another, a
lot of powerful interests. You look at what
happened with Dodd-Frank. That was supposed to be
aimed at the big banks. The regulations ended up
empowering the big banks in keeping the small banks down. Do you think that that’s a
risk given your influence, that if we regulate, we’re
actually going to regulate into – you into a position of
cemented authority when one of my biggest concerns about what
you guys are doing is that the next Facebook –
which we all want, the guy in the dorm room. We all want that to start it
– that you are becoming so dominant that we’re not able
to have that next Facebook. What – what – what
are your views on that? ZUCKERBERG: Well, senator, I
agree with the point that when you’re thinking
through regulation, across all industries, you need
to be careful that it doesn’t cement in the current companies
that are – that are winning. SULLIVAN: But would
you try to do that? Isn’t that the normal
inclination of a company, to say, hey, I’m going to hire
the best guys in town and I’m going to cement in an advantage. You wouldn’t do that if
we were regulating you. ZUCKERBERG: Senator, that –
that certainly wouldn’t be our approach. But – but I think – I think part
of the challenge with regulation in general is that when you add
more rules that companies need to follow, that’s something
that a larger company like ours inherently just has
the resources to go do, and that might just be harder
for a smaller company getting started to be
able to comply with. SULLIVAN: Correct. ZUCKERBERG: So it’s not
something that – like going into this, I would look at the
conversation as what is the right outcome. I think there are real
challenges that we face around content and privacy and
in a number of areas, ads transparency, elections … SULLIVAN: Let me – let me
get – I’m sorry to interrupt, but let me get to
one final question. It kind of relates to what
you’re talking about in terms of content regulation and what
exactly – what exactly Facebook is. You know, you – you
mention you’re a tech company, a platform, but there’s some
who are saying that you’re the world’s biggest publisher. I think about 140 million
Americans get their news from Facebook, and when you talk to –
when you mentioned that Senator Cornyn – Cornyn, he – you said
you are responsible for your content. So which are you, are you a tech
company or are you the world’s largest publisher, because I
think that goes to a really important question on what
form of regulation or government action, if any, we would take. ZUCKERBERG: Senator, this is
a – a really big question. I – I view us as a tech company
because the primary thing that we do is build
technology and products. SULLIVAN: But you said you’re
responsible for your content, which makes … ZUCKERBERG: Exactly. SULLIVAN: … you kind of a publisher, right? ZUCKERBERG: Well, I agree
that we’re responsible for the content, but we don’t
produce the content. I – I think that when people ask
us if we’re a media company or a publisher, my understanding of
what – the heart of what they’re really getting at, is do we feel
responsibility for the content on our platform. The answer to that, I think, is
clearly ìyes.î And – but I don’t think that that’s
incompatible with fundamentally, at our core, being a technology
company where the main thing that we do is have
engineers and build products. THUNE: Thank you,
Senator Sullivan. Senator Udall? UDALL: Thank you, Mr. Chairman. And thank you very
much, Mr. Zuckerberg, for being here today. You – you spoke very
idealistically about your company, and you talked
about the strong values, and you said you wanted to be a
positive force in the community and the world. And you were hijacked by
Cambridge Analytica for political purposes. Are you angry about that? ZUCKERBERG: Absolutely. UDALL: And – and you’re
determined – and I assume you want changes made in the law? That’s what you’ve
talked about today. ZUCKERBERG: Senator, the most
important thing that I care about right now is making sure
that no one interferes in the various 2018
elections around the world. We have an
extremely important U.S. midterm. We have major
elections in India, Brazil, Mexico,
Pakistan, Hungary coming up. And we’re going to take a
– a number of measures, from building and
deploying new A.I. tools that take down fake news,
to growing our security team to more than 20,000 people, to
making it so that we verify every advertiser who’s
doing political and issue ads, to make sure that that kind of
interference that the Russians were able to do in 2016 is going
to be much harder for anyone to pull off in the future. UDALL: And – and I think you’ve
said earlier that you support the Honest Ads Act, and so I
assume that means you want changes in the law in order to –
to effectuate exactly what you talked about? ZUCKERBERG: Senator, yes. UDALL: Yeah, yeah. ZUCKERBERG: We
support the Honest Ads Act. We’re implementing it. UDALL: And so are you going to
– are you going to come back up here and be a strong advocate,
to see that that law is passed? ZUCKERBERG: Senator, the biggest
thing that I think we can do is implement it. And we’re doing that. UDALL: That’s a kind
of yes-or-no question, there. I hate to interrupt you, but are
you going to come back and be a strong advocate? You’re angry about this. You think there
ought to be change. There ought to be a
law put in place. Are you going to come
back and be an advocate, to get a law in place like that? ZUCKERBERG: Senator, our team is
certainly going to work on this. What I can say is, the
biggest thing that … (CROSSTALK) UDALL:
I’m talking about you, not your team. ZUCKERBERG: Well,
Senator, I try … (CROSSTALK) UDALL: (inaudible)
come back here and be … ZUCKERBERG: … not to come to D.C. UDALL: … an advocate for that law? That’s what I want to see. I mean, you’re upset about this. We’re upset about this. I – I’d like a
yes-or-no answer on that one. ZUCKERBERG: Senator, I’m –
I’m posting and speaking out publicly about how
important this is. I don’t come to
Washington, D.C., too often. I’m going to direct my
team to focus on this. And the biggest thing that I
feel like we can do is implement it, which we’re doing. UDALL: Well, the biggest thing
you can do is to be a strong advocate yourself,
personally, here in Washington. Just let me make that clear. But many of us have seen the
kinds of images shown earlier by Senator Leahy. You saw those
images that he held up. Can you guarantee that any
of those images that can be attributed or associated
with the Russian company, Internet Research Agency, have
been purged from your platform? ZUCKERBERG: Senator, no,
I can’t guarantee that. Because this is an
ongoing arms race. As long as there are people
sitting in Russia whose job it is, is to try to interfere
with elections around the world, this is going to be
an ongoing conflict. What I can commit is that we’re
going to invest significantly. because this is a top priority,
to make sure that people aren’t spreading misinformation or
trying to interfere in elections on Facebook. But I don’t think it would
be a realistic expectation, to assume that as long as there
are people who are employed in Russia, for whom
this is their job, that we’re going to
have zero amount of that, or that we’re going to be 100
percent successful at preventing that. UDALL: Now, beyond
disclosure of online ads, what specific steps are you
taking to ensure that foreign money is not financing political
or issue ads on Facebook in violation of U.S. law? Just because someone submits a
disclosure that says paid for by some 501(c)(3) or PAC, if that
group has no real person in the U.S., how can we ensure it
is not foreign – foreign interference? ZUCKERBERG: Senator, our
verification program involves two pieces. One is verifying the identity of
the person who’s buying the ads, that they have a valid
government identity. The second is
verifying their location. So if you’re sitting
in Russia, for example, and you say that
you’re in the U.S., then we’ll be able to – to
make it a lot harder to do that, because what we’re actually
going to do is mail a code to the address that
you say you’re at. And if you can’t get
access to that code, then you’re not going
to be able to run ads. UDALL: Yes. Now, Facebook is creating an
independent group to study the abuse of social
media in elections. You’ve talked about that. Will you commit that all
findings of this group are made public no matter what they say
about Facebook or it’s business model? Yes or no answer. ZUCKERBERG: Senator, that’s
the purpose of this group, is that Facebook does not get
to control what these folks publish. These are going to be
independent academics, and Facebook has no
prior publishing control. They’ll be able to do the
studies that – that – that they’re doing and
publish the results. UDALL: And you’re fine
with them being public? And what’s the timing
on getting those out? ZUCKERBERG: Senator, we’re –
we’re kicking off the research now. Our goal is to focus on both
providing ideas for preventing interference in 2018 and
beyond, and also for holding us accountable to making sure
that the measures that we put in place are
successful in doing that. So I would hope that we will
start to see the first results later this year. UDALL: Thank you, Mr. Chairman. THUNE: Thank you, Senator Udall. Senator Moran is up next,
and I would just say again, for the benefit of
those who are here, that after a couple
of more questions, we’ll probably give the
witness another short break. ZUCKERBERG: Thank you. THUNE: So we’re – we’re getting
about almost two thirds through the – the list of members
who are here to ask questions. Senator Moran. MORAN: Mr. Chairman, thank you. Mr. Zuckerberg, thank you
for your – I’m over here. Thank you for your testimony and
thank you for your presence here today. On March the 26th of this year,
the FTC confirmed that it was investigating Facebook to
determine whether it’s privacy practices violated the FTC
Act or the consent order that Facebook entered into
with the agency in 2011. I chair the Commerce committee
– subcommittee that has jurisdiction over the
Federal Trade Commission. I remain interested in
Facebook’s assertion that it rejects any suggestion of
violating that consent order. Part two of that consent
order requires that Facebook, quote, ìclearly and prominentlyî
display notice and obtain users’ affirmative consent before
sharing their information with, quote, ìany third party.? My question is how does the
case of approximately 87 million Facebook friends having their
data shared with a third party due to the consent of only
300,000 consenting users not violate that agreement? ZUCKERBERG: Well,
Senator, like I said earlier, I mean our view is that – is
that we believe that we are in compliance with
the consent order, but I think we have a broader
responsibility to protect people’s privacy
even beyond that. And in this specific case, the
way that the platform worked, that you could sign into an
app and bring some of your information and some of your
friends’ information is how we explained it would work. People had
settings to that effect. We explained and – and they
consented to – to it working that way. And the – the system basically
worked as it was designed. The issue is that we designed
the system in a way that wasn’t good. And now we – starting in 2014,
have changed the design of the system to that that way it just
massively restricts the amount of – of data access that
a developer could get. (CROSSTALK) MORAN:
The – I’m sorry, the 300,000 people, they were
treated in a way that – it was appropriate; they consented. But you’re not suggesting
that the friends consented? ZUCKERBERG: Senator, I believe
that – that we rolled out this developer platform, and that
we explained to people how it worked, and that
they did consent to it. It – it makes, I think, to – to
go through the way the platform works. I mean, it’s – in 2007, we – we
announced the Facebook developer platform, and the idea was
that you wanted to make more experiences social, right? So, for example, if you –
like, you might want to have a calendar that can have your
friends’ birthdays on it, or you might want your address
book to have your friends’ pictures in it, or you might
want a map that can show your friends’ addresses on it. In order to do that, we needed
to build a tool that allowed people to sign in to an app and
bring some of their information, and some of their
friends’ information, to those apps. We made it very clear
that this is how it worked, and – and when people
signed up for Facebook, they signed up for that as well. Now, a lot of good use
cases came from that. I mean, there were
games that were built. There were integrations
with companies that, I think, we’re familiar
with, like Netflix and Spotify. But over time, what became clear
was that that also enabled some abuse. And that’s why in 2014, we
took the step of changing the platform. So now, when people
sign in to an app, you do not bring some of your
friends’ information with you. You’re only bringing your own
information and you’re able to connect with friends who
have also authorized that app directly. MORAN: Let me turn to the
bug – your Bug Bounty program. Our subcommittee has had
hearings in – a hearing in regard to Bug Bounty. Your press release indicated
that was one of the six changes that Facebook initially offered
to crack down on platform abuses was to reward outside parties
who find vulnerabilities. One concern I have regarding the
utility of this approach is that the vulnerability disclosure
programs are normally geared toward identifying
unauthorized access to data, not pointing out data-sharing
arrangement that likely could harm someone, but technically
they abide by complex consent agreements. How do you see the Bug Bounty
program that you’ve announced addressing the issue of that? ZUCKERBERG: Sorry, could you –
could you clarify what – what specifically … MORAN: How do you – how do you
see that the Bug Bounty program that you are – have announced
will deal with the sharing of information not permissible, as
compared to just unauthorized access to data? ZUCKERBERG: Senator, I’m
not – I’m not too sure I – I understand this enough to –
to speak to – to that specific point, and I can have my
team follow up with you on the details of that. In general, bounty programs
are an important part of the security arsenal for
hardening a lot of systems. I – I think we should expect
that we’re going to invest a lot in hardening our
systems ourselves, and that we’re going to audit
and investigate a lot of the folks in our ecosystem. But even with that, having
the ability to enlist other third-parties outside of the
company to be able to help us out by giving them an incentive
to point out when they see issues, I think is likely going
to help us improve the security of the platform overall,
which is why we did this. MORAN: Thank you,
Mr. Zuckerberg. THUNE: Thank you, Senator Moran. Next up is Senator Booker. BOOKER: Thank you, Mr. Chairman. Hello, Mr. Zuckerberg. As you know, much of my life
has been focused on low-income communities, poor communities,
working-class communities, and trying to make sure
they have a fair shake. This country has a very bad
history of discriminatory practices towards low-income
Americans and Americans of color, from the
redlining FHA practices, even to more recently really
just discriminatory practices in the mortgage business. I’ve always seen technology as
a promise to democratize our nation, expand access,
expand opportunities. But unfortunately, we’ve
also seen how platforms, technology
platforms like Facebook, can actually be used to double
down on discrimination and – and give people more sophisticated
tools with which to discriminate. Now in – in 19 –
in 2000 – in 2016, ProPublica revealed that
advertisers could use ethnic affinity, a users race to
market categories to potentially discriminate overall against
Facebook users in the areas of housing, employment and credit,
echoing a dark history in this country, and – and also in
violation of federal law. In 2016, Facebook
committed to fixing this, that the advertisers who
have access to this data, to fixing it. But unfortunately a year
later as – as – as ProPublica’s article showed, they found that
the system Facebook built was still allowing housing ads
without applying – to go forward without applying these new
restrictions that were put on. Facebook then opted in a system
that’s very similar to what we’ve been talking about
with Cambridge Analytica, that they could self certify
that they were not engaging in these practices and
complying with federal law, using this self certification
away and – and – to – to overcome and to comply
with rather Facebook’s anti-discrimination policy. Unfortunately, in
a recent lawsuit, as of February 2018, alleges
that discriminatory ads were still being created on Facebook,
still disproportionately impacting low-income communities
and communities of color. Given the fact that you allow
Cambridge Analytica to self certify in a way that I
think – at least I think you’ve expressed regret over, is
self certification the best and strongest way to safeguard –
guard against the misuse of your platform and protect
the data of users, not let it be manipulated in
such a discriminatory fashion. ZUCKERBERG: Senator, this is a
– a – a very important question and, in general, I think over
time we’re going to move towards more proactive
review, with more A.I. tools to help flag
problematic content. In the near term, we have a
lot of content on the platform, and we – it’s – it’s hard to
review every single thing up front. We do a quick screen. But I – I agree with you that
I think in – in this specific case, I’m not happy
with where we are, and I – I think it makes sense
to – to really focus on making sure that these areas
get more reviews sooner. BOOKER: And I – and I know
you understand that there is a growing distrust and I
know a lot of civil rights organizations have met with
you about Facebook’s sense of urgency to address these issues. There’s a distrust that stems
from the fact and I know – I’ve had conversations with leaders
in Facebook about the lack of diversity in the
tech sector as well, people who are
writing these algorithms, people who are actually
policing for this data, or policing for these problems,
are they going to be a part of a more diverse group
that’s looking at this? You’re looking to
hire, as you said, 5,000 new positions for among
other things reviewing content, but we know in your
industry, the inclusivity, it – it’s a real serious problem
that you are an industry that lacks diversity in a
very dramatic fashion. It’s not just
true with Facebook; it’s true with the
tech area as well. And – and so it’s very important
for me to – to communicate that larger sense of urgency, and –
and what a lot of civil rights organizations are
concerned with, and – and we should be
working towards more – a more collaborative approach. BOOKER: And I’m wondering if
you’d be open to opening your platform for civil rights
organizations to really audit a lot of these companies dealing
in areas of credit and housing, to really audit what is actually
happening and better have more transparency in
working with your platform. ZUCKERBERG: Senator, I
think that’s a very good idea. And I think we should follow
up on the details of that. BOOKER: I also want to say
that – that there was an investigation. Something’s very
disturbing to me, is the fact that there have been
law enforcement organizations that use Facebook’s platform
to – to – to surveil African American organizations
like Black Lives Matter. I know you’ve expressed
support for the group, and Philando Castile’s killing
was broadcast live on Facebook. But there are a lot of
communities of color worried that that data can be used to
surveil groups like Black Lives Matter, like folks who are
trying to organize against substantive issues of
discrimination in this country. Is this something that
you’re committed to addressing, and to ensuring that the
freedoms that civil rights activists and
others are not targeted, or their work not being
undermined or people not using your platform to unfairly
surveil and try to undermine the activities that
those groups are doing? ZUCKERBERG: Yes, Senator. I think that
that’s very important. We’re – we’re committed to that. And in general, unless law
enforcement has a very clear subpoena or ability or –
or reason to get access to information, we’re going to push
back on that across the board. BOOKER: And then I’d just like,
for the record – my time has expired … GRASSLEY: Yeah. BOOKER: … but there’s a lawsuit against
Facebook about discrimination. And you moved for the lawsuit to
be dismissed because no harm was shown. Could you please submit to the
record – do you believe that people of color were not
recruited for various economic opportunities are being harmed? Can you please clarify why
you moved for – to dismiss that lawsuit, for the record? GRASSLEY: For the record. Senator Heller’s up next. I’ll go to you. HELLER: All right, Mr. Chairman. Thank you. Appreciate the time, and
thank you for being here. I’m over here. Thanks. And thank you for taking time. I know it’s been a long day, and
I think you’re at the – at the final stretch, here. But I’m glad that you are here. Yesterday Facebook sent out a
notification to 87 million users that information was given to
Cambridge Analytica without their consent. My daughter was one
of the 87 million, and six of my
staff, all from Nevada, received this notification. Can you tell me how many
Nevadans were among the 87 million that
received this notification? ZUCKERBERG: Senator, I don’t
have this broken out by state right now. But I can have my team follow
up with you to get you the information. HELLER: Okay, okay. I figured that
would be the answer. If, after hearing this –
going through this hearing and Nevadans no longer want to
have a Facebook account, if – if that’s the case, if a
Facebook user deletes their account, do you
delete their data? ZUCKERBERG: Yes. HELLER: My kids have been on
Facebook and Instagram for years. How long do you
keep a user’s data? ZUCKERBERG: Sorry, can … HELLER: How long do
you keep a user’s data, once they – after –
after they’ve left? If they – if they choose
to delete their account, how long do you keep their data? ZUCKERBERG: I don’t know the
answer to that off the top of my head. I know we try to delete it
as quickly as is reasonable. We have a lot of
complex systems, and it work – takes awhile
to work through all that. But I think we try to
move as quickly as possible, and I can follow up or
have my team follow up … HELLER: Yeah. ZUCKERBERG: … to get you the –
the data on that. HELLER: Okay. Have you ever said that you
won’t sell an ad based on personal information? Simply that – that you wouldn’t
sell this data because the usage of it goes too far? ZUCKERBERG: Senator,
could you clarify that? HELLER: Have you ever drawn
the line on selling data to an advertiser? ZUCKERBERG: Yes, senator. We don’t sell data at all. So the – the way the ad system
work is advertisers can come to us and say, I – I have a message
that I’m trying to reach a certain type of people. They might be
interested in something, they might live in a place,
and then we help them get that message in front of people. But this is one of the – it’s
– it’s widely mischaracterized about our system
that we sell data. And it’s actually one of the
most important parts of how Facebook works is that
we do not sell data. Advertisers do not get access
to people’s individual data. HELLER: Have you ever collected
the content of phone calls or messages through any
Facebook application or service? ZUCKERBERG: Senator, I don’t
believe we have ever collected the content of – of phone calls. We have an app called Messenger
that allows people to message most of their Facebook friends. And we do on – in the Android
operating system allow people to use that app as their client
for both Facebook messages and texts. So we do allow people to
import their texts into that. HELLER: Okay. Let me ask you about
government surveillance. For years Facebook said that
there’d be – that there should be strict limits of the
information the government can access on Americans. And by the way, I agreed
with you that privacy – because privacy is
important to Nevadans. You argue that Facebook
users wouldn’t trust you if they thought you were giving
their private information to the intelligence community. Yet you use and sell the
same data to make money. And in the case of
Cambridge Analytica, you don’t even know how
it’s used after you sell it. Can you tell us why
this isn’t hypocritical? ZUCKERBERG: Well
senator, once again, we don’t sell any
data to anyone. We don’t see it to advertisers,
and we don’t sell it to developers. What we do allow is for people
to sign in to apps and bring their data and it used to be the
data of some of their friends but now it isn’t with them. And that I think makes sense. I mean, that’s
basic data portability. The ability that
you own the data, you should be able to take it
from one app to another if you’d like. HELLER: Do you believe you’re
more responsible with millions of American’s personal data than
the Federal government would be? ZUCKERBERG: Yes. But, senator, the – your
point about surveillance, I think that there’s a very
important distinction to draw here, which is that when – when
organizations do surveillance people don’t have
control over that. But on Facebook, everything that
you share there you have control over. You can – you can say I don’t
want this information to be there. You have full access
to understand all, every piece of information that
Facebook might know about you, and you can get
rid of all of it. And I – I don’t know of
any other – any surveillance organization in the world
that operates that way, which is why I think that
that comparison isn’t really apt here. HELLER: With you here today,
do you think you’re a victim? ZUCKERBERG: No. HELLER: Do you think Facebook
as a company is a victim? ZUCKERBERG: Senator, no. I think that we have a
responsibility to protect everyone in our community from
anyone in – in our ecosystem who is going to
potentially harm them. And I think that we haven’t
done enough historically … HELLER: Do you consider … ZUCKERBERG: … and we need to
step up and do more. HELLER: Do you consider the 87
million users – do you consider them victims? ZUCKERBERG:
Senator, I think yes. I mean, they – they did not
their information to be sold to Cambridge
Analytica by a developer. And – and that happened,
and it happened on our watch. So even though we didn’t do it,
I think we have a responsibility to be able to prevent that and
be able to take action sooner. And we’re committing to make
that we do that going forward. ZUCKERBERG: Which is why the
steps that I – that I announced before are now, they’re the two
most important things that we’re doing are locking down the
platform to make sure that developers can’t get access to
that much data so this can’t happen again going forward,
which I think is largely the case since 2014, and going
backwards we need to investigate every single app that might have
had access to a large amount of people’s data to make sure that
no one else was misusing it. If we find that they are, we’re
going to get into their systems, do a full audit, make sure they
delete it and we’re going to tell everyone who’s affected. HELLER: Mr. Chairman, thank you. THUNE: Thank you,
Senator Heller. We’ll go to Senator Peters and
then into the break and then Senator Tillis
coming out of the break. So Senator Peters. PETERS: Thank you, Mr. Chairman. Mr. Zuckerberg, thank
you for being here today. You know, you’ve talked about
your very humble beginnings in starting Facebook in
– in your dorm room, which I appreciated that
story but certainly Facebook has changed an awful lot over a
relatively short period of time. When Facebook launched
it’s timeline feature, consumers saw their
friends post chronologically, was the process. But Facebook has since then
changed to a timeline driven by some very
sophisticated algorithms. And I think it has
left many people, as a result of
that, asking, you know, why – why am I seeing this –
this feed and why am I seeing this right now. And now, in light of the
Cambridge Analytica issue, Facebook users are
asking, I think, some new questions right now. Can I believe what I’m seeing
and who has access to this information about me? So I think it’s safe
to say, very simply, that Facebook is losing
the trust of an awful lot of Americans as a
result of this incident. And – and I think an example of
this is something that I’ve been hearing a lot from folks that
have been coming up to me and talking about really, kind of
the experience they’ve had, where they’re having a
conversation with friends. Not on the phone, just talking. And then they see ads popping
up fairly quickly on their Facebook. So I’ve heard constituents fear
that Facebook is mining audio from their mobile devices for
the purpose of ad targeting. Which I think speaks to this
lack of trust that we’re seeing here, but – and I understand
there’s some technical issues and logistical issues
for that to happen. But for the record, I
think it’s clear – see, I hear it all the time,
including from my own staff. Yes or no, does Facebook use
audio obtained from mobile devices to enrich personal
information about its users? ZUCKERBERG: No. PETERS: The … ZUCKERBERG: Well, senator, let
me be – let me be clear on this. So you’re – you’re talking about
this conspiracy theory that gets passed around that we
listen to what’s going on, on your microphone
and use that for ads. PETERS: Right. ZUCKERBERG: We don’t do that. To be clear, we do allow people
to take videos on their – on their devices and
– and share those. And of course
videos also have audio, so – so we do, while
you’re taking a video, record that and use that to make
the service better by making sure that your
videos have audio. But I – I mean that, I
think, is pretty clear, but I just wanted to make
sure I was exhaustive there. PETERS: Well, I appreciate that. And hopefully that’ll dispel a
lot of what I’ve been hearing, so thank you for saying that. Certainly the – today,
in the era of mega data, we are finding that
data drives everything, including consumer behavior. And so consumer information’s
probably the most valuable information you can get
in the data ecosystem. And certainly folks, as you’ve
mentioned in your testimony here, people like the fact that
they can have targeted ads that they’re going to be interested
in as opposed to being bombarded by a lot of ads that they
don’t have any interest in; and that consumer information
is important in order for you to tailor that. But also, people are now
beginning to wonder is there an expense to that when it comes to
perhaps exposing them to being manipulated or
through deception. You’ve talked about
artificial intelligence, you brought that up many
times during your testimony. And I know you’ve employed some
new algorithms to target bots, bring down fake
accounts, deal with terrorism, things that you’ve
talked about in this hearing. PETERS: But you also know that
artificial intelligence is not without its risk and that you
have to be very transparent about how those
algorithms are constructed. How do you see
artificial intelligence, more specifically, dealing with
the ecosystem by helping to get consumer insights, but also
keeping consumer privacy safe. ZUCKERBERG: Senator, I think
the – the core question you’re asking about, A.I. transparency, is a really
important one that people are just starting to
very seriously study, and that’s ramping up a lot. And I think this is going to be
a very central question for how we think about A.I. systems over the
next decade and beyond. Right now, a lot of our A.I. systems make decisions in
ways that people don’t really understand. PETERS: Right. ZUCKERBERG: And I don’t
think that in 10 or 20 years, in the future that
we all want to build, we want to end up with systems
that people don’t understand how they’re making decisions. So having – doing the research
now to make sure that the – that these systems can have those
principles as we’re developing them, I think is certainly a –
an extremely important thing. PETERS: Well, you bring
up the – the principles. Because, as you’re
well aware, A.I. systems, especially in very
complex environments when you have machine learning, it’s
sometimes very difficult to understand, as you mentioned,
exactly how those decisions were arrived at. There’s examples of how
decisions are made in a discriminatory basis, and they
can compound if you’re not very careful about how that occurs. And so, is your company –
you mentioned principles. Is your company developing a set
of principles that are going to guide that development? And would you provide details to
us as to what those principles are and how they will
help deal with this issue? ZUCKERBERG: Yes, senator. I can make sure that our team
follows up and gets you the information on that. And we have a whole A.I. ethics team that is working
on developing basically the technology. It’s not just about
philosophical principles; it’s also a technological
foundation for making sure that this goes in the
direction that we want. PETERS: Thank you. THUNE: Thank you,
Senator Peters. We’ll recess for
five, and come back in. So we’ll give Mr.
Zuckerberg a quick break here. Thanks. (RECESS) THUNE: We’re back. Final stretch. And Senator
Tillis is recognized. TILLIS: Thank
you, Mr. Zuckerberg, for being here. I think you’ve done a good job. I’ve been here for
most of it – the session, except for about 20 minutes I
watched on television back in my office. I’m was googling
earlier – actually, going on my Facebook
app on my phone earlier, and I found one of your
Facebook page – yeah, one of your Facebook presents is
– it was the same one on March 30th, I think you posted
a pic of a first stater. But further down, you listed out
the facts since the new platform was released in 2007;
sort of, a timeline. You start with 2007, then you
jump to the Cambridge Analytica issue. I actually think that we need
to fully examine what Cambridge Analytica did. They either broke a
kind of code of conduct. If they broke any other rules
or agreements with you all, I hope that they
suffer the consequences. TILLIS: But I think that
timeline needs to be updated. And it really needs to go back
– I’ve read a series of three articles that were published in
the MIT Technology Review back in 2012, and it talks about how
proud the Obama campaign was of exploiting data on
Facebook in the 2012 campaign. In fact, somebody asked you
earlier if it made you mad about what Cambridge Analytica did,
and you rightfully answered yes, but I think you should probably
be equally mad when a former campaign director of the
Obama campaign proudly tweeted ìFacebook was surprised we
were able to suck out the social graph, but they didn’t stop us
once they realized that was what we were doing.î So you clearly
had some people in your employ that apparently knew it, at
least that’s what this person said on Twitter, and thank
goodness for Wayback and some of the other
history-grabber machines. I’m sure we can get this tweet
back and get it in the right context. I think when you
do your research, it’s important to
get the whole view. I’ve worked in data-analytics
practice for a good part of my career, and for anybody to
pretend that Cambridge Analytica was the first
person to exploit data, clearly doesn’t work or hasn’t
worked in the data-analytics field. So when you go back and do your
research on Cambridge Analytica, I would personally appreciate
it if you’d start back from the first known high-profile
national campaign that exploited Facebook data. In fact, they published an
app that said it would grab information about my
friends, their birth dates, locations and likes. So presumably if I downloaded
that app that was published by the Obama campaign, I’ve got
4,900 friends on my Facebook page. I delete the haters and save
room for family members and true friends on my personal page,
as I’m sure everybody does. And that means if I
clicked yes on that app, I would have approved
the access of birth dates, locations, and likes of some
4,900 people without their consent. So as you do the chronology, I
think it’d be very helpful so that we can take away the
partisan rhetoric that’s going on like this is a
Republican-only issue. It’s a – it’s a broad based
issue that needs to be fixed. And bad actors at either end of
the political spectrum need to be held accountable, and I – and
I trust that you all are going to work on that. I think the one thing
that I – so for that, I just want to get to the facts,
and there’s no way you could answer any of the questions,
I’m not going to burden you with that. But I think getting that
chronology would be very helpful. The one thing I would encourage
people to do is go to Facebook. I’m – I’m a proud
member of Facebook, just got a post from my sister
on this being National Sibling Day, so I’ve connected with four
or five of my staff while I was giving you my undivided – or
family undivided attention. But go to the privacy tab. If you don’t want
to share something, don’t share it. This is a free service. Go on there and say I don’t
want to allow third party search engines to get in
my Facebook page. Go on there and say only
my friends can look at it. Go on there and understand
what you’re signing up for. It’s a free app. Now you need to do more. And I think it would be helpful. I didn’t read your disclaimer
page or the terms of use, because that is anywhere in
there that I could get an attorney and
negotiate the terms. So it was a terms of use. I went on there then I used the
privacy settings to be as safe as I could be with a
presence on Facebook. Last thing, we talk about all
these proposed legislation, good ideas, but I have one
question for you: When you were developing this
app in your dorm, how many people did you have
in your regulatory affairs division? Exactly. So if government takes a handy
– heavy-handed approach to fix this problem, then we know very
well that the next Facebook, the next thing that you’re going
to wake up and worry about how you continue to be relevant as
the behemoth that you are today, is probably not going to happen. TILLIS: So we – I think that
there’s probably a place for some regulatory guidance here,
but there’s a huge place for Google, Snapchat, Twitter, all
the other social-media platforms to get together and
create standards. And I also believe that that
person who may have looked the other way when the whole social
graph was extracted for the Obama campaign, if
they’re still working for you, they probably shouldn’t, or at
least there should be a business code of conduct that says,
you don’t play favorites, you’re trying to create a fair
place for people to share their ideas. Thank you for being here. THUNE: Thank you,
Senator Tillis. Senator Harris. HARRIS: Thank you. Thank you for being here. I’ve been here for – on and off
for the last four hours since you’ve been testifying. And I have to tell you,
I’m concerned about how much Facebook values
trust and transparency, if we agree that a critical
component of relationship of trust and transparency is we
speak truth and we get to the truth. During the course
of this hearing, these last four hours, you
have been asked several critical questions for which
you don’t have answers. And those questions have
included whether Facebook can track user’s browsing activity
even after the user has logged off of Facebook, whether
Facebook can track your activity across devices even when you
are not logged into Facebook. Who is Facebook’s
biggest competition? Whether Facebook may store
up to 96 categories of user’s information. Whether you knew whether Kogan’s
terms of service and whether you knew if that Kogan could
sell or transfer data. And then another case in point,
specifically as it relates to Cambridge Analytica is,
and a concern of mine, is that you – meaning Facebook
– and I’m going to assume you personally as a CEO – became
aware of December 2015 that Dr. Kogan and Cambridge Analytica
misappropriated data from 87 million Facebook users. That’s 27 months ago that
you became – as Facebook – and perhaps you
personally became aware. However a decision was
made not to notify the users. So my question is, did anyone at
Facebook have a conversation at the time that you
became aware of this breach, and have a conversation where
in the decision was made not to contact the users? ZUCKERBERG: Senator, I
don’t know if there were any conversations at Facebook
overall because I wasn’t in a lot of them. But … HARRIS: On that subject. ZUCKERBERG: Yes. I mean, I’m not sure what
other people discussed. Are – at the time – in 2015
we heard the report that this developer, Aleksandr Kogan,
had sold data to Cambridge Analytica. That’s in
violation of our times. HARRIS: Correct, and were you
apart of a decision – were you part of a discussion that
resulted in a decision not to inform your users? ZUCKERBERG: I don’t remember
a conversation like that. But the reason why … HARRIS: Are you aware of anyone
in the leadership at Facebook who was in a conversation
where a decision was made not to inform your users? Or do you believe no such
conversation ever took place? ZUCKERBERG: I’m not sure whether
there was a conversation about that. But I can tell you the thought
process at the time of the company, which was that in
2015, when we heard about this, we banned the developer and we
demanded that they delete all of the data and stop using it, and
same with Cambridge Analytica. (CROSSTALK) HARRIS: And I’ve
heard your testimony in that regard, but I’m talking
about notification of the users. And this relates to the
issue of transparency and the relationship of trust, informing
the user about what you know in terms of how their personal
information has been misused. And I’m also concerned that when
you personally became aware of this, did you or senior
leadership do an inquiry to find out who at Facebook
had this information, and did they not have a
discussion about whether or not the users should be
informed back in December 2015? ZUCKERBERG:
Senator, in retrospect, I think we clearly viewed it as
a mistake that we didn’t inform people and we did that based
on false information that we thought that the case was closed
and that the data had been deleted. HARRIS: So there was a decision
made on that basis not to inform the users. Is that correct? ZUCKERBERG: That’s
my understanding. Yes. HARRIS: Okay. And … ZUCKERBERG: But I – I – in
retrospect I think that was a mistake and knowing
what we know now, we should have handled a lot
of things here differently. HARRIS: I appreciate that point. Do you know when that decision
was made not to inform the users? ZUCKERBERG: I don’t. HARRIS: Okay. Last November the Senate
Intelligence Committee held a hearing on social
media influence. I was a part of that hearing. I submitted 50 written questions
to Facebook and other companies and the responses that we
received were unfortunately evasive and some were
frankly nonresponsive. So I’m going to ask
the question again here. How much revenue did Facebook
earn from the user engagement that resulted from
foreign propaganda? ZUCKERBERG: Well senator,
what we do know is that the IRA, the Internet Research Agency,
the – the Russian firm ran about $100,000 worth of ads. I can’t say that we’ve
identified all of the foreign actors who are involved here. So, I – I – I can’t say that
that’s all of the money but that is what we have identified. HARRIS: Okay. My time is up. I’ll submit more
questions for the record. Thank you. THUNE: Thank you Senator Harris. Next up is Senator Kennedy. KENNEDY: Mr.
Zuckerberg, I come in peace. (LAUGHTER) I – I don’t want
to vote to have to regulate Facebook, but by God I will. That – a lot of
that depends on you. I’m a little disappointed
in this hearing today. I just don’t feel like
that we’re connecting. So – so let me try to lay it out
for you from my point of view. I think you are a
really smart guy. And I think you have built an
extraordinary American company and you’ve done a lot of good. Some of the things that you’ve
been able to do are magical. But our – our promised digital
utopia we have discovered has minefields. There – there’s some impurities
in the Facebook punch bowl. And they’ve got to be fixed
and I think you can fix them. Now here – here’s
what’s going to happen. There are going to be a whole
bunch of bills introduced to regulate Facebook. It’s up to you
whether they pass or not. You can go back home, spend $10
million on lobbyists and fight us or you can go back home and
help us solve this problem and they’re two. One is a privacy problem the
other one is what I call a propaganda problem. Let’s start with the
privacy problem first. Let’s start with
the user agreement. Here’s what everybody’s
been trying to tell you today, and – and I say this gently. Your user agreement sucks. (LAUGHTER) You’re – you –
you can spot me 75 IQ points, if I can figure it out,
you can figure it out. The purpose of that user
agreement is to cover Facebook’s rear end. It’s not to inform your
users about their rights. KENNEDY: Now, you know
that and I know that. I’m going to suggest to you that
you go back home and rewrite it. And tell your
$1,200 an hour lawyers, no disrespect. They’re good. But – but tell them you
want it written in English and non-Swahili, so the average
American can understand it. That would be a start. Are you willing –
as a Facebook user, are – are you willing to give
me more control over my data? ZUCKERBERG: Senator, as
someone who uses Facebook, I believe that you should have
complete control over your data. KENNEDY: Okay. Are – are you willing to go back
and – and work on – on giving me a greater right
to erase my data? ZUCKERBERG: Senator, you can
already delete any of the data that’s there, or
delete all of your data. KENNEDY: Are – are you
willing to expand that, work on expanding that? ZUCKERBERG: Senator, I think we
already do what you’re referring to. But certainly, we’re always
working on trying to make these controls easier. KENNEDY: Are – are you willing
to expand my right to know who you’re sharing my data with? ZUCKERBERG: Senator, we already
give you a list of apps that – that you’re using. And you signed
into those yourself, and provided
affirmative consent. As I’ve said before … KENNEDY: Right. But when I use – on that – on
that – on that user agreement … ZUCKERBERG: … we don’t share any data with … KENNEDY: … are – are you willing to expand
my right to prohibit you from sharing my data? ZUCKERBERG: Senator, again, I
believe that you already have that control. So, I mean, I think people have
that – that full control in the system already today. If we’re not
communicating this clearly, then that’s a big thing
that we should work on. Because I think the principles
that you’re articulating are the ones that we believe in and try
to codify in the product that we build. KENNEDY: Are – are you willing
to give me the right to take my data on Facebook and move it to
another social media platform? ZUCKERBERG: Senator,
you can already do that. We have a
download-your-information tool, where you can go get a file
of all the content there, and then do
whatever you want with it. KENNEDY: And you’re – are – then
I assume you’re willing to give me the right to say, ìI’m
going to go in your platform, and you’re going to be able to
tell a lot about me as a result, but I don’t want you to
share it with anybodyî? ZUCKERBERG: Yes, senator. And I believe you already
have that ability today. People can sign on and
choose to not share things, and just follow some friends or
some pages and read content if that’s what they want to do. KENNEDY: Okay. Let me be sure I under –
I’m about out of time. Oh, it goes fast, doesn’t it? Let me ask you one final
question in my 12 seconds. Could somebody
call you up and say, ìI want to see
John Kennedy’s fileî? ZUCKERBERG: Absolutely not. KENNEDY: Could you – if – not –
not – could you – not would you do it. Could you do it? ZUCKERBERG: In – in theory. KENNEDY: Do you have
the right to put my data, a name on my data and
share it with somebody? ZUCKERBERG: I do not believe
we have the right to do that. KENNEDY: Do you
have the ability? ZUCKERBERG: Senator, the
data is in the system. So … KENNEDY: Do you
have the ability? ZUCKERBERG: Technically, I
think someone could do that. But that would be
a massive breach. So we would never do that. KENNEDY: It would be a breach? Thank you, Mr. Chairman. THUNE: Thank you,
Senator Kennedy. Senator Baldwin’s up next. BALDWIN: Thank
you, Mr. Chairman. Thank you for being here
and enduring a long day, Mr. Zuckerberg. I want to start with what I hope
can be a quick round of – of questions, just so I make
sure I understand your previous testimony, specifically with
regard to the process by which Cambridge Analytica was able to
purchase Facebook users’ data. So it was an app
developer, Aleksandr Kogan. He collected data via
a personality quiz. Is that correct? ZUCKERBERG: Yes. BALDWIN:Okay. And he thereby is able to gain
access of not only the people who took the quiz
but their network, is that correct, too? ZUCKERBERG: Senator, yes. The terms of the platform at the
time allowed for people to share their information and some basic
information about their friends as well. And we’ve since
changed that, as of 2014. BALDWIN: And … ZUCKERBERG: Now,
that’s not possible. BALDWIN: And so, in total
about 87 million Facebook users. You earlier testified about the
two types of ways you gain data. One is what is voluntarily
shared by Facebook members and users. And the other is in order to –
I think you said improve your advertising experience, whatever
that exactly means – the data that Facebook collects in order
to customize or focus on that. Did – was Aleksandr Kogan able
to get both of those sets or data, or just what was
voluntarily entered by the user? ZUCKERBERG: Yes,
that’s a good question. It was just a subset of what
was entered by the person. And … BALDWIN: So, a subset of the
95 categories of data that you keep? ZUCKERBERG: Yes, when
you sign into the app … BALDWIN: Okay. ZUCKERBERG: … you – the app
developer has to say, here are the types of data
from you that I’m asking for, including public information
like your name and profile, the pages you follow, other
interests on your profile, that kind of content. BALDWIN: Okay. ZUCKERBERG: The app developer
has to disclose that up front, and you agree with it. BALDWIN: Okay. So, in answer to a couple of
other senators’ questions, specifically Senator Fischer,
you talked about Facebook storing this data and I think
you just talked about this data being in the system. I wonder if, outside of the
way in which Aleksandr Kogan was able to access this data,
whether you – could Facebook be vulnerable to a
data breach or hack, why or why not? ZUCKERBERG: Well, there are many
kinds of security threats that a company like ours faces,
including people trying to break in to our security systems. BALDWIN: Okay. And if you believe
that you had been hacked, do you believe you would have
the duty to inform those who were impacted? ZUCKERBERG: Yes. BALDWIN: Okay. Do you know whether Aleksandr
Kogan sold any of the data he collected with anyone other
than Cambridge Analytica? ZUCKERBERG: Senator, yes, we do. He sold it to a
couple of other firms. BALDWIN: Can you identify them? ZUCKERBERG: Yes,
there’s one called Eunoia, and there may have been a
couple of others as well. And I can follow up with … BALDWIN: Can you
furnish that to me after? ZUCKERBERG: Yes. BALDWIN: Thank you. I appreciate that. And then, how much do you know,
or have you tried to find to find out how Cambridge Analytica
used the data while they had it, before you believe
they deleted it? ZUCKERBERG: Since we just heard
that they didn’t delete it about a month ago, we’ve kicked off an
internal investigation to see if they used that data
in any of their ads, for example. That investigation
is still underway, and we will – we can come back
to the results of that once we have that. BALDWIN: Okay. I want to switch to my
home state of Wisconsin. According to press reports, my
home state of Wisconsin was a major target of Russian-bought
ads on Facebook in the 2016 election. These divisive ads, touching
on a number of very polarizing issues, were designed to
interfere with our election. We’ve also learned that Russian
actors using another platform, Twitter, similarly targeted
Wisconsin with divisive content aimed at sowing
division and dissent, including in the wake of a
police-involved shooting in Milwaukee’s Sherman Park
neighborhood in August of 2016. Now I find some encouragement in
the steps you’ve outlined today to provide greater transparency
regarding political ads. I do want to get further
information on how you can be confident that you have excluded
entities based outside of the United States. ZUCKERBERG: We’ll
follow up on that. BALDWIN: And then, I
think on that topic, if you require disclosure
of a political ad’s sponsor, what sort of transparency
will you be able to provide with regard to people who weren’t the
subject of that ad seeing its content? ZUCKERBERG: Senator, you’ll be
able to go to any page and see all of the ads that
that page has run. So if someone is
running a political campaign, for example, and they’re
targeting one district with one ad and another
district with another, historically it has been
hard to track that down, but now it will be very easy. You’ll just be able to look at
all of the ads that they’ve run, the targeting associated with
each to see what they’re saying to different folks, and in some
cases how much they’re spending on the ads, and all of
the relevant information. This is an area where I think
more transparency will really help discourse overall and
root out foreign interference in elections. THUNE: Thank you,
Senator Baldwin. BALDWIN: And will you … THUNE: Senator Johnson. JOHNSON: Thank
you, Mr. Chairman. Thank you, Mr. Zuckerberg,
for testifying here today. Do you have any idea how many
of your users actually read the terms of service,
the privacy policy, the statement of rights
and responsibilities? I mean, actually read it? ZUCKERBERG: Senator, I do not. JOHNSON: Would you imagine
it’s a very small percentage? ZUCKERBERG: Senator,
who read the whole thing? I would imagine that probably
most people do not read the whole thing. But everyone has the opportunity
to and consents to it. JOHNSON: Well, I agree. But that’s kind of true of
every application where, you know, you want to get to
it and you have to agree to it, and people just press that
ìagree,î the vast majority, correct? ZUCKERBERG: Senator, it’s really
hard for me to make a full assessment, but … JOHNSON: Common sense would tell
you that would be probably the case. With all this publicity,
have you documented any kind of backlash from Facebook users? I mean, has there been a
dramatic falloff in the number of people who utilize Facebook
because of these concerns? ZUCKERBERG:
Senator, there has not. JOHNSON: You haven’t
even witnessed any? ZUCKERBERG: Senator, there was a
movement where some people were encouraging their friends to
delete their account and I think that got shared a bunch. JOHNSON: So it’s kind of safe
to say that Facebook users don’t seem to be overly concerned
about all these revelations, although obviously
Congress apparently is. ZUCKERBERG: Well, senator, I
think people are concerned about it. And I think these are incredibly
important issues that people want us to address. And I think people have
told us that very clearly. JOHNSON: But it seems like
Facebook users still want to use the platform because they enjoy
sharing photos and they share the connectivity
with family members, that type of thing. And that overrides their
concerns about privacy. You talk about the
user owns the data, you know, there are a number –
have been a number of proposals of having that data stay with
the user and allow the user to monetize it themselves. Your COO, Ms.
Sandberg, mentioned possibly, if you can’t utilize that
data to sell advertising, perhaps we would charge
people to go onto Facebook. JOHNSON: Have you
thought about that model, where the user data is actually
monetized by the actual user? ZUCKERBERG: Senator, I’m not
sure exactly how – how it would work for it to be monetized
by the person directly. In general, where – we believe
that the ads model is the right one for us because it aligns
with our social mission of trying to connect everyone and
bring the world closer together. JOHNSON: But – but you’re aware
of people making that kind of proposal, correct? ZUCKERBERG: Yes. I – Senator, a number of people
suggest that – that we should offer a version where people
cannot have ads if they pay a monthly subscription, and
certainly we consider ideas like that. I think that they’re reasonable
ideas to – to think through. But overall, the – I think that
the ads experience is going to be the best one. I think in general, people like
not having to pay for a service. A lot of people can’t afford
to pay for a service around the world, and this aligns
with our mission the best. JOHNSON: You answered Senator
Graham when he asked you if you thought you were monopoly. That you didn’t think so. You’re obviously a big
player in the space. That might be an
area for competition, correct, if somebody else wants
to create a social platform that allows a user to
monetize their own data? ZUCKERBERG: Senator, yes. There are lots of new
social apps all the time. And as I said before, the
average American I think uses eight different
communication and social apps. So there’s a lot of different
choice and a lot of innovation and activity going
on in this space. JOHNSON: I want – in a
very short period of time. You talked about the difference
between advertisers and application developers. Because those – again, you – you
said in earlier testimony that advertisers have no
access to data whatsoever. But application developers do? Now, is that only through their
own service agreement with their customers, or do they
actually access data as they’re developing applications? ZUCKERBERG: Senator, this
is an important distinction, so thanks for giving me the
opportunity to clarify this. People – we give people the
ability to take their data to another app if they want. And this is a question that
Senator Kennedy asked me just a few minutes ago. The reason why we designed the
platform that way is because we – we thought it would be very
useful to make it so that people could easily bring their data
to other – to other services. Some people inside the company
argued against that at the time because they were
worried that – they said hey, we should just make it so
that we can be the only ones who develop this stuff, but … JOHNSON: But again,
that’s – that’s the … ZUCKERBERG: … we thought that that was a – a
useful thing for people to … JOHNSON: … that’s the user agreeing to
allow you to share – when they’re using that app, to allow
Facebook to share that data. Does the developer ever have
access to that prior to users using it? Meaning developing
the application. Because you used the
term ìscrapedî data. What does that mean? Who scraped the data? ZUCKERBERG: Yes, senator. This is a good question. So there’s the
developer platform, which is the sanctioned way
that an app developer can ask a person to access information. We also have certain features
and certain things that are public, right? A lot of the information
that people choose to put on Facebook, they’re sharing
with everyone in the world. Not privately, but, you
know, you put your name, you put your profile picture,
that’s public information that people put out there. And sometimes people who
aren’t registered developers at Facebook try to load a lot of
pages in order to get access to a bunch of people’s public
information and aggregate it. We fight back hard against that,
because we don’t want anyone to aggregate information, even if
people made it public and chose to share it with everyone. JOHNSON: Okay. Thank you, Mr. Chairman. THUNE: Thank you,
Senator Johnson. Senator Hassan? HASSAN: Thank you, Mr. Chair. Thank you, Mr. Zuckerberg,
for being here today. I want to talk to a
couple of broader issues. I’m concerned that Facebook’s
profitability rests on two potentially
problematic foundations. And we’ve heard other senators
talk about this a little today. The foundations are maximizing
the amount of time people spend on your products and
collecting people’s data. HASSAN: I’ve looked at
Facebook’s 2017 corporate financial statement, where you
lay out some of the major risks to your business. One risk is a
decrease in, and I quote, ìuser engagement, including time
spent on our products.î That concerns me because of the
research we’ve seen suggesting that too much time spent on
social media can hurt people’s mental health,
especially young people. Another major risk to your
business is the potential decline in – and here’s another
quote – ìthe effectiveness of our ad targeting or the
degree to which users opt out of certain types of ad targeting,
including as a result of changes that enhance the user’s
privacy.î There’s clearly tension, as other
senators have pointed out, between your bottom line and
what’s best for your users. You’ve said in your testimony
that Facebook’s mission is to bring the world closer together,
and you’ve said that you will never prioritize
advertisers over that mission. And I believe that
you believe that. But at the end of the day, your
business model does prioritize advertisers over the mission. Facebook is a
for-profit company, and as the CEO you have a legal
duty to do what’s best for your shareholders. So given all of that, why
should we think that Facebook, on its own, will ever truly be
able to make the changes that we need it to make to protect
American’s well-being and privacy? ZUCKERBERG: Well, senator,
you’ve raised a number of important points in there,
so just let me respond in … HASSAN: Sure. ZUCKERBERG: … in a couple of different ways. The first is that I think it’s
really important to think about what we’re doing, is building
this community over the long term. Any business has the opportunity
to do things that might increase revenue in the short term,
but at the expense of trust or building engagement over time. What we actually find is not
necessarily that increasing time spent, especially not
just in the short term, is going to be best
for our business. It actually – it aligns
very closely with – with the well-being
research that we’ve done. That when people are
interacting with other people, and posting and basically
building relationships, that is both correlated with
higher measures of well-being, health, happiness,
not feeling lonely, and that ends up being better
for the business than when they’re doing lower value things
like just passively consuming content. So I think that that’s – that’s
an important point to – to … HASSAN: Okay, but – and I
understand the point that you’re trying to make here, but
here’s what I’m concerned about. We have heard this point from
you over the last decade-plus. Since you’ve founded Facebook –
and I understand it – you’ve – you founded it pretty much as
a solo entrepreneur with your roommate. But now, you know, you’re
sitting here at the head of a bazillion dollar company,
and we’ve heard you apologize numerous times and
promise to change, but here we are again, right? So I really firmly
believe in free enterprise, but when private companies are
unwilling or unable to do what’s necessary, public
officials have, historically, in every industry,
stepped up to protect our constituents and consumers. You’ve supported
targeted regulations, such as the Honest Ads Act,
and that’s an important step for election integrity, I’m proud
to be a co-sponsor of that bill. But we need to address
other, broader issues as well. And today you’ve said you’d
be open to some regulation, but this has been a
pretty general conversation. So will you commit to working
with Congress to develop ways of protecting constituent
privacy and well-being, even if it means that that
results in some laws that will require you to adjust
your business model? ZUCKERBERG: Senator, yes. We will commit to that. I think that that’s an
important conversation to have. Our position is not
that regulation is bad. I think the Internet is so
important in people’s lives, and it’s getting more important. HASSAN: Yes. ZUCKERBERG: The expectations
on Internet companies and technology companies
overall are growing, and I think the
real question is, ìwhat is the right framework
for this?î not ìshould there be one?î HASSAN: That
is very helpful, and I think the other question
– and it doesn’t just go to Facebook – is whether the
framework should include financial penalties
when large providers, like Facebook, are breached
and privacy is compromised as a result. Because right now, there is very
little incentive for whether it’s Facebook or Equifax to
actually be aggressive in protecting customer privacy and
looking for potential breeches or vulnerabilities
in their systems. So what we hear after the fact,
after people’s privacy has been breached, after they’ve taken
the harm that comes with that, and considerable inconvenience
in addition to the harm. We’ve heard apologies, but there
is no financial incentive right now it seems to me for these
companies to aggressively stand in their consumers stead
and protect their privacy. And I would really look forward
to working with you on that, and getting your
considered opinion about it. ZUCKERBERG: Well senator,
we – we look forward to – to discussing that with you. I would disagree however that we
have no financial incentive or incentive overall to do this. This episode has
clearly hurt us, and has clearly made it harder
for us to achieve the social mission that we care about. And we now have to a lot of work
around building trust back which – which is – is a really
important part of this. HASSAN: Well, I thank you. My time is up and – and I’ll
follow up with you on that. GRASSLEY: Senator Capito. CAPITO: Thank you,
Chairman Grassley. Thank you, Mr. Zuckerberg,
for being here today. I – I want to ask just
kind of a process question. You’ve said more than a few
times that Facebook users can delete from their own
account at any time. Well, we know and of course I
do – I’ve got grandchildren now with children. You tell your children, once you
make that mark in – in or in – in the Internet system it
never really goes away. So my question to you is, if
once – and I think you answered that – that once an individual
deletes the information from their page it’s gone forever
from Facebook’s archives. Is that correct? ZUCKERBERG: Yes. And I think you raise
a good point though, which is that it is – we will
delete it from our systems but if you shared something to
someone else then we can’t guarantee that they don’t
have it somewhere else. CAPITO: Okay. So if somebody leaves Facebook
and then rejoins and asks Facebook, can you
recreate my past, your answer would be? ZUCKERBERG: If they
delete their account, the answer is no. That’s why we
actually offer two options. We offer deactivation, which
allows you to shut down or suspend your account, but
not delete the information. Because actually a lot of people
want to – at least for some period of time. I mean we hear students with
exams coming up want to not be on Facebook because they want to
make sure they can focus on the exam. So they deactivate
their account temporarily, but then want the ability to
turn it back on when they’re ready. You can also
delete your account, which is wiping everything. If you do that, then
you can’t get it back. CAPITO: You can’t get it back. It’s gone from your archives? ZUCKERBERG: Yes. CAPITO: But is it
ever really gone? ZUCKERBERG: From our systems? CAPITO: From – from the cloud
or wherever it – wherever it is. I mean, it always seems
to be able to reappear in investigations and other things. Not necessarily Facebook, but
some other emails and – and other things of that nature. What about the
information going from the past? The information that’s already
been in the Cambridge Analytica case? You can’t really go
back and redo that. So I’m going to assume that what
we’ve been talking with and with the improvements that you’re
making now at Facebook are from this point forward. Is that a correct assumption? ZUCKERBERG: Senator, I actually
do think we can go back in some cases. And that’s why one of the things
that I announced is that we’re going to be investigating every
single app that had access to a large of information before
we locked down the platform in 2014. And if we find any
pattern of suspicious activity, then we’re going to go do a
full audit of their systems. And if we find that
anyone’s improperly using data, than we’ll take action to make
sure that they delete the data, and we’ll inform everyone who
– who may have had their data misused. CAPITO: Okay, other –
other suggestion I would make, because we’re kind of
running out of time here, is you’ve heard more
than a few complaints, and I join the chorus, of the –
the lapse in the time of when you discovered and when
you became transparent. And I understand you sent out
two messages just today to – to users. So I would say – you say
you regret that decision, that you wish you’d been
more transparent at the time, so I would imagine if in the
course of your investigation, you find more
breaches so to speak, that you will be reinforming
your Facebook customers. ZUCKERBERG: Yes,
that is correct. We have already committed that
if we find any improper use, we will inform
everyone affected. CAPITO: Okay, thank you. You’ve said also that you
want to have an active view on controlling your ecosystem. Last week the FDA
Commissioner Scott Gottlieb, addressed the Drug Summit
in Atlanta and spoke on the national opioid epidemic. My state, I’m
from West Virginia, and thank you for visiting
and next time you visit, if you would please bring
some fiber because we don’t have connectivity in – in our
rural areas like we really need, and Facebook could
really help us with that. So – so Commissioner Gottlieb
called up – called upon social media and Internet
service providers, and he mentioned Facebook
when he talked about it, to try to disrupt the sale –
the sale of illegal drugs and particularly the
powerful opioid, Fentanyl, which has been
advertised and sold online. I know you have
policies against this, the commissioner is announcing
his intention to convene a meeting of chief executives
and senior leaders and I want to know – can I get a commitment
from you today that Facebook will commit to having a
representative with Commissioner Gottlieb to finalize
with this meeting? ZUCKERBERG: Senator, that sounds
like an important initiative, and we will send someone. And let me also say that on
your point about connectivity, we do have a – a group at
Facebook that is working on trying to spread Internet
connectivity in rural areas, and we would be happy to follow
up with you on that as well. That’s something that
I’m very passionate about. CAPITO: That’s good. That’s good news. Last question I have,
just on the advertising, if somebody advertises on
Facebook and somebody purchases something, does Facebook get a
percentage or any kind of a fee associated with a successful
purchase from an advertiser? ZUCKERBERG: Senator, no. The way that the system works
is people – advertisers bid how much it’s worth it to them to
show an ad or when an action happens. So it’s not that we would
get a percent of the sale, but let’s – let’s
just use an example. So let’s say you have –
you’re an app developer, and you – your goal is you want
to get more people to install your app. You could bid in the ad system
and say I will pay $3 anytime someone installs this app. And then we basically calculate
on – on our side which ads are going to be relevant for people,
and we have an incentive to show people ads that are going to
be relevant because we only get paid when it
delivers a business result, and – and that’s
how the system works. CAPITO: So it – it could be
one – you could be paid for the advertisement. I mean for the sale. ZUCKERBERG: We – we get
paid when the action of the advertiser wants to –
to happen, happens. CAPITO: All right, thank you. THUNE: Senator –
Senator Cortez Masto? CORTEZ MASTO: Thank you. Mr. Zuckerberg, thank you. It’s been a long afternoon and
I – I appreciate you being here and – and taking the time
with every single one of us. I’m going to echo a lot of what
I’ve heard my colleagues say today as well. I appreciate you being
here, appreciate the apology, but stop apologizing and
let’s make the change. I – I think it’s time to
really change the conduct. I appreciate the fact that you
talked about your principles for Facebook: (inaudible)
users on the use of the data, and that users have
complete control of their data. CORTEZ MASTO: But the skepticism
that I have and I’m hoping you can help me with this
is over the last what, seven years, seven, 14
years – seven years, haven’t seen really much change
in insuring that the privacy is there and that individual users
have control over their data. So – so let me –
let me ask you this. Back in 2009, you made two
changes to your privacy policy. And, in fact, prior to that,
most users could either identify only friends, or friends of
friends as part of their – their privacy, correct? If they wanted to
protect their data. They could identify only friends
or friends of friends who could see their data. Isn’t that correct? ZUCKERBERG: Senator, I believe
that we’ve had the option for people to share with
friends, friends of friends, a custom audience or
publicly for a long time. I – I don’t remember … CORTEZ MASTO: Okay. ZUCKERBERG: … exactly when we
put that in place, but I believe it
was before 2009. CORTEZ MASTO: So either you can
choose only friends or friends of friends to decide how you’re
going to share that – protect that data, correct? ZUCKERBERG: Those are
two of the options, yes. CORTEZ MASTO: Okay. And in 2011 when the FTC
started taking a look at this, they were concerned that if
somebody chose only friends, that the individual user was
under the impression they could continue to restrict sharing
of data to a limited audience, but that wasn’t the case. And, in fact, selecting friends
only did not prevent users’ information from being
shared with third – third-party applications their friend used. Isn’t that the case, and that’s
why the FTC was looking at – at you and making that change? Because there was concern that
if you had friends on your page, a third party could
access that information. Isn’t that correct? ZUCKERBERG: Senator, I don’t
remember the exact context that the … CORTEZ MASTO: So let me
– let me help you here. Because David Vladeck who was
– spent nearly four years as director of the Federal Trade
Commission’s Bureau of Consumer Protection, where he
worked, including on the FTC’s enforcement case
against Facebook, basically identifies in this
article that was the case. That not only did Facebook
misrepresent – and that’s why there were eight counts of
deceptive acts and practices – the actual FTC, in
November’s 2011 decree, basically stated – required
Facebook to give users clear and conspicuous notice and to obtain
affirmative – let me jump back here – to do three things. The decree barred Facebook from
making any further deceptive privacy claims or – and it
required Facebook get consumers’ approval before changing
the way it shares their data. And most
importantly, the third thing, it required Facebook to give
users clear and conspicuous notice and to obtain affirmative
express consent before sharing their data with third parties. That was part of the
FTC consent decree, correct? ZUCKERBERG: Senator,
that sounds right to me. CORTEZ MASTO: Okay. So at that time, you’re on
notice that there were concerns about the sharing of data
and information – users’ data including those friends
– with third parties, correct? ZUCKERBERG: Senator,
my understanding … CORTEZ MASTO: Well,
let me ask you this. Let me do it this way. In response to the FTC
consent to make those changes, did you make those changes
and what did you do to ensure individuals’ user data was
protected and they had notice of that information and that
potentially third parties would be accessing that and they
had to give express consent? What did you specifically
do in response to that? ZUCKERBERG: Senator,
a number of things. One of the most important parts
of the FTC consent decree that we signed was establishing a
robust privacy program at the company, headed by our
chief privacy officer, Erin Egan. We’re now … CORTEZ MASTO: Can
you give me specifics? And I know – and – and I’ve
heard this over and over again. I’m running out of time. But here’s the
concern that I have. It can’t be a privacy policy
because that’s what the consent said it couldn’t be. It had to be
something very specific, something very simple, like
you’ve heard from my colleagues. And that did not occur. Had that occurred, we wouldn’t
be here today talking about Cambridge Analytica. CORTEZ MASTO: Isn’t
that really true? Had you addressed
those issues then, had you done an audit, had
you looked at not only the third-party applications, but
audited their associated data storage as well, you would have
known that this type of data information was being shared. And that’s our concern and
that’s what I’m saying now, time just to make a change. It’s time to really
address the privacy issue. It’s time to really come and
lead the country on this issue and how we can protect
individual user’s data and information. I know my time is running out,
but I appreciate you being here and I’m just hoping that you’re
committed to working with us in the future in
addressing these concerns. THUNE: Thank you,
Senator Cortez Masto. Senator Gardner? GARDNER: Thank
you, Mr. Chairman. And thank you, Mr. Zuckerberg,
for your patience and testimony today. The end is near, I
think, one, two, three or four people. So that’s good news, to
get out of this hearing. A couple questions for you, to
clarify one of the comments made about deleting
accounts from Facebook. In the user agreement it
says when you delete I.P. content, if – if it is deleted
in manner similar to – it is deleted in a manner similar to
emptying the recycle bin on a computer. However, you understand that
removed content may persist in backup copies for a
reasonable period of time. How long is that? ZUCKERBERG:
Senator, I don’t know, sitting here, what our
current systems are on that. But the intent is to get all
the content out of the system as quickly as possible. GARDNER: And does that
mean your user data as well? It talks about I.P. content, is that the same
thing as your user data; it can sit in backup copies? ZUCKERBERG: Senator, I think
that that is probably right. I – I don’t – I’m not sitting
here today having full knowledge of – of our current state of the
systems around wiping all of the data out of backups. So I can follow up with
you on that afterwards, but what I can tell you … GARDNER: But all
backups get wiped? ZUCKERBERG: That is certainly
the way it’s – it – it’s supposed to work. GARDNER: Has there ever
been a failure of that? ZUCKERBERG:
Senator, I – I don’t know. But this is – if we tell people
that we’re going to delete their data then we need to do that. GARDNER: And you do, do that? ZUCKERBERG:
(OFF-MIKE) GARDNER: Thank you. Mr. Zuckerberg, a couple of
other questions I think that gets to the heart of this
expectation gap as I call it, with – with the users. Facebook, as I understand it,
if you’re logged in to Facebook with a separate browser and you
log in to another – log in to another article, open a new tab
in the browser while you have the Facebook tab open, and that
new tab has a Facebook button on it, you track the
article that your reading. Is that correct? ZUCKERBERG: Senator, I … GARDNER: In the new tab. ZUCKERBERG: … I think that there – there
is functionality like that, yes. GARDNER: Do you think
users understand that? ZUCKERBERG: Senator, I think
that they – that there is a reasonable – the – I think the
answer’s probably yes for the following reason, because when
we show a ìLikeî button on a website, we show
social context there. So, it says here are your
friends who liked that. So in order to do
that, we would have to … GARDNER: But if – but if you’ve
got your Facebook browser open and you open up the
article in the Denver Post, and it has a
Facebook button on it, you think they know –
consumers, users know, that Facebook now knows what
article you’re reading in the Denver Post? ZUCKERBERG: Well, we would need
to have that in order to serve up that – the – the like button
and show you who your friends were who had also liked that. GARDNER: So, I – I – I – and I
think that goes to the heart of this expectation gap
because I don’t think consumers, users necessarily
understand that. I mean, in going
through this user agreement, as others have, you do need
a lawyer to understand it. And I hope that you can
close that expectation gap by simplifying the user agreement,
making sure that people understand their privacy. Has there ever been a violation
outside of the – the – the talk about Cambridge Analytica
about the privacy settings? Has a privacy setting violation
ever occurred outside of Cambridge Analytica? ZUCKERBERG: I’m not aware that
we have had systems that have … GARDNER: So the privacy
setting a – a – a consumer, a user uses, have
always been respected? There’s never been an instance
where those privacy settings have been violated? ZUCKERBERG: That’s
my understanding. I mean, this is the core thing
that our company does is – you come to Facebook, you say, hey,
I want to share this photo or I want to send this
message to these people. And then … (CROSSTALK) GARDNER: Has there
ever been a breach of Facebook data or a hack? ZUCKERBERG: There have been – I
don’t believe there has been a breach of data
that we are aware of. GARDNER: Has there been a hack? ZUCKERBERG: Yes. GARDNER: Have those
hacks accessed user data? ZUCKERBERG: I don’t believe so. I think we had an instance in
2013 where someone was able to install some malware on a few
employees’ computers and had access to some
content on their computers, but I don’t believe … GARDNER: Never to
affect the user of the page? Never affected the user page? ZUCKERBERG: I do not believe so. GARDNER: Okay. Has the government ever
asked to remove a page, have a page removed? ZUCKERBERG:
Senator, I believe so. GARDNER: Okay, and has the
government ever – can you get a warrant to join a page to get to
be on a page – pretending you’re a separate user, to
be liked by that, to track what
that person’s doing. Do you need a warrant for that
or can the government just do that? The FBI? Anybody? ZUCKERBERG: Senator, I’m
not sure I fully understand. You’re saying … GARDNER: We can
follow up on that, because I do have one final
question I want to ask you. A couple days ago, I think
Facebook talked about that it would label traditional
advocacy as political ads. And for instance, if the Sierra
Club was to run a climate change ad that would be labeled
political – a political ad. If the Chamber of Commerce
wanted to run or place an ad as this would be a – this would
have an impact on – the climate change regulations would have
an impact to talk about that through an ad, that would
be labeled as political, which is different than current
standards of what is political and issue advocacy. Is it your intent to label
things political that would be in contradiction to federal law? ZUCKERBERG: Senator, the intent
of what we’re trying to get at is the foreign election
interference we’ve seen has taken more of the form of
issue ads than direct political electioneering advertising. So because of that, we think
it’s important to extend the verification and transparency to
issue ads in order to block the kind of interference that
the Russians attempted to do, and I think will likely
continue to attempt to do. That’s why I think those
measures are important to do. GARDNER: Thank you. ZUCKERBERG: Thank
you, Senator Gardner. Senator Tester. TESTER: Thank you, Mr. Chairman. I want to thank you
for being here today, Mark. I appreciate you coming in. I hope this isn’t the last
time we see you in front of committee. I know this is – we’re
approaching five hours, so it’s been a little tenuous. Some mental
gymnastics for all of us, and I just want to
thank you for being here. Facebook is an American
company, and with that, I believe you’ve got a
responsibility to protect American liberties
central to our privacy. Facebook allowed a foreign
company to steal private information. They allowed a foreign company
to steal private information from tens of
millions of Americans, largely without any
knowledge of their own. Who and how we choose to share
opinions is question of personal freedom. Who we share our likes and
dislikes with is a question of personal freedom. This is a troubling episode
that completely shatters that’s liberty, so that you understand
the magnitude of this. Montanans deeply concerned –
they are deeply concerned with this breach of
privacy and trust. TESTER: So you’ve been at
this nearly five hours today. So besides taking reactive
steps – and I want you to be as concise as you possibly can –
what are you doing to make sure what Cambridge Analytica
did, never happens again? ZUCKERBERG: Thank you, senator. There are three important
steps that we’re taking here. For Cambridge
Analytica, first of all, we need to finish resolving this
by doing a full audit of their systems to make sure that they
delete all the data that they have and so we can fully
understand what happened. There are two sets of steps that
we’re taking to make sure that this doesn’t happen again. The most important is
restricting the amount of accessed information that
developers will have going forward. The good news here
is that back in 2014, we actually had already made a
large change to restrict access on the platform that would
have prevented this issue with Cambridge Analytica from
happening again today. Clearly we did not
do that soon enough. If we’d done it a
couple of years earlier, then we probably
wouldn’t be sitting here today. But this isn’t a change that
we had to take now in 2018, it’s largely a change
that we made back in 2014. TESTER: Okay. ZUCKERBERG: There were other
parts of the platform that we also similarly can lock down now
to make sure that other issues that might have been exploited,
in the future won’t be able to. And we’ve taken a number of
those steps and I’ve outlined those in – in my
written statement as well. TESTER: I appreciate that. And you feel confident that the
actions you’ve taken thus far – whether it was ones back in 2014
or the one that you just talked about, about locking the other
parts – will adequately protect the folks that use Facebook? ZUCKERBERG:
Senator, I believe so … TESTER: Okay. ZUCKERBERG: … although security is
never a solved problem. TESTER: That’s all I need. You talked about a full audit
of the – of Cambridge Analytica systems. Can you do a full audit if that
information’s stored somewhere – some other country? ZUCKERBERG:
Senator, if – right now, we’re waiting on the
audit because the U.K. government is doing a government
investigation of them. TESTER: Okay, but … ZUCKERBERG: And I do believe
that the government will have the ability to get into the
systems even if we can’t … TESTER: If information
is stored in the U.K., but what if it’s
stored some other country? What if the information is
stored in some other country? Can – is – is an
audit even possible? ZUCKERBERG: Well, senator,
we believe a bunch of the information that we – that
we will be able to audit. I think you raise an important
question and if we have issues, then we – if we are not able to
do an audit to our satisfaction, we are going to take legal
action to enable us to do that. And if – and also, I
know that the U.K. and U.S. governments are also involved
in working on this as well. TESTER: Yes, I don’t – I
don’t really – I’m telling you, I – I have faith in the U.S. government. I really actually
have faith in the U.K. too. I – there have been claims that
this information is being stored in Russia. I don’t care, it could be
stored anywhere in the world. I don’t know how you get
access to that information. I’m not as smart as you
are about tech information. And so the question really
becomes – and I got to move on – but the question is I don’t see
how you can perform a full audit if they’ve got stuff stored
somewhere else that we can’t get access to. That’s all. Maybe you have other
ideas on how to do that. ZUCKERBERG: Well, I think
we’ll know once we get in there whether we feel like we can
fully investigate everything. TESTER: Just real quickly. Senator Schatz asked a question
earlier about – about data and who owns the data. I want to dig into
it a little bit more. You said – and I think multiple
times during this hearing – that I own the data on
Facebook if it’s my data. ZUCKERBERG: Yes. TESTER: And – and I’m going to
tell you that I think that that sounds really good to me. But in practice – let’s
think about this for a second. You’re making about $40 billion
bucks a year on the data. I’m not making any money on it. It feels like you own the data. And in fact, I would say that
the – the data that was – that was breached through
Cambridge Analytic, which impacted – and correct
me if these numbers are wrong – some 80 million Americans. TESTER: My guess is
that few, if any, knew that that
information was being breached. If I own that data, I
know it’s being breached. So could – could you give me
some sort of idea on how you can really honestly say
it’s my data when, quite frankly, they
may have goods on me. I don’t – I don’t want them
to have any information on me. ZUCKERBERG:
Senator, when I say … TESTER: Because if I
own it, I can stop it. ZUCKERBERG: Yes. So, senator, when I
say it’s your data, what we mean is that you have
control over how its used on Facebook. You clearly need to give
Facebook a license to use it within our system. TESTER: Yes. ZUCKERBERG: Or else – or
else the service doesn’t work. TESTER: Yes, I know and this
license has brought up – been brought up many times a day, and
I’m going to be quiet in just one second, Mr. Chairman. But the fact is, is the
license is very thick, maybe intentionally, so
people get tired of reading it, and don’t want to. Look, Mark, I
appreciate you being here. I look forward to
having another hearing. Thank you. THUNE: Senator Young. YOUNG: Mr. Zuckerberg, thanks so
much being here and enduring the many questions today. I think its
important you’re here, because social media – your
social media platform happens to be the ubiquitous
social media platform, and there’s not a senator that
you heard from today that isn’t on Facebook, that
doesn’t communicate with our constituents through Facebook. In a sense, we have to be on it,
and so I think its especially important that you’re
here, not just for Facebook, but really for our
country and beyond. The threshold question that –
that continues to emerge here today is what are the reasonable
expectations of privacy that users ought to have? And I’ll tell you my neighbors
are unsatisfied by an answer to that question that
involves, you know, ìtake a look at the user
agreement.î And I – I think there’s been a fair amount of
discussion here about whether or not people actually
read that user agreement. I would encourage
you to, you know, survey that, get all the
information you can with respect to that, and make sure that –
make sure that user agreement is easy to understand and
streamlined and so forth. Mr. Zuckerberg, earlier in
today’s hearing you drew a distinction that I
thought was interesting. It caught my attention. It was a distinction between the
consumer expectation of privacy depending upon whether they were
on an ISP or ìthe pipes of the Internet,î as you
characterized it, or on an Edge
platform, like Facebook. I find this distinction
somewhat unsatisfying, because most folks who use the
Internet just think of it as one place, if you will. They think of it as ìthe
Internet,î as opposed to various places requiring
different degrees of privacy. Could you – could you speak to
this issue and indicate whether you’d support a comprehensive
privacy policy that applies in the same manner to all entities
across the entire net – Internet ecosystem. ZUCKERBERG: Senator, sure. I think that people’s
expectations of how they use these different
systems are different. Some thing – some apps are very
lightweight and as are – and you can fully encrypt the data going
across them in a way that the app developer or
the – the pipes, in the ISP case. You probably shouldn’t be
able to see any of the content, and I – I think you probably
should have a full expectation that no one is going to be
introspecting or looking at that content. (CROSSTALK) YOUNG: Give
me some quick examples, if you would kindly, sir. ZUCKERBERG: Sure. Well, when data is going
over the Verizon network, I think it would be good for
that to be as encrypted as possible, and such that
Verizon wouldn’t look at it, right? I think that’s
what people expect, and I don’t know that being able
to look at the data is required to – to deliver their service. That’s how WhatsApp
works too, so that’s an app. It’s a very lightweight app. It doesn’t require us to know a
lot of information about you, so we can offer that
with full encryption, and therefore, we’re not looking
– we don’t see the content. For a service like
Facebook or Instagram, where you’re sharing photos
and then they – people want to access them from lots
of different places. People kind of want to
store that in a central place, so that way they can go
access it from – from a lot of different devices. In order to do that, we need to
have an understanding of what that content is. So I think the – the
expectations of – of what Facebook will have knowledge
of versus what an ISP will have knowledge of are just different. YOUNG: I think that needs to
be clearly communicated to your users and – and
we’ll leave it at that. That – that those – those –
those different levels of privacy that the user can expect
to enjoy when they’re on your platform. I’d like to sort of take a
different tact to Internet privacy policy with you sir. Might we create stronger privacy
rights for consumers either through creating a stronger
general property right regime online; say a new law that
states unequivocally something that you said before, that users
own their online data or through stronger affirmative opt in
requirements on platforms like yours. Now if we were to do that, would
you need to retool your model? If we were to adopt one
of those two approaches? ZUCKERBERG: Senator, could you
repeat what the approaches are again? YOUNG: Yes. So one is to create a
stronger property right for the individual online through a law,
that states unequivocally users own their data. The other one is a stronger
affirmative opt in requirement to be a user on Facebook. Would you have to fundamentally
change the Facebook architecture to accommodate those policies? ZUCKERBERG: Senator, those
policies and the principles that you articulated are generally
how we view our service already. So depending on the details of
what – what your – the proposal actually ends up being – and the
details do just matter a huge amount here – it’s not clear
that it would be a fundamental shift. But the details really matter
and if this is something you’re considering or working on, we
would love to follow up with you on this because this is
very important to get right. YOUNG: I’d love
to work with you. I’m out of time. Thank you. GRASSLEY: Senator Thune
has a closing comment. THUNE: Just a … GRASSLEY: … and I have a process statement
for everybody to listen to. THUNE: Mr. Chairman thank you
and – and thanks to all of our members for their
patience; been a long hearing, particularly long hearing
for you Mr. Zuckerberg. Thank you for – for
sitting through this. But I think this is important. I do have a letter here from the
Motion Picture Association of America that I want to get into
the record without objection. GRASSLEY: Without
objection, so ordered. THUNE: And then – and just a
quick – quick sort of rap up question if you will and
maybe one quick comment. But you’ve answered several
questions about – today about efforts to keep bad actors,
whether that’s a terrorist group to a malicious foreign
agent off of your platform. You’re also heard
concerns about bias at Facebook, particularly bias
against conservatives. And – and I just as
a final question, can you assure us that when you
are improving tools to stop bad actors, that you will err on
the side of protecting speech especially political speech
from all different corners? ZUCKERBERG: Senator, yes. That’s our –
that’s our approach. If there is an
eminent threat of harm, we’re going to take conservative
position on that and make sure that we flag that and
understand that more broadly. But overall, I want to make sure
that we provide people with the most voice possible. I want the widest possible
expression and I don’t want anyone at our company to make
any decisions based on the – the political
ideology of the content. THUNE: Okay. And just one final
observation Chairman Grassley, Mr. Zuckerberg’s answered a lot
of questions today but there are also a lot of questions today,
but there are also a lot of promises to follow up with some
of our members and sometimes on questions about Facebook
practices that seem fairly straightforward, but I don’t
think we have – I think it’s going to be hard for us to
fashion solutions to – to solve some of this stuff until we
have some of those answers. And you had indicated earlier
that you’re continuing to try and find out who among these
other analytics companies may have had access to user that
– that they were able to use. And hopefully as you
get those answers, you will be able to forward
those to – to us and it’ll help shape our thinking in terms of
how – where we go from here. So – but overall I think it’s
a very informative hearing, Mr. Chairman, and – and – so
I’m – I’m ready to wrap it up. GRASSLEY: Yes, I probably
wouldn’t make this comment, but you’re response to him in
regard to political speech, I won’t identify the CEO I had
a conversation with yesterday, but one of our platforms – and
he admitted to being more or left than right, or I mean being
left I guess is what he admitted and I don’t want to – I’m
not asking you what you are, but it – but just so you
understand that – that probably as liberals have a
lot of concerns about, you know, the leaning of – of
Fox News or conservatives have questions about the
leaning of – of MSNBC let’s say. It seems to me that when you –
when we get – whether it’s from the right or the left, so
I’m speaking to you for your platform, there’s a great deal
of cynicism in American society about government generally. And then when
there is suspicions, legitimate or not, that
maybe you’re playing on one way unfairly towards the other, it
seems to me that everything you can do to lean over backwards to
make sure that you are fair in protecting political
speech, right or left, that you ought to do it. And I’m not telling
you how to do it, and I’m not saying
you don’t do it, but we’ve – we got to do
something that reduces cynicism. At my town meetings in Iowa,
I always get this question, how come you guys in D.C. can’t get along? You know, meaning
Republicans and Democrats. Well I try to explain to them
that they kind of get a obtuse – what would you say –
review of what goes on here, because controversy makes news,
so if people are getting along, you never hear about that. So they get a
distorted view of it, and – and really we –
congressmen get along more than the public thinks. But these
attitudes of the public, we’ve got to change and
people of your position and your influence, you can do
a lot to change this. Whether I know you got plenty
time around your corporation, through your
corporation or privately, anything you can do to reduce
this cynicism because we have a – a perfect constitution,
maybe it’s not perfect, but we got a very good
constitution – the longest one written constitution in the
history of man – mankind. And – but if people don’t have
faith in the institutions of government, and then it’s – it’s
our responsibility to enhance that faith so they
have less cynicism in us, you know, we don’t have a very
strong democracy just because we’ve got a good constitution. GRASSLEY: So I hope that
everybody will do whatever they can to help enhance
respect for government, including speaking to myself. I got to bend over backwards to
do what I can so they don’t – so I don’t add to that cynicism. So, I’m sorry you
had to listen to me. (LAUGHTER) And so, this
concludes today’s hearing. Thanks to all the
witnesses (sic) for attending. The record will be open for 14
days for the members to submit additional written
questions and for the witness, Mr. Zuckerberg, to make any
corrections to his testimony. The hearing is adjourned.


100 thoughts on “Facebook CEO Mark Zuckerberg testifies before Congress on data scandal”

  • When did Americans stop being responsible for themselves? If you don’t want it out there…don’t put it in there!

  • If you don’t want to be “surveilled “(?) you’re probably doing something you shouldn’t be doing! again…don’t put it out there!

  • https://youtu.be/u-FlWZ1BOcA?t=1h38m47s Wait for the moment when Mark realizes the senator is actually now just being an old dude that doesn't understand where his app went. You can see a clear relaxing in his shoulders when he does, classic!

  • Humm i was recommended to watch this movie…interesting watching facebook boss sweating for five hours… and for free

  • It is a comedy show: "Everybody deserves privacy" – says the guy then goes home and deleting someone's private messages and looking at their nudes…

  • Privacy needs to address here because of human nature of having even a small dose of sins cannot be avoided. This will lead to family breakout, relationship arguments etc. or even arguments of people that may lead bloody confrontation because of gossip or the likes. Or the privacy of hiding to someone for some reason that may lead to bad result to your life, health, career, well being etc. if compromised..

  • The senators who personally use the app themselves still have no clue how the app works or anything about the data lol. this is shocking to be honest when they bring mr zuckerberg before congress and congress dont understand what they are asking.

  • Hey Senators. IT's simple. You sign up for Facebook. You own your facebook page and everything in it. Photo's, video's etc. You can choose who you share it with and the one you share it with can share it with others. Facebook of course can see all of your content. I don't mind that. I signed up!!! I know that!!!

  • Ceo Mark should disallowed the congress people to use Facebook. Take their account down. They totally don't understand how Facebook works. Don't waste yr time with them. If usa is not happy, China will welcome you.

  • Pamela Collins says:

    sips water and is clueless when asked a questin he does not like the yes and no answers he isnt going to do squat about any of these issues he has five minutes to just let you know back he will check with his team this is unacceptable we the people expect more than just taking a word of zuckerbergs and his clueless team

  • Pamela Collins says:

    so ya i look to buy an item on amazon next thing i know the item was was interested in buying i go on facebook low and behold the item i was going to buy it ends up on facebook thats invasion of my privacy most of the time people should end up with a settlement i say to the american people he owes us all money not just an apology

  • Pamela Collins says:

    a hit in the wallet is the only action this turd knows nevermind talking turkey hit em in the wallet thats the way he will make changes

  • Pamela Collins says:

    zuckerbook lets facebook animals being torchered puched kicked drowned burned forced booze down a dogs throat a dog draged bbehind car an on and on we are sick of this and want it removed from facebook he acts like he is clueless his team has all the answers well i doubt it very much so ya stop this criminal activity on animals we are sick of it

  • Michal Frnka, comment on ateismus.cz: "Facebook má v zásadách komunity, že kromě homosexuální skupiny, kterou s fanatickou vervou chrání a perzekuuje kokoholiv, kdo uráží ani né ty nešťastníky, nýbrž jen homolobby, které v Evropě USA rády podporují, aby Evropu rozvrátily. Důkazem je Brexit, podle mně záměrně vyvolané. Jednotná Evropa, představuje možný problém jak pro USA, tak pro Rusko. K věci. Zatím co homosexuální tak zvaná komunita, viz. porušení pravidel jak to říkají? O zásadách komunity, je chráněna. Křesťané a muslimové jsou facebookem perzekuování. Žádá si facebook o další skandál? Aby se tohle dostalo na youtube, kde to uvidí miliony lidí, kde nesahají jejich pazoury, aby mohli příspěvek cenzúrovat, nebo lidi blokovat? Už jen tato skupina, ateismus.cz, porušuje zmíněnou zásadu komunity. Dochází tu k urážkám křesťanství a Boha jako takového, že kdyby šlo o druhou zmíněnou komunitu, všichni byste měli blokace účtu nejméně na 30 dní. Skupina by byla zrušena, protože administrátoři, nejenže si neumí udělat pořádek, ale sami ještě podporují urážky jiné komunity. O islám nejde, protože všichni máte v kalhotách a bojíte se, aby nezačali islamisté jednat, které zaměňujete beztak s muslimy. Ve smyslu, každý islamista je muslim, ale ne každý muslim, je islamista. Mám svědectví od facebookem perzekuovaných lidí, že za urážku Boha, katolické církve a křesťanství nenásledovalo ani napomenutí a smazání příspěvku! Za urážku pouze homolobby, Ban, já jsem svědek, protože jsem se stal několikrát obětí této perzekuce. Mám svědkyni, Pavlu Nytrovou. Čili neměří se stejným metrem! Opět rovnice. V případě jedné komunity stačí málo a jsou trvrdé perzekuce. V případě křesťanství a islámu: Někdy když došlo ke sprostým urážkam konkrétního člověka, došlo jen ke smazání toho příspěvku a údajně upozornění, že se tak stalo. Když se urážela víra jako taková, Bůh a církev, nedostal jsem od facebooku zprávu, že následuje postih, bylo to v pořádku. Abych měl jistotu, sám sebe jsem facebooku hlásil abych mohl mluvit jen pravdu a byl si jistý. Facebook jednal velmi rychle. Ale neumlčel mně, vše jsem zveřejnil na youtube. Pavla Nytrová říkala, že facebook porušuje zákony o svobodě slova, už tak má problémů dost. Pokud nechce mít i ekonomické, když mu budou odcházet účastníci, ať pokračuje v urážkách mé osoby a nespravedlivých perzekucí. Jdu vyzvat tisíce muslimů a křesťanů, k odchodu z facebooku. Bohužel psychologická závislost, nejspíše nedovolí jim tak učinit. Buďme realisté. Jinak šéf je ateista, to vysvětluje podporu skupin, jako je tato a perzekuci monoteistických věřících.Prosím pokud mně chcete podpořit, nahlašte tento příspěvek facebooku, ale nutné aby to vůbec řešil, v kontextu s porušením zásad komunity, kdy půjde o napadení sexuálních menšin. Podat to jako něco jiného, ani se tím zabývat nebudou. Tím že příjde Ban, budu mít další důkaz proti jednání, jaké facebook předvádí. Očividně si USA moc dovolují a mají v ČR větší moc, než by bylo zdrávo. Jo jinak kdykoliv jsem napsal něco v tomto smyslu, nešlo o nic sprostého, okamžitě smazání a Ban! Na to jsou hákliví i v ČR, ve vedení společnosti. Očividně instrukce z USA. Tohle už je to, o čem mluvila paní Nytrová, porušení svobody slova v ČR. Říkat proti USA si může slušně kdo chce, co chce."

  • At fifty six minutes into it she asked a question that mark should say no to because when she repeated it, she changed it. She left out Alamo

  • She asked if you think European regulations should be applied here? It is a wide location as England is in Europe and so is Italy and parts of turkey. Europe is a continent with many countries and the have Euros and i believe the British pound for currencies. The euros or pounds is a currency we should consider doing trade with as it is much higher and could bring us more money in business. We don’t have to change our policies to theirs because we can keep our own and do business with them and both prosper in our own settings. The great thing about living in this country is we set the stage for ourselves, like in Facebook we can choose our audience and our friends and our passwords , what he can’t help is identity theft. That is another topic for discussion as it would involve putting the police not Mark Zuckerberg on trial.

  • 1 hour and nine minutes into it. Someone dying is not his fault first of all, it’s the police that need to investigate and just like at any public place something could happen and then the police get involved to investigate. Facebook is a platform and anytime your outside in a public place it’s like a platform, you have a possibility of getting attacked and getting hate speeches without Facebook. So it was there before Facebook and it’s there on tv and it is also there in institutions like, schools, hospitals and prisons. Facebook provides help and understanding in dealing with these situations but public places usually don’t. That’s the difference between Facebook who is making the world a better place.

  • One hour and 26 minuets in it. Facebook doesn’t collect data. They always ask permission if your sharing something. It what other apps don’t always do. Like plenty of fish and Craigslist where they spy on you and sell your information. Facebook always asks you prior to taking actions like sharing messages, pictures, text, etc….

  • About messenger kids: if any has kids will know that sometimes getting babysitters is hard, or if your in a line up and you want to engage your youngster then it’s useful to have. It’s saying no you need to teach them because it like giving kids candy. It’s not the candy you sue but the kids that you teach about moderation.

  • Note if you deleted your account from Facebook you no longer have access or right to that account. and that same rule is used on other apps like plenty of fish.

  • I haven’t heard of any pages being blocked or removed from Facebook in fact I was trying to block a Donald Trump and he even became the president so if anyone wants to be removed or blocked he would’ve been but since he still on there and continue to show up that means Facebook does not blog pages not from what I’ve seen anyway and a lot of people are complaining about time so I know Facebook doesn’t block people because he still on there

  • I don’t see business pages getting blocked on Facebook only person can be blocked. Take Caster communications for example who was copying me and my business practices, I tried to block them but because they are listed as a business page it wasn’t getting blocked. Same as Donald trump, before he became president I tried to block him but his page was business and public figures page so it was overwriting me and all I could do was submit a complaint. I did and submitted complaints. Those complaints were looked at. However things have changed, since then trump has oppologized and we are starting to get along here and there but not on all things. If people communicate about their problems it might be something they can work out. Not always but sometimes, so it’s worth a try. So Facebooks impact is an important one because it gives people a voice to hear.

  • Clearly they know nothing about the internet or computers. This was almost as bad as the guy saying that the next mass shooting will happen with a musket.

    Search "Intelligence of people in congress" – 404 not found

  • I believe: if you don’t want it out there… don’t put it in there! The moment that You put your information in the website it already became public, no privacy anymore!

  • Zuckerberg real name is Jacob Michael Greenberg, he is the grandson of one of the Rockefeller family members. We all know that the Rockefellers ran into the New World Order gig. Learn the truth put into the YouTube search engine. Who is Jacob Michael Greenberg

  • Where is closed captioning for this video? Please add it using a professional captionist and not rely on auto-caption by machine to do the work due to many errors. Thanks.

  • Federico Jacobitti says:

    Senator Tillis:
    I found interesting, most of all, the reminder of shutting down the possibility of data gathering.

  • Federico Jacobitti says:

    Senator Harris:
    She was a dragon.. she made a summary of the most burning questions made that didn't really get an answer. I found it worth while.

  • Federico Jacobitti says:

    Senator Hirono:
    On the identification of possible public threats threw data collecting.
    On the establishment of so to be called "dangerous" content.

  • Maria Batida de coco says:

    I am greek and I realize how bad things are in my country, but I know understand that things in the US are even worse. I can't believe that these people rule the country and are the leaders of justice. The senators have already formed their opinion before the hearing and their consultants formed the questions in this way, that have no actual answer. I am not saying that Zuckerberg is a saint, but they don't get to undermine his intelligence and talent.

  • Daniel Faria Borges says:

    NSA, FBI, and other agencies already have access all by themselves to a lot of the information in question. The case of FBI versus Apple is an fantastic example, and it is unique precisely because they were not able to hack it. With the amount of attacks and threats would FBI and other agencies be constantly asking for info to this tech giants? Well they don't because they have access to much of it. But in this case, it is the government and they want to protect us…Well Cambridge analytic used this data to create algoritms that analyse people and target them in order to make TRUMP win the elections. And actually for those who don't know, cambridge analytic, if I am not mistaken worked for TED CRUZ, before working for TRUMP.

    And mR Cruz was just siting there asking questions about info that he himself use in his presidential attempt.

    They are there attacking facebook and tech giants when the senator and candidates to presendecy used this data. What great americans are them to ask and pay cambridge analytica for data that could made them win elections.

    there should be more regulations, bcause now we have tech giants regulating themsleves, the same that we had in the us with bush and previous administrations with banks regulating themselves, which led to 2008 crash. But I dont want the american government or russina or any other government controlling my data a well.

    This was a security fail that was used by a company hired by Ted Cruz and Trump.
    And the worries of ted cruz is the republican party? He was trying o use this info!!

  • Daniel Faria Borges says:

    And then TRUMP uses Cambridge Analytica. The republican won the presedential elections due to ilegal data. Russian interfered with the elections. And the us senate is worried and attacks the giant tech companies, when they just had an ilegal and interefered election? (giant tech must be debated, but it is not just confroting one ceo, and when you have people sowing they bodies in underwear in public profiles, is hard to understand what else they have to hide and keep private)

  • They say Facebook raises its money on online ads. About 90 percent! Is this true? And they use our data for free. They sell it to advertisers. Is this all true?

  • What's with the dudes opening speech ? Bumming fb and stating that nothing malicious has been carried out by fb? Before any testimony has been given. Telling this weasel zuckerberg he is the shining light for the American Dream … I think we can see which way this is going folks

  • 5 mins in…. Russia fixing our elections? Cambridge what? Wake up ppl, this is insulting to our intelligence and natural instincts, we can change this nonsense and regain our intellectual dignity and the ability to think for ourselves

  • give someone access to your acct. have them change your password and THEN schedule your account for deletion….. make sure that it's a person that WILL SAY "F-You" you "weakwilled person" and out of spite? will NEVER "cave' or give you access! WE ALL know at least one of these people!!!!! 14…… or most likely NINETY days will pass and YOU? WILL NOT! have a facebook! …..give it a shot? i may try it….. no username or password equals? NO FACEBOOK!!!!!

  • Watch This With Concentration Programming Background Music ……….10x better understanding ……………

  • RomanceofParis says:

    People please, for your own sake, delete your Facebook account!! There is no value added for you. The only added value is for Facebook (Zuckerberg is the third richest man on Earth!). They use your personal info plus pictures for all kind of surveys or for marketing companies . They make money on You! Plus, you know who are your true friends, right? Don't measure your life or happiness with the number of likes you get. These are virtual likes. Call your Facebook friends when you have a flat tire…….See who is going to show up? Stop being a target and Facebook is the new Uncle Sam watching you, hoping for juicy info to exploit……..Facebook doesn't make it easy to cancel a membership but it is doable. Go on Goggle for explanations….

  • RomanceofParis says:

    Please get off Facebook for your own sake! Big Brother "Buddy" is watching you! What is wrong with this latest generation of air heads??? No Africa is not a country!
    Read this : https://money.cnn.com/2018/07/10/technology/mailru-facebook-russia/index.html

  • Facebook crime posting crime scenes …who is responsible? Where is Global Police 👮‍♀️ ? Facebook has become easy access for crime uploads and sharings as encouragement ?…of what?… and where is it happening??!!

  • When will Facebook name the companies that have been stealing user private data of their employees and friends of their employees???

  • With everything considered…..
    Mark is quite composed with a very pleasant personality for a Billionaire AI hell-bent on World domination.

  • I don't appreciate that even since I don't personally have a Facebook account except other users who possesses private information about myself are exposing me to Facebook with out my consent

  • mitsubishi fx says:


  • Cortani Park says:

    When you stop using Facebook and social media you find yourself getting random phone calls from all over the country

  • As a businessman. I was one of the victims to whom all of my private messages were stolen by the English company, Cambridge Analytica (I am English, so may pursue legal action against them). Facebook didn't report this, nor resolve this, and kept it quiet. I do not appreciate that.

  • William Steven says:

    Mark answers question with 'pre-taught responses', as if he's been told what to say ahead of time. Mark also lightly smiles at any compliment towards him, for example: 11:58.

Leave a Reply

Your email address will not be published. Required fields are marked *