WIRED Videos

NSA Director of Cybersecurity Anne Neuberger in Conversation with Garrett Graff

Anne Neuberger, Director of Cybersecurity at the National Security Agency, speaks with WIRED's Garrett Graff as part of WIRED25, WIRED's second annual conference in San Francisco.

Released on 11/8/2019

Transcript

00:00
Good morning.
00:01
I'm Garrett Graff, I'm a contributing editor
00:02
at Wired and cover national security,
00:04
and I'm excited to be joined here today
00:06
by Anne Neuberger, who is the head
00:08
of the Cybersecurity Directorate at the NSA.
00:11
And I'm gonna ask Anne a little bit about that in minute,
00:15
but Anne is a longtime NSA official,
00:23
was appointed the first chief risk officer
00:26
at the NSA in the wake of Edward Snowden revelations,
00:29
and had a background in the Navy,
00:30
in the private sector before that,
00:32
including as the deputy chief management officer
00:34
of the US Navy.
00:36
And one of the things, you know,
00:38
the NSA is an incredibly hard institution
00:41
to cover and understand sort of what they're doing,
00:43
but I've always appreciated that Anne has been,
00:46
throughout her career,
00:48
one of the people who has been very open,
00:50
or as open as she can be,
00:52
about trying to engage with the public
00:54
about the role that the organization has done.
00:57
So, Anne, I thought I would start by asking you this morning
01:00
about the new Cybersecurity Directorate, which you head.
01:05
The NSA has sort of two main missions,
01:08
signal intelligence and cyber security,
01:10
and as of October 1st, it has this new
01:13
Cybersecurity Directorate that knits together
01:16
various parts of that mission, which you now head
01:19
as sort of the top cyber security official of the whole NSA.
01:24
So tell us a little bit about what your new role is,
01:27
and what the goal of this directorate is.
01:30
Absolutely.
01:30
So first it's truly terrific to be here.
01:33
I really appreciated the invitation.
01:36
Wasn't lost on me that there was one government individual,
01:40
part of the 25,
01:41
so really, really appreciated the invitation,
01:43
the opportunity to be here, and the opportunity
01:46
to really listen to all the talks going on today.
01:48
So I'll start with that.
01:51
The director of NSA, which is a four-star Army general,
01:54
General Nakasone, decided to stand up the directorate
01:57
because we recognize that we're at a crossroads
02:01
from a national security perspective.
02:03
There are significant advancements in technology,
02:07
in economy, in society,
02:09
that are converging and coming together.
02:12
Really exciting, right?
02:13
We see 5G, we see Internet of Things,
02:16
autonomous vehicles, drones.
02:18
And with those significant advancements,
02:21
we also see malicious actors able to accomplish
02:24
strategic impact with individual tactical actions.
02:28
For example, theft of a company's research
02:31
and development investments, which can then allow
02:33
others to more quickly productize a product.
02:36
It affects, like the discussion we just heard,
02:38
the future of jobs, the future of the economy here.
02:40
Similarly, we see certainly using social media
02:43
to influence elections, so achieving
02:45
those strategic outcomes with tactical activities.
02:50
We felt the need to ensure that,
02:52
while using these technologies,
02:54
we understand the risks to our democracy, to society,
02:56
economy, and we ensure that we're accounting for those risks
03:00
at the same pace as the advancements become reality.
03:04
So the director brings together thousands of people
03:06
to focus on those problems,
03:07
and to allow us to really not only recruit,
03:10
but keep the kinds of people we need
03:12
to ensure we're putting in place those defenses needed.
03:16
And as the public, what should we expect different
03:20
with the addition of the Cybersecurity Directorate
03:24
and sort of this specific focus on the mission now?
03:27
Absolutely, so first and above all,
03:29
our mission is to prevent and degrade cyber actors
03:34
on the most sensitive, critical, national networks.
03:37
What does that mean?
03:38
It means, for example, that you know, Garrett,
03:40
you mentioned earlier at the beginning,
03:41
NSA has two missions.
03:43
Well one of them is we build the cryptography
03:45
and distribute the key for things
03:47
like nuclear command and control,
03:48
the military's secure communications,
03:51
both with itself and with allies all around the world.
03:54
So ensuring that those networks,
03:56
our critical infrastructure networks,
03:57
for example the defense weapons systems,
04:00
ballistic missile defense systems, working very closely
04:02
with the Department of Homeland Security
04:04
on secure critical infrastructure networks.
04:07
That's a key focus area for us,
04:09
and there's a lot of interesting areas
04:11
both in technology that we need to focus on
04:13
and modernize in that space.
04:15
Beyond that, we recognize that defenders operate
04:19
in unclassified space, and in order for us to be effective,
04:24
we need to operate more in unclassified space.
04:26
So in our first month, folks may have seen
04:29
we issued three advisories.
04:32
And in them we say, you know, there's a nation-state actor
04:35
using this CVE, this vulnerability, to do this,
04:38
and here's what we recommend you do
04:40
to secure your network to address it.
04:42
So those kinds of things where we tie together
04:43
a threat to enable and present defense are critical.
04:48
And then finally, the third thing,
04:50
is future technologies, and ensuring that
04:53
we're working in standards bodies,
04:54
we're working with companies to ensure those products
04:56
are as secure as possible.
04:58
Cloud security, Internet of Things, looking at,
05:00
can we do for low-compute, low-power,
05:03
what's the cryptography for that?
05:05
Quantum-resistant computing, quantum computing
05:07
has significant implications for the cryptography
05:11
we use today to secure a good deal of internet commerce.
05:13
So those three types of efforts is what you'll see more of.
05:16
I wanna come back in a little bit to talking more about
05:20
the emerging technology, but let me stay with the NSA,
05:24
and its mission for a minute longer.
05:27
You've had a front-row seat as General Nakasone
05:29
has settled in, he's been there
05:31
about two and a half years now,
05:33
and he's been sort of a...
05:37
Remarkable transformation of the NSA,
05:40
and he has this second role, the dual-hat,
05:44
of heading CYBERCOM, and we've seen
05:47
a much more vigorous engagement from CYBERCOM
05:51
in offensive cyber operations, against targets like Iran,
05:56
and against actually Russia during the midterm elections
05:59
last year, and you had an election security role previously,
06:04
and I wonder if you could talk a little bit about
06:06
how General Nakasone is approaching his mission,
06:10
and the NSA's mission, and sort of how, philosophically,
06:14
he's looking at the work that the NSA should be doing.
06:18
Absolutely, so great question there.
06:21
The key principle he brings to it is to say
06:23
it all starts with partners, and it starts with
06:27
operationalizing intelligence to better secure.
06:29
What does that mean?
06:30
So I'll use counter-terrorism as an example.
06:33
In the counter-terrorism mission, when we learn of a threat
06:37
across the intelligence community,
06:39
yes, certainly we issue that warning and awareness
06:42
in a top-secret report, but in addition to that,
06:44
we quickly make information available
06:46
to a local police force, to an allied police force
06:49
anywhere around the world, to ensure
06:51
that they can do something to save lives.
06:53
At the end of the day, it's not interesting for us
06:55
to write a classified report and then people are killed
06:58
in a terror attack, that's not the goal.
06:59
The goal is saving lives.
07:01
And he brings that same approach to cybersecurity,
07:05
election security.
07:06
So as you noted, I was the NSA lead, co-lead,
07:09
working very closely with a Cyber Command co-lead
07:11
for the 2018 midterm elections.
07:13
And the guidelines he gave us was,
07:17
I don't wanna see a classified threat intelligence report
07:20
about what was done to shake confidence in the elections,
07:23
I wanna make sure there is a secure and safe election,
07:26
and that every American feels confident
07:28
in the integrity of the vote.
07:29
And that's where we were with regard to ensuring
07:33
we laid out together three lines of effort.
07:35
The first was ensure we're building the insights
07:37
to understand who might have an interest
07:39
in shaping the election in some way,
07:42
in tampering with election infrastructure in some way.
07:46
The second was ensure that information gets to people
07:48
who can do something about it,
07:50
whether other elements in the U.S. government,
07:52
we partnered very closely with DHS and FBI,
07:54
partnered closely with social media companies,
07:56
make sure they can actually...
07:58
And you saw, for example, accelerated take-downs
08:01
which occurred of accounts which were
08:04
maligned social influence, social media influence,
08:07
both because the social media companies really put in
08:10
a greater focus on their own.
08:11
And then finally, that's the distinct authority
08:14
Cyber Command has, so this is not NSA,
08:16
but Cyber Command is authorized,
08:18
is one, authorized by the president,
08:19
can impose costs to ensure that, as was stated,
08:25
U.S. government policy is any attempts to interfere
08:27
in a U.S. election will be met with some consequences.
08:30
And we're just about a year away
08:33
from the 2020 election now,
08:36
although it feels like it's been going on
08:37
for 72 years already.
08:39
For you too? [audience laughing]
08:41
Sitting here today, sort of knowing what you know,
08:43
surveying the horizon and the threat landscape that you can,
08:48
do you think that Americans should feel confident
08:50
that we will have a secure and accurate
08:54
un-interfered with election next year?
08:57
Yes, I think that there are...
08:58
Not I think, I know, that there are hundreds of people
09:03
across the intelligence community,
09:05
across DHS, FBI, working to do those three things
09:09
I talked about, right?
09:10
Ensure they understand what's potentially planned,
09:13
ensure they're making that information available,
09:16
and ensure that we're making clear to those
09:18
who might have an interest that there will be consequences.
09:21
But that being said, it is a challenging task.
09:26
And beyond the elections, one of the things we saw
09:30
in 2018, and I think we've seen
09:32
over the last number of years,
09:34
is attempts to heighten the polarization of our society.
09:40
As a democracy, what we rely on is,
09:42
I love the word that Nick used earlier,
09:44
which was civil discourse.
09:46
We don't have to agree on everything, in fact, we won't.
09:48
This is not...
09:50
The beauty of America is that people
09:52
from many different backgrounds can come together
09:54
and can live together with different ethnic backgrounds,
09:58
with different religious beliefs,
09:59
with different values potentially.
10:01
But that there's more that unites us than divides us.
10:05
And I think that when we look at how social media
10:08
can be used today, and the degree to which individuals...
10:14
Influence operations have been around since the days
10:16
of Adam and Eve, right?
10:17
But what's changed is the way social media allows
10:20
for targeted yet broad messaging
10:23
that makes people believe, hey, this is a friend,
10:25
I can trust them, and then at the time
10:28
those messages are integrated,
10:31
or for example combining cyber attacks
10:33
to gain compromising information that's then leaked
10:35
at an inopportune time to shape thinking.
10:38
Right before a key event, a key vote, or a key decision.
10:43
So that's the piece that we worry about a great deal
10:45
with regard to a trend, and that's not something
10:48
the government can take on alone at all.
10:50
That's the reason I so appreciate the invitation,
10:52
and that's the kind of work that needs to be done
10:55
between individuals committed to civic discourse
10:59
in our society.
11:01
So speaking of civil discourse,
11:03
the NSA being here in San Francisco,
11:06
in the Bay area and Silicon Valley,
11:09
that has not been necessarily the warmest relationship
11:13
over the last five years or so.
11:15
Garrett has such a nice way of putting things.
11:17
Which Edward Snowden has a little bit to do
11:21
with the soured civil discourse
11:24
between the NSA and the tech community,
11:27
and I wonder if you could sort of, sitting here,
11:30
sort of talk a little bit about what...
11:33
What you sort of wish that Silicon Valley understood
11:38
about the work that the NSA does on a daily basis,
11:41
and what, from the NSA's perspective,
11:44
you wish you could get from Silicon Valley.
11:48
So you are correct that the...
11:53
Discussions and the trust broke down a great deal
11:56
post-2013 media leaks, but my perspective is,
12:00
we had some responsibility to do with it long before that.
12:05
As human beings, we don't trust the black box.
12:07
We trust things we understand, we trust people we know,
12:10
we've met, we've heard their values.
12:13
And when you walk in, I don't know how many people here
12:17
have been to the National Security Agency,
12:18
but when you walk into NSA, and you pass all the turnstiles,
12:21
and you get to the elevator,
12:23
it's the way I walk in every single morning,
12:25
in front of you is a black granite wall,
12:27
and on the top of that wall it says,
12:29
they served in silence, and they have names,
12:32
most pretty young, of individuals who in the 60-odd years
12:37
of the National Security Agency have lost their lives.
12:39
Most are young, military.
12:42
The most recent one was actually just a few months ago.
12:45
And that culture of they served in silence
12:50
is one I think that didn't always do us well.
12:54
It's a culture of pride in saying,
12:56
we just want people to be safe,
12:57
we don't need people to understand our role in that,
13:00
but the flip side of that is that the citizens we serve
13:04
don't necessarily get to know who we are.
13:05
So I think we had some responsibility before Snowden.
13:09
And I think actually to your core question,
13:12
there is a tension in being an intelligence agency
13:15
in a democracy, and I think it's a good tension.
13:20
I've lived with that tension in my own life.
13:24
I was raised in a family with a deep fear of government.
13:28
My father grew up in Soviet-occupied Hungary.
13:32
He and my grandparents fled here
13:34
after the Hungarian Revolution.
13:36
My mom's mom wore long sleeves her whole life
13:39
to hide the tattoo on her arm from Auschwitz,
13:42
which was tattooed to make her not a human being,
13:45
to make her just a number.
13:47
You know, my great-grandparents and most of their children
13:49
were all murdered, gassed to death in the death camps.
13:52
So I grew up in a home which feared government
13:55
and the power of government.
13:57
And yet my parents talked about the tremendous gratitude
14:00
to have the opportunity to live in a democracy
14:02
where they could practice their faith
14:04
and live lives free of fear.
14:06
So the responsibilities that we have in this country
14:11
to use the force for good, to do that balance,
14:15
to ensure that we can live safely,
14:18
to ensure that individuals all around the world
14:20
can live safely because of who we are,
14:25
and balance that carefully,
14:27
because there is no perfect security,
14:28
and there is no perfect privacy, you need both,
14:32
and there's tension in that, and that's good.
14:35
That's the goal for us.
14:37
And what we need, to your question,
14:39
from the Valley, and I think
14:40
from really every American citizen,
14:42
is to get involved in that, to figure out
14:47
do we have that balance right, challenge it,
14:49
get involved in making that balance right.
14:51
Come into government for a few years,
14:53
or if you're working in Silicon Valley
14:54
and you have a way to both protect privacy,
14:57
but help folks understand that when there are threats,
15:00
whether that's child pornography, crime,
15:02
or a national security threat,
15:04
best minds are working on those areas
15:07
that are really not black and white, they're gray.
15:09
And balancing the gray to get to the best outcomes
15:11
and protect who we are as a country
15:14
is I think the core challenge we have in front of us today,
15:16
and we need great, committed people being a part of that.
15:20
Do you feel like the relationship between the NSA
15:23
and the Valley, and the tech platforms,
15:25
has been repaired in a meaningful way
15:28
over the last five years?
15:30
I think it's become much better.
15:31
I look at the work we did together in the 2018 midterms
15:34
where we said, neither of us can do this alone.
15:36
The Valley made clear they wanted insights
15:38
about how platforms were used, how individuals
15:41
were appearing to look like they were U.S. individuals,
15:45
how people can anonymously connect to platforms,
15:48
how you can ensure that people have to validate in that way.
15:52
And we made a commitment to recognize that...
15:56
It wasn't enough to write a classified report,
15:58
and we had to ensure that information we had
16:00
made its way to people who could do something about it.
16:02
So I think repaired is a hard thing to say.
16:05
I think it's gotten better, and more importantly,
16:07
I think there are now enough conversations around values
16:11
and core shared goals to make...
16:16
To make what we need to happen more likely to happen.
16:19
And where do you see those shared values
16:21
between the NSA and the tech community?
16:24
Like, what is the overlap in where you can have
16:28
a conversation about values?
16:30
America is not a perfect country,
16:33
but I think America was a dream
16:36
that people from all different backgrounds
16:39
could live together, and respect each other enough
16:42
to give each other the freedom
16:43
to live their lives differently.
16:48
To me, the fear of what social media allows,
16:52
it allows those ties to be frayed.
16:55
It allows people to treat people like other
16:58
by creating communities that say,
17:00
we're all in this community, and we can talk anonymously
17:02
and safely, and that other community doesn't know
17:05
what we're saying, and that's dangerous
17:08
in a society that has to have...
17:10
That shared values are paramount.
17:12
And I don't think we should think that the society
17:15
is just there without our working to keep it that way.
17:17
So I think the shared values to say freedom of speech
17:20
means also freedom to respect each other's speech,
17:23
and respecting each other's speech
17:24
means that civil way in engaging
17:27
so you're not shutting people down by the incitement
17:31
and the anger in your speech.
17:32
And how to ensure that we protect speech
17:35
to enable freedom of speech.
17:37
I think there's more shared discussions on how we get there.
17:41
There's still more to be done,
17:43
but I do believe that we're closer to that point.
17:46
And I think one part...
17:50
One of the things we're trying to do at NSA
17:52
is bring more diversity into tech.
17:54
We run, for example, GenCyber camps, we call them,
17:57
which are camps for kids in high school.
18:01
There are 15,000 kids participated, 38 states,
18:04
which is super exciting, right?
18:06
'Cause if you can get people involved young in...
18:08
Cyber is something that can easily be...
18:11
Intelligence is a harder thing, right?
18:13
Intelligence, but cyber is defense, is a shared goal,
18:18
and by getting kids involved young
18:19
who can see that it's a shared goal
18:22
between government and the private sector, then...
18:25
And that diversity then allows that as well
18:27
by everybody seeing themselves in it.
18:29
We have about a minute left, so I wanna get to
18:33
emerging technologies and sort of what you're thinking about
18:35
in terms of threats that you see coming over the horizon.
18:38
Yep, three trends.
18:39
One is certainly see nation-state actors
18:43
becoming more sophisticated.
18:44
We see more of a focus on disruption,
18:47
kind of what we saw with Iran, Saudi Aramco,
18:49
breaking tens of thousands of machines.
18:52
And that shift to disruption, as well as hacking
18:55
to gain compromising information
18:57
that's then leaked at an inconvenient time.
18:59
So that's certainly the trends we see overall in cyber.
19:01
What we're focused on are securing technologies
19:05
like I mentioned, like IOT, looking at cryptography,
19:07
high-speed cryptography at the 400G level,
19:10
low-power I mentioned, and certainly looking at
19:12
security standards to enable us all
19:14
to use those technologies safely.
19:16
So really excited about some of those areas,
19:18
distributed ledger is another area
19:20
we're putting some focus on.
19:22
Excited both for the partnership
19:23
and for what we can do with those technologies.
19:26
One of the things that sort of stands out to me
19:28
in covering cybersecurity is how each major attack
19:31
that we've fielded from a nation-state adversary
19:34
has represented a failure of imagination,
19:36
that the attack on Sony Pictures by North Korea,
19:40
Russia's attack on our confidence in our democratic process
19:44
in 2016, what do you think sort of the next failure
19:48
of imagination that we're going to see
19:50
in terms of an attack on the U.S.?
19:56
That's a really good question.
19:58
I think it...
20:01
One of the things we've seen is the weaponization of drones
20:06
and low-orbit satellites, sensor platforms,
20:10
I think bringing together the intelligence
20:12
off those sensor platforms with potentially large numbers
20:16
of weaponized drones is a key concern.
20:19
The technologies to defend against that,
20:22
the cost of defense far exceeds
20:24
the cost of just building and deploying a drone.
20:26
And we're seeing those kinds of technologies
20:29
in certain ungoverned areas around the world.
20:31
So that's something that we're thinking hard about
20:34
and how to protect against.
20:35
Great, thanks so much for joining us, Anne.
20:37
Thank you, Garrett.
20:38
[audience clapping]
More from wired.com
WANNA BUILD A ROBOT? NASA to Give Away a Mountain of Its Code
WANNA BUILD A ROBOT? NASA to Give Away a Mountain of Its Code
MONUMENT VALLEY An iPad Game to Make M.C. Escher Drool
MONUMENT VALLEY An iPad Game to Make M.C. Escher Drool
FRIEND REQUEST DENIED Classic Paintings, for Millennials
FRIEND REQUEST DENIED Classic Paintings, for Millennials
WELCOME TO SILICON VALLEY The Satire of Our Dreams With Mike Judge
WELCOME TO SILICON VALLEY The Satire of Our Dreams With Mike Judge
A BATTLE TO THE DEATH Uber vs. Lyft: The $500 Million Battle
A BATTLE TO THE DEATH Uber vs. Lyft: The $500 Million Battle
COBRA iRAD 230 Dodge Cops With This Radar Detector
COBRA iRAD 230 Dodge Cops With This Radar Detector