Pages

2 October 2019

Edward Snowden in His Own Words: Why I Became a Whistle-Blower


At the age of 22, when I entered the American intelligence community, I didn't have any politics. Instead, like most young people, I had solid convictions that I refused to accept weren't truly mine but rather a contradictory cluster of inherited principles. My mind was a mash-up of the values I was raised with and the ideals I encountered online.

It took me until my late twenties to finally understand that so much of what I believed, or of what I thought I believed, was just youthful imprinting. We learn to speak by imitating the speech of the adults around us, and in the process of that learning we wind up also imitating their opinions, until we've deluded ourselves into thinking that the words we're using are our own.

My parents were, if not dismissive of politics in general, then certainly dismissive of politicians. To be sure, this dismissal had little in common with the disaffection of nonvoters or partisan disdain. Rather, it was a certain bemused detachment particular to their class, which nobler ages have called the federal civil service or the public sector, but which our own time tends to refer to as the deep state or the shadow government.


None of those epithets, however, really captures what it is: a class of career officials (incidentally, perhaps one of the last functional middle classes in American life) who—nonelected and nonappointed—serve or work in government, either at one of the independent agencies (from the CIA and NSA to the IRS, the FCC, and so on) or at one of the executive departments (State, Treasury, Defense, Justice, and the like).

These were my parents, these were my people: a nearly 3-million-strong professional government workforce dedicated to assisting the amateurs chosen by the electorate, and appointed by the elected, in fulfilling their political duties—or, in the words of the oath, in faithfully executing their offices. These civil servants, who stay in their positions even as administrations come and go, work as diligently under Republicans as under Democrats because they ultimately work for the government itself, providing core continuity and stability of rule.

These were also the people who, when their country went to war, answered the call. That's what I had done after 9/11, and I found that the patriotism my parents had taught me was easily converted into nationalist fervor. For a time, especially in my run-up to joining the Army, my sense of the world came to resemble the duality of the least sophisticated videogames, where good and evil are clearly defined and unquestionable.

However, once I returned from the Army and rededicated myself to computing, I gradually came to regret my martial fantasies. The more I developed my abilities, the more I matured and realized that the technology of communications had a chance of succeeding where the technology of violence had failed. Democracy could never be imposed at the point of a gun, but perhaps it could be sown by the spread of silicon and fiber.

In the early 2000s the internet was still just barely out of its formative period, and, to my mind at least, it offered a more authentic and complete incarnation of American ideals than even America itself. A place where everyone was equal? Check. A place dedicated to life, liberty, and the pursuit of happiness? Check, check, check.

It helped that nearly all of the major founding documents of internet culture framed it in terms reminiscent of American history: Here was this wild, open new frontier that belonged to anyone bold enough to settle it, swiftly becoming colonized by governments and corporate interests that were seeking to regulate it for power and profit. The large companies that were charging large fees—for hardware, for software, for the long­-distance phone calls that you needed back then to get online, and for knowledge itself, which was humanity's common inheritance and so, by all rights, should have been available for free—were irresistible contemporary avatars of the British, whose harsh taxation ignited the fervor for independence.

This revolution wasn't happening in history textbooks but now, in my generation, and any of us could be part of it solely by dint of our abilities. This was thrilling—to participate in the founding of a new society, one based not on where we were born or how we grew up or our popularity at school but on our knowledge and technological ability.

In school, I'd had to memorize the preamble to the US Constitution: Now its words were lodged in my memory alongside John Perry Barlow's "A Declaration of the Independence of Cyberspace," which employed the same self-evident, self-elect plural pronoun: "We are creating a world that all may enter without privilege or prejudice accorded by race, economic power, military force, or station of birth. We are creating a world where anyone, anywhere may express his or her beliefs, no matter how singular, without fear of being coerced into silence or conformity."

This technological meritocracy was certainly empowering, but it could also be humbling, as I came to understand when I first went to work in the intelligence community. The decentralization of the internet merely emphasized the decentralization of computing expertise. I might have been the top computer person in my family, or in my Beltway neighborhood, but to work for the IC meant testing my skills against everyone in the country and the world. The internet showed me the sheer quantity and variety of talent that existed, and made clear that to flourish I had to specialize.

There were a few different careers available to me as a technologist. I could have become a software developer, or, as the job is more commonly called, a programmer, writing the code that makes computers work. Alternatively, I could have become a hardware or network specialist, setting up the servers in their racks and running the wires, weaving the massive fabric that connects every computer, every device, and every file.

Computers and computer programs were interesting to me, and so were the networks that linked them together. But I was most intrigued by their total functioning at a deeper level of abstraction, not as individual components but as an overarching system.

Got a Tip?

If you'd like to tip WIRED anonymously, we have a couple ways for you to do that here.

I thought about this a lot while I was driving, to and from Lindsay's house and to and from community college. Car time has always been thinking time for me, and commutes are long on the crowded Beltway. To be a software developer or programmer was to run the rest stops off the exits and to make sure that all the fast food and gas station franchises accorded with each other and with user expectations; to be a hardware specialist was to lay the infrastructure, to grade and pave the roads themselves; while to be a network specialist was to be responsible for traffic control, manipulating signs and lights to safely route the time-crunched hordes to their proper destinations.

To get into systems, however, was to be an urban planner, to take all of the components available and ensure their interaction to maximum effect. It was, pure and simple, like getting paid to play God, or at least a tinpot dictator.

There are two main ways to be a systems guy. One is that you take possession of the whole of an existing system and maintain it, gradually making it more efficient and fixing it when it breaks. That position is called a systems administrator, or sysadmin. The second is that you analyze a problem, such as how to store data or how to search across databases, and solve it by engineering a solution from a combination of existing components or by inventing entirely new ones.

This position, the most prestigious, is called a systems engineer. l eventually would do both of these, working my way into administration and from there into engineering, oblivious throughout about how this intense engagement with the deepest levels of integration of computing technology was exerting an influence on my political convictions.

I'll try not to be too abstract here, but I want you to imagine a system. It doesn't matter what system: It can be a computer system, a legal system, or even a system of government. Remember, a system is just a bunch of parts that function together as a whole, which most people are only reminded of when something breaks. It's one of the great chastening facts of working with systems that the part of a system that malfunctions is almost never the part in which you notice the malfunction. To find what caused the system to collapse, you have to start from the point where you spotted the problem and trace it logically through all of the system's components.

Because systems work according to instructions, or rules, such an analysis is ultimately a search for which rules failed, how, and why—an attempt to identify the specific points where the intention of a rule was not adequately expressed by its formulation or application. Did the system fail because something was not communicated or because someone abused the system by accessing a resource they weren't allowed to or by accessing a resource they were allowed to but using it exploitatively? Was the job of one component stopped or impeded by another? Did one program or computer or group of people take over more than their fair share of the system?

Over the course of my career, it became increasingly difficult for me to ask these questions about the technologies I was responsible for and not about my country. And it became increasingly frustrating to me that I was able to repair the former but not the latter. I ended my time in intelligence convinced that my country's operating system—its government—had decided that it functioned best when broken.

No comments:

Post a Comment