9 March 2021

The danger in calling the SolarWinds breach an ‘act of war’

Tarah Wheeler

When news broke late last year that a massive, years-long Russian cyberespionage had penetrated large parts of the U.S. federal government and its information technology systems, policymakers were quick to describe the breach as “an act of war” and that the United States must strike back. But the breach that leveraged weaknesses in software developed by the company SolarWinds was not an act of war. It was an act of espionage. The United States has experienced cycles of outrage over Russian espionage before and mislabeling espionage as an act of war risks leading the United States toward the wrong response.

To understand why the SolarWinds breach was an act of espionage, and not an act of war, it is worth considering the technical details of the breach. The breach occurred via the Orion IT network management software developed by the Texas company SolarWinds. That tool was and remains widely deployed in U.S. federal systems. To keep the more than 300,000 customers that use Orion on the latest version, SolarWinds would occasionally push out an update that client machines would receive and install. The server that held the updated software was compromised when Russian hackers found a hole in SolarWinds network security, pivoted to the update server through the network, broke into that server, added a vulnerability to the patch pushed to customers, and recompiled the update to look innocent. When customers downloaded legitimate fixes from SolarWinds, they got a Russian wiretap along with it. The updated software contained a backdoor that permitted Russian eavesdropping on every computer that contained the Orion software.

Such supply-chain attacks are particularly challenging to detect. Think of it this way: Most people need to buy milk but rarely think about the supply chain that delivers it. If you purchase a gallon of milk at the store under a common name, like Darigold or Lucerne, you know intuitively that all the milk in that gallon didn’t come from a single cow. In fact, it probably didn’t even come from a single dairy. It came from multiple suppliers via the distributors that build the supply chain that outputs your gallon of milk. One of those original dairies, SolarWinds, didn’t notice that a few of their cows got swapped out for goats until somewhere down the line, someone with a goat milk allergy recognized what was happening and people started asking why there was goat milk in the cow milk supply chain.

SolarWinds appears to have made it easy for the attacker to breach their supply chain. One SolarWinds server with administrative power over other company computers was protected with the password “solarwinds123.” A password this simple—the company name and a few predictable additional bits like “123”—is part of any standard hostile password cracking attack. It would be something that the most junior of internal red team or penetration testers would do as part of a standard information security audit. Such a password is a hint that an attack would find poor security practices in many other places, perhaps enough to compromise the entire system. Kevin Thompson, the former CEO of SolarWinds, recently claimed that an intern had set this password years ago, but he offered little introspection as to why a single intern had that kind of security access to company production servers in the first place. SolarWinds failed to lock the doors to the dairy, and anyone passing by could see it. But this breach was no black swan event and happens everywhere, in all tech and security companies. The problem is not that the general public is unaware but that the companies themselves do not know it. But that does not make the Russian espionage operation targeting SolarWinds a cyberattack.

It’s easy for people to mix up cyberespionage and cyberwarfare. Cyberwarfare is the use of computers to conduct an operation that is intended to have a kinetic effect, whether that is shutting down power grids, crashing airplanes, denying access to critical communications, attacking military infrastructure, or interrupting hospital operations. It is deception and destruction on foreign shores. When it targets civilians and especially health facilities, cyberwarfare can be a war crime. If there is deception and destruction on foreign shores, these acts belong under the laws of war. Cyberespionage, on the other hand, is the act of a government listening in on the activities of foreign computers, just like in-person espionage might involve listening at hotel keyholes or telephone espionage might be a silent third listener on a phone call. To the best of my ability to tell, SolarWinds did not do one dollar of physical damage to any computer system, nor did a single human so much as break a fingernail, and, as a result, this operation was an act of cyberespionage.

The sheer abundance of embarrassment at the extent of the SolarWinds breach does not make a wiretap an act of war, nor does calling this event a cyberattack make it one. As the country begins to heal from the last four years of “alternative facts,” we must return to calling things by their correct names. Under the Tallinn Manual 2.0, which is the description of the laws of war in cyberspace put out by NATO’s Cooperative Cyber Defence Centre of Excellence, we know that this kind of espionage operation is not an active attempt to change or engage in deception/destruction on a local level. Keeping an espionage operation like this below the boiling point of what could be called an act of war—such as the destruction of computer systems—can even be one of its key objectives. Once an attacker has access to a machine that has sensitive information on it, why would she turn it off? I am an offensive security researcher, and when I break into a computer that contains sensitive information, I do nothing that is physically noticeable, and I certainly don’t turn it off. What I do is quietly listen—for as long as I possibly can.

Russia may have kept this operation in listening-only mode because history has taught the Kremlin that the fallout from an espionage operation that does not rise to the level of war will be minimal or nonexistent. Before and during World War II, Russia engaged in a years-long listening operation and deeply embedded listening devices and human assets in U.S. aerospace, chemical engineering, communications, and finance industries with minimal consequences. This pre-digital era espionage operation involved Julius and Ethel Rosenberg, Alger Hiss, and targeted U.S. officials up to and including FDR and Truman. As part of this operation, Julius Rosenberg transferred the designs of the Lockheed P-80 Shooting Star jet fighter to Moscow in 1943, and while Congress was made aware of the extent of Russian espionage, the consequences were token at best, partially because of the far greater threat posed by Nazi Germany and partially out of sheer embarrassment at the extent to which they had been spied upon.

Instead, Congress directed its attention inward. In response to the Alger Hiss scandal and the revelations of the Venona Project, Congress passed the 1950 Internal Security Act over Truman’s veto, which did little to prevent espionage and a great deal to enable domestic oppression. Truman argued—correctly—that the ISA would weaken our internal security far more than it would help, and that the data collection required would aid foreign penetration of our military-industrial complex rather than hinder it. It was the Patriot Act of its day, and instead of appropriately targeting Russia with relevant sanctions and international action to censure their espionage attempts, the legislative branch increased domestic surveillance to appear as though they were acting forcefully instead of out of fear.

Today, a number of different policy proposals have been floated in response to the SolarWinds breach. One former supply chain security official at the Department of Homeland Security has argued in favor of greater oversight of software suppliers. Others have proposed greater incentives for software firms to build secure products. The computer security expert Bruce Schneier recently noted that the economic incentives for companies to fix their own cybersecurity problems before they impact customers are misaligned. Companies can transfer the risk of a data breach or cybersecurity incident to their customers or to taxpayers with little or no financial impact. When the customer is the U.S. federal government, the risk and damage is experienced both by their customers and by taxpayers–and not by shareholders, who will pour dollars into stocks of companies too big to fail. This is a natural place for government regulation to intervene, since the market will not solve this problem–indeed, the market rewards making it worse. Brad Smith of Microsoft and Kevin Mandia of FireEye have proposed a requirement for private companies to disclose cybersecurity incidents, which addresses some of the disincentives that lead companies to keep breaches secret. Mandia noted as well that there must be liability protection for companies doing so in good faith. As someone who has had to justify a cybersecurity budget to corporate executives who often see cybersecurity as a cost center like janitorial work or legal fees, I can tell you that this will be a big help to people inside companies who want to do the right thing. While these are complex policy issues that will take time to implement well, they will make people safer and reduce the burden on taxpayers in the long run.

The time, care, and complexity required to formulate a good response is part of why we are at risk today of a misguided congressional reaction to the SolarWinds breach. Instead of increasing domestic surveillance, there are two important questions to ask. First: U.S. policymakers should not be asking, “How did they break in?” but, “How do we know they’re all the way out?” This question–how do we know our networks are not being hacked–is one of the most fiendishly difficult questions in corporate network security. When I have been asked this question, I am only able to answer it with probabilities and best practices. Anne Neuberger, the deputy national security advisor for cyber and emerging technology and the person charged with the cleanup, recently observed that the breach was “focused on the identity part of the network, which is the hardest to clean up.” Basically, because this hack was directed toward the very software normally used to identify the people and computers permitted to audit and secure computers, it’s even more difficult to make repairs and re-secure the system.

This complex issue requiring dedication and care from security professionals and policymakers cannot get lost in complacency and a single news cycle. We must see a comprehensive technical report issued about SolarWinds, the appropriate congressional investigation, and the appropriate executive branch responses in economic and foreign policy to penalize Russia for this operation. The United States must demonstrate that it has broken the historic cycle of complacency and recalibrated our global relationships with a focus on security, prosperity, and the integrity of U.S. federal communications and innovation.

Simple policy and cultural reforms can help make major improvements in U.S. cybersecurity. Investing in civilian and military cybersecurity defense spending that includes multiple bids and third-party audits with an increased focus on security in government suppliers beyond a simple compliance checklist is the single most important domestic long-term investment we can make. Also profoundly important is ensuring that companies are incentivized to build secure products and that they disclose it when something goes awry. Infosec should not be an article of belief, but a boring repair job like highway maintenance. Technology is inherently political; cybersecurity should not be. Partisan politics should not dictate whether you, I, or members of Congress update our mobile devices and laptops, or if our federal networks are properly secured and maintained.

The second key question we must ask is whether we are targeting our efforts at censuring Russia by working through the State Department and other appropriate civilian agencies. Using the U.S. military to respond to this operation would only be appropriate had this been an act of war, and increasing domestic surveillance as a response to foreign surveillance is a mismatch in response. Neuberger recently observed–correctly–that part of the technical problem is that “there’s a lack of domestic visibility,” but then implied greater domestic surveillance as a possible response: “As a country we have chosen to have both privacy and security. The intelligence community largely has no visibility into private sector networks.” Neuberger should absolutely be incentivising private American companies to improve security, but the federal response should not be to target private American companies out of embarrassment for this surveillance operation. We should be continuously engaging with the Russian government who carried it out. This is the trap we fell into after discovering the extent of the pre-WWII Russian surveillance operations, and we can avoid it this time by learning history’s lessons.

If we ask and answer these two questions according to U.S. tradition and law, we can avoid the trap of increasing domestic surveillance by introducing counter-productive security measures. Economic sanctions offer a diplomatic toolkit to penalize Russian espionage activity, and there is scope for a complex, appropriately targeted response. A sanctions package is currently being crafted by the Biden administration and will provide an early test for the administration’s ability to craft a proportionate, targeted response to Russian espionage. The United States must build its response carefully: The people of Russia are not the enemies of the people of the United States, and sanctions can be a blunt stick if not carefully applied to the actual source of this operation.

Tarah Wheeler is a contributing editor to TechStream, a Cyber Project Fellow at the Belfer Center for Science and International Affairs at Harvard University‘s Kennedy School of Government, an International Security Fellow at New America leading a new international cybersecurity capacity building project with the Hewlett Foundation’s Cyber Initiative and a US/UK Fulbright Scholar in Cyber Security for the 2020/2021 year.

Microsoft provides financial support to the Brookings Institution, a nonprofit organization devoted to rigorous, independent, in-depth public policy research.

No comments: