6 December 2015

CYBER RESILIENCE IN THE AGE OF TERROR: THE FALSE CONUNDRUM OF ENCRYPTION

https://resilient.com/cyber-resilience-in-the-age-of-terror-the-false-conundrum-of-encryption/

Posted on November 30, 2015   by Thomas P.M. Barnett

THE ECONOMIST RUNS A GREAT "BRIEFING" ON CYBER SECURITY in its 28 November issue, asking "how to balance security with privacy after the Paris attacks"? It starts off by noting that the US and UK intelligence services proactively vacuum up the largest amounts of data based on the structural advantages they enjoy - namely, America is home to many of the world's largest Internet firms and Britain sits astride some of the world's biggest undersea fiber-optic cables.

Neither condition is an accident of history, of course, and both should logically be exploited by the nations' respective security agencies, I would argue, trusting that judicial and legislative oversight mechanisms - along with a free press and whistle-blowers - will continue doing their jobs. Yes, bad actors will continue to seek out and/or construct no-go zones beyond our reach, and advancing encryption technologies will enable that, but that shouldn't push free societies into accepting such "Wild West" dynamics ad infinitum.
Privacy, as the Economist notes, isn't the same thing as anonymity. As consumers, we trade privacy for convenience all the time. As citizens, we trade anonymity for security. If you want a truly anonymous, libertarian paradise, try living in a failed state or ungoverned space. Me? The near-certainties that I enjoy on a day-to-day basis justify my transparency before both my private- and public-sector service providers. Heck, if we all want to achieve the "smart" this and that in the emerging Internet of Things, then submitting to such transparency is a given. Simply in terms of long-term environmental sustainability, privacy must and will be sacrificed to network control functions.

For now, the countering IT industry argument is that we need more and stronger encryption technologies throughout our economies, lest we endanger our critical infrastructure. Hard to argue with that. Latest surveys indicate that less than half of our private enterprises consistently apply an encryption strategy (nice Economist chart on that below) - a scary deficiency when you remember that the vast majority of our critical infrastructure is owned and operated by private firms, which collectively are far more targeted by cybercriminals than is the public sector (with energy & utilities and financial services topping the list of those suffering the highest annual losses - according to another great Economist chart below).



According to the article, national security experts seek four "powers" regarding current and future encryption technologies: 
Tech firms should store message traffic sent along their networks - just in case security agencies need to access it in response to an emerging or existing threat. 
Firms must be willing to crack any code they produce and sell, if confronted with a legal warrant. 
Firms shouldn't sell any code they cannot crack themselves. 
Firms should build certain weaknesses inside their codes to allow security agencies to crack them as needed - and on their own. 


My sense is that #1 is reasonable, so long as there are statutory limits per the emerging civil "right to be forgotten."


On the face of it, #2 is the same standard we've applied on previous generations of communication and information technologies, so hard to argue against that.


Number 3 strikes me as not just sensible but inevitable, not because I fear "Skynet" eventually killing us all but because of basic product liability issues that will eventually be addressed by the courts.


It's #4 where I most definitely hesitate, because that does sound like producing weak(er) encryption technologies out of a generalized fear that our oversight institutions aren't up to the task of doing the right thing - consistently - when it comes to judging surveillance requests made by security agencies. Plus, if the private sector agrees to #1-3, then the public sector should be satisfied with such access as can be legally obtained through established procedures. In practice, this sounds like tech firms leaving "backdoors" inside the code to allow for subsequent access, something I think is warranted (pun intended). However, until proven unreliable on this score, the conservative in me says that the firms - and not the government - should own the "keys."


In sum, this is not a binary choice on encryption (making it stronger or weaker). Nor is it some mindless surrender to the "slippery slope" of Big Brother surveillance. Simply put, this is merely the next iteration of a balancing act we've long practiced in the United States.


Are we going to make mistakes in this long-term co-evolution between the private and public sectors? Absolutely. Resilience is less about avoiding mistakes than rapidly surmounting them with aplomb and grace. The US political system is built with a rock-paper-scissors mechanical structure for self-correction (our three branches - executive, legislative, judicial - consistently balancing one another's ambitions and manias). But in our private sector, such checks-and-balances are far less robust, primarily for competitive reasons. Yes, these self-regulating mechanisms are most advanced in the utilities sector. It's just that our definition of critical utilities and their infrastructures keeps expanding by the day.


This is why it's so crucial for such key industry sectors (18, by our count, with all but one private-sector dominated) to forge ahead on defining, evangelizing, and enforcing "best practices" for critical infrastructure resilience.

No comments: