Pages

25 April 2018

Future cyber threats will come from inside the architecture

By: Kelsey Atherton

“The Five Most Dangerous Attack Techniques” read the marquee guiding attendees of the RSA cybersecurity conference to this morning’s keynote panel. As the audience shuffled to find seats in the bluely lit room, the four panelists from SANS institute launched into a rapid fire assessment of multiple threats, some of which certainly seemed dangerous. Alan Paller, research director and founder of SANS Institute, together with SANS instructor Ed Skoudis, SANS Dean of Research Johannes Ullrich, and SANS Head of R&D James Lyne, are in the business of training people in threat response. So, what threats lurk in the shadows of 2018, waiting to ruin the lives of any government agency or contractor that uses computers? They ranged from de-anonymizing data stored in the cloud to infiltrating server farms primarily to steal processing power to finding new exploits against industrial controls.

Cloud City

The cloud, metaphorical repository of everything not on computers directly owned by users, is just a fancy term for “someone else’s computer.” Putting data on the computers of others by necessity requires surrendering some control over the devices, so Skoudis suggested that when using cloud assets, agencies and contractors invest specifically in data inventories and a have a person assigned as data curator. If the computers aren’t in-house, keeping an eye on the data still can be, and since so much of everything in technology today is the collection, maintenance, and use of data, it makes sense to invest resources in tracking data like other inventory.

For preventing developers from leaking credentials, Skoudis recommended git-seekrat and git-secrets, and to find sensitive information in repositories, Skoudis pointed users to gitrob. Amazon, Microsoft, and Google all offer tools for threat detection and data loss prevention. And then there is the basic security task of penetration testing, finding vulnerabilities yourself before an adversary does.

“You can do a pen test in the cloud - you just need permission from the cloud provider,” said Skoudis. “The provider can do it themselves or you can do a pen test of the end system.”

Besides tracking the data itself, Skoudis highlighted the risk of how data, even anonymized data, could be correlated with open-source information to be de-anonymized, citing specifically an example of anonymized Netflix user information that, paired with data gleaned from IMDb, can reveal who those anonymized users were.

Looming over this discussion of data security was the European Union’s GDPR, a set of strong data privacy protections set to come into effect next month

“I think it will lead to better protection of data,” said Skoudis. “I worry it will stay in Europe. I would like to see it here.”

Mine Craftiness

Pivoting sharply from Skoudis’ presentation on keeping cloud data secure, Ullrich began his presentation in front of a sign that blared “Nobody Wants Your Data Anymore.” He pointed to the life cycle of data theft: stealing information and selling it to others, then stealing information and locking it up with ransomware to sell back to users, and then skipping the data entirely and simply stealing the processing power of a company and its customers.

Why processing power? Bitcoin, and other blockchain-mined digital commodities, may have the best economic return on investment for intruders, so sneaking code in that can borrow a computer’s resources, while sending the returns back to the person who set it up, gives hackers a covert income stream. How can people spot miners? High CPU load, network traffic, and high temperature are the main means of discovery, applicable for outsider and insider threats, since the computers themselves will exhibit signs of misuse.

“Someone brought a cryptocoin miner into a data center, put it under the floor. ” said Ullrich, “Maybe use a thermal camera in data center security.”

In addition to checking computers for signs of hijacking by miners, and also in light of major security hardware failures like Spectre, security professionals should stop assuming that hardware is automatically trustworthy. Hardware can be isolated if it is physically isolated, but the more it is inline, the more possibility there is for the network to be infiltrated. Going forward, Ullrich recommended not just encryption for communications between networks, but encryption and authentication “on the wire” within networks, between individual machines. Fortunately, placing a miner on a computer isn’t the end of the world.

“[Miners] have to ex-filtrate data,” said Ullrich. “Miners get found, and have simple fixes once found.”

That miners can be found in the future does not mean that miners will be as easy to find as they were in the past.

“Coinjacker miners used to be greedy, clock things fast, create obvious problems,” said Ullrich, “Users quit jacked browsers. New miners don’t overclock as obviously.”

Power overwhelming

Industrial and infrastructure systems are at a unique moment of risk, argued Lyne, because they are being moved more and more online, but the industrial control community doesn’t have the same decades of expertise in defending against attacks that commercial and personal computer companies have seen. The problem is a match of both increased vulnerability on behalf of the targets and increased interest from attackers.

“I think it is inevitable we will see more focus on attacks from groups like nation states and others beyond those interested in money focused on industrial controls and memory overloads,” said Lyne. Sabotage as rationale itself, divorced from economic incentive, changes the calculus of security, and likely requires different responses at the highest levels. Nations have already explored how to get into vulnerable infrastructure systems.

That vulnerability might come in the form of not just taking data from industrial controllers, but changing what data the controllers even receive.

“We’ve seen attacks on controllers; what scares me is when attacks move from controllers to sensors themselves,” said Lyne. “What happens when your source of truth is manipulated?”

Learn from the past to avoid fallout

Tying the various themes together, Paller asked the panel to consider the words of an earlier speaker in the day, Cisco Senior Vice President John N. Stewart, about the need to shift the cybersecurity perspective from blaming users to blaming vendors.

“I wish we could stop patching vulns on default passwords,” said Ullrich, “and just send hardware back to vendors to fix.”

That vendors have shifted the burden of security onto end users, rather than take on the burden of securing their products themselves, is one hurdle facing the future, but the other panelists suggested other lessons learned.

“There’s lots of work to retrofit security today,” said Lyne, noting that in the commercial and personal computing worlds, the common practice is ship vulnerable and patch later. Perhaps the industrial computing world could learn that lesson, and while this wouldn’t mean a world with no patches, it would mean that the assumption of immediate patches because the product shipped insecure could be challenged.

No comments:

Post a Comment