Feds weigh economics, security of cloud computing


Panelists at Security Through Innovation Summit April 14, 2016. (L-R) Robert Klopp, SSA; Joseph Ronzio, Department of Veteran Affairs; Paul Stephenson, VMware; Claudio Belloli, FedRAMP. (FedScoop photo)

Federal IT and cybersecurity officials explored the cost benefit equations inherent in using cloud computing last week, weighing cost savings, security and flexibility against each other as a major security vendor urged attendees to get past “emotional” assessments of the cloud.

“A few years ago, when cloud first came out, there was a very emotional response to it,” said Brian Dye, corporate vice president of global products at Intel Security. “And the emotional response was: The things I control have better security.”

Dye, who spoke at a press lunch on the sidelines of the 2016 Security Through Innovation Summit sponsored by Intel Security, explained that in reality cloud environments tended to be more secure than corporate networks “because they’re incredibly more standardized.” Heterogeneity, like multiple kinds of endpoints with individualized configurations, has long been recognized as making enterprises more vulnerable to cyberattacks.


In addition to a lingering fear of the unknown, speakers at a summit cloud adoption panel addressed common misunderstandings about the economics of the cloud, warning that agencies banking on savings from on-premises cloud computing are making a costly bet, and risk missing out on larger savings and agility available from large public providers.

“There is a false premise in the private cloud that you can get the same economic advantage if you take 200 racks of servers and create your own cloud,” said Robert Klopp, CIO and deputy commissioner for systems at the Social Security Administration.

Klopp acknowledged that private clouds still make sense for managing and safeguarding sensitive information. But they will never achieve the increasingly favorable economics of the large public and hybrid cloud service providers, such as Amazon, Microsoft and Google, that maintain hundreds of thousands of racks of servers, he said.

Even the best-managed private clouds are usually only utilizing 60 percent of their CPU capacity, Klopp said. “Imagine how much CPU utilization is sitting there unused,” Klopp said. “Amazon says, ‘I can go sell that unused CPU utilization to people who just use tiny chunks.’”

But that efficiency, in addition to driving down costs, can create vulnerabilities, warned Steve Grobman, chief technology officer at Intel Security.


“Cloud is a great asset to us, it’s a great asset to our customers, but it’s also a great asset to the attackers,” he said.

Grobman said he was mainly talking about how the large public cloud providers, by allowing small, less security-conscious businesses to have a public facing internet presence, created opportunities for hackers and online crime gangs to hijack cloud infrastructure and use its computing power for distributed denial-of-service attacks or other nefarious purposes.

Those kind of current attacks are distinct from what he referred to as “the next generation of privilege escalation attacks,” or “a virtual machine escape, where you have one cloud occupant essentially break out of their environment to infect the cloud provider” and potentially get access to other cloud tenants’ data. Although vulnerabilities have been discovered in some cloud software that might in theory allow such an attack, he told FedScoop afterward no example of it had ever actually been found.

“I’m a lot less worried about that, at least in the near term,” he told the press lunch.

In the meantime, panelists at the cloud discussion said, several factors are altering the economics of cloud computing.


Advances in application containerization and computing microservices are making it easier to break computing work into smaller payloads that can move in a “frictionless” way between hybrid and private cloud environments, Klopp said.

At the same time, the rise of dynamic pricing systems in the marketplace is making it easier to discover where and when unused computing capacity — and rock-bottom pricing — are available. When big cloud providers want to squeeze extra utilization out of their systems, the price of that spare computing power can fall to virtually zero for savvy users, Klopp explained.

In spite of the economic advantages that public cloud service providers can offer, Paul Stephenson, public sector field CTO at VMware, still suggested tech leaders in government agencies need to think about multi-cloud approach as the most practical way forward.

“We see that you need choice. Some people need to move in and back out,” Stephenson said. “How easy is it to get back [your data and applications] from Amazon right now? It’s not terribly easy.”

Stephenson, who previously worked with the Navy’s Space and Naval Warfare Systems Command, recalled trying to take about 200 applications out of the cloud service. Based on the amount of time it took to move one application, at that time, it would have taken five or six years to get the whole job done.


While that job is getting easier as cloud computing matures, he urged agencies to create virtual, insulated operating layers in their stacks to more effectively deploy and manage the movement of their applications to and from cloud providers.

Joseph Ronzio, special assistant to the chief health technology officer at the Department of Veterans Affairs, added that in terms of system architecture, clouds are more effective as “the endpoint for millions of devices versus the endpoint of one.”

For doctors at the VA, he said it’s often more important to have data and computing power on a doctor’s device rather than in the cloud — and with iPhones now having the computing power of Cray supercomputers of 20 years ago, that’s both possible and practical.

Ronzio suggested starting out with virtual sandboxes in the cloud and making sure vendors can operate in them before moving to larger scale production.

“In health care, vendors promise their solutions will work,” he said. “I tell them, here’s the cloud, let’s see if it actually works.”


The panelists urged government IT attendees in the audience to get past cultural barriers, security concerns and fear of the unknown that have held up adoption of the cloud, and to begin initiating projects.

“I would dare to say that the cloud today is often as secure as what you’re doing in a data center,” Stephenson said.

“We need to think about security differently,” added Intel Security’s Grobman, comparing the shift to the 1990s when businesses moved from mainframe to client-server architectures. “They needed to think about the way they secured their systems in a fundamentally different way.”

Wyatt Kash contributed to this report.

Latest Podcasts