Cloud security – How to balance cost vs loss of control over data

There’s a definite buzz of concern about cloud computing security as companies try to figure out when, how and whether they’re going to use public (as opposed to private or internal) cloud services. Companies want to know that cloud service providers will protect their information, and service-level agreements and SaS 70 audits may not offer them enough reassurance.

Not surprisingly, companies want to reduce risks and offset loss of control. And how best to do that was a hot topic at this week’s RSA Security Conference, as companies try to figure out how to bridge the gap between their reluctance to relinquish control over information security and the limited visibility cloud providers allow into their security architecture.

Perhaps the main issue is transparency, as providers can offer strong assurances — but not the kind of accountability — an enterprise can demand.

There was heated debate in one RSA session, as Eran Feigenbaum, director of security for Google Apps, said that cloud computing was being held to a standard that didn’t exist inside the enterprise, what he called “euphoric security states.” The panelists, including Feigenbaum, pushed for a standards-based approach to security that would meet the rigors posed by corporate governance and regulatory requirements.

Absent such standards, Feigenbaum noted that Google received SaS 70 certification and shares the audit results on its security controls with customers. Google is also now seeking certification to comply with the Federal Information Security Management Act (FISMA).

“The problem I have with SaS 70,” said Michelle Dennedy, Oracle vice president, “is that unless we make it like the 27000 series or publish the parameters of FISMA, the third party attestation for one is an apple, the third-party attestation for another provider is a cumquat.” She urged greater transparency, suggesting that “while cloud providers can’t reveal their entire security architecture, they can use vectors of the ISO 27002 standard to reveal as much as they can.”

Jim Reavis, co-founder and director of the Cloud Security Alliance, which has has issued a security guidance document (download PDF) for best practices, said the issue of transparency undercuts the question of whether information is any more or less secure in the public cloud than within the enterprise.

“The issue is that since we can’t prove [that the cloud is less secure] — and don’t have the compliance regimen we need to have done — we will require more transparency from cloud providers,” Reavis said.

Security pros are feeling the crunch. Even as companies push the potential cost savings in the cloud, IT departments worry about their ability to effectively mitigate risk or gain sufficient transparency into a cloud provider’s security.

As one conference attendee put it: “The execs and finance folks are banging the gong go to the cloud, go to the cloud. [But] I would not trust my private data or my high-impact business data to contracts.”

It was not all fear and loathing in San Francisco, however. Amid the uncertainty and hand-wringing, analyst Rich Mogull, CEO of Securosis, and Chris Hoff, Cisco Systems’ director of cloud and virtualization solutions, argues that cloud computing is a rare opportunity to redefine security around the informationThey call it “information centricity.”

“You should be delighted by disruptive innovation,” said Mogull. “It’s an opportunity.”

Hoff and Mogull argue that technology could soon allow companies to build security around the data itself, wherever it moves, protected based on its intrinsic value and the context in which it is used. For example, quarterly financial results are highly sensitive before they are released, but not once the quarterly report has been published.

“You have to adapt what you do and how, operationally, you may not be able to do what you do now,” Mogull said.

The information-centric approach requires understanding about how information flows, and how to apply the appropriate controls based on the context in which it’s used.

A combination of technologies — data labels, encryption, enterprise digital rights management, data leakage prevention and identity and access control — are close to the point, they said, where data can be classified at the point of creation and evaluated and re-evaluated wherever it flows. The result: Concern over whether data is within the enterprise or in a public cloud would lose a lot of its sting.

The key is to be ready to anticipate change, they said.

“It’s not about perfectly predicting the future,” said Hoff, “but looking at the indicators and correcting course before it’s too late. You have to know what to put on the radar, what to embrace.”

Original article from Computer World by Neil Roiter a freelance writer who has covered technology and security issues, most recently for TechTarget.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s