Companies high on virtualization despite fears of security breaches

19.04.2016
Companies are feeling more comfortable with the cloud, virtualization and even software defined data centers than ever before, despite their fears about security breaches, according to a study due out this month by technology companies HyTrust and Intel. While no one thinks security problems will go away, companies are willing to tolerate the risk in the name of agility, flexibility and lower costs.

Some 62 percent of executives, network administrators and engineers surveyed expect more adoption of SDDC in 2016, which can quantifiably drive up virtualization and server optimization, while 65 percent predict that these implementations will be faster.

Still, there are no illusions about security. A quarter of those surveyed say security will still be an obstacle, and 54 percent predict more breaches this year. In fact, security concerns are the No. 1 reason that 47 percent of respondents avoid virtualization, according to the report. They have good reason for concern. A single point of failure in a virtualized platform, such as a hack into the hypervisor software that sits just above the hardware and acts like a shared kernel for everything on top of it, has the potential to exploit an entire network, not just a single system.

[ MORE VIRTUALIZATION: Virtualization doubles the cost of security breach ]

“There’s a strong desire, especially by senior-level executives, to move forward with these projects because there are tangible benefits,” says Eric Chiu, president and co-founder of HyTrust. The opportunity to increase agility, revenues and profits trumps making the virtual environment safer, he adds.

Meanwhile, in the IT department, staff tends to focus on what they know how to protect, not necessarily what they need to protect, according to a Kapersky Labs report. Only a third of organizations surveyed possess strong knowledge of the virtualized solutions that they use, and around one quarter have either a weak understanding of them or none at all.

Dave Shackleford knows this all too well. He teaches a week-long course on virtualization and cloud security for the SANS Institute. By the end of the first day, he usually realizes that 90 percent of the students, a broad mix of system and virtualization/cloud administrators, network engineers and architects, have very little idea of exactly what they’re up against when it comes to securing virtual infrastructure.

“You’ve got organizations out there that are 90 percent virtualized, which means your whole data center is running in a box out of your storage environment. Nobody is thinking about it this way,” says Shackleford, who is also CEO of Voodoo Security. “It’s not uncommon to go into even really big, mature enterprises and find an enormous number of security controls that they’re unaware of or being overlooked in one way or another” in the virtual environment, he adds.

Adding to the confusion, virtualization has caused a shift in IT responsibilities in many organizations, says Greg Young, research vice president at Gartner. The data center usually includes teams trained in network and server ops, but virtualization projects are typically being led by the server team. “The network security issues are things they haven’t had to deal with before,” Young says.

The average cost to remediate a data breach in a virtualized environment tops $800,000, according to Kapersky Labs, and remediation costs bring the average closer to $1 million – nearly double the cost of a physical infrastructure attack.

Companies don’t see technology as the sole answer to these security problems just yet, according to the HyTrust survey. About 44 percent of survey-takers criticize the lack of solutions from current vendors, the immaturity of vendors or new vendor offerings, or issues with cross-platform interoperability. Even as vendors like Illumio, Catbird, CloudPassage and Bracket Computing emerge with fixes to some virtualization security problems, companies can’t afford to wait for the next security solution.

“If you’re 50 percent virtualized today, in two years you’re going to be 70 percent to 90 percent virtualized, and it’s not going to get any easier to add security,” Shackleford says. “If you start moving things out to Amazon or Azure or any big cloud provider, you want to have your security at least thought through or ideally in place before you get there, where you’re going to have even less control than you may have had to date.”

These security pros agree that companies can indeed have a secure virtual environment today if they can gain a clear picture of their virtual infrastructure, use some of the technology and security tools they already have, and better align technology and security in the organization.

1. Get a grip on your virtual infrastructure

“You can have very good security just through planning – taking the steps and making sure the safeguards are there,” Young says. This starts with inventory management. “The security team needs to get the lay of the land with regards to virtualization,” Shackleford says. “

You need to try to get a handle on where hypervisors are, where management consoles are, what’s in-house, where it lives, and what the operational processes are around maintaining those. Next, define standards for locking them down. If nothing else, at least lock down the hypervisors,” Shackleford adds. Major vendors like VMware and Microsoft have guides to help you, as well as the Center for Internet Security.

2. Rethink the way you look at data and storage.

People seriously need to think about their environment as a set of files, Shackleford says. “It’s a very big shift for security professionals to realize that your whole data center runs from your SAN – your storage network. So they need to at least get familiar with the types of controls that they’ve put in place.”

Vendors are also rethinking their security postures and welcoming third parties who can provide security fixes. “The problem before was, could I apply fine-grained network security to my virtualized environment, and in the past the network ops people said ‘absolutely not. We can’t support it,’ says Chris King, vice president in the networking and security business unit at VMware.

“Now there are technologies available that will enable them to revisit that request and that can now cut the common thread in [these] breaches, which is once an attacker is inside, they’re stuck in that compartment and have to break through another wall in order to attack.”

3. Encrypt the data

It’s top of mind these days, but many companies are still not encrypting, Chiu says. “There’s this outdated thought process, which is ‘if it’s within my four walls, then I don’t need to worry about it,’ but that’s definitely not the case. You need to at least encrypt all customer data and all intellectual property wherever it is in your environment,” Chiu says. “Of course the cloud makes finding it worse because you don’t know for sure where that data is – but encrypting all that data should be a fundamental principle.”

4. Coordinate security and infrastructure teams early on.

There needs to be alignment and coordination between security and infrastructure teams at the beginning of virtualization projects, Chiu says. “It’s a lot easier to build in security controls and requirements in the beginning than to bolt something on later.”

Security also needs to map the requirements of the organization for the next several years, he adds. “Does the company plan to virtualize PCI data, HC data, move to a shared environment where business units and application tiers are all going to get collapsed together All those things matter because your requirements are going to be different.”

(www.csoonline.com)

Stacy Collett

Zur Startseite