Key database considerations for hybrid cloud

02.03.2016
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.

Hybrid cloud implementations are becoming standard for companies building next-generation cloud applications, but their adoption raises questions about how to run and manage database operations that support both environments.

While hybrid cloud allows IT to expand infrastructure resources only when required (i.e. ‘bursting’), improves disaster prevention, and makes it possible to offload some hardware and operational responsibility and associated costs to others, database issues to consider include:

1. How Simple Is It How easy will it be to have the database run across one or more public clouds and private data centers If the idea is to have a single database utilize a public cloud as just another part of its IT infrastructure, businesses will want to avoid heavy lifting to incorporate one or more clouds into a database’s deployment topology.

The underlying architecture of a database plays a big role in how simple it is to make hybrid cloud happen. For starters, a masterless database architecture that sees every installation and running instance of the database software in the same way will inevitably be simpler to run and manage versus a master-slave or other similarly styled design. The latter will almost always have parts of the deployment devoted to different activities and functions (i.e., some parts handle write operations, while others only handle reads or are marked off as failover-only), be more difficult to handle, and likely disappoint when it comes to distributing data over wide geographic areas.

The foundational architecture of a database is closely tied to its replication capabilities, which also plays a key part in how simple the deployment is to run and maintain. Part of the idea in using a hybrid cloud database deployment is being able to have multiple copies of the data in various locations that serve to: (1) provide consistent performance no matter the location of the web/mobile user; (2) deliver continuous uptime and no outages at the database level vs. the legacy failover capabilities of centralized databases; (3) supply location independence where both write and read operations are concerned; (4) keep certain data local, other data cloud-only, and some data shared in order to satisfy legal and other business requirements. 

Lastly, management and monitoring tools should seamlessly incorporate any cloud-based machines running the database with on-premise hardware that houses the same database. To the tool, machines in the cloud should appear as any other data center in the enterprise’s IT infrastructure.  

2. How Scalable Is It  One of the biggest draws of the hybrid cloud model is the ability to quickly scale and meet the rapid demands of more application users and growing data volumes. The key is to avoid idle compute resources and elastically expanding or shrink a database deployment so its capacity matches either current or forecasted demand. 

While a nice idea, predictably scaling a database across on-premise and the cloud is not an easy task. Many DBaaS offerings appear to provide easy scale, but underneath the covers it involves a tremendous amount of structure that, in the end, will not deliver the performance and scalability needed to meet the growth requirements of successful cloud applications. The database’s architecture plays a key role in how scalable it will be across private datacenters and cloud providers and whether IT can confidently predict potential scale increases with a masterless architecture. 

3. How Secure Is It  Security is top of mind for every organization considering hybrid cloud deployments. Some studies show that nearly 70 percent believe moving any part of their database to a cloud increases their chances of unauthorized access, with others pointing to common concerns that include account compromise, cloud malware, excessive data exposure and over-exposed personally identifiable information (PII).

To alleviate these worries, the database being targeted for hybrid cloud should have an enterprise-class security feature set providing the same levels of protection and security management for data regardless of where it is housed. Encryption should be utilized for all data transferred over the wire, between nodes and at rest. Either internal or third-party security (e.g. LDAP, Kerberos, etc.) authentication and access authorization should be utilized across all sites. Lastly, smart auditing functions should be applied so database access can be monitored from both the cloud and on-premise locations. All of these security controls should function uniformly across the hybrid cloud deployment.  

Beyond standard security management, there is another consideration becoming more prominent every day -- data sovereignty (sometimes referred to as data residency). Data sovereignty addresses the idea that stored data is subject to the laws of the country in which it is located.

Many of the issues surrounding data sovereignty deal with enforcing privacy regulations and preventing data that is stored in a foreign country from being subpoenaed by the host country’s government. Cloud and hybrid cloud computing have broken down standard traditional geopolitical barriers so, in response, many countries have introduced new compliance requirements that mandate sensitive data be kept within the country in which it originates. 

Adhering to data sovereignty security requirements in a hybrid cloud database deployment comes down to having the flexibility to retain required data in a local-only fashion while permitting other data that is freed from such mandates to freely move between clouds that exist in other geographies.

All signs point to the fact that hybrid cloud is here to stay. Because the database is the heart of nearly every application, it’s important to ensure any database being considered for a hybrid cloud deployment is simple to operate in such environments, can scale in a predictable fashion, and ensures solid data security. 

(www.networkworld.com)

By Robin Schumacher, VP Products, DataStax