Welcome!

Containers Expo Blog Authors: Zakia Bouachraoui, Elizabeth White, Liz McMillan, Pat Romanski, Yeshim Deniz

Blog Feed Post

The Three Reasons Hybrid Clouds Will Dominate

In the short term, hybrid cloud is going to be the cloud computing model of choice.

image Amidst all the disconnect at CloudConnect regarding standards and where “cloud” is going was an undercurrent of adoption of what most have come to refer to as a “hybrid cloud computing” model. This model essentially “extends” the data center into “the cloud” and takes advantage of less expensive compute resources on-demand. What’s interesting is that the use of this cheaper compute is the granularity of on-demand. The time interval for which resources are utilized is measured more in project timelines than in minutes or even hours. Organizations need additional compute for lab and quality assurance efforts, for certification testing, for production applications for which budget is limited. These are not snap decisions but rather methodically planned steps along the project management lifecycle. It is on-demand in the sense that it’s “when the organization needs it”, and in the sense that it’s certainly faster than the traditional compute resource acquisition process, which can take weeks or even months.

Also mentioned more than once by multiple panelists and speakers was the notion of separating workload such that corporate data remains in the local data center while presentation layers and GUIs move into the cloud computing environment for optimal use of available compute resources. This model works well and addresses issues with data security and privacy, a constant top concern in surveys and polls regarding inhibitors of cloud computing.

It’s not just the talk at the conference that makes such a conclusion probabilistic. An Evans Data developer survey last year indicated that more than 60 percent of developers would be focusing on hybrid cloud computing in 2010.

blockquote Results of the Evans Data Cloud Development Survey, released Jan. 12, show that 61 percent of the more than 400 developers polled said some portion of their organizations' IT resources "will move to the public cloud within the next year," Evans Data said. "However, over 87 percent [of the developers] say half or less then half of their resources will move ... As a result, the hybrid cloud is set to dominate the coming IT landscape."

There are three reasons why this model will become the de facto standard strategy for leveraging cloud computing, at least in the short term and probably for longer than some pundits (and providers) hope.


HERE COMES THE LOGIC

imageIf we recall the model, specifically when used as a virtual private cloud, you’ll note that what the model actually does is extend the data center into the cloud computing provider’s compute space. Using VPN technology, whether SSL or IPSEC based, a set of cloud-based compute resources essentially become part of the organizational data center.

This becomes an important distinction because of the benefits associated with such a model. It is these benefits, in fact, that will drive the adoption of such a model faster than any other.

1. LEAST AMOUNT of NETWORK DISRUPTION
The use of VPN technology provides an extension of the data center through the extension of the network. Address schemes and routing are implemented in the same way they are implemented throughout the rest of the network, and aside from the challenges associated with managing performance over WAN links to remote sites the cloud computing provider’s network becomes a part of the data center network.

This means the provisioning and subsequent use of cloud-based compute resources can be achieved with the least amount of network disruption. There’s no integration, no interoperability issues, no complex routing schemes. The fact that a well-understood and proven technology is used to connect the resources helps keeps costs lower because necessary skill sets are already on-hand and experienced staff will likely be able to adeptly navigate the challenges associated with such network configurations.

2. LEVERAGES EXISTING INVESTMENTS in INFRASTRUCTURE
By simply extending the data center via existing standards and technology it is possible for the organization to continue to leverage existing data center investments and do so in a consistent way. There is no need for new or additional solutions for managing traffic and performance or addressing availability and existing skill sets are again able to be leveraged to optimize the use of extended resources. Rather than incur the costs associated with duplicating infrastructure such as load balancing services in the cloud computing environment, the existing infrastructure can be leveraged to perform the same duties for the remote resources. This holds true for identity management and security infrastructure, which continues to serve applications regardless of their physical locality.

3. VISIBILITY and CONTROL is MAINTAINED
Perhaps most importantly, visibility into the applications is maintained in this model because existing network management and reporting systems can continue to be used. Because all resources are technically on the same “network” existing methods of management and reporting can be extended as well and incorporated exactly as if the resources were locally deployed. This visibility and control maintains the ability of IT staff to diagnosis performance and availability issues and to respond according to organizational SLAs as expected.

blockquote "The hybrid Cloud presents a very reasonable model, which is easy to assimilate and provides a gateway to Cloud computing without the need to commit all resources or surrender all control and security to an outside vendor," said Janel Garvin, CEO of Evans Data. "Security and government compliance are primary obstacles to public cloud adoption, but a hybrid model allows for selective implementation so these barriers can be avoided." – Developers Will Focus on Hybrid Cloud in 2010, Survey Says


IT’S NOT ALL PUPPIES and RAINBOWS

The primary issue with this model is going to come from the link between the data center and the cloud computing provider. While most cloud computing providers are located near or on the Internet’s backbone, with peering agreements with most major providers, the link from your data center to that point may still end up congested or exhibiting poor performance. If you have only one link (of course you don’t, right?) remember that it must be shared with other organizational traffic as well as traffic with the cloud computing provider.

Thus consistent performance and utilization become important factors in ensuring this model is workable, especially as cloud-based resources will need to communicate with internal infrastructure. This may require some tweaking of infrastructure services such as monitoring systems to adjust for a longer time to respond than would be typical for resources located on the local network. WAN Application Delivery services may provide some relief and assurances by leveraging application and data optimization techniques that reduce both the chattiness of protocols and the amount of data traversing the link at any given time.

While the hybrid model certainly is the most advantageous from a network and financial disruption perspective, it is not “plug-n-play” and will still require adjustments. It may be necessary, for example, to move an existing application deployment to the cloud computing provider in order to allow for the deployment of a new project in the data center which simply cannot tolerate the more volatile network performance of a WAN.

Still, the hybrid model is almost certainly the model that allows organizations to realize the benefits of cloud computing and take advantage of cheaper compute resources on-demand while maintaining the control and ability to enforce organizational security and performance policies on applications that will be deployed in this “extended” environment. If I were betting on a model to win over the others, I’d put my money on the hybrid model.


Related blogs & articles:

Follow me on Twitter    View Lori's profile on SlideShare  friendfeed icon_facebook

AddThis Feed Button Bookmark and Share

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

IoT & Smart Cities Stories
The platform combines the strengths of Singtel's extensive, intelligent network capabilities with Microsoft's cloud expertise to create a unique solution that sets new standards for IoT applications," said Mr Diomedes Kastanis, Head of IoT at Singtel. "Our solution provides speed, transparency and flexibility, paving the way for a more pervasive use of IoT to accelerate enterprises' digitalisation efforts. AI-powered intelligent connectivity over Microsoft Azure will be the fastest connected pat...
CloudEXPO has been the M&A capital for Cloud companies for more than a decade with memorable acquisition news stories which came out of CloudEXPO expo floor. DevOpsSUMMIT New York faculty member Greg Bledsoe shared his views on IBM's Red Hat acquisition live from NASDAQ floor. Acquisition news was announced during CloudEXPO New York which took place November 12-13, 2019 in New York City.
BMC has unmatched experience in IT management, supporting 92 of the Forbes Global 100, and earning recognition as an ITSM Gartner Magic Quadrant Leader for five years running. Our solutions offer speed, agility, and efficiency to tackle business challenges in the areas of service management, automation, operations, and the mainframe.
Apptio fuels digital business transformation. Technology leaders use Apptio's machine learning to analyze and plan their technology spend so they can invest in products that increase the speed of business and deliver innovation. With Apptio, they translate raw costs, utilization, and billing data into business-centric views that help their organization optimize spending, plan strategically, and drive digital strategy that funds growth of the business. Technology leaders can gather instant recomm...
In an age of borderless networks, security for the cloud and security for the corporate network can no longer be separated. Security teams are now presented with the challenge of monitoring and controlling access to these cloud environments, at the same time that developers quickly spin up new cloud instances and executives push forwards new initiatives. The vulnerabilities created by migration to the cloud, such as misconfigurations and compromised credentials, require that security teams t...
AI and machine learning disruption for Enterprises started happening in the areas such as IT operations management (ITOPs) and Cloud management and SaaS apps. In 2019 CIOs will see disruptive solutions for Cloud & Devops, AI/ML driven IT Ops and Cloud Ops. Customers want AI-driven multi-cloud operations for monitoring, detection, prevention of disruptions. Disruptions cause revenue loss, unhappy users, impacts brand reputation etc.
As you know, enterprise IT conversation over the past year have often centered upon the open-source Kubernetes container orchestration system. In fact, Kubernetes has emerged as the key technology -- and even primary platform -- of cloud migrations for a wide variety of organizations. Kubernetes is critical to forward-looking enterprises that continue to push their IT infrastructures toward maximum functionality, scalability, and flexibility. As they do so, IT professionals are also embr...
@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...