Welcome!

Containers Expo Blog Authors: Liz McMillan, Pat Romanski, Yeshim Deniz, Elizabeth White, Zakia Bouachraoui

Related Topics: Containers Expo Blog

Containers Expo Blog: Blog Post

Beware Virtualization Sprawl

Harnessing 'virtualization sprawl' requires managing an ecosystem of technologies, suppliers

Better managing server virtualization expansion across enterprises has become essential if the benefits of virtualization are to be preserved and enhanced at scale. I recently had a chance to examine ways that IT organizations can adopt virtualization at deeper levels, or across more systems, data and applications -- but at lower risk.

As more enterprises use virtualization for more workloads to engender productivity from higher server utilization, we often see what can be called virtualization sprawl, spreading a mixture of hypervisors, which leads to complexity and management concerns.

In order to ramp up to more -- yet advantageous -- use of virtualization, pitfalls from heterogeneity need to be managed well. Yet, no one of the hypervisor suppliers is likely to deeply support any of the others.

So how do companies gain a top-down perspective of virtualization to encompass and manage the entire ecosystem, rather than just corralling the individual technologies? To better understand the risks of hypervisor sprawl and how to mitigate the pitfalls to preserve the economic benefits of virtualization, I recently interviewed Doug Strain, manager of Partner Virtualization Marketing at HP.

Here are some excepts:

Strain: Virtualization has been growing very steeply in the last few years anyway, but with the economy, the economic reasons for it are really changing. Initially, companies were using it to do consolidation. They continue to do that, but now the big deal with economy is the consolidation to lower cost -- not only capital cost, but also operating expenses.

... There’s a lot of underutilized capacity out there, and, particularly as companies are having more difficulty getting funding for more capital expenses, they’ve got to figure out how to maximize the utilizations they’ve already bought.

We’re seeing a little bit of a consolidation in the market, as we get to a handful of large players. Certainly, VMware has been early on in the market, has continued to grow, and has continued to add new capabilities. It's really the vendor to beat.

Of course, Microsoft is investing very heavily in this, and we’ve seen with Hyper-V, fairly good demand from the customers on that. And, with some of the things that Microsoft has already announced in their R2 version, they’re going to continue to catch up.

We’ve also got some players like Citrix, who really leverage their dominance in what’s called Presentation Server, now XenApp, market and use that as a great foot in the door for virtualization.

Strain: Because of the fact that all the major vendors now have free hypervisor capabilities, it becomes so easy to virtualize, number one, and so easy to add additional virtual machines, that it can be difficult to manage if technology organizations don’t do that in a planned way.

Most of the virtualization vendors do have management tools, but those tools are really optimized for their particular virtualization ecosystem. In some cases, there is some ability to reach out to heterogeneous virtualization, but it’s clear that that’s not a focus for most of the virtualization players. They want to really focus on their environment.

The other piece is that the hardware management is critical here. An example would be, if you’ve got a server that is having a problem, that could very well introduce downtime. You've got to have a way of navigating the virtual machine, so that those are moved off of the server.

That’s really an area where HP has really tried to invest in trying to pull all that together, being able to do the physical management with our Insight Control tools, and then tying that into the virtualization management with multiple vendors, using Insight Dynamics – VSE. ... We think that having tools that work consistently both in physical and in virtual environments, and allow you to easily transition between them is really important to customers.

There are a lot of ways that you can plan ahead on this, and be able to do this in a way that you don't have to pay a penalty later on.

Capacity assessment

It could be something as simple as doing a capacity assessment, a set of services that goes in and looks at what you’ve got today, how you can best use those resources, and how those can be transitioned. In most cases you’re going to want to have a set of tools like some of the ones I’ve talked about with Insight Control and Insight Dynamics VSE, so that you do have more control of the sprawl and, as you add new virtual machines, you do that in a more intelligent way.

We invest very heavily in certifying across the virtualization vendors, across the broadest range of server and storage platforms. What we’re finding is that we can’t say that one particular server or one particular storage is right for everybody. We’ve got to meet the broadest needs for the customers.

...Virtualization is certainly not the only answer or not the only component of data center transformation, but it is a substantial one. And, it's one that companies of almost any size can take advantage of, particularly now, where some of the requirements for extensive shared storage have decreased. It's really something that almost anybody who's got even one or two servers can take advantage of, all the way to the largest enterprises.

More Stories By Dana Gardner

At Interarbor Solutions, we create the analysis and in-depth podcasts on enterprise software and cloud trends that help fuel the social media revolution. As a veteran IT analyst, Dana Gardner moderates discussions and interviews get to the meat of the hottest technology topics. We define and forecast the business productivity effects of enterprise infrastructure, SOA and cloud advances. Our social media vehicles become conversational platforms, powerfully distributed via the BriefingsDirect Network of online media partners like ZDNet and IT-Director.com. As founder and principal analyst at Interarbor Solutions, Dana Gardner created BriefingsDirect to give online readers and listeners in-depth and direct access to the brightest thought leaders on IT. Our twice-monthly BriefingsDirect Analyst Insights Edition podcasts examine the latest IT news with a panel of analysts and guests. Our sponsored discussions provide a unique, deep-dive focus on specific industry problems and the latest solutions. This podcast equivalent of an analyst briefing session -- made available as a podcast/transcript/blog to any interested viewer and search engine seeker -- breaks the mold on closed knowledge. These informational podcasts jump-start conversational evangelism, drive traffic to lead generation campaigns, and produce strong SEO returns. Interarbor Solutions provides fresh and creative thinking on IT, SOA, cloud and social media strategies based on the power of thoughtful content, made freely and easily available to proactive seekers of insights and information. As a result, marketers and branding professionals can communicate inexpensively with self-qualifiying readers/listeners in discreet market segments. BriefingsDirect podcasts hosted by Dana Gardner: Full turnkey planning, moderatiing, producing, hosting, and distribution via blogs and IT media partners of essential IT knowledge and understanding.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...