Welcome!

Containers Expo Blog Authors: Pat Romanski, Elizabeth White, Flint Brenton, Liz McMillan, Yeshim Deniz

Related Topics: Containers Expo Blog, Microservices Expo, Microsoft Cloud, Open Source Cloud, @CloudExpo, SDN Journal

Containers Expo Blog: Article

Virtualization – The Easy Way

Virtualization is all about making the most efficient usage of your infrastructure while allowing for maximum flexibility

We all know that our computer systems are underutilized. Most new desktops come with far more storage than is going to be used (or, to be more exact - should be used given file servers). Processors desktops and servers are busy only 20% of the time, and we always buy more memory than we need just in case. We are left with spending large amounts of capital for hardware that will sit idle most of the time.

The solution is Virtualization - one of the latest buzzword floating around Information Technology. It seems that we have been talking about virtualization for a while now, but what does it actually mean and how do I get there? The key thing to know is that virtualization is all about making the most efficient usage of your infrastructure while allowing for maximum flexibility.

As in all things, we need to be clear on our terms - what is virtualization? In practice, "virtualization" covers many different but related concepts:

  • Server Virtualization. Taking several single purpose servers and having them all run as separate "virtual" servers on the same or set of physical servers
  • Network Virtualization. Virtualizing your network infrastructures so that single pipes (single pieces of copper or fiber) carry many different virtual Local Area Networks (VLANs) and/or different types of traffic (where data networks and storage networks are combined on the same physical pipe)
  • Storage Virtualization. Virtualizing your storage by the use of storage servers. Storage is then divided up and given to the physical servers - where each server then thinks it has its own set of disk drives
  • Desktop Virtualization. Moving desktop processing to central servers (which may or may not themselves be virtualized) and each user is given a virtual desktop

For the purposes of this article, we will focus on Server Virtualization. The other types of virtualization are sufficiently complex as to require their own articles.

At a high level, server virtualization can be accomplished using some very basic steps:

  • Inventory your existing servers - how many are there? For the sake of our examples, let's assume 20 servers today.
  • Compare the peak processor usage of all 20 servers. If you have good management software, it should be able stack all 20 servers' processing times together so that you see where your peak usage is. For our example, let's say we determine that we need 8 CPUs at peak processing time and that the existing servers were all dual core systems - for a total of 40 CPUs.
  • Compare the peak memory usage of all 20 servers. If you have good management software, it should be able stack all 20 servers' memory usage together so that you see where your peak usage is. For our example, let's say we determine that we need 32 GBytes at peak processing time and that current systems had over 80 GBytes in them today.
  • Compare the peak network usage of all 20 servers. If you have good management software, it should be able stack all 20 servers' network usage together so that you see where your peak usage is. For our example, let's say we determine that we need 1 ½ Gigabits/sec at peak network transfer (two network connections). Today we are using 20 network connections for the main traffic, plus another 20 for the management interface.
  • Determine how much actual disk space is being used. For our example, let's say we determine that 7 Terabytes are needed and that the systems have a total of 15 Terabytes worth of storage.

So we know that we using at peak 7 Terabytes of disk space, 32 GBytes of memory, and eight processors - how does that help us? Well, since most sites are achieving between 10 to 20 virtual servers to a physical server - we know that we should be able to reduce our datacenter from 20 servers to two servers. If each server had six CPU cores, 32 Gbytes of memory, and connected to a shared 10 Terabyte storage system, we would meet all of our current needs plus allow for reasonable growth.

I said before that "virtualization is all about making the most efficient usage of your infrastructure while allowing for maximum flexibility." Let's see how well we did.

  • We reduced our server count from 20 servers to two servers. This means that we are no longer trying to power 20 servers, trying to cool 20 servers, trying to cable 20 servers, etc.
  • We reduced the number of CPU cores from 40 to 12. CPUs are expensive - and we just saved a lot of money.
  • We reduced the amount of memory purchased from 80 Gbytes to 64 Gbytes.
  • We reduced the amount of disk space purchased from 15 Terabytes to 7 Terabytes.
  • We reduced the number of network connections from 40 connections (almost its own switch!) to four connections. I should comment that most servers use two network connections to allow for redundancy in case of a network failure - so we went from 60 connections to six.
  • We set up an environment that can accommodate a 50% increase in CPU requirements above today's usage without requiring new servers and migrating applications to new servers.
  • We set up an environment that can accommodate additional memory.
  • We set up an environment that can accommodate 40% increase in storage requirements - without requiring disks to be swapped out when more space is needed.

We were able to utilize server virtualization to reduce our operating costs, maintenance costs, and cooling costs. We were able to utilize server virtualization to create a more nimble architecture that can respond to changes in the business. We successfully created an infrastructure that was effectively utilized and gave us maximum flexibility.

Before we finish, let me give a final message about Desktop Virtualization. Desktop Virtualization can result in significant cost savings in both operating and capital budgets. Imagine the case where a company has 1,000 users. Best practice involves replacing desktops every three years - or 333 desktops a year. At a $1,000 per desktop (adjust up or down based on your company's standard desktop) - that is $333,000 in new PCs every year. At one desktop engineer per each 150 desktops - that is seven desktop engineers. Plus one engineer to just handle the upgrades. Desktop Virtualization would reduce the number of engineers to about two (representing an operating savings of close to $600,000/year) and reduce the capital expenditures to about $50,000. Do you know of any companies that would like to save close to a $1,000,000/year?

More Stories By Dean Nedelman

Dean Nedelman is Director of Professional Services at ASi Networks, a network and voice services firm in City of Industry, Calif. In this role, he supports our major accounts, and oversees the implementation of advanced Cisco solutions and services. He has over 25 years of experience in technical and executive level technology roles, primarily in secure high performance computer environments across multiple industries and sectors. Dean's background includes security, high availability, telephony, advanced network, wide area application services, and storage area networks.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution. In his session at @ThingsExpo, Akvelon expert and IoT industry leader Sergey Grebnov provided an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...
Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, discussed how they built...
As ridesharing competitors and enhanced services increase, notable changes are occurring in the transportation model. Despite the cost-effective means and flexibility of ridesharing, both drivers and users will need to be aware of the connected environment and how it will impact the ridesharing experience. In his session at @ThingsExpo, Timothy Evavold, Executive Director Automotive at Covisint, discussed key challenges and solutions to powering a ride sharing and/or multimodal model in the age ...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
In his session at @ThingsExpo, Dr. Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, presented the findings of a series of six detailed case studies of how large corporations are implementing IoT. The session explored how IoT has improved their economic performance, had major impacts on business models and resulted in impressive ROIs. The companies covered span manufacturing and services firms. He also explored servicification, how manufacturing firms shift from se...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smart...
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...