Welcome!

Containers Expo Blog Authors: John Rauser, Liz McMillan, Don MacVittie, Automic Blog, Elizabeth White

Related Topics: Containers Expo Blog, Microservices Expo, Microsoft Cloud, Open Source Cloud, @CloudExpo, SDN Journal

Containers Expo Blog: Article

Virtualization – The Easy Way

Virtualization is all about making the most efficient usage of your infrastructure while allowing for maximum flexibility

We all know that our computer systems are underutilized. Most new desktops come with far more storage than is going to be used (or, to be more exact - should be used given file servers). Processors desktops and servers are busy only 20% of the time, and we always buy more memory than we need just in case. We are left with spending large amounts of capital for hardware that will sit idle most of the time.

The solution is Virtualization - one of the latest buzzword floating around Information Technology. It seems that we have been talking about virtualization for a while now, but what does it actually mean and how do I get there? The key thing to know is that virtualization is all about making the most efficient usage of your infrastructure while allowing for maximum flexibility.

As in all things, we need to be clear on our terms - what is virtualization? In practice, "virtualization" covers many different but related concepts:

  • Server Virtualization. Taking several single purpose servers and having them all run as separate "virtual" servers on the same or set of physical servers
  • Network Virtualization. Virtualizing your network infrastructures so that single pipes (single pieces of copper or fiber) carry many different virtual Local Area Networks (VLANs) and/or different types of traffic (where data networks and storage networks are combined on the same physical pipe)
  • Storage Virtualization. Virtualizing your storage by the use of storage servers. Storage is then divided up and given to the physical servers - where each server then thinks it has its own set of disk drives
  • Desktop Virtualization. Moving desktop processing to central servers (which may or may not themselves be virtualized) and each user is given a virtual desktop

For the purposes of this article, we will focus on Server Virtualization. The other types of virtualization are sufficiently complex as to require their own articles.

At a high level, server virtualization can be accomplished using some very basic steps:

  • Inventory your existing servers - how many are there? For the sake of our examples, let's assume 20 servers today.
  • Compare the peak processor usage of all 20 servers. If you have good management software, it should be able stack all 20 servers' processing times together so that you see where your peak usage is. For our example, let's say we determine that we need 8 CPUs at peak processing time and that the existing servers were all dual core systems - for a total of 40 CPUs.
  • Compare the peak memory usage of all 20 servers. If you have good management software, it should be able stack all 20 servers' memory usage together so that you see where your peak usage is. For our example, let's say we determine that we need 32 GBytes at peak processing time and that current systems had over 80 GBytes in them today.
  • Compare the peak network usage of all 20 servers. If you have good management software, it should be able stack all 20 servers' network usage together so that you see where your peak usage is. For our example, let's say we determine that we need 1 ½ Gigabits/sec at peak network transfer (two network connections). Today we are using 20 network connections for the main traffic, plus another 20 for the management interface.
  • Determine how much actual disk space is being used. For our example, let's say we determine that 7 Terabytes are needed and that the systems have a total of 15 Terabytes worth of storage.

So we know that we using at peak 7 Terabytes of disk space, 32 GBytes of memory, and eight processors - how does that help us? Well, since most sites are achieving between 10 to 20 virtual servers to a physical server - we know that we should be able to reduce our datacenter from 20 servers to two servers. If each server had six CPU cores, 32 Gbytes of memory, and connected to a shared 10 Terabyte storage system, we would meet all of our current needs plus allow for reasonable growth.

I said before that "virtualization is all about making the most efficient usage of your infrastructure while allowing for maximum flexibility." Let's see how well we did.

  • We reduced our server count from 20 servers to two servers. This means that we are no longer trying to power 20 servers, trying to cool 20 servers, trying to cable 20 servers, etc.
  • We reduced the number of CPU cores from 40 to 12. CPUs are expensive - and we just saved a lot of money.
  • We reduced the amount of memory purchased from 80 Gbytes to 64 Gbytes.
  • We reduced the amount of disk space purchased from 15 Terabytes to 7 Terabytes.
  • We reduced the number of network connections from 40 connections (almost its own switch!) to four connections. I should comment that most servers use two network connections to allow for redundancy in case of a network failure - so we went from 60 connections to six.
  • We set up an environment that can accommodate a 50% increase in CPU requirements above today's usage without requiring new servers and migrating applications to new servers.
  • We set up an environment that can accommodate additional memory.
  • We set up an environment that can accommodate 40% increase in storage requirements - without requiring disks to be swapped out when more space is needed.

We were able to utilize server virtualization to reduce our operating costs, maintenance costs, and cooling costs. We were able to utilize server virtualization to create a more nimble architecture that can respond to changes in the business. We successfully created an infrastructure that was effectively utilized and gave us maximum flexibility.

Before we finish, let me give a final message about Desktop Virtualization. Desktop Virtualization can result in significant cost savings in both operating and capital budgets. Imagine the case where a company has 1,000 users. Best practice involves replacing desktops every three years - or 333 desktops a year. At a $1,000 per desktop (adjust up or down based on your company's standard desktop) - that is $333,000 in new PCs every year. At one desktop engineer per each 150 desktops - that is seven desktop engineers. Plus one engineer to just handle the upgrades. Desktop Virtualization would reduce the number of engineers to about two (representing an operating savings of close to $600,000/year) and reduce the capital expenditures to about $50,000. Do you know of any companies that would like to save close to a $1,000,000/year?

More Stories By Dean Nedelman

Dean Nedelman is Director of Professional Services at ASi Networks, a network and voice services firm in City of Industry, Calif. In this role, he supports our major accounts, and oversees the implementation of advanced Cisco solutions and services. He has over 25 years of experience in technical and executive level technology roles, primarily in secure high performance computer environments across multiple industries and sectors. Dean's background includes security, high availability, telephony, advanced network, wide area application services, and storage area networks.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
"There's plenty of bandwidth out there but it's never in the right place. So what Cedexis does is uses data to work out the best pathways to get data from the origin to the person who wants to get it," explained Simon Jones, Evangelist and Head of Marketing at Cedexis, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
SYS-CON Events announced today that Telecom Reseller has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics gr...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things’). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing? IoT is not about the devices, it’s about the data consumed and generated. The devices are tools, mechanisms, conduits. In his session at Internet of Things at Cloud Expo | DXWor...
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution. In his session at @ThingsExpo, Akvelon expert and IoT industry leader Sergey Grebnov provided an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone inn...