Containers Expo Blog Authors: Elizabeth White, Pat Romanski, Yeshim Deniz, Liz McMillan, Zakia Bouachraoui

Related Topics: Containers Expo Blog, @CloudExpo

Containers Expo Blog: Article

Desktop Virtualization: Right Idea – Wrong Tool

Providing the centralized image management that IT needs as well as the real experience of a PC that users demand

Financial analysts, industry analysts, and CIO-focused publications all agree that Desktop Virtualization will be one of the most strategic business initiatives over the next few years. Many organizations have a VDI implementation or project in place solely because of the success they have had with server virtualization. However, Desktop Virtualization has a completely different value proposition than server virtualization. Server virtualization is all about CAPEX (capital expenditure) reduction while Desktop Virtualization is all about OPEX (operation expenditure) reduction. There are three dollars of OPEX spent for every PC acquisition dollar spent.

The whole value proposition is of Desktop Virtualization centralizing desktop images and management in the data center or network operations center. In other words, IT manages one copy of Windows and one copy of each application centrally instead of thousands of copies of Windows and applications on each PC.

Because it was the first Desktop Virtualization technology to market, VDI became synonymous with Desktop Virtualization. VDI was an evolution of server virtualization (in the case of VMware) or application virtualization (in the case of Citrix) to host and run Windows on a VM centrally instead of leveraging the local computational power of a laptop or desktop. VDI is great for a specific use case: users on a high speed LAN that have relatively static images and are using thin client devices. For the 98% of users that are on laptops and desktops (410M PCs shipping in 2010 as compared to 6M thin clients), VDI is simply the wrong tool for the job.

Users want the same or better PC experience as the PCs they are used to having. They want to use multimedia apps and they want to install their own apps, all while needing to operate over a WAN type of connection or disconnected from the network completely. This is especially the case as PCs become more and more powerful (Moore's Law) for essentially the same price.

IT wants to centralize the management of PCs and not control the hardware. This enables IT to act as a service provider and provide a Desktop-as-a-Service offering. Since server virtualization provided massive CAPEX savings, there is a natural assumption that desktop virtualization will provide the same CAPEX savings, while also providing the OPEX savings that occurs when you only have to manage one copy of Windows and one copy of each application (for everyone in an organization).

Here's where the wrong tool for the right job enters the story. While VDI is able to provide centralized management, it does not get down to single image management due to the lack of user personalization. Further, the infrastructure required for VDI has left most IT groups with sticker shock.

With the current density of approximately 50 users per server, the server infrastructure requirements if for 100 servers necessary to handle a VDI implementation for 5000 users. High speed storage and all the networking gear to connect everything together also tends to add up quite quickly too. As you can probably imagine, the ongoing power and cooling costs of that much infrastructure is not cheap either.

That covers the IT side of things, but the most important element here is user experience. Simply put, if the user experience is not better than or equal to the experience they have today, the solution will never succeed outside of pilot mode. More and more information is being delivered via video. Video chat and softphones are becoming more prevalent. Users have a plethora of personal and work applications that may include design and graphical applications that simply do not work well in a VDI environment. Working over a low speed network connection or disconnected from the network is becoming more common and introduces even more challenges in creating a working VDI environment.

Today, nearly all laptops shipping have at least a dual core processor, 4GB of RAM, and plenty of hard drive space. Soon, they will have quad core processors, more RAM, and enough hard disk space for all your photos, movies, and music combined - all for relatively the same price point.

What's needed is a solution that provides IT with the ability to realize OPEX gains by centralizing desktop management, but not have to empty their CAPEX budget in the process. Of course this solution also has to work for the users by enabling them to take advantage of the native performance of a PC and have the ability to work from any location - connected via a network or not. This is really what Desktop Virtualization is all about and why Desktop Virtualization is much bigger than just one of its use cases, VDI.

Assuming that eventually all forms of DV use the same images and VDI solves issues around persistent personalization without storing a copy of Windows and applications for each user, IT departments can centralize all forms of virtual desktops (VDI, laptops, and desktops) and manage one copy of Windows and one copy of each app for the entire organization.

At that point, Desktop Virtualization is really just a better way of doing desktop management. It's better not just because there is single image management instead of managing thousands of different laptops and desktops, but because of all the other benefits that occur when PC images are centrally stored and managed.

Disaster Recovery (DR) is one of those benefits as the primary copy of the PC image is always in the data center or network operations center. Additionally, DR with centralized images means an exact replica of the lost or damaged machine - not just the files. A PC image can even be moved temporarily to a virtual machine so that the user can continue to remain productive while his or her new machine is procured. An exact replica of the original PC image will be placed on the new PC along with any changes the user made while accessing via a virtual machine.

In addition, if the Desktop Virtualization solution integrates image layering into centralized image management, additional benefits or use cases are enabled including in-place Windows 7 migration and help-desk break fix operations.

With in-place Windows 7 migration, IT just assigns a new Windows 7 base image layer to a user and the new layer is sent down to the PC in the background. Once downloaded, the user will receive a message to reboot into Windows 7 and will have their Windows personalization and personal data in place. IT does not have to touch the PC and the user's downtime is only around 30 minutes.

Break-fix is accomplished on the centralized image by replacing the operating system or application layer while leaving all other layers on the PC intact. Since there is no troubleshooting with this procedure, the technician can normally follow a script as part of a "level one" help desk team instead of a Windows troubleshooting expert.

In all of these cases, the image should be able to be run on the endpoint (not on a hypervisor), so users can work offline, use processor-intensive applications, and enjoy predictable, native PC performance regardless of network connectivity.

Desktop Virtualization is really a game-changing technology, but like other early technologies, it is going through some growing pains.

Centralizing image management via layering is definitely the next way of doing desktop management, and when done right it provides great use cases including patch and image management, complete PC back-up and recovery, in-place Windows 7 migration, and centralized break-fix capabilities. This is really what Desktop Virtualization is all about and why it is much bigger than just one of its use cases, VDI.

For Desktop Virtualization to be truly successful, it will need to provide the centralized image management that IT needs as well as the real experience of a PC that users demand - all without breaking the data center budget.

More Stories By Barry Phillips

Barry Phillips is a seasoned Marketing executive with experience in both large and small companies. He joins Maxta after being the CMO of Panzura, Egnyte and Wanova (acquired by VMware), where he led Marketing, Sales, and Business Development. He came to Wanova from Citrix Systems, where he was the Group Vice President and General Manager of the Delivery Center Product Group. He joined Citrix through the acquisition of Net6. He began his career in United States Naval Aviation where he logged over 1,000 hours in a P-3C Orion.

Barry holds a Bachelors of Computer Science from the United States Naval Academy and a Masters of Computer Science from UCLA.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

IoT & Smart Cities Stories
Contextual Analytics of various threat data provides a deeper understanding of a given threat and enables identification of unknown threat vectors. In his session at @ThingsExpo, David Dufour, Head of Security Architecture, IoT, Webroot, Inc., discussed how through the use of Big Data analytics and deep data correlation across different threat types, it is possible to gain a better understanding of where, how and to what level of danger a malicious actor poses to an organization, and to determ...
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
DXWorldEXPO LLC announced today that Telecom Reseller has been named "Media Sponsor" of CloudEXPO | DXWorldEXPO 2018 New York, which will take place on November 11-13, 2018 in New York City, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, will provide an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life ...
Chris Matthieu is the President & CEO of Computes, inc. He brings 30 years of experience in development and launches of disruptive technologies to create new market opportunities as well as enhance enterprise product portfolios with emerging technologies. His most recent venture was Octoblu, a cross-protocol Internet of Things (IoT) mesh network platform, acquired by Citrix. Prior to co-founding Octoblu, Chris was founder of Nodester, an open-source Node.JS PaaS which was acquired by AppFog and ...
The Founder of NostaLab and a member of the Google Health Advisory Board, John is a unique combination of strategic thinker, marketer and entrepreneur. His career was built on the "science of advertising" combining strategy, creativity and marketing for industry-leading results. Combined with his ability to communicate complicated scientific concepts in a way that consumers and scientists alike can appreciate, John is a sought-after speaker for conferences on the forefront of healthcare science,...
"The Striim platform is a full end-to-end streaming integration and analytics platform that is middleware that covers a lot of different use cases," explained Steve Wilkes, Founder and CTO at Striim, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.