Welcome!

Containers Expo Blog Authors: Liz McMillan, Pat Romanski, Yeshim Deniz, Elizabeth White, Zakia Bouachraoui

Related Topics: Containers Expo Blog

Containers Expo Blog: Article

Desktop Virtualization.... The Right Way (Part 1)

Applications Matter

8th Virtualization Expo in New York on Ulitzer

Have you experienced this before? You need an application to help you with a project. You ask your manager if you can purchase the software and you get approval.  You go out and buy the software and install it onto your desktop and away you go to do your job. 

This is a common situation, one I've done myself on many occasions. These applications make up the non-IT delivered application set of every organization, and it is a massive list.  This happens over and over again in every organization and in every department. So when you hear organizations say they have 10,000 or 20,000 applications, they are likely not exaggerating.  Out of that massive list, only 500-1,000 of those applications are IT-managed. 

This brings about the main challenge with desktop virtualization, how do you deal with the non-IT delivered applications? With Citrix XenDesktop, if you use the recommended strategy of a single image for many users you lose the ability to install the application into the virtual desktop and have it persist across reboots.   This is a major issue that must be dealt with or users will not accept the virtual desktop.  

First, you need an application assessment. You have a few options.

  • Entire site assessment: By using a tool or doing a manual assessment you can get a list of applications deployed throughout the organization.  This will give you the data points, but the amount of data might be overwhelming. Imagine looking at a list of 20,000 applications. How do you even start determining your optimal solution.  This is information overload
  • Department-by-Department assessment: By focusing at the departmental level, you get a better grasp of the applications without being overwhelmed from the start.  .  If you focus at a departmental or group level, your application list should be more manageable.
  • Survey: Leave it up to the departments to create a list of what their users NEED to effectively do their job and not what they HAVE.  Many of the applications are outdated and unused.  By identifying what is needed, the number of applications can be better managed. 

Regardless of the approach taken, the following is needed for each application:

  1. User
  2. Application
  3. Dependencies
  4. Mobility requirements

Second, it's time for layoffs but this time we need to layoff applications.  If you ask your users what applications they have installed, they will miss most of them.  In fact, many of the applications installed on a typical desktop are not needed anymore.  By laying off applications, we can start to get control of our application set and give our IT organizations an opportunity to succeed.  

Third, develop an application delivery strategy.  We can either install, host or stream.  Do you need all three? Potentially.  The point to remember is you need to be flexible. Certain strategies will work better in certain situations.   Think about it this way.

  • Certain applications will be used by 100% of your users.  These applications are best served by installing into the virtual desktop image. Why add another process (streaming/hosting) for an application that will be used by everyone, everyday?
  • Certain applications have such a massive memory footprint. Executing the application within a virtual desktop will result in massive amounts of RAM being consumed.  However, if that application were hosted on XenApp, those DLLs and EXEs could be shared between users, thus reducing the overall memory footprint required.
  • Certain applications are used by a small group of users (1-2% of users).  These applications might best be served via the hosting model on XenApp or via application streaming into the virtual desktop.
  • Certain applications go through constant updates (daily/weekly).  It would appear to be easier to use a single application image that can be distributed to any device when needed. Instead of maintaining hundreds/thousands of installations, the single package model would appear to be easier. 

The point of all of this is if you going to be successful, you must have a strategy for delivering the applications into the virtual desktop.  The strategy is also dependent on how well your IT group can service the user requests for all of these applications.  If it is just not possible, your other alternative is to go down the Bring Your Own Computer (BYOC) route.

In the BYOC model, my physical desktop is maintained and managed by myself.  I'm not part of the domain nor do I call support when I have an issue, I do it myself.  This also means that the non-IT delivered applications are installed on my own personal desktop.  So far, this model has worked for me but I'm a savvy user and know how to fix a lot of issues I run into to. This approach might be more difficult for those not used to self-supporting.  But if a user installed their own applications, then technically they are already self-supporting their non-IT delivered applications.

Remember, the desktop is the easy part.  Spend your time looking at your application set and remember the following:

  1. Application Assessment
  2. Application Layoffs
  3. Application Delivery Strategy

What other application characteristics have you seen that would help determine your application delivery strategy?

More Stories By Daniel Feller

Daniel Feller, Lead Architect of Worldwide Consulting Solutions for Citrix, is responsible for providing enterprise-level architectures and recommendations for those interested in desktop virtualization and VDI. He is charged with helping organizations architect the next-generation desktop, including all flavors of desktop virtualization (hosted shared desktops, hosted VM-based desktops, hosted Blade PC desktops, local streamed desktops, and local VM-based desktops). Many of the desktop virtualization architecture decisions also focuses on client hypervisors, and application virtualization.

In his role, Daniel has provided insights and recommendations to many of the world’s largest organizations across the world.

In addition to private, customer-related work, Daniel’s public initiatives includes the creation of best practices, design recommendations, reference architectures and training initiatives focused on the core desktop virtualization concepts. Being the person behind the scenes, you can reach/follow Daniel via Twitter and on the Virtualize My Desktop site.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...