Click here to close now.

Welcome!

Virtualization Authors: Liz McMillan, Pat Romanski, Carmen Gonzalez, Clinton Wolfe, John M. Hawkins

Related Topics: Virtualization

Virtualization: Article

Balancing the Virtualization Equation

Get the most from your virtualized environment

Enterprises committed to a virtualization strategy need to ensure that management and automation of mission-critical IT systems and applications are included in their planning. Enterprises also need to establish procedures that allow them to maximize the benefits of consolidating to a virtualized platform and mitigate potential business risk across a landscape that has become abstract. Failure to do so will impact the success of projects and dilute the value of a virtualization strategy.

Spiraling energy costs, squeezing extra IT power out of fixed data center real estate footprints and environmental concerns, have shifted virtualization from a commodity tool to a center-stage role in the IT strategy of many organizations.

The history of virtualization can be tracked back to the 1970s when mainframe computers could be virtually partitioned to host multiple guest machines. It proved an ideal environment in which to install and configure new operating platforms, upgrade existing systems, and give software developers a sandbox for isolation testing. In its 21st century incarnation, history has repeated itself with virtualization usually starting life deep within the data center of most enterprises. IT operations and application development teams rapidly recognized the extra flexibility they could get from not needing to procure extra hardware to service ad hoc processing demands or for software testing.

With the shift from commodity to a center-stage role for virtualization, there is a corresponding shift in planning required to ensure that all IT layers in an enterprise are fully aligned to perform in a new virtualized landscape. In addition to ensuring that the underlying IT infrastructure components are in place each time a new virtual machine is provisioned, it's imperative that the business applications as well as the operational processes and procedures are fully established to provide the comprehensive set of services that end users rely on to do their jobs.

Factor
From an end-user or functional user perspective, whether an environment is virtualized or not is largely irrelevant. Such users simply expect their applications and programs to work - virtualization for them is a back-office, and therefore mostly unseen, technology. Planning for virtualization should strive to minimize apparent adverse impact on users' day-to-day activities.

Virtualization transforms a data center into a dynamic IT environment that can provide the flexibility and scalability capable of responding to the varying demands driven by a dynamic 24x7 global marketplace. However, while the ability to add and subtract processing capacity without needing to power up extra hardware offers enterprises greater agility, there are accompanying challenges that require addressing.

Factor
An organization's current system monitoring tools are probably very good at monitoring server statistics (like CPU utilization, I/O, etc.) and raising alarms if certain thresholds are exceeded. In a virtualized environment, such alarms should be expected to initiate action that can start, stop, or move virtual machines within the environment to help alleviate the detected resource exception. Planning should consider how system monitors can take actions that modify the virtual environment.

As each new virtual machine is spawned, the IT Operations team is left with the challenge of recognizing that there is an extra machine available that requires managing and monitoring. This same team also assumes responsibility for manually routing workload to this additional resource, continually checking systems performance and being ready to respond to messages and resolve problems as and when they occur.

Factor
A long-running, complex business process is known to contain a large processing "spike" at a certain point. In a virtualized environment, additional virtual machines can be started just prior to the spike (and stopped just after) to provide additional processing horsepower. The orchestrator (personnel or product) of the business process should be expected to be sufficiently aware of the virtualized environment to note the additional virtual machine(s) and take advantage of them. Without that awareness, even with the flexibility to dynamically add horsepower, an important potential benefit of the virtualized environment is lost. Planning should look at how business process orchestrators can take actions that affect the virtual environment.

This increase in workload combined with the perennial lack of qualified, skilled personnel puts tremendous pressure on IT operations. Instead of continually trying to find, train, and retain staff, organizations need to incorporate the tribal operations management knowledge that has accumulated over many years into the fabric of their virtualized environments. Adopting an automated approach would not only reduce operational pressures; it would also mitigate business risk by reducing the exposure of critical systems and applications to unaccountable manual intervention.

Factor
Drilling down into the previous example - if personnel are responsible for orchestrating the business process, one can envision a very detailed and carefully written manual process document for them to follow to manage the spike, taking advantage of the established virtualized environment. The burden (what higher-value activity could a person be doing?) and risk (what if a person makes a mistake?) of such a manual procedure could be eliminated by using an automated orchestrator - but only so far as the orchestrator is aware of and can interact with and control the virtualized environment. Again, without the awareness, an important potential benefit of the virtualized environment is lost. Planning should work to convert or translate manual processes (to the greatest extent possible) into automated processes.

Ensuring that extra virtual machines are brought online to cater for peak processing demands, optimizing the distribution of batch jobs to complete ahead of critical deadlines through to automatically responding and taking corrective actions against errors are just a few examples of workload management challenges arising in a virtualized world that can be simplified using automation. Beyond the infrastructure layer there's an equivalent set of tasks and procedures that have to be done to drive application processing that have traditionally relied on manual interaction, either by data center or end-user personnel. The virtualization of applications generates a similar set of challenges and requires equal attention if enterprises are going to realize benefits throughout their IT landscape.

In virtualized environments, the fixed relationships between hardware, systems, and applications no longer exist. Hardwired, proscribed associations, ranging from a command sequence in an operations handbook to fixed parameters embedded in a piece of application code, can result in different interpretations when presented in a virtualized world. Virtualization introduces an extra layer of abstraction between physical hardware devices and the software systems that an enterprise runs to support its business.

Factor
It's easy for a developer to write a program that runs well on a single server. However, without due consideration of the virtualized environment, it's all too likely that that same program won't run successfully across a landscape of virtual machines or hypervisors. Support for virtualized environments must be built into custom-developed code.

At the IT infrastructure management layer, there are IT housekeeping and administrative tasks that need to be executed: backups, snapshots, database clean-ups, file-transfer handling, and starting and stopping VMs. At the business application layer, there are functional processes and procedures that need to be undertaken: sales data uploads, order processing, invoicing, logistics, production, analytics and forecasting, finance and accounting, HR and customer care. Bringing together the execution of these activities ensures that everything around business and IT processes are properly managed and maintained. The scope of activities required will usually go well beyond the capability of an individual business application or systems management solution. Enterprises need to manage the suite of all interfaces around their virtual environments. They also need to be able to integrate the real and virtual environments in such a way that they can fully leverage the breadth and the depth of functionality that can be derived from their core applications and operating platforms.

Factor
IT housekeeping and administrative applications certainly must be "virtualization-aware" - indeed, some of the IT housekeeping tasks listed above are included in various hypervisors (e.g., snapshots). Business applications such as ERP, CRM, BI and DW must also be aware - it would make no sense to bring another virtual machine online for a particular application if the application itself had no awareness of its virtualized environment. There's some opportunity for application consolidation in terms of the applications used for managing IT housekeeping, administration, and business applications. The distinctions have blurred between certain classes of applications (e.g., job schedulers, system managers, business process managers) to such a degree that one new application may be able to replace the functionality of two or more older applications (see the references to an "orchestrator" in other parts of this article). Planning must include the business applications and each one's unique requirements.

Forming logical associations and utilizing logical views when managing virtualized systems and applications will allow IT departments to achieve greater flexibility and agility. When seeking to automate IT housekeeping procedures through to business processes, such as financial period-end close, creating a centralized single set of policy definitions that have embedded parameter variables not only ensures consistency and transparency across all virtualized machines and hypervisors - it will also reduce maintenance and administration overheads.

Factor
Establishing a single metadata repository for such items as policy definitions, processing rules, and business processes is a positive step in any virtualized environment. If such a repository also holds data about the current state of play of the policies in force, which rules are in control, and processing status then such data can be used in a predictive manner to proactively determine what virtual resources might be needed near-term AND take action to make those resources available. Effort should be spent planning how metadata can be used to allow proactive management of the virtual environment.

Establishing the availability of virtual resources, determining current systems performance, and analysis of other metrics can be used at runtime to optimize the routing and dispatching of workloads. Process definitions can be dynamically configured using parameter overrides to run on the hypervisor server best suited to ensure end-user SLAs are satisfied.

Factor
In the absence of an orchestrator to automate processing, system monitors can detect system events and raise alarms in a reactive fashion. Proactive and reactive attempts to modify the virtual environment are certainly valid. However, doing neither wastes some of the potential advantages of virtualization. Both proactive and reactive adjustments of the virtual environment should be planned for.

Securing and administering all process definitions in a centralized repository will support change control management. There's no need to manually check that script updates, necessary because a new version of a backup utility is being rolled out, have been propagated to all virtual machines. Critical activities that need to be run on virtual machines are protected against unauthorized updates and illegal use. Being able to maintain a record and report on all changes made to process definitions, as well as details of who executed what, where, when, and the outcome, supports enterprises in ensuring that their use of virtualization doesn't introduce additional operational risk and is compliant with IT governance strategy.

Factor
As highlighted earlier, automation provides a highly effective alternative to manual processes. If changes to the virtualized environment are automated (e.g., though predictive use of state data, automated response to alarms, and planned changes in a business process) then one expectation should be the existence of a good solid audit trail of actions taken by the automation orchestrator. Planning for compliance is a must.

Conclusion
Instead of dusting down an old IT operations run book and updating it to support a virtualization strategy, enterprises need to realize that embedding knowledge and experience into automated procedures not only simplifies management and control of a virtualized world; it can also ensure smart decisions are taken at the right time in the right context. An automated approach translates into improved throughput, greater accuracy, fewer errors, and less risk. Putting technology to work by allowing it to analyze resource utilization and respond instantaneously, provisioning extra resource in a virtualized environment enhances productivity and throughput.

More Stories By Alex Givens

Alex Givens is a Senior Solutions Architect for UC4 Software, Inc., makers of UC4 Workload Automation Suite. For 13 years, Alex has helped organizations improve the efficiency and effectiveness of their business processing. Alex has spoken on business process automation at many international, national and regional conferences.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understanding the kinds of data: structured, unstructured, big/small? Analytics: What kinds and how responsiv...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing demand and the rapidly changing workspace model.
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. He discussed opportunities and challenges ahead for the IoT from a market and technical point of vie...
IoT is still a vague buzzword for many people. In his session at @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, discussed the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. He also discussed how IoT is perceived by investors and how venture capitalist access this space. Other topics discussed were barriers to success, what is new, what is old, and what the future may hold. Mike Kavis is Vice President & Principal Cloud Architect at Cloud Technology Pa...
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS solutions that provide a Hadoop flavor either make choices for customers very flexible in the name of opti...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
Advanced Persistent Threats (APTs) are increasing at an unprecedented rate. The threat landscape of today is drastically different than just a few years ago. Attacks are much more organized and sophisticated. They are harder to detect and even harder to anticipate. In the foreseeable future it's going to get a whole lot harder. Everything you know today will change. Keeping up with this changing landscape is already a daunting task. Your organization needs to use the latest tools, methods and expertise to guard against those threats. But will that be enough? In the foreseeable future attacks w...
Disruptive macro trends in technology are impacting and dramatically changing the "art of the possible" relative to supply chain management practices through the innovative use of IoT, cloud, machine learning and Big Data to enable connected ecosystems of engagement. Enterprise informatics can now move beyond point solutions that merely monitor the past and implement integrated enterprise fabrics that enable end-to-end supply chain visibility to improve customer service delivery and optimize supplier management. Learn about enterprise architecture strategies for designing connected systems tha...
Dale Kim is the Director of Industry Solutions at MapR. His background includes a variety of technical and management roles at information technology companies. While his experience includes work with relational databases, much of his career pertains to non-relational data in the areas of search, content management, and NoSQL, and includes senior roles in technical marketing, sales engineering, and support engineering. Dale holds an MBA from Santa Clara University, and a BA in Computer Science from the University of California, Berkeley.
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
The cloud is now a fact of life but generating recurring revenues that are driven by solutions and services on a consumption model have been hard to implement, until now. In their session at 16th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, will discuss how a top European telco has leveraged the innovative recurring revenue generating capability of the consumption cloud to enable a unique cloud monetization model to drive results.
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will discuss how to cut costs, scale easily, and unleash insight with CommVault Simpana software, the only si...
Analytics is the foundation of smart data and now, with the ability to run Hadoop directly on smart storage systems like Cloudian HyperStore, enterprises will gain huge business advantages in terms of scalability, efficiency and cost savings as they move closer to realizing the potential of the Internet of Things. In his session at 16th Cloud Expo, Paul Turner, technology evangelist and CMO at Cloudian, Inc., will discuss the revolutionary notion that the storage world is transitioning from mere Big Data to smart data. He will argue that today’s hybrid cloud storage solutions, with commodity...
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems, will focus on how to set up a cloud data governance program and s...
Every innovation or invention was originally a daydream. You like to imagine a “what-if” scenario. And with all the attention being paid to the so-called Internet of Things (IoT) you don’t have to stretch the imagination too much to see how this may impact commercial and homeowners insurance. We’re beyond the point of accepting this as a leap of faith. The groundwork is laid. Now it’s just a matter of time. We can thank the inventors of smart thermostats for developing a practical business application that everyone can relate to. Gone are the salad days of smart home apps, the early chalkb...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been involved at the beginning of four IT industries: EDA, Open Systems, Computer Security and now SOA.
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...