Welcome!

Containers Expo Blog Authors: Stackify Blog, Elizabeth White, Automic Blog, Derek Weeks, Pat Romanski

Related Topics: Containers Expo Blog

Containers Expo Blog: Article

Balancing the Virtualization Equation

Get the most from your virtualized environment

Enterprises committed to a virtualization strategy need to ensure that management and automation of mission-critical IT systems and applications are included in their planning. Enterprises also need to establish procedures that allow them to maximize the benefits of consolidating to a virtualized platform and mitigate potential business risk across a landscape that has become abstract. Failure to do so will impact the success of projects and dilute the value of a virtualization strategy.

Spiraling energy costs, squeezing extra IT power out of fixed data center real estate footprints and environmental concerns, have shifted virtualization from a commodity tool to a center-stage role in the IT strategy of many organizations.

The history of virtualization can be tracked back to the 1970s when mainframe computers could be virtually partitioned to host multiple guest machines. It proved an ideal environment in which to install and configure new operating platforms, upgrade existing systems, and give software developers a sandbox for isolation testing. In its 21st century incarnation, history has repeated itself with virtualization usually starting life deep within the data center of most enterprises. IT operations and application development teams rapidly recognized the extra flexibility they could get from not needing to procure extra hardware to service ad hoc processing demands or for software testing.

With the shift from commodity to a center-stage role for virtualization, there is a corresponding shift in planning required to ensure that all IT layers in an enterprise are fully aligned to perform in a new virtualized landscape. In addition to ensuring that the underlying IT infrastructure components are in place each time a new virtual machine is provisioned, it's imperative that the business applications as well as the operational processes and procedures are fully established to provide the comprehensive set of services that end users rely on to do their jobs.

Factor
From an end-user or functional user perspective, whether an environment is virtualized or not is largely irrelevant. Such users simply expect their applications and programs to work - virtualization for them is a back-office, and therefore mostly unseen, technology. Planning for virtualization should strive to minimize apparent adverse impact on users' day-to-day activities.

Virtualization transforms a data center into a dynamic IT environment that can provide the flexibility and scalability capable of responding to the varying demands driven by a dynamic 24x7 global marketplace. However, while the ability to add and subtract processing capacity without needing to power up extra hardware offers enterprises greater agility, there are accompanying challenges that require addressing.

Factor
An organization's current system monitoring tools are probably very good at monitoring server statistics (like CPU utilization, I/O, etc.) and raising alarms if certain thresholds are exceeded. In a virtualized environment, such alarms should be expected to initiate action that can start, stop, or move virtual machines within the environment to help alleviate the detected resource exception. Planning should consider how system monitors can take actions that modify the virtual environment.

As each new virtual machine is spawned, the IT Operations team is left with the challenge of recognizing that there is an extra machine available that requires managing and monitoring. This same team also assumes responsibility for manually routing workload to this additional resource, continually checking systems performance and being ready to respond to messages and resolve problems as and when they occur.

Factor
A long-running, complex business process is known to contain a large processing "spike" at a certain point. In a virtualized environment, additional virtual machines can be started just prior to the spike (and stopped just after) to provide additional processing horsepower. The orchestrator (personnel or product) of the business process should be expected to be sufficiently aware of the virtualized environment to note the additional virtual machine(s) and take advantage of them. Without that awareness, even with the flexibility to dynamically add horsepower, an important potential benefit of the virtualized environment is lost. Planning should look at how business process orchestrators can take actions that affect the virtual environment.

This increase in workload combined with the perennial lack of qualified, skilled personnel puts tremendous pressure on IT operations. Instead of continually trying to find, train, and retain staff, organizations need to incorporate the tribal operations management knowledge that has accumulated over many years into the fabric of their virtualized environments. Adopting an automated approach would not only reduce operational pressures; it would also mitigate business risk by reducing the exposure of critical systems and applications to unaccountable manual intervention.

Factor
Drilling down into the previous example - if personnel are responsible for orchestrating the business process, one can envision a very detailed and carefully written manual process document for them to follow to manage the spike, taking advantage of the established virtualized environment. The burden (what higher-value activity could a person be doing?) and risk (what if a person makes a mistake?) of such a manual procedure could be eliminated by using an automated orchestrator - but only so far as the orchestrator is aware of and can interact with and control the virtualized environment. Again, without the awareness, an important potential benefit of the virtualized environment is lost. Planning should work to convert or translate manual processes (to the greatest extent possible) into automated processes.

Ensuring that extra virtual machines are brought online to cater for peak processing demands, optimizing the distribution of batch jobs to complete ahead of critical deadlines through to automatically responding and taking corrective actions against errors are just a few examples of workload management challenges arising in a virtualized world that can be simplified using automation. Beyond the infrastructure layer there's an equivalent set of tasks and procedures that have to be done to drive application processing that have traditionally relied on manual interaction, either by data center or end-user personnel. The virtualization of applications generates a similar set of challenges and requires equal attention if enterprises are going to realize benefits throughout their IT landscape.

In virtualized environments, the fixed relationships between hardware, systems, and applications no longer exist. Hardwired, proscribed associations, ranging from a command sequence in an operations handbook to fixed parameters embedded in a piece of application code, can result in different interpretations when presented in a virtualized world. Virtualization introduces an extra layer of abstraction between physical hardware devices and the software systems that an enterprise runs to support its business.

Factor
It's easy for a developer to write a program that runs well on a single server. However, without due consideration of the virtualized environment, it's all too likely that that same program won't run successfully across a landscape of virtual machines or hypervisors. Support for virtualized environments must be built into custom-developed code.

At the IT infrastructure management layer, there are IT housekeeping and administrative tasks that need to be executed: backups, snapshots, database clean-ups, file-transfer handling, and starting and stopping VMs. At the business application layer, there are functional processes and procedures that need to be undertaken: sales data uploads, order processing, invoicing, logistics, production, analytics and forecasting, finance and accounting, HR and customer care. Bringing together the execution of these activities ensures that everything around business and IT processes are properly managed and maintained. The scope of activities required will usually go well beyond the capability of an individual business application or systems management solution. Enterprises need to manage the suite of all interfaces around their virtual environments. They also need to be able to integrate the real and virtual environments in such a way that they can fully leverage the breadth and the depth of functionality that can be derived from their core applications and operating platforms.

Factor
IT housekeeping and administrative applications certainly must be "virtualization-aware" - indeed, some of the IT housekeeping tasks listed above are included in various hypervisors (e.g., snapshots). Business applications such as ERP, CRM, BI and DW must also be aware - it would make no sense to bring another virtual machine online for a particular application if the application itself had no awareness of its virtualized environment. There's some opportunity for application consolidation in terms of the applications used for managing IT housekeeping, administration, and business applications. The distinctions have blurred between certain classes of applications (e.g., job schedulers, system managers, business process managers) to such a degree that one new application may be able to replace the functionality of two or more older applications (see the references to an "orchestrator" in other parts of this article). Planning must include the business applications and each one's unique requirements.

Forming logical associations and utilizing logical views when managing virtualized systems and applications will allow IT departments to achieve greater flexibility and agility. When seeking to automate IT housekeeping procedures through to business processes, such as financial period-end close, creating a centralized single set of policy definitions that have embedded parameter variables not only ensures consistency and transparency across all virtualized machines and hypervisors - it will also reduce maintenance and administration overheads.

Factor
Establishing a single metadata repository for such items as policy definitions, processing rules, and business processes is a positive step in any virtualized environment. If such a repository also holds data about the current state of play of the policies in force, which rules are in control, and processing status then such data can be used in a predictive manner to proactively determine what virtual resources might be needed near-term AND take action to make those resources available. Effort should be spent planning how metadata can be used to allow proactive management of the virtual environment.

Establishing the availability of virtual resources, determining current systems performance, and analysis of other metrics can be used at runtime to optimize the routing and dispatching of workloads. Process definitions can be dynamically configured using parameter overrides to run on the hypervisor server best suited to ensure end-user SLAs are satisfied.

Factor
In the absence of an orchestrator to automate processing, system monitors can detect system events and raise alarms in a reactive fashion. Proactive and reactive attempts to modify the virtual environment are certainly valid. However, doing neither wastes some of the potential advantages of virtualization. Both proactive and reactive adjustments of the virtual environment should be planned for.

Securing and administering all process definitions in a centralized repository will support change control management. There's no need to manually check that script updates, necessary because a new version of a backup utility is being rolled out, have been propagated to all virtual machines. Critical activities that need to be run on virtual machines are protected against unauthorized updates and illegal use. Being able to maintain a record and report on all changes made to process definitions, as well as details of who executed what, where, when, and the outcome, supports enterprises in ensuring that their use of virtualization doesn't introduce additional operational risk and is compliant with IT governance strategy.

Factor
As highlighted earlier, automation provides a highly effective alternative to manual processes. If changes to the virtualized environment are automated (e.g., though predictive use of state data, automated response to alarms, and planned changes in a business process) then one expectation should be the existence of a good solid audit trail of actions taken by the automation orchestrator. Planning for compliance is a must.

Conclusion
Instead of dusting down an old IT operations run book and updating it to support a virtualization strategy, enterprises need to realize that embedding knowledge and experience into automated procedures not only simplifies management and control of a virtualized world; it can also ensure smart decisions are taken at the right time in the right context. An automated approach translates into improved throughput, greater accuracy, fewer errors, and less risk. Putting technology to work by allowing it to analyze resource utilization and respond instantaneously, provisioning extra resource in a virtualized environment enhances productivity and throughput.

More Stories By Alex Givens

Alex Givens is a Senior Solutions Architect for UC4 Software, Inc., makers of UC4 Workload Automation Suite. For 13 years, Alex has helped organizations improve the efficiency and effectiveness of their business processing. Alex has spoken on business process automation at many international, national and regional conferences.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
SYS-CON Events announced today that GrapeUp, the leading provider of rapid product development at the speed of business, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company, specialized in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market acr...
In the enterprise today, connected IoT devices are everywhere – both inside and outside corporate environments. The need to identify, manage, control and secure a quickly growing web of connections and outside devices is making the already challenging task of security even more important, and onerous. In his session at @ThingsExpo, Rich Boyer, CISO and Chief Architect for Security at NTT i3, discussed new ways of thinking and the approaches needed to address the emerging challenges of security i...
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business - from apparel to energy - is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
In his opening keynote at 20th Cloud Expo, Michael Maximilien, Research Scientist, Architect, and Engineer at IBM, discussed the full potential of the cloud and social data requires artificial intelligence. By mixing Cloud Foundry and the rich set of Watson services, IBM's Bluemix is the best cloud operating system for enterprises today, providing rapid development and deployment of applications that can take advantage of the rich catalog of Watson services to help drive insights from the vast t...
There is only one world-class Cloud event on earth, and that is Cloud Expo – which returns to Silicon Valley for the 21st Cloud Expo at the Santa Clara Convention Center, October 31 - November 2, 2017. Every Global 2000 enterprise in the world is now integrating cloud computing in some form into its IT development and operations. Midsize and small businesses are also migrating to the cloud in increasing numbers. Companies are each developing their unique mix of cloud technologies and service...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, will introduce two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a...
Recently, IoT seems emerging as a solution vehicle for data analytics on real-world scenarios from setting a room temperature setting to predicting a component failure of an aircraft. Compared with developing an application or deploying a cloud service, is an IoT solution unique? If so, how? How does a typical IoT solution architecture consist? And what are the essential components and how are they relevant to each other? How does the security play out? What are the best practices in formulating...
Internet of @ThingsExpo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devic...
In his session at @ThingsExpo, Arvind Radhakrishnen discussed how IoT offers new business models in banking and financial services organizations with the capability to revolutionize products, payments, channels, business processes and asset management built on strong architectural foundation. The following topics were covered: How IoT stands to impact various business parameters including customer experience, cost and risk management within BFS organizations.
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics ...
SYS-CON Events announced today that Elastifile will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Elastifile Cloud File System (ECFS) is software-defined data infrastructure designed for seamless and efficient management of dynamic workloads across heterogeneous environments. Elastifile provides the architecture needed to optimize your hybrid cloud environment, by facilitating efficient...
SYS-CON Events announced today that Golden Gate University will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Since 1901, non-profit Golden Gate University (GGU) has been helping adults achieve their professional goals by providing high quality, practice-based undergraduate and graduate educational programs in law, taxation, business and related professions. Many of its courses are taug...
"We provide IoT solutions. We provide the most compatible solutions for many applications. Our solutions are industry agnostic and also protocol agnostic," explained Richard Han, Head of Sales and Marketing and Engineering at Systena America, in this SYS-CON.tv interview at @ThingsExpo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that DXWorldExpo has been named “Global Sponsor” of SYS-CON's 21st International Cloud Expo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Digital Transformation is the key issue driving the global enterprise IT business. Digital Transformation is most prominent among Global 2000 enterprises and government institutions.
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
SYS-CON Events announced today that Secure Channels, a cybersecurity firm, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Secure Channels, Inc. offers several products and solutions to its many clients, helping them protect critical data from being compromised and access to computer networks from the unauthorized. The company develops comprehensive data encryption security strategie...
From 2013, NTT Communications has been providing cPaaS service, SkyWay. Its customer’s expectations for leveraging WebRTC technology are not only typical real-time communication use cases such as Web conference, remote education, but also IoT use cases such as remote camera monitoring, smart-glass, and robotic. Because of this, NTT Communications has numerous IoT business use-cases that its customers are developing on top of PaaS. WebRTC will lead IoT businesses to be more innovative and address...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
SYS-CON Events announced today that App2Cloud will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. App2Cloud is an online Platform, specializing in migrating legacy applications to any Cloud Providers (AWS, Azure, Google Cloud).