Click here to close now.


Containers Expo Blog Authors: Pat Romanski, Liz McMillan, Elizabeth White, Yeshim Deniz, Lori MacVittie

Related Topics: @CloudExpo, Containers Expo Blog, Agile Computing

@CloudExpo: Article

Breaking Through the Virtual Glass Ceiling

'Virtualization…all the cool, smart kids are doing it!'

"Virtualization...all the cool, smart kids are doing it!" Or at least that is the message being pushed by the virtualization vendors. Mostly this is a true statement. There are plenty of market studies that show that more than three quarters of medium to large enterprises are leveraging server virtualization technologies of some kind. What is not evident in that statement is that for most enterprises, their overall use of the technology is not as widespread as is implied by the statement. For many organizations, virtualization utilization is in its infancy (from a maturity of use perspective) or not a material percentage of the overall IT infrastructure.

Even with the exponential growth of virtualization usage over the last few years, the majority of applications have been based on simple consolidation ROI and used primarily in areas of low risk (IT assets, or test-development environments). Recently the term "VM Stall" has been thrown out along with some research that identifies some of the top reasons why organizations have initially deployed virtualization at a high rate but then significantly slowed that pace down - a virtual "glass ceiling" built on a number of common factors. The good news is that all of these concerns are addressable, and organizations can continue deploying virtualization and reap not only the simple consolidation ROI, but also start moving toward the operational efficiencies promised by utility or cloud computing. The following sections will outline some of the most common areas of concern and provide some guidance on how organizations can break through that ceiling and achieve their virtualization goals.

Cloud Confusion
Cloud computing (or the vendors pushing it) must take some responsibility for putting the brakes on virtualization deployments. For over a year now, VMware has transitioned their marketing towards "cloud." Every ecosystem software vendor has followed suit with some angle on how their software enables cloud. The problem is that companies that are not sophisticated virtualization users are getting confusing and sometimes conflicting messages not just about what cloud is, but how it can be used. The result of this is that some CIOs are pausing and looking at their virtualization efforts and then attempting to reconcile cloud. Do I even need cloud? When? Do my current virtualization plans conflict with or complement a cloud future? These are all valid questions that unfortunately end with answers like, "It depends." This in itself is not a bad idea, but because of the mixed messages around cloud, making an informed decision can be complicated. My suggestion for organizations that are stuck in cloud confusion: Move forward and keep virtualizing! There are enough benefits to virtualization technologies that warrant moving forward, and based on the road maps of the major hypervisor vendors, the ability to leverage cloud from existing virtualization deployments will be a key feature of each vendor's solution.

Security Concerns
Another key issue that prevents some applications from being deployed in virtual environments revolves around security and compliance. Applications such as health care and financial transactions that fall under regulatory control of HIPAA or PCI have been the ones most impacted by these concerns. Partially this is due to a lack of specific requirements (in the case of HIPAA) or ambiguous language (PCI DSS) in the documentation. The PCI Council has addressed the virtualization concerns in their latest release of the DSS [1] and in its accompanying guidance document,[2] which specifically clarifies that use of virtualization is not specifically excluded from PCI environments. There is also concern over virtualization being "different." Though it's true that virtualization is a new layer in the IT stack, all of the traditional security domains apply. Actions like Access Control, Network Inspection, Segmentation, and Separation of Duty are all required in the virtual datacenter. For some items, existing security measures and software will suffice. In addition, as virtualization has matured, the hypervisor vendors as well as the ecosystem have begun to offer solutions to address these areas with specific attention to the virtual domain. Though virtualization is somewhat different, it's not new security practices that need to be performed, it's new approaches to the same practices that may require virtualization-specific software and staff knowledgeable enough in both virtualization and the security domains to be able to apply the security concepts appropriately.

Staffing Issues
The staffing issue is next in the list of concerns that slow down adoption. It's not just in the security area that a specific virtualization skill set is needed. For many enterprises, virtualization knowledge is held within a small team. This team is independent of the other traditional data center groups of infrastructure, security, storage and software applications. But virtualization is unique in that the domain crosses all of those groups. The typical scenario is that even though an enterprise has been successfully growing the virtual environment (in both size and sophistication), the team may not be keeping pace. At some point the team will start to lose their ability to manage the environment to its fullest, which can manifest itself in a non-performing environment or a reluctance of the team to take on more applications; both slowing and stopping growth. A solution to preventing the situation is first to fully accept that successful virtualization requires a much greater cooperation between what were traditionally siloed technology groups. Something that is easier said than done, but an absolute necessity. This allows for shared responsibility and distribution of the management of portions of the virtual domain to groups better positioned to own these technologies. This will cause a trend toward the virtualization team becoming more architects and conductors of the virtual environment, while the day-to-day management is broken down into the sub-domain experts.

In addition to a shift in the datacenter management staff strategy, it's also important to leverage management tools that enable the cross-domain approach. The virtualization ecosystem is filled with software solutions tailored to specific problems or tasks. If there is no alternative, IT departments must purchase some very specific point solutions to solve problems. The downside to point solutions is that in quantity they become very cumbersome to manage. There can be a long learning curve for each product and the combined costs may also reach a point where the operational costs cut too deeply into the expected returns of virtualization. The large software vendors and some forward-thinking smaller vendors are aware of this issue. While large vendors are buying up the smaller point solutions to integrate into their management portfolio, small vendors are busy building virtualization-specific, cross domain management tools. Both approaches are valid as there are many knobs and dials in virtual management, and it's highly likely that different staff in very different roles will need to access the same controls and information. The choice is to buy different tools for each group, thus driving up operating expense, or deploying software designed for the virtual datacenter that provides both the consolidated management and domain-specific controls and views. When choosing the tools, look for real integration of the tool sets in both user interface and reporting. The power of such tools is the ability to leverage data across multiple control planes through a common interface.

There is strong evidence that enterprises reach a level of virtualization deployment that causes the process to slow down, and that cloud confusion, security and compliance, enterprise/staff maturity, and manageability are top on many CIO's minds. Virtualization and cloud computing are a paradigm shift. With that shift lies the commonality in all of the suggestions above, a change in approach and thinking for how datacenters are managed, that will enable organizations break through the virtualization glass ceiling.



More Stories By Mike Wronski

Mike Wronski is Vice President of Product Management for Reflex Systems. He brings more than 15 years of industry experience to his role as VP of Product Management for Reflex Systems. Mike's broad IT experience, ranging from large carrier data networking to virtualization stems from previously held senior roles at Starent Networks(now part of Cisco), Cambia Networks, 3Com and USRobotics. Wronski holds a CISSP and Certified Ethical Hacker certifications, as well as an MBA, and a Bachelor of Science degree in Computer Engineering from Florida International University.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
Today’s connected world is moving from devices towards things, what this means is that by using increasingly low cost sensors embedded in devices we can create many new use cases. These span across use cases in cities, vehicles, home, offices, factories, retail environments, worksites, health, logistics, and health. These use cases rely on ubiquitous connectivity and generate massive amounts of data at scale. These technologies enable new business opportunities, ways to optimize and automate, along with new ways to engage with users.
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of, and Fred Yatzeck, principal architect leading product development at, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at the same time reduce Time to Market (TTM) by using plug and play capabilities offered by a robust IoT ...
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new data-driven world, marketplaces reign supreme while interoperability, APIs and applications deliver un...
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
Can call centers hang up the phones for good? Intuitive Solutions did. WebRTC enabled this contact center provider to eliminate antiquated telephony and desktop phone infrastructure with a pure web-based solution, allowing them to expand beyond brick-and-mortar confines to a home-based agent model. It also ensured scalability and better service for customers, including MUY! Companies, one of the country's largest franchise restaurant companies with 232 Pizza Hut locations. This is one example of WebRTC adoption today, but the potential is limitless when powered by IoT.
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud without worrying about any lock-in fears. In fact by having standard APIs for IaaS would help PaaS expl...
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Valley. The program, to be aired during the peak viewership season of the year, will have a major impac...
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the cloud and the best price/performance value available. ProfitBricks was named one of the coolest Clo...
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated and cloud solutions through hybrid hosting – a sustainable solution for the data required to manage I...
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk will be on IBM Cloudant, Apa...
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, will look at different existing uses of peer-to-peer data sharing and how it can become useful in a live session to...
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.