Welcome!

Containers Expo Blog Authors: Elizabeth White, Pat Romanski, John Esposito, Flint Brenton, Scott Sobhani

Related Topics: Containers Expo Blog

Containers Expo Blog: Article

Virtualization: A Promising and Justifiable Investment

Exclusive Q&A with Bala Murugan, Chief Architect, eG Innovations

"There is a shift in focus and it is from technologies that enable virtualization to technologies that manage virtualization," says Bala Murugan, Chief Architect at eG Innovations, in this Exclusive Q&A with SYS-CON's Virtualization Journal. Overall, Murugan maintains, virtualization is "a promising and justifiable investment, particularly in the current economic downturn."

Virtualization Journal: Do you agree with the view that Virtualization is one of the most promising technology investments in the current economic downturn?

Bala Murugan: Virtualization, when done right, has been proven to provide significant reductions in direct cost. It also helps your indirect cost by improving your IT’s performance, reliability and capacity management. So yes, I would say that it is a promising and justifiable investment, particularly in the current economic downturn.


Virtualization Journal:
How about your concept of “Virtualization 2.0” – doesn’t it implicitly suggest that Virtualization 1.0 has been deficient?

Murugan: On the contrary, it is more in reference to the evolution of the Virtualization industry. Virtualization 1.0 was a revelation; it introduced virtualization to the world, proved its power and showed everyone how much they can benefit from it. Virtualization 2.0 – which is already here - is about accepting Virtualization as reality and moving on to how to do it right. How to get the most out of it. Essentially, there is a shift in focus and it is from technologies that enable virtualization to technologies that manage virtualization.

To be successful in Virtualization 2.0, organizations have to focus around technology that helps them manage their virtualization deployments better. Being a monitoring technology provider, we understand the complexities of monitoring in Virtualization 2.0 and are well positioned to help these companies realize the full potential of their virtualized infrastructures.


Virtualization Journal:
Are you concerned at all that the “2.0” label might detract from the overall value proposition, given that it seems to be going down with the USS Economy. ;-)

Murugan: We view Virtualization 2.0 as an evolution (next phase) – not as a radical revamp of current virtualization deployments. In Virtualization 2.0, the focus is on how to make virtualization deployments more cost-effective and how to gain maximum benefits. So this will actually make virtualization a mandatory technology for most organizations that are dealing with tight budgets in the economic slow-down.


Virtualization Journal:
How about interoperability, how important is that for the industry do you think? What barriers persist?

Murugan: We live in an age of diverse infrastructures. Even before virtualization, the success of n-tier architectures and open systems made it impossible for one to have a homogenous environment. Data centers today comprise diverse technologies that have to co-exist and to in deliver IT services. Virtualization has taken this another step on the evolutionary road, now we are talking about adding a couple of more tiers to the n-tier apps by separating the hardware from the OS. At this juncture, we believe that interoperability is not a “nice to have”. It is a “must have.”

In terms of barriers, the ones that still exist are mostly technological, that people are working to overcome. In principle, I believe everyone agrees interoperability is a must have. Not only do they have to deal with a mix of virtual and non-virtual infrastructures, but also different types of virtualization from different vendors. They key we found is to be able to provide a unified consistent view across this diverse landscape, which makes management that much easier for the end-user.


Virtualization Journal:
Do you think VMware needs fear Microsoft’s belated entry into the virtualization marketplace?

Murugan: History has shown that Microsoft can be a significant threat in any endeavor it puts its mind to. They will have good technology and resort to their favorite ploy; their licensing model, and make Virtualization more of a commodity than it already is.

VMware itself has recognized that the hypervisor is no longer going to be the differentiator and that technologies that enable the effective use of virtualization (e.g., manageability), new application deployment models (like virtual desktops), etc. will be a key to retaining their leadership position.

Competition in this space can only be good – innovation will be faster and certainly there is room for multiple vendors in this fast growing market.


Virtualization Journal:
How about eG Innovations, what’s the background story to the company’s formation and growth to date?

A: eG Innovations was founded by Srinivas Ramanathan, who also is our president and CEO. Prior to eG, he was a research scientist at HP and the chief architect of Firehunter, an ISP performance monitoring solution. His years at HP gave him a ringside seat to real pain points that customers have with monitoring their environments and monitoring tools themselves. In 2000, he left HP to build the proverbial “better mousetrap,” and assembled a strong team, including myself, to take this concept from the ground up. That was the genesis of eG Innovations.

Our focus was on monitoring n-tier architectures by looking at them as business services as opposed to a collection of servers, networks and applications. Our key benefit to the customer was our ability to proactively identify to the right problem, the true root cause, of poor performance in their IT infrastructures. As a result, customers spent less time firefighting and finger pointing, and more time improving their overall service levels. It took a couple of years to roll-out the finished product, and we got VC funding from Singapore. Then we opened up the US market in 2002 and found a receptive audience for the technology. We quickly became the premier Citrix monitoring solution, which had all the classic n-tier architecture issues. We won many awards and saw the company grow across the globe.

We saw the opportunity in the virtualization space quite early and started working with early virtualization adopters to better understand their needs and to strengthen our technology. Our mastery in thin-client computing and shared access technologies (Citrix, Microsoft Terminal Services, etc.) helped because a Virtualization ecosystem (one box – multiple OSs) is similar to a Citrix ecosystem (one OS – multiple users). More awards later, we are now recognized as one of the industry leaders in the Virtualization monitoring space, with support for different virtualization platforms including VMware, Citrix Xen, Solaris Containers/LDOMs and more.


Virtualization Journal:
What are the main pain points that bring customers to you in search of a monitoring solution?

Murugan: The biggest single pain point is probably problem isolation. When there is a problem in your n-tier IT infrastructure, it is usually pretty hard to distinguish between the true root cause and the effects. With systems being interdependent, a single problem generally causes a ripple effect that flows through the entire environment, leading you to chase effects as opposed to pinpointing the root cause. In simple terms, this means you are wasting valuable IT resources in fire-fighting mode fixing effects, which leads to finger pointing inside the organization. Meanwhile, your customers are still facing the problem. Virtualization only increases the complexity of your n-tier IT delivery, which makes problem isolation even more difficult.

Another key pain point that we see customers face is lack of visibility into their IT infrastructures. Even though it sounds simple enough, more often than not customers today don’t have total visibility into what is going on within their virtualized infrastructures. When you are managing a virtualized environment you definitely need answers to questions like; “How many guests are they running?” “How many guests are just consuming resources without being used?” “Where are the bottlenecks in the environment?” “Where do you stand on capacity?” “How do applications running inside VMs compare to ones running on physical servers?” “Is VMotion happening? If yes, why?” and so on. When it comes to virtual environments, what you don’t know can hurt you badly.

Another common problem is the classic disconnect between business services and the IT infrastructure. For example, business users say they can’t process orders or things are too slow. The IT side says servers are running fine on CPU. Both of them are right in their own perspective, but they are not on the same page, not even on the same book. This comes from the traditional IT view of looking at boxes and servers as opposed to the actual quality of services being delivered.


Virtualization Journal:
What are two of your favorite customer success stories?

Murugan: There are many, but a classic one was when we got called in by a customer who was deploying a new project with Citrix technologies in a heterogeneous infrastructure with physical and virtual servers. Their new service was not taking off. Users were complaining about severe slowdowns and they had already spent weeks on this problem with no results. Before they came to us, they had changed the server hardware, the application software, the client terminals and software, all to no effect. Within a couple of days of getting involved, we were able to pin-point the source of the problem – network packet retransmissions between servers -- due to some issues with the way network teaming had been set up. We had been working with the application and server teams, and these teams had no visibility into the network. All they had to go by was what the network team was telling them. Hence, they assumed when a problem happened that it was a server or application issue, and spent weeks chasing this. Without any kind of instrumentation on the network, our eG Enterprise solution was able to determine that the root cause of the problem was in the network, not in the VMs, Citrix or other applications. This was a classic case of having to work with limited visibility into some domains, working with different silos of the infrastructure, and yet being able to effectively troubleshoot problems. In the end, it took us just a minutes to review the collected metrics to identify the root cause. Even after hundreds of customer installations, this remains a great example of a customer success.

Another very good example was a large financial institution where our technologies have delivered immense value. Before we got involved, they were very silo-based in their day to day firefighting and operations. We helped them streamline their operations, providing the helpdesk with end-to-end visibility into key business services. s a result, when a problem occurs, the helpdesk knows exactly which expert to call to resolve a problem. This produced significant improvement in service uptime, and more effective use of their operations staff.


Virtualization Journal:
What does the future hold do you think for VDI?

Murugan: VDI and its various technology cousins are definitely here to say. The idea of a centralized desktop with the power of a localized desktop is extremely attractive. Some of the largest implementations have been VDI related. Currently we are seeing Fortune 100 companies leading the way on this and I believe it will be common place soon even in mid-size companies. As a technology, it has not yet fully matured, but once it does we see it as becoming a much bigger market than server-based virtualization initiatives. It may become the de-facto desktop platform in near future.


Virtualization Journal: Do you agree that we are entering a new age of infrastructure – one in which it is back on the agenda of C-level execs (and not only the CTO)?

Murugan: I believe infrastructure has always been on the agenda of C-level execs, but with the success of virtualization there are definitely more conversations at the C-level about how to do this right.


Virtualization Journal: You were responsible for the design and development of one of the earliest J2EE portals in the late 90s; what role does Java play today in the enterprise technology landscape?

Murugan: The platform independence provided by Java was one of the key drivers that enabled a slew of web-facing service-oriented applications in the last decade. Java and its sister technologies remain one of the backbone technologies of the web-based applications.

More Stories By Jeremy Geelan

Jeremy Geelan is Chairman & CEO of the 21st Century Internet Group, Inc. and an Executive Academy Member of the International Academy of Digital Arts & Sciences. Formerly he was President & COO at Cloud Expo, Inc. and Conference Chair of the worldwide Cloud Expo series. He appears regularly at conferences and trade shows, speaking to technology audiences across six continents. You can follow him on twitter: @jg21.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
rcjay2 01/23/09 01:38:00 PM EST

This is a great article and gives you insight to one of the leaders in Enterprise Monitoring Solutions. I am a user who has the pleasure of working with Bala and the folks at EG for some time now. I can honestly say that the product is amazing. It works in all environments across all OS’s and the monitoring/ reporting capabilities are extensive and endless. Out of the box it monitors everything you can throw at it and if you need to implement a custom monitoring solution for something not covered it is easy to include custom scripts EG can run and report on. Currently, I have the EG suite monitoring 2 complete virtual environments with XenServer 5 and ESX Infrastructure 3. Within each virtual environment I have multiple hosts with a range of operating systems. Everything from Solaris, Fedora Core, and all versions of Windows (2003/2008) are running and fully monitored. Not to mention all the network devices (Cisco, Dell, and Linksys) and printers can all be monitored via SNMP.

Furthermore one of the key points is with the newest version EG is now able to monitor the Solaris Sunray environment. All things surrounding the DTU connectivity is readily available. I have found that it is easy to install, configure and in the case of a disaster it is easy to get a backup up and going. One final note, support from the people at EG is second to none. I have spoke with them on numerous occasions and have never run into anything but a genuine offering of help and wiliness to understand and pinpoint the issue until a resolution is discovered.

Rob Jaudon
Promptu Technologies

@ThingsExpo Stories
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
When people aren’t talking about VMs and containers, they’re talking about serverless architecture. Serverless is about no maintenance. It means you are not worried about low-level infrastructural and operational details. An event-driven serverless platform is a great use case for IoT. In his session at @ThingsExpo, Animesh Singh, an STSM and Lead for IBM Cloud Platform and Infrastructure, will detail how to build a distributed serverless, polyglot, microservices framework using open source tec...
Apixio Inc. has raised $19.3 million in Series D venture capital funding led by SSM Partners with participation from First Analysis, Bain Capital Ventures and Apixio’s largest angel investor. Apixio will dedicate the proceeds toward advancing and scaling products powered by its cognitive computing platform, further enabling insights for optimal patient care. The Series D funding comes as Apixio experiences strong momentum and increasing demand for its HCC Profiler solution, which mines unstruc...
IoT offers a value of almost $4 trillion to the manufacturing industry through platforms that can improve margins, optimize operations & drive high performance work teams. By using IoT technologies as a foundation, manufacturing customers are integrating worker safety with manufacturing systems, driving deep collaboration and utilizing analytics to exponentially increased per-unit margins. However, as Benoit Lheureux, the VP for Research at Gartner points out, “IoT project implementers often ...
It is one thing to build single industrial IoT applications, but what will it take to build the Smart Cities and truly society changing applications of the future? The technology won’t be the problem, it will be the number of parties that need to work together and be aligned in their motivation to succeed. In his Day 2 Keynote at @ThingsExpo, Henrik Kenani Dahlgren, Portfolio Marketing Manager at Ericsson, discussed how to plan to cooperate, partner, and form lasting all-star teams to change t...
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
Presidio has received the 2015 EMC Partner Services Quality Award from EMC Corporation for achieving outstanding service excellence and customer satisfaction as measured by the EMC Partner Services Quality (PSQ) program. Presidio was also honored as the 2015 EMC Americas Marketing Excellence Partner of the Year and 2015 Mid-Market East Partner of the Year. The EMC PSQ program is a project-specific survey program designed for partners with Service Partner designations to solicit customer feedbac...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
There are several IoTs: the Industrial Internet, Consumer Wearables, Wearables and Healthcare, Supply Chains, and the movement toward Smart Grids, Cities, Regions, and Nations. There are competing communications standards every step of the way, a bewildering array of sensors and devices, and an entire world of competing data analytics platforms. To some this appears to be chaos. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Bradley Holt, Developer Advocate a...
Connected devices and the industrial internet are growing exponentially every year with Cisco expecting 50 billion devices to be in operation by 2020. In this period of growth, location-based insights are becoming invaluable to many businesses as they adopt new connected technologies. Knowing when and where these devices connect from is critical for a number of scenarios in supply chain management, disaster management, emergency response, M2M, location marketing and more. In his session at @Th...
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, wh...
SYS-CON Events announced today that Bsquare has been named “Silver Sponsor” of SYS-CON's @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. For more than two decades, Bsquare has helped its customers extract business value from a broad array of physical assets by making them intelligent, connecting them, and using the data they generate to optimize business processes.
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. Big Data at Cloud Expo - to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is...