|By Jeremy Geelan||
|January 23, 2009 06:00 AM EST||
"There is a shift in focus and it is from technologies that enable virtualization to technologies that manage virtualization," says Bala Murugan, Chief Architect at eG Innovations, in this Exclusive Q&A with SYS-CON's Virtualization Journal. Overall, Murugan maintains, virtualization is "a promising and justifiable investment, particularly in the current economic downturn."
Virtualization Journal: Do you agree with the view that Virtualization is one of the most promising technology investments in the current economic downturn?
Bala Murugan: Virtualization, when done right, has been proven to provide significant reductions in direct cost. It also helps your indirect cost by improving your IT’s performance, reliability and capacity management. So yes, I would say that it is a promising and justifiable investment, particularly in the current economic downturn.
Virtualization Journal: How about your concept of “Virtualization 2.0” – doesn’t it implicitly suggest that Virtualization 1.0 has been deficient?
Murugan: On the contrary, it is more in reference to the evolution of the Virtualization industry. Virtualization 1.0 was a revelation; it introduced virtualization to the world, proved its power and showed everyone how much they can benefit from it. Virtualization 2.0 – which is already here - is about accepting Virtualization as reality and moving on to how to do it right. How to get the most out of it. Essentially, there is a shift in focus and it is from technologies that enable virtualization to technologies that manage virtualization.
To be successful in Virtualization 2.0, organizations have to focus around technology that helps them manage their virtualization deployments better. Being a monitoring technology provider, we understand the complexities of monitoring in Virtualization 2.0 and are well positioned to help these companies realize the full potential of their virtualized infrastructures.
Virtualization Journal: Are you concerned at all that the “2.0” label might detract from the overall value proposition, given that it seems to be going down with the USS Economy. ;-)
Murugan: We view Virtualization 2.0 as an evolution (next phase) – not as a radical revamp of current virtualization deployments. In Virtualization 2.0, the focus is on how to make virtualization deployments more cost-effective and how to gain maximum benefits. So this will actually make virtualization a mandatory technology for most organizations that are dealing with tight budgets in the economic slow-down.
Virtualization Journal: How about interoperability, how important is that for the industry do you think? What barriers persist?
Murugan: We live in an age of diverse infrastructures. Even before virtualization, the success of n-tier architectures and open systems made it impossible for one to have a homogenous environment. Data centers today comprise diverse technologies that have to co-exist and to in deliver IT services. Virtualization has taken this another step on the evolutionary road, now we are talking about adding a couple of more tiers to the n-tier apps by separating the hardware from the OS. At this juncture, we believe that interoperability is not a “nice to have”. It is a “must have.”
In terms of barriers, the ones that still exist are mostly technological, that people are working to overcome. In principle, I believe everyone agrees interoperability is a must have. Not only do they have to deal with a mix of virtual and non-virtual infrastructures, but also different types of virtualization from different vendors. They key we found is to be able to provide a unified consistent view across this diverse landscape, which makes management that much easier for the end-user.
Virtualization Journal: Do you think VMware needs fear Microsoft’s belated entry into the virtualization marketplace?
Murugan: History has shown that Microsoft can be a significant threat in any endeavor it puts its mind to. They will have good technology and resort to their favorite ploy; their licensing model, and make Virtualization more of a commodity than it already is.
VMware itself has recognized that the hypervisor is no longer going to be the differentiator and that technologies that enable the effective use of virtualization (e.g., manageability), new application deployment models (like virtual desktops), etc. will be a key to retaining their leadership position.
Competition in this space can only be good – innovation will be faster and certainly there is room for multiple vendors in this fast growing market.
Virtualization Journal: How about eG Innovations, what’s the background story to the company’s formation and growth to date?
A: eG Innovations was founded by Srinivas Ramanathan, who also is our president and CEO. Prior to eG, he was a research scientist at HP and the chief architect of Firehunter, an ISP performance monitoring solution. His years at HP gave him a ringside seat to real pain points that customers have with monitoring their environments and monitoring tools themselves. In 2000, he left HP to build the proverbial “better mousetrap,” and assembled a strong team, including myself, to take this concept from the ground up. That was the genesis of eG Innovations.
Our focus was on monitoring n-tier architectures by looking at them as business services as opposed to a collection of servers, networks and applications. Our key benefit to the customer was our ability to proactively identify to the right problem, the true root cause, of poor performance in their IT infrastructures. As a result, customers spent less time firefighting and finger pointing, and more time improving their overall service levels. It took a couple of years to roll-out the finished product, and we got VC funding from Singapore. Then we opened up the US market in 2002 and found a receptive audience for the technology. We quickly became the premier Citrix monitoring solution, which had all the classic n-tier architecture issues. We won many awards and saw the company grow across the globe.
We saw the opportunity in the virtualization space quite early and started working with early virtualization adopters to better understand their needs and to strengthen our technology. Our mastery in thin-client computing and shared access technologies (Citrix, Microsoft Terminal Services, etc.) helped because a Virtualization ecosystem (one box – multiple OSs) is similar to a Citrix ecosystem (one OS – multiple users). More awards later, we are now recognized as one of the industry leaders in the Virtualization monitoring space, with support for different virtualization platforms including VMware, Citrix Xen, Solaris Containers/LDOMs and more.
Virtualization Journal: What are the main pain points that bring customers to you in search of a monitoring solution?
Murugan: The biggest single pain point is probably problem isolation. When there is a problem in your n-tier IT infrastructure, it is usually pretty hard to distinguish between the true root cause and the effects. With systems being interdependent, a single problem generally causes a ripple effect that flows through the entire environment, leading you to chase effects as opposed to pinpointing the root cause. In simple terms, this means you are wasting valuable IT resources in fire-fighting mode fixing effects, which leads to finger pointing inside the organization. Meanwhile, your customers are still facing the problem. Virtualization only increases the complexity of your n-tier IT delivery, which makes problem isolation even more difficult.
Another key pain point that we see customers face is lack of visibility into their IT infrastructures. Even though it sounds simple enough, more often than not customers today don’t have total visibility into what is going on within their virtualized infrastructures. When you are managing a virtualized environment you definitely need answers to questions like; “How many guests are they running?” “How many guests are just consuming resources without being used?” “Where are the bottlenecks in the environment?” “Where do you stand on capacity?” “How do applications running inside VMs compare to ones running on physical servers?” “Is VMotion happening? If yes, why?” and so on. When it comes to virtual environments, what you don’t know can hurt you badly.
Another common problem is the classic disconnect between business services and the IT infrastructure. For example, business users say they can’t process orders or things are too slow. The IT side says servers are running fine on CPU. Both of them are right in their own perspective, but they are not on the same page, not even on the same book. This comes from the traditional IT view of looking at boxes and servers as opposed to the actual quality of services being delivered.
Virtualization Journal: What are two of your favorite customer success stories?
Murugan: There are many, but a classic one was when we got called in by a customer who was deploying a new project with Citrix technologies in a heterogeneous infrastructure with physical and virtual servers. Their new service was not taking off. Users were complaining about severe slowdowns and they had already spent weeks on this problem with no results. Before they came to us, they had changed the server hardware, the application software, the client terminals and software, all to no effect. Within a couple of days of getting involved, we were able to pin-point the source of the problem – network packet retransmissions between servers -- due to some issues with the way network teaming had been set up. We had been working with the application and server teams, and these teams had no visibility into the network. All they had to go by was what the network team was telling them. Hence, they assumed when a problem happened that it was a server or application issue, and spent weeks chasing this. Without any kind of instrumentation on the network, our eG Enterprise solution was able to determine that the root cause of the problem was in the network, not in the VMs, Citrix or other applications. This was a classic case of having to work with limited visibility into some domains, working with different silos of the infrastructure, and yet being able to effectively troubleshoot problems. In the end, it took us just a minutes to review the collected metrics to identify the root cause. Even after hundreds of customer installations, this remains a great example of a customer success.
Another very good example was a large financial institution where our technologies have delivered immense value. Before we got involved, they were very silo-based in their day to day firefighting and operations. We helped them streamline their operations, providing the helpdesk with end-to-end visibility into key business services. s a result, when a problem occurs, the helpdesk knows exactly which expert to call to resolve a problem. This produced significant improvement in service uptime, and more effective use of their operations staff.
Virtualization Journal: What does the future hold do you think for VDI?
Murugan: VDI and its various technology cousins are definitely here to say. The idea of a centralized desktop with the power of a localized desktop is extremely attractive. Some of the largest implementations have been VDI related. Currently we are seeing Fortune 100 companies leading the way on this and I believe it will be common place soon even in mid-size companies. As a technology, it has not yet fully matured, but once it does we see it as becoming a much bigger market than server-based virtualization initiatives. It may become the de-facto desktop platform in near future.
Virtualization Journal: Do you agree that we are entering a new age of infrastructure – one in which it is back on the agenda of C-level execs (and not only the CTO)?
Murugan: I believe infrastructure has always been on the agenda of C-level execs, but with the success of virtualization there are definitely more conversations at the C-level about how to do this right.
Virtualization Journal: You were responsible for the design and development of one of the earliest J2EE portals in the late 90s; what role does Java play today in the enterprise technology landscape?
Murugan: The platform independence provided by Java was one of the key drivers that enabled a slew of web-facing service-oriented applications in the last decade. Java and its sister technologies remain one of the backbone technologies of the web-based applications.
|rcjay2 01/23/09 01:38:00 PM EST|
This is a great article and gives you insight to one of the leaders in Enterprise Monitoring Solutions. I am a user who has the pleasure of working with Bala and the folks at EG for some time now. I can honestly say that the product is amazing. It works in all environments across all OS’s and the monitoring/ reporting capabilities are extensive and endless. Out of the box it monitors everything you can throw at it and if you need to implement a custom monitoring solution for something not covered it is easy to include custom scripts EG can run and report on. Currently, I have the EG suite monitoring 2 complete virtual environments with XenServer 5 and ESX Infrastructure 3. Within each virtual environment I have multiple hosts with a range of operating systems. Everything from Solaris, Fedora Core, and all versions of Windows (2003/2008) are running and fully monitored. Not to mention all the network devices (Cisco, Dell, and Linksys) and printers can all be monitored via SNMP.
Furthermore one of the key points is with the newest version EG is now able to monitor the Solaris Sunray environment. All things surrounding the DTU connectivity is readily available. I have found that it is easy to install, configure and in the case of a disaster it is easy to get a backup up and going. One final note, support from the people at EG is second to none. I have spoke with them on numerous occasions and have never run into anything but a genuine offering of help and wiliness to understand and pinpoint the issue until a resolution is discovered.
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, exploreed the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
Nov. 28, 2015 08:45 AM EST Reads: 429
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessions, I wanted to share some of my observations on emerging trends. As cyber security serves as a fou...
Nov. 28, 2015 08:45 AM EST Reads: 327
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now all corporate assets – people, objects, and spaces – can share information about themselves and thei...
Nov. 28, 2015 06:00 AM EST Reads: 239
Continuous processes around the development and deployment of applications are both impacted by -- and a benefit to -- the Internet of Things trend. To help better understand the relationship between DevOps and a plethora of new end-devices and data please welcome Gary Gruver, consultant, author and a former IT executive who has led many large-scale IT transformation projects, and John Jeremiah, Technology Evangelist at Hewlett Packard Enterprise (HPE), on Twitter at @j_jeremiah. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.
Nov. 28, 2015 05:30 AM EST Reads: 728
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Nov. 28, 2015 05:00 AM EST Reads: 358
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.
Nov. 28, 2015 04:00 AM EST Reads: 536
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound effect on the world, and what should we expect to see over the next couple of years.
Nov. 28, 2015 03:30 AM EST Reads: 472
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" in this scenario: microservice A (releases daily) depends on a couple of additions to backend B (re...
Nov. 28, 2015 03:00 AM EST Reads: 446
The cloud. Like a comic book superhero, there seems to be no problem it can’t fix or cost it can’t slash. Yet making the transition is not always easy and production environments are still largely on premise. Taking some practical and sensible steps to reduce risk can also help provide a basis for a successful cloud transition. A plethora of surveys from the likes of IDG and Gartner show that more than 70 percent of enterprises have deployed at least one or more cloud application or workload. Yet a closer inspection at the data reveals less than half of these cloud projects involve production...
Nov. 28, 2015 03:00 AM EST Reads: 481
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Day 2 Keynote at 17th Cloud Expo, Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, wil...
Nov. 28, 2015 02:00 AM EST Reads: 579
PubNub has announced the release of BLOCKS, a set of customizable microservices that give developers a simple way to add code and deploy features for realtime apps.PubNub BLOCKS executes business logic directly on the data streaming through PubNub’s network without splitting it off to an intermediary server controlled by the customer. This revolutionary approach streamlines app development, reduces endpoint-to-endpoint latency, and allows apps to better leverage the enormous scalability of PubNub’s Data Stream Network.
Nov. 28, 2015 02:00 AM EST Reads: 327
Container technology is shaping the future of DevOps and it’s also changing the way organizations think about application development. With the rise of mobile applications in the enterprise, businesses are abandoning year-long development cycles and embracing technologies that enable rapid development and continuous deployment of apps. In his session at DevOps Summit, Kurt Collins, Developer Evangelist at Built.io, examined how Docker has evolved into a highly effective tool for application delivery by allowing increasingly popular Mobile Backend-as-a-Service (mBaaS) platforms to quickly crea...
Nov. 28, 2015 02:00 AM EST Reads: 357
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Ben Perlmutter, a Sales Engineer with IBM Cloudant, demonstrated techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk was on IBM Cloudant, Apache CouchDB, and ...
Nov. 28, 2015 01:45 AM EST Reads: 410
I recently attended and was a speaker at the 4th International Internet of @ThingsExpo at the Santa Clara Convention Center. I also had the opportunity to attend this event last year and I wrote a blog from that show talking about how the “Enterprise Impact of IoT” was a key theme of last year’s show. I was curious to see if the same theme would still resonate 365 days later and what, if any, changes I would see in the content presented.
Nov. 28, 2015 12:00 AM EST Reads: 424
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi’s VP Business Development and Engineering, explored the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context with p...
Nov. 28, 2015 12:00 AM EST Reads: 427
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical to maintaining positive ROI. Raxak Protect is an automated security compliance SaaS platform and ma...
Nov. 27, 2015 06:00 PM EST Reads: 417
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
Nov. 27, 2015 04:00 PM EST Reads: 175
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Nov. 27, 2015 12:00 PM EST Reads: 464
Internet of @ThingsExpo, taking place June 7-9, 2016 at Javits Center, New York City and Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 18th International @CloudExpo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo New York Call for Papers is now open.
Nov. 27, 2015 12:00 PM EST Reads: 549
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York and Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound cha...
Nov. 27, 2015 11:45 AM EST Reads: 541