Welcome!

Virtualization Authors: Roger Strukhoff, Elizabeth White, Carmen Gonzalez, Victoria Livschitz, Pat Romanski

Related Topics: Virtualization

Virtualization: Article

"Virtualization Is Now a Key Strategic Theme," Says Citrix CTO

Exclusive Q&A with Simon Crosby, CTO of Citrix & Founder of XenSource

"Virtualization is already widely used, but primarily for the first-order benefit, namely server consolidation," notes Citrix CTO Simon Crosby, in this Exclusive Q&A with SYS-CON's Virtualization Journal. "The second-order benefits of agility, availability and manageability of the IT stack are now becoming better understood," Crosby continues, "and as a consequence virtualization has moved from a tactical tool for gaining immediate savings, to become a key strategic theme for every IT department."

About Simon Crosby:
Now CTO of Citrix Systems, Simon Crosby was founder and CTO of XenSource prior to the acquisition of XenSource by Citrix. Prior to XenSource, he was a principal engineer at Intel where he led strategic research in distributed autonomic computing, platform security and trust. In 2007, he was awarded a coveted spot as one of InfoWorld’s Top 25 CTOs.

Virtualization Journal: Starting at 35,000 ft…where does the Xen hypervisor fit in the virtualization universe?
Simon Crosby: The Xen hypervisor is the industry’s most strategic code base for virtualization. Why? First, it is a tiny, optimized, open source reference standard hypervisor for a wide range of CPU architectures, with extensive support for high performance virtualization-enhanced CPUs and I/O subsystems. Because it has multiple routes to market in any given year, the hardware manufacturers ensure that Xen has “first and best” support for the latest hardware, ensuring that it always leads the industry in scalability and performance. In addition:
  • It is collaboratively built by the industry’s leading IT vendors, led by Citrix and including Intel, AMD, IBM, HP, Novell, Red Hat, Sun, VA Linux and many others.
  • The Xen security architecture is contributed by the security community, including researchers, IBM’s secure hypervisor project, the NSA and DoD.
  • Xen is used in the world’s largest virtualization deployments, for example by Amazon, with a deployment of thousands of servers virtualized using Xen.
  • The Microsoft Hyper-V hypervisor is in fact an implementation of the Xen reference architecture, built by Microsoft, and compatible with Citrix XenServer.
Virtualization Journal: The first public release of Xen was made available in 2003, how long did it take for you and your Cambridge collaborators to get it to that stage?
Crosby: The Xen code base has been in development now for seven years. When we started XenSource, we had released Xen 2.5, and were working on Xen 3.0.

Virtualization Journal:
What was and is the relationship between XenSource, Inc. and the Xen project?
Crosby: XenSource’s founders – all former University of Cambridge faculty who developed Xen in their research – decided that the Xen hypervisor needed a company to support its ongoing development when large users of Xen told us that we needed to build a complete product offering based on Xen so they could be confident that their commercial deployments would have the backing of a commercial entity.

Virtualization Journal: XenSource still hosts the xen.org site – what’s the situation there, since the Citrix acquisition? Will the community and its processes continue to be respected?
Simon Crosby: XenSource is part of Citrix – indeed XenSource is no longer a formal entity at all. Citrix hosts xen.org for the community, but it is run entirely separately from all of our product development activity. The community site at www.xen.org has its own program manager, tasked with serving the community and the Xen project Advisory Board.


The board oversees the day-to-day project management processes, and sets policies such as the trademark policy for the Xen® brand. The advisory board members come from Intel, IBM, HP, Novell, Red Hat and Sun, and the Chair is Ian Pratt, the Xen project leader, from Citrix. Citrix has already invested heavily in additional headcount on Xen, and is a sponsor of the upcoming Xen Summit, to be held in conjunction with Usenix in Boston, in June.


Virtualization Journal: If paravirtualization equals second-generation virtualization, what will third-generation virtualization look like?
Simon Crosby: From a hypervisor architecture perspective, there is very little left to “optimize away” in the way that paravirtualization allows us to slim down the code base. What will happen is that all of the data center infrastructure, from CPUs to memory management, to I/O chipsets and even storage subsystems will become “virtualization aware” and assist with the job of speeding up what formerly had to be done either in the hypervisor or the virtualization stack that drives it.

Within the next year, I/O Virtualization (often called IOV) standardized by the PCI SIG will start to be supported by fabric and I/O card vendors. This allows optimized fast-path I/O between guests and hardware in a virtualization-safe manner, without needing to use the driver stack offered by the virtualization platform itself. This effectively removes most of the remaining overhead of virtualization. We recently demonstrated XenServer with a performance of about 10,000 iSCSI IOPS on a 10Gb/s IOV card from SolarFlare, for example. This means that the most challenging workloads can now be virtualized.


Virtualization Journal: How about embedded hypervisors, what’s the future trajectory there?
Crosby: Citrix XenServer is now an add-in option at point of sale on over 50 percent of x86 servers worldwide. We recently announced a jointly developed embedded product with HP for ProLiant servers that HP refers to as its “preferred embedded virtualization option for ProLiant.” Putting the virtualization platform in hardware is, in our view, the next natural progression for the industry, since OEMs can leverage the capabilities of the hardware through their add-on systems management stacks to offer customers powerful, seamless management for virtualization as a built-in component of the management stack.

For customers, this is the cheapest and highest performance virtualization offering available, and it has the full benefit of complete integration with all of HP’s management tools. At the same time, Microsoft with Hyper-V in the OS, and the Linux vendors with Xen have the opportunity to leverage the same code base through a different delivery model, where the OS virtualizes more instances of that OS, or other guests. This model is still in its early stages – the Linux vendors don’t virtualize Windows well, and Microsoft Hyper-V doesn’t support Linux particularly well.



Virtualization Journal: It has inevitably been said that 2008 is, at long last, The Year of Virtualization. What do you think took everyone in Enterprise IT so long?
Crosby: Enterprise IT has not been standing still. Indeed virtualization is already widely used, but primarily for the first-order benefit, namely server consolidation. The second-order benefits of agility, availability and manageability of the IT stack are now becoming better understood, and as a consequence virtualization has moved from a tactical tool for gaining immediate savings, to become a key strategic theme for every IT department.

But there is also another key factor that changes in 2008. Until this year the competition in the market was really only VMware and XenSource – a tiny startup. The acquisition by Citrix gives our product, XenServer, a huge channel, a large investment in features, additional value-added functions that leverage Xen, 24x7 worldwide support and all the clout needed to serve true enterprise customers and use cases. As we go to market with XenServer, we also collaborate closely with Microsoft, who will deliver Hyper-V to market in the summer. Our intention is to leverage both footprints to deliver powerful virtualization-optimized solutions to customers for data-center automation, virtual desktop infrastructure and application delivery. Citrix products XenServer, XenDesktop and XenApp, all contain virtualization as a core feature set (server, desktop and application, respectively).

Until 2008, VMware was the only choice, their hypervisor cost thousands of dollars, and they had the market to themselves. In 2008, Citrix and Microsoft bring customers an open architecture, a price/performance and feature set that is difficult to beat, and a powerful channel that can deliver customers a real choice for their virtual infrastructure for the first time. Importantly, our products will all also add value to VMware virtualized infrastructure, to fully support customers that have purchased VMware enterprise licenses.

It will be a very exciting year!

Virtualization Journal: What’s the risk of Virtualization becoming just another buzzword used in the attempt to get organizations to “sign a check”?
Crosby: The word is already over-used, and every vendor wants a “virtualization spin” on their product. Customers are smart though, and I think they understand that first and foremost they need to pick a hypervisor. There’s VMware, or the compatible pair of Xen and Microsoft Hyper-V. Second, there’s virtualized storage. We believe that there will be tremendous innovation in the storage area to optimize the management of storage for virtual machines in hardware, as opposed to doing this in software on the host, as VMware does. Third, there are dynamic infrastructure software services that:
  • Provision virtual machines – we offer dynamic provisioning on XenServer, Hyper-V, VMware and (crucially) on bare metal
  • Optimize performance of virtual machines on the infrastructure through workflow based automated provisioning
  • Protect virtual machines by offering them high-availability or even fault-tolerance
  • Manage VM lifecycle
This area of virtualization management is an area rich with innovation that can exploit XenServer or Hyper-V to deliver powerful new choices to customers. There is a lot of hot air in vendors’ pitches right now, and customers should really look under the covers to understand the ROI before they purchase new tools.


Virtualization Journal:
The Xen AB currently has members from Citrix, IBM, Intel, HP, Novell, Red Hat and Sun – is it likely that further companies would get onto the Advisory Board?
Crosby: The AB is drawn from the top contributors to Xen, and includes the first key vendors that delivered the Xen hypervisor to market. As new vendors join the ranks of those that ship Xen to customers, I expect the AB will grow, since those vendors have a strategic interest in Xen’s continued prominence.

Virtualization Journal:
You’ve been hailed as one of the top 25 CTOs in the industry: what duty or duties of care do you feel such acclaim brings with it for a top software executive in the first decade of the twenty-first century? Is the ‘IT greening’ aspect of virtualization important to you, for example?
Crosby: I am incredibly fortunate to be in a position that allows me to advocate a technology and community that I find inspiring. The community builds the world’s best hypervisor using a development model based on collaborative contribution without charge. Xen is great because its community makes it great and makes it freely available.

The impact of Xen in a global sense, beyond vendors and products, has been to slash the price of virtualization, making it a free feature set available to everyone. Server consolidation should be free, because it makes a powerful contribution to the greening of IT. To the Xen community goes the credit for a powerful, open, collaborative development spirit that will have a tremendous worldwide impact on power consumption and therefore global warming.

More Stories By Jeremy Geelan

Jeremy Geelan is Chairman & CEO of the 21st Century Internet Group, Inc. and an Executive Academy Member of the International Academy of Digital Arts & Sciences. Formerly he was President & COO at Cloud Expo, Inc. and Conference Chair of the worldwide Cloud Expo series. He appears regularly at conferences and trade shows, speaking to technology audiences across six continents. You can follow him on twitter: @jg21.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
The security devil is always in the details of the attack: the ones you've endured, the ones you prepare yourself to fend off, and the ones that, you fear, will catch you completely unaware and defenseless. The Internet of Things (IoT) is nothing if not an endless proliferation of details. It's the vision of a world in which continuous Internet connectivity and addressability is embedded into a growing range of human artifacts, into the natural world, and even into our smartphones, appliances, and physical persons. In the IoT vision, every new "thing" - sensor, actuator, data source, data con...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happens, where data lives and where the interface lies. For instance, it's a mix of architectural styles ...
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and assessments, including a decade of leading incident response and digital forensics. He is co-author of t...
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrategies, will examine why IT must finally fulfill its role in support of its SBUs or face a new round of...
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, described an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-device experiences grounded in people's real needs and desires.
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) i...
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at @ThingsExpo, Robin Raymond, Chief Architect at Hookflash, will walk through the shifting landscape of traditional telephone and voice services ...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Architect for the Internet of Things and Intelligent Systems at Red Hat, described how to revolutioniz...
Bit6 today issued a challenge to the technology community implementing Web Real Time Communication (WebRTC). To leap beyond WebRTC’s significant limitations and fully leverage its underlying value to accelerate innovation, application developers need to consider the entire communications ecosystem.
The definition of IoT is not new, in fact it’s been around for over a decade. What has changed is the public's awareness that the technology we use on a daily basis has caught up on the vision of an always on, always connected world. If you look into the details of what comprises the IoT, you’ll see that it includes everything from cloud computing, Big Data analytics, “Things,” Web communication, applications, network, storage, etc. It is essentially including everything connected online from hardware to software, or as we like to say, it’s an Internet of many different things. The difference ...
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.