Welcome!

Virtualization Authors: Liz McMillan, Elizabeth White, Bill Vorhies, Jayaram Krishnaswamy, Ian Khan

Related Topics: Cloud Expo, Java, SOA & WOA, Virtualization, Web 2.0, Big Data Journal, SDN Journal

Cloud Expo: Article

Deploying the Cloud? Make 2013 the Year to Do It Right

Leverage the benefits of cloud technology across the enterprise

It's no secret that the cloud is growing at an exponential rate. By 2016, two-thirds of the world's server workloads will exist in the cloud. But according to Cisco's 2012 Cloud Index, less than half of server workloads currently run in the cloud. Closing the gap between current capabilities and future requirements is a mission-critical priority for businesses across a range of industries. Without adequate planning and preparation, the race to the cloud can easily become a long slog through a minefield of missed opportunities, user failures and IT nightmares. As more and more workloads make their way to the cloud each year, enterprises have a vested interest in expanding network capabilities and evolving critical data center infrastructure to accommodate an ever-increasing array of cloud-based applications and data storage requirements.

Key Trends in Cloud Technology
Several trends are driving the migration of applications and data to the cloud.

  1. Agility. Cloud deployment enables businesses to improve performance and functionality quickly, launching new applications without a corresponding need for additional infrastructure. Agility is especially important in new and young companies, many of which lack the time and resources to deploy a range of diverse applications internally.
  2. Consumerization of IT. The Bring Your Own Device (BYOD) trend is enabling companies to expand the use of technology through the use of employee-owned devices. The cloud is playing an important role in helping organizations keep up with the pace of BYOD and deliver anytime, anywhere access to workers.
  3. Cost Drivers. Financial metrics are also a motivating factor in the race to the cloud. In general, Software-as-a-Service (SaaS) is cheaper, faster and easier than traditional deployment models - reducing the cost of infrastructure, physical space and IT labor.

Preparing for the Cloud
Successful preparation for future cloud workloads requires planning. By strategically adapting your network capacity, data center and other critical IT functions, you can substantially improve your organization's ability to operate in the cloud.

Today's networks must be capable of handling constant interactions characterized by rich media and heavy content, particularly as users migrate away from email toward interactions via social media and other channels. Consequently, networks and data centers must expand to support instant access to many different types of content beyond email. The first step of network expansion is a comprehensive assessment of your organization's app portfolio. In most cases, executive decision-makers are unaware of the scope of applications that are running in the organization. Once all of the applications that are currently running in your organization have been identified, they need to be ranked and categorized according to future requirements. While some applications may need to remain in-house, others can be migrated to a public cloud or secure public cloud environment. From there, the organization can begin to evaluate how to expand the network to manage future workloads.

As virtualization becomes more prevalent in data centers and companies adopt a cloud-based strategy, network architects need to rethink and redesign their current infrastructure to adapt to the new traffic patterns. What once used to be primarily "North-South" network traffic flow is now becoming "East-West." Environments are becoming highly dynamic; workloads are moving to different physical locations on the network as virtual servers are migrated and clients move about the building. The architectures and networking techniques of yesterday are not necessarily well suited to the architectures and applications of today and tomorrow. A thorough understanding of networking and the infrastructure that is its foundation as well its relationship to applications is necessary when architecting a data center network that is capable of not just supporting but adapting to future challenges that arise as a result of virtualization and cloud computing. The solution must address all aspects of application delivery - security, availability, performance, and visibility - while exhibiting the qualities and characteristics that define cloud architectures including affordability and elastic scalability. The data must be protected against attacks, intrusions, breaches, and leaks and categorized based on its importance and network resource needs with Quality-of-Service (QoS) capabilities.

Storage and Backup is another key to preparing for cloud migration. Security is a top-of-mind issue in the cloud. Although cloud deployments offer real benefits, your organization needs to know that sensitive data will remain secure. The process for preparing for cloud-based data storage and backup mirrors the process for evaluating network expansion requirements. Starting with an assessment of current data sources and storage routines, the organization needs to evaluate what types of data can eventually either integrate with or be completely migrated to the cloud. Equipped with this information, the organization can begin to identify the technology gaps that need to be addressed to meet future cloud storage and backup requirements.

Purpose-built appliances provide fast backup and restore, and deliver local-like performance while using the cloud for secure off-site storage, avoiding the need to provision and manage a secondary site for Disaster Recovery (DR) or long-term storage. This can dramatically reduce capital spending, streamline IT infrastructure, and enable payback periods that are measured in months, not years. Appliances can be combined with existing data protection applications and private or public clouds, creating a low cost, highly scalable storage tier for old or infrequently accessed data. Appliances also allow organizations of all sizes to modernize their data protection architecture, eliminate tape, improve scalability, and improve Disaster Recovery readiness. Cloud storage allow organizations to leverage a pay-for-use pricing model and anywhere availability.

Increased virtualization will alleviate some of the cloud migration challenges upfront. Cloud technology enables organizations to move servers to the cloud and back in an integrated and strategic manner. In fact, the use of virtualization can also play an important role in preparing the organization's culture and stakeholders for future cloud deployments. By increasing the use of virtualization now, you can encourage greater acceptance of the cloud across your enterprise. In essence, virtualization can serve as a bridge to the cloud.

It is the key technology that enables the cloud, and without it there is no cloud. The ability to separate the OS and application from the hardware allows it to be the foundation required for on-demand cloud services. The encapsulation offered in virtualization and the mobility that enables a live virtual machine to be moved with no downtime for the application is what the cloud is built on. If you look at virtualization / cloud computing as a whole, it really is not about a product, but a journey. Companies initially enter the world of virtualization because they just can't keep up with the increased scale, complexity, and management requirements while maintaining their current traditional infrastructure. This leads them to the first step in the virtualization journey, which is to consolidate their resources/infrastructure to get better utilization of their servers and to reduce their energy cost. Higher levels of abstraction allow companies to take advantage of the intelligence built into the virtualization software. Intelligent software that allows High Availability (HA) and Replication, load balancing, pooled resources, self-automation/orchestration, service definitions or service profiles, templates, policies, self-service portal, service catalog(s), security and identity management, system monitoring/system management, capacity planning, billing and chargeback, and licensing.

It's important to understand that the ability to handle future cloud-based workloads will present different challenges and concerns for the stakeholders in your organization; users are motivated by ease-of-use and increased access to applications and data, CIOs are focused on control, ownership and data and application security and CFOs are primarily concerned about the cost savings, rate of return and opex versus capex, etc. By thoughtfully and strategically preparing for future cloud opportunities, your organization can address these concerns and fully leverage the benefits of cloud technology across the enterprise.

More Stories By Pete Schmitt

Pete Schmitt is Vice President of Engineering at Customer Storage Inc. Since 2002, cStor has helped companies strategize, create, and implement best in class data center solutions that address business needs. cStor’s proven capabilities with key data center technologies provides clients with a fresh perspective, the ability to collaborate with recognized data center experts, and the confidence that goals will be met.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories

ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ --  IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's platform-as-a-service. The new platform enables developers to build ap...

Building low-cost wearable devices can enhance the quality of our lives. In his session at Internet of @ThingsExpo, Sai Yamanoor, Embedded Software Engineer at Altschool, provided an example of putting together a small keychain within a $50 budget that educates the user about the air quality in their surroundings. He also provided examples such as building a wearable device that provides transit or recreational information. He then reviewed the resources available to build wearable devices at home including open source hardware, the raw materials required and the options available to power s...
The Internet of Things promises to transform businesses (and lives), but navigating the business and technical path to success can be difficult to understand. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, demonstrated how to approach creating broadly successful connected customer solutions using real world business transformation studies including New England BioLabs and more.
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session at @ThingsExpo, Ryan Bagnulo, Solution Architect / Software Engineer at SOA Software, focused on desi...
"For over 25 years we have been working with a lot of enterprise customers and we have seen how companies create applications. And now that we have moved to cloud computing, mobile, social and the Internet of Things, we see that the market needs a new way of creating applications," stated Jesse Shiah, CEO, President and Co-Founder of AgilePoint Inc., in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immediate and actionable interpretation of events as they happen. Another aspect concerns how to deliver ...
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Media announced that Splunk, a provider of the leading software platform for real-time Operational Intelligence, has launched an ad campaign on Big Data Journal. Splunk software and cloud services enable organizations to search, monitor, analyze and visualize machine-generated big data coming from websites, applications, servers, networks, sensors and mobile devices. The ads focus on delivering ROI - how improved uptime delivered $6M in annual ROI, improving customer operations by mining large volumes of unstructured data, and how data tracking delivers uptime when it matters most.
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.