Click here to close now.

Welcome!

Virtualization Authors: Yeshim Deniz, Jnan Dash, Liz McMillan, Alena Prokharchyk, Carmen Gonzalez

Related Topics: Virtualization, Microservices Journal

Virtualization: Article

How Data Virtualization Improves Business Agility – Part 2

Accelerate value with a streamlined, iterative approach that evolves easily

Business Agility Requires Multiple Approaches
Agile businesses create business agility through a combination of business decision agility, time-to-solution agility and resource agility.

This article addresses how data virtualization delivers time-to-solution agility. Part 1 addressed business decision agility and Part 3 will address resource agility.

Time-To-Solution Agility = Business Value
When responding to new information needs, rapid time-to-solution is critically important and often results in significant bottom-line benefits.

Proven, time and again across multiple industries, substantial time-to-solution improvements can be seen in the ten case studies described in the recently published Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility.

Consider This Example: If the business wants to enter a new market, it must first financially justify the investment, including any new IT requirements. Thus, only the highest ROI projects are approved and funded. Once the effort is approved, accelerating delivery of the IT solution also accelerates realization of the business benefits and ROI.

Therefore, if incremental revenues from the new market are $2 million per month, then the business will gain an additional $2 million for every month IT can save in time needed to deliver the solution.

Streamlined Approach to Data Integration
Data virtualization is significantly more agile and responsive than traditional data consolidation and ETL-based integration approaches because it uses a highly streamlined architecture and development process to build and deploy data integration solutions.

This approach greatly reduces complexity and reduces or eliminates the need for data replication and data movement. As numerous data virtualization case studies demonstrate, this elegance of design and architecture makes it far easier and faster to develop and deploy data integration solutions using a data virtualization platform. The ultimate result is faster realization of business benefits.

To better understand the difference, let's contrast these methods. In both the traditional data warehouse/ETL approach and data virtualization, understanding the information requirements and reporting schema is the common first step.

Traditional Data Integration Has Many Moving Parts
Using the traditional approach IT then models and implements the data warehouse schema. ETL development follows to create the links between the sources and the warehouse. Finally the ETL scripts are run to populate the warehouse. The metadata, data models/schemas and development tools used within each activity are unique to each activity.

This diverse environment of different metadata, data models/schemas and development tools is not only complex but also results in the need to coordinate and synchronize efforts and objects across them.

Experienced BI and data integration users will readily acknowledge the long development times that result from this complexity, including Forrester Research in its 2011 report Data Virtualization Reaches Critical Mass.

"Extract, transform, and load (ETL) approaches require one or more copies of data staged along the physical integration process flow. Creating, storing, and manipulating these copies can be complex and error prone."

Data Virtualization Has Fewer Moving Parts
Data virtualization uses a more streamlined architecture that simplifies development. Once the information requirements and reporting schema are understood, the next step is to develop the objects (views and data services) used to both model and query the required data.

These virtual equivalents of the warehouse schema and ETL routines and scripts are created within a single view or data service object using a unified data virtualization development environment. This approach leverages the same metadata, data models/schemas and tools.

Not only is it easier to build the data integration layer using data virtualization, but there are also fewer "moving parts," which reduces the need for coordination and synchronization activities. With data virtualization, there is no need to physically migrate data from the sources to a warehouse. The only data that is moved is the data delivered directly from the source to the consumer on-demand. These result sets persist in the data virtualization server's memory for only a short interval.

Avoiding data warehouse loads, reloads and updates further simplifies and streamlines solution deployment and thereby improves time-to-solution agility.

Iterative Development Process Is Better for Business Users
Another way data virtualization improves time-to-solution agility is through support for a fast, iterative development approach. Here, business users and IT collaborate to quickly define the initial solution requirements followed by an iterative "develop, get feedback and refine" process until the solution meets the user need.

Most users prefer this type of development process. Because building views of existing data is simple and fast, IT can provide business users with prospective versions of new data sets in just a few hours. The user doesn't have to wait months for results while IT develops detailed solution requirements. Then business users can react to these data sets and refine their requirements based on the tangible insights. IT can then change the views and show the refined data sets to the business users.

This iterative development approach enables the business and IT to hone in on and deliver the needed information much faster than traditional integration methods.

Even in cases where a data warehouse solution is mandated by specific analytic needs, data virtualization can be used to support rapid prototyping of the solution. The initial solution is built using data virtualization's iterative development approach, with migration to the data warehouse approach once the business is fully satisfied with the information delivered.

In contrast, developing a new information solution using traditional data integration architecture is inherently more complex. Typically, business users must fully and accurately specify their information requirements prior to any development, with little change tolerated. Not only does the development process take longer, but there is a real risk that the resulting solution will not be what the users actually need and want.

Data virtualization offers significant value, and the opportunity to reduce risk and cost, by enabling IT to quickly deliver iterative results that enable users to truly understand what their real information needs are and get a solution that meets those needs.

Ease of Data Virtualization Change Keeps Pace with Business Change
The third way data virtualization improves time-to-solution agility is ease of change. Information needs evolve. So do the associated source systems and consuming applications. Data virtualization allows a more loosely coupled architecture between sources, consumers and the data virtualization objects and middleware that integrate them.

This level of independence makes it significantly easier to extend and adapt existing data virtualization solutions as business requirements or associated source and consumer system implementations change. In fact, changing an existing view, adding a new source or migrating from one source to another is often completed in hours or days, versus weeks or months in the traditional approach.

Conclusion
Data virtualization reduces complexity, data replication and data movement. Business users and IT collaborate to quickly define the initial solution requirements followed by an iterative "develop, get feedback and refine" delivery process. Further independent layers make it significantly easier to extend and adapt existing data virtualization solutions as business requirements or associated source and consumer system implementations change.

These time-to-solution accelerators, as numerous data virtualization case studies demonstrate, make it far easier and faster to develop and deploy data integration solutions using a data virtualization platform than other approaches. The result is faster realization of business benefits.

Editor's Note: Robert Eve is the co-author, along with Judith R. Davis, of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, the first book published on the topic of data virtualization. This series of three articles on How Data Virtualization Delivers Business Agility includes excerpts from the book.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
While not quite mainstream yet, WebRTC is starting to gain ground with Carriers, Enterprises and Independent Software Vendors (ISV’s) alike. WebRTC makes it easy for developers to add audio and video communications into their applications by using Web browsers as their platform. But like any market, every customer engagement has unique requirements, as well as constraints. And of course, one size does not fit all. In her session at WebRTC Summit, Dr. Natasha Tamaskar, Vice President, Head of Cloud and Mobile Strategy at GENBAND, will explore what is needed to take a real time communications ...
What exactly is a cognitive application? In her session at 16th Cloud Expo, Ashley Hathaway, Product Manager at IBM Watson, will look at the services being offered by the IBM Watson Developer Cloud and what that means for developers and Big Data. She'll explore how IBM Watson and its partnerships will continue to grow and help define what it means to be a cognitive service, as well as take a look at the offerings on Bluemix. She will also check out how Watson and the Alchemy API team up to offer disruptive APIs to developers.
The IoT Bootcamp is coming to Cloud Expo | @ThingsExpo on June 9-10 at the Javits Center in New York. Instructor. Registration is now available at http://iotbootcamp.sys-con.com/ Instructor Janakiram MSV previously taught the famously successful Multi-Cloud Bootcamp at Cloud Expo | @ThingsExpo in November in Santa Clara. Now he is expanding the focus to Janakiram is the founder and CTO of Get Cloud Ready Consulting, a niche Cloud Migration and Cloud Operations firm that recently got acquired by Aditi Technologies. He is a Microsoft Regional Director for Hyderabad, India, and one of the f...
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal today!
WebRTC is an up-and-coming standard that enables real-time voice and video to be directly embedded into browsers making the browser a primary user interface for communications and collaboration. WebRTC runs in a number of browsers today and is currently supported in over a billion installed browsers globally, across a range of platform OS and devices. Today, organizations that choose to deploy WebRTC applications and use a host machine that supports audio through USB or Bluetooth can use Plantronics products to connect and transit or receive the audio associated with the WebRTC session.
As enterprises move to all-IP networks and cloud-based applications, communications service providers (CSPs) – facing increased competition from over-the-top providers delivering content via the Internet and independently of CSPs – must be able to offer seamless cloud-based communication and collaboration solutions that can scale for small, midsize, and large enterprises, as well as public sector organizations, in order to keep and grow market share. The latest version of Oracle Communications Unified Communications Suite gives CSPs the capability to do just that. In addition, its integration ...
SYS-CON Media announced today that @ThingsExpo Blog launched with 7,788 original stories. @ThingsExpo Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @ThingsExpo Blog can be bookmarked. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago.
The world's leading Cloud event, Cloud Expo has launched Microservices Journal on the SYS-CON.com portal, featuring over 19,000 original articles, news stories, features, and blog entries. DevOps Journal is focused on this critical enterprise IT topic in the world of cloud computing. Microservices Journal offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. Follow new article posts on Twitter at @MicroservicesE
SYS-CON Events announced today that robomq.io will exhibit at SYS-CON's @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. robomq.io is an interoperable and composable platform that connects any device to any application. It helps systems integrators and the solution providers build new and innovative products and service for industries requiring monitoring or intelligence from devices and sensors.
Wearable technology was dominant at this year’s International Consumer Electronics Show (CES) , and MWC was no exception to this trend. New versions of favorites, such as the Samsung Gear (three new products were released: the Gear 2, the Gear 2 Neo and the Gear Fit), shared the limelight with new wearables like Pebble Time Steel (the new premium version of the company’s previously released smartwatch) and the LG Watch Urbane. The most dramatic difference at MWC was an emphasis on presenting wearables as fashion accessories and moving away from the original clunky technology associated with t...
SYS-CON Events announced today that Litmus Automation will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Litmus Automation’s vision is to provide a solution for companies that are in a rush to embrace the disruptive Internet of Things technology and leverage it for real business challenges. Litmus Automation simplifies the complexity of connected devices applications with Loop, a secure and scalable cloud platform.
In 2015, 4.9 billion connected "things" will be in use. By 2020, Gartner forecasts this amount to be 25 billion, a 410 percent increase in just five years. How will businesses handle this rapid growth of data? Hadoop will continue to improve its technology to meet business demands, by enabling businesses to access/analyze data in real time, when and where they need it. Cloudera's Chief Technologist, Eli Collins, will discuss how Big Data is keeping up with today's data demands and how in the future, data and analytics will be pervasive, embedded into every workflow, application and infra...
As Marc Andreessen says software is eating the world. Everything is rapidly moving toward being software-defined – from our phones and cars through our washing machines to the datacenter. However, there are larger challenges when implementing software defined on a larger scale - when building software defined infrastructure. In his session at 16th Cloud Expo, Boyan Ivanov, CEO of StorPool, will provide some practical insights on what, how and why when implementing "software-defined" in the datacenter.
So I guess we’ve officially entered a new era of lean and mean. I say this with the announcement of Ubuntu Snappy Core, “designed for lightweight cloud container hosts running Docker and for smart devices,” according to Canonical. “Snappy Ubuntu Core is the smallest Ubuntu available, designed for security and efficiency in devices or on the cloud.” This first version of Snappy Ubuntu Core features secure app containment and Docker 1.6 (1.5 in main release), is available on public clouds, and for ARM and x86 devices on several IoT boards. It’s a Trend! This announcement comes just as...
SYS-CON Events announced today that Vicom Computer Services, Inc., a provider of technology and service solutions, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. They are located at booth #427. Vicom Computer Services, Inc. is a progressive leader in the technology industry for over 30 years. Headquartered in the NY Metropolitan area. Vicom provides products and services based on today’s requirements around Unified Networks, Cloud Computing strategies, Virtualization around Software defined Data Ce...
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? Join this panel of experts as they peel away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud environment, and we must architect and code accordingly. At the very least, you’ll have no problem filling in your buzzword bingo cards.
SYS-CON Events announced today that AIC, a leading provider of OEM/ODM server and storage solutions, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. AIC is a leading provider of both standard OTS, off-the-shelf, and OEM/ODM server and storage solutions. With expert in-house design capabilities, validation, manufacturing and production, AIC's broad selection of products are highly flexible and are configurable to any form factor or custom configuration. AIC leads the industry with nearly 20 years of ...
How is unified communications transforming the way businesses operate? In his session at WebRTC Summit, Arvind Rangarajan, Director of Product Marketing at BroadSoft, will discuss how to extend unified communications experience outside the enterprise through WebRTC. He will also review use cases across different industry verticals. Arvind Rangarajan is Director, Product Marketing at BroadSoft. He has over 19 years of experience in the telecommunications industry in various roles such as Software Development, Product Management and Product Marketing, applied across Wireless, Unified Communic...
SYS-CON Events announced today the IoT Bootcamp – Jumpstart Your IoT Strategy, being held June 9–10, 2015, in conjunction with 16th Cloud Expo and Internet of @ThingsExpo at the Javits Center in New York City. This is your chance to jumpstart your IoT strategy. Combined with real-world scenarios and use cases, the IoT Bootcamp is not just based on presentations but includes hands-on demos and walkthroughs. We will introduce you to a variety of Do-It-Yourself IoT platforms including Arduino, Raspberry Pi, BeagleBone, Spark and Intel Edison. You will also get an overview of cloud technologies s...
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch of Docker's initial release in March of 2013, interest was revved up several notches. Then late last...