Containers Expo Blog Authors: Pat Romanski, Yeshim Deniz, Liz McMillan, Flint Brenton, Elizabeth White

Related Topics: Containers Expo Blog

Containers Expo Blog: Blog Post

Data Virtualization Adoption Propelled by Significant Business Benefits

Faster, cheaper, better...data virtualization middleware platforms provide critical data integration capabilities

Enterprise adoption of data virtualization accelerated in 2011 propelled by organizations growing need for greater business agility, lower costs and better performance.

These benefits were fully described in an earlier series of articles:

The success of data virtualization can now be observed across hundreds of organizations and is clearly evident in the ten case studies described in the recently published Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility.

In this article, I will describe data virtualization at a high level and explain how data virtualization technology works.

What is Data Virtualization

Data virtualization is a data integration approach and technology used by innovative organizations to achieve greater business agility and reduce costs.

Data virtualization technology is a form of middleware that leverages high-performance software and an advanced computing architecture to integrate and deliver to both internal and external consumers data from multiple, disparate sources in a loosely coupled, logically-federated manner.

By implementing a virtual data integration layer between data consumers and existing data sources, the organization avoids the need for physical data consolidation and replicated data storage. Thus, data virtualization enables the organization to accelerate delivery of new and revised business solutions while also reducing both initial and ongoing solution costs.

Most front-end business applications, including BI, analytics and transaction systems, can access data through the data virtualization layer. Consumption is on demand from the original data sources, including transaction systems, operational data stores, data warehouses and marts, big data, external data sources and more.

High performance query algorithms and other optimization techniques ensure timely, up-to-the-minute data delivery.

Logical data models, in the form of tabular or hierarchical schemas, ensure data quality and completeness.

Standard APIs and an open architecture simplify the consumer-to-middleware-to-data source connections.

Data virtualization middleware platforms provide the functionality described above within integrated offerings that support the full software development life cycle, high-performance run-time execution and reliable, 24x7x365 operation.

How Data Virtualization Technology Works

The primary objects created and used in data virtualization are views and data services.

These objects encapsulate the logic necessary to access, federate, transform and abstract source data and deliver the data to consumers.

These objects can vary in scope and function depending on the business need, canonical information standards and other usage objectives. Individual objects can call other objects in order to perform additional functions. This is often done using a layered, or hierarchical, approach where objects that perform application delivery functions call objects that perform transformation and conformance functions which, in turn, call objects that perform source data access and validation functions.

The ability to reuse common objects in this way provides flexibility, accelerates new development and reduces costs.

The grouping of objects related to a single domain or subject area, such as trades in financial services or projects in research and development, can be used to create the data virtualization equivalent of a subject-oriented data mart. Multiple domains can then be combined to create the virtual equivalent of a data warehouse.

As a result, data virtualization can be adopted in a phased manner, starting with a narrow set of application use cases and expanding over time to a wider, enterprise-scale adoption.

A data virtualization platform consists of three primary middleware components that perform a full range of development, run time and management functions. These include:

  • Integrated Development Environment
  • Data Virtualization Server Environment
  • Management Environment

Integrated Development Environment

Data virtualization technology includes an integrated development environment (IDE) that can be used by a range of people, from business analysts to application developers, to define and implement the appropriate view and data service objects.

The foundation of these views and services is an underlying logical data model that is, in turn, based on either a tabular or hierarchical schema. Data quality requirements, such as standards conformance, enrichment, augmentation, validation and masking; and security controls (e.g., authentication and authorization) can also be also implemented within these object definitions.

The IDE includes profiling-like introspection and relationship discovery capabilities designed to simplify each developer's understanding of existing data sources and jump-start the modeling process.

To limit the coding required and save development time, drag and- drop modeling techniques and a rich set of pre-built, any-to-any transformations automatically generate view or data service objects. Multiple languages (SQL, XQuery, Java, etc.) can extend these capabilities to address more advanced data virtualization needs.

Standard source and consumer APIs, based on ODBC, JDBC, SOAP, REST, etc., simplify source data access and consumer delivery development activities.

Integrated data governance, including lineage and where used, metadata asset management and versioning provide needed controls.

Data Virtualization Server Environment

In data virtualization, run-time activities are typically triggered by queries, or requests for data, from a consuming application. The data virtualization server is the component that executes these queries.

The query engine within the server, which is specifically designed to process federated queries across multiple sources in a wide-area network, optimizes and executes queries across one or more data sources as defined by the view or data service.

Cost- and rule-based optimizers automatically calculate the best query plan for each individual query from a wide variety of supported join techniques. Parallel processing, predicate push-down, scan multiplexing and constraint propagation techniques optimize database and network resources.

The data virtualization server also does the following:

  • Transforms query results sets to ensure that the data is complete, high quality and consumable by the user.
  • Executes authentication and authorization security functions to protect data from improper use.
  • Caches appropriate data sets to enhance both performance and availability.

To complete the query, the server delivers the results directly to the consuming application and logs all activities.

Management Environment

Data virtualization servers are configured for development, testing, staging, production, back-up and failover operations.

To manage this topology, meet service-level agreements (SLAs) and ensure reliable 24x7x365 operations, the data virtualization platform also includes a complete set of integrated management tools.

These integrated tools support all the activities required to set up the data virtualization middleware and users, including provisioning the software, granting access to sources, integrating with LDAP and other security tools, etc.

System management tools manage server sessions and resources.

Monitoring tools log activities, monitor memory and CPU usage, as well as display key health indicators in dashboards.

Optional clustering tools improve workload sharing and synchronization across servers.

Data Virtualization Platform Examples

A number of enterprise software vendors provide data virtualization technology.

Several of these solutions are delivered as extensions to other technology platforms, such as BI, ETL or an enterprise service bus (ESB).

Others, such as the Composite Data Virtualization Platform from Composite Software, are complete, standalone data virtualization platforms.


With increasing pressure to move faster, save money and perform better, organizations have adopted data virtualization technology with successful results.

Data virtualization middleware platforms provide critical data integration capabilities that support the full software development life cycle, high-performance run-time execution and reliable, 24x7x365 operation.

When evaluating data virtualization offerings, different vendors have taken different approaches.  The best selection will require you consider not only functional capabilities, but also domain expertise and complementary services that each vendor can provide.

And finally, check references.  Real users doing real work is the best test.

Editor's Note: Robert Eve is the co-author, along with Judith R. Davis, of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, the first book published on the topic of data virtualization.  This article includes excerpts from the book.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
DXWorldEXPO LLC announced today that ICOHOLDER named "Media Sponsor" of Miami Blockchain Event by FinTechEXPO. ICOHOLDER give you detailed information and help the community to invest in the trusty projects. Miami Blockchain Event by FinTechEXPO has opened its Call for Papers. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Miami Blockchain Event by FinTechEXPO also offers s...
DXWorldEXPO | CloudEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
The IoT Will Grow: In what might be the most obvious prediction of the decade, the IoT will continue to expand next year, with more and more devices coming online every single day. What isn’t so obvious about this prediction: where that growth will occur. The retail, healthcare, and industrial/supply chain industries will likely see the greatest growth. Forrester Research has predicted the IoT will become “the backbone” of customer value as it continues to grow. It is no surprise that retail is ...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
DXWorldEXPO LLC announced today that "Miami Blockchain Event by FinTechEXPO" has announced that its Call for Papers is now open. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Financial enterprises in New York City, London, Singapore, and other world financial capitals are embracing a new generation of smart, automated FinTech that eliminates many cumbersome, slow, and expe...