Welcome!

Containers Expo Blog Authors: Pat Romanski, Elizabeth White, PagerDuty Blog, XebiaLabs Blog, Automic Blog

Related Topics: Containers Expo Blog

Containers Expo Blog: Blog Post

Data Virtualization Adoption Propelled by Significant Business Benefits

Faster, cheaper, better...data virtualization middleware platforms provide critical data integration capabilities

Enterprise adoption of data virtualization accelerated in 2011 propelled by organizations growing need for greater business agility, lower costs and better performance.

These benefits were fully described in an earlier series of articles:




The success of data virtualization can now be observed across hundreds of organizations and is clearly evident in the ten case studies described in the recently published Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility.

In this article, I will describe data virtualization at a high level and explain how data virtualization technology works.

What is Data Virtualization

Data virtualization is a data integration approach and technology used by innovative organizations to achieve greater business agility and reduce costs.

Data virtualization technology is a form of middleware that leverages high-performance software and an advanced computing architecture to integrate and deliver to both internal and external consumers data from multiple, disparate sources in a loosely coupled, logically-federated manner.

By implementing a virtual data integration layer between data consumers and existing data sources, the organization avoids the need for physical data consolidation and replicated data storage. Thus, data virtualization enables the organization to accelerate delivery of new and revised business solutions while also reducing both initial and ongoing solution costs.

Most front-end business applications, including BI, analytics and transaction systems, can access data through the data virtualization layer. Consumption is on demand from the original data sources, including transaction systems, operational data stores, data warehouses and marts, big data, external data sources and more.

High performance query algorithms and other optimization techniques ensure timely, up-to-the-minute data delivery.

Logical data models, in the form of tabular or hierarchical schemas, ensure data quality and completeness.

Standard APIs and an open architecture simplify the consumer-to-middleware-to-data source connections.

Data virtualization middleware platforms provide the functionality described above within integrated offerings that support the full software development life cycle, high-performance run-time execution and reliable, 24x7x365 operation.

How Data Virtualization Technology Works

The primary objects created and used in data virtualization are views and data services.

These objects encapsulate the logic necessary to access, federate, transform and abstract source data and deliver the data to consumers.

These objects can vary in scope and function depending on the business need, canonical information standards and other usage objectives. Individual objects can call other objects in order to perform additional functions. This is often done using a layered, or hierarchical, approach where objects that perform application delivery functions call objects that perform transformation and conformance functions which, in turn, call objects that perform source data access and validation functions.

The ability to reuse common objects in this way provides flexibility, accelerates new development and reduces costs.

The grouping of objects related to a single domain or subject area, such as trades in financial services or projects in research and development, can be used to create the data virtualization equivalent of a subject-oriented data mart. Multiple domains can then be combined to create the virtual equivalent of a data warehouse.

As a result, data virtualization can be adopted in a phased manner, starting with a narrow set of application use cases and expanding over time to a wider, enterprise-scale adoption.

A data virtualization platform consists of three primary middleware components that perform a full range of development, run time and management functions. These include:

  • Integrated Development Environment
  • Data Virtualization Server Environment
  • Management Environment

Integrated Development Environment

Data virtualization technology includes an integrated development environment (IDE) that can be used by a range of people, from business analysts to application developers, to define and implement the appropriate view and data service objects.

The foundation of these views and services is an underlying logical data model that is, in turn, based on either a tabular or hierarchical schema. Data quality requirements, such as standards conformance, enrichment, augmentation, validation and masking; and security controls (e.g., authentication and authorization) can also be also implemented within these object definitions.

The IDE includes profiling-like introspection and relationship discovery capabilities designed to simplify each developer's understanding of existing data sources and jump-start the modeling process.

To limit the coding required and save development time, drag and- drop modeling techniques and a rich set of pre-built, any-to-any transformations automatically generate view or data service objects. Multiple languages (SQL, XQuery, Java, etc.) can extend these capabilities to address more advanced data virtualization needs.

Standard source and consumer APIs, based on ODBC, JDBC, SOAP, REST, etc., simplify source data access and consumer delivery development activities.

Integrated data governance, including lineage and where used, metadata asset management and versioning provide needed controls.

Data Virtualization Server Environment

In data virtualization, run-time activities are typically triggered by queries, or requests for data, from a consuming application. The data virtualization server is the component that executes these queries.

The query engine within the server, which is specifically designed to process federated queries across multiple sources in a wide-area network, optimizes and executes queries across one or more data sources as defined by the view or data service.

Cost- and rule-based optimizers automatically calculate the best query plan for each individual query from a wide variety of supported join techniques. Parallel processing, predicate push-down, scan multiplexing and constraint propagation techniques optimize database and network resources.

The data virtualization server also does the following:

  • Transforms query results sets to ensure that the data is complete, high quality and consumable by the user.
  • Executes authentication and authorization security functions to protect data from improper use.
  • Caches appropriate data sets to enhance both performance and availability.

To complete the query, the server delivers the results directly to the consuming application and logs all activities.

Management Environment

Data virtualization servers are configured for development, testing, staging, production, back-up and failover operations.

To manage this topology, meet service-level agreements (SLAs) and ensure reliable 24x7x365 operations, the data virtualization platform also includes a complete set of integrated management tools.

These integrated tools support all the activities required to set up the data virtualization middleware and users, including provisioning the software, granting access to sources, integrating with LDAP and other security tools, etc.

System management tools manage server sessions and resources.

Monitoring tools log activities, monitor memory and CPU usage, as well as display key health indicators in dashboards.

Optional clustering tools improve workload sharing and synchronization across servers.

Data Virtualization Platform Examples

A number of enterprise software vendors provide data virtualization technology.

Several of these solutions are delivered as extensions to other technology platforms, such as BI, ETL or an enterprise service bus (ESB).

Others, such as the Composite Data Virtualization Platform from Composite Software, are complete, standalone data virtualization platforms.

Conclusion

With increasing pressure to move faster, save money and perform better, organizations have adopted data virtualization technology with successful results.

Data virtualization middleware platforms provide critical data integration capabilities that support the full software development life cycle, high-performance run-time execution and reliable, 24x7x365 operation.

When evaluating data virtualization offerings, different vendors have taken different approaches.  The best selection will require you consider not only functional capabilities, but also domain expertise and complementary services that each vendor can provide.

And finally, check references.  Real users doing real work is the best test.

Editor's Note: Robert Eve is the co-author, along with Judith R. Davis, of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, the first book published on the topic of data virtualization.  This article includes excerpts from the book.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
Consumers increasingly expect their electronic "things" to be connected to smart phones, tablets and the Internet. When that thing happens to be a medical device, the risks and benefits of connectivity must be carefully weighed. Once the decision is made that connecting the device is beneficial, medical device manufacturers must design their products to maintain patient safety and prevent compromised personal health information in the face of cybersecurity threats. In his session at @ThingsExpo...
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Massive Networks will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Massive Networks mission is simple. To help your business operate seamlessly with fast, reliable, and secure internet and network solutions. Improve your customer's experience with outstanding connections to your cloud.
SYS-CON Events announced today that Grape Up will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company specializing in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the U.S. and Europe, Grape Up works with a variety of customers from emergi...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
Because IoT devices are deployed in mission-critical environments more than ever before, it’s increasingly imperative they be truly smart. IoT sensors simply stockpiling data isn’t useful. IoT must be artificially and naturally intelligent in order to provide more value In his session at @ThingsExpo, John Crupi, Vice President and Engineering System Architect at Greenwave Systems, will discuss how IoT artificial intelligence (AI) can be carried out via edge analytics and machine learning techn...
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution and join Akvelon expert and IoT industry leader, Sergey Grebnov, in his session at @ThingsExpo, for an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, will examine the regulations and provide insight on how it affects technology, challenges the established rules and will usher in new levels of diligence a...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics ...
In the enterprise today, connected IoT devices are everywhere – both inside and outside corporate environments. The need to identify, manage, control and secure a quickly growing web of connections and outside devices is making the already challenging task of security even more important, and onerous. In his session at @ThingsExpo, Rich Boyer, CISO and Chief Architect for Security at NTT i3, discussed new ways of thinking and the approaches needed to address the emerging challenges of security i...
SYS-CON Events announced today that Dasher Technologies will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Dasher Technologies, Inc. ® is a premier IT solution provider that delivers expert technical resources along with trusted account executives to architect and deliver complete IT solutions and services to help our clients execute their goals, plans and objectives. Since 1999, we'v...
There is only one world-class Cloud event on earth, and that is Cloud Expo – which returns to Silicon Valley for the 21st Cloud Expo at the Santa Clara Convention Center, October 31 - November 2, 2017. Every Global 2000 enterprise in the world is now integrating cloud computing in some form into its IT development and operations. Midsize and small businesses are also migrating to the cloud in increasing numbers. Companies are each developing their unique mix of cloud technologies and service...
SYS-CON Events announced today that IBM has been named “Diamond Sponsor” of SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California.
SYS-CON Events announced today that Datera, that offers a radically new data management architecture, has been named "Exhibitor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datera is transforming the traditional datacenter model through modern cloud simplicity. The technology industry is at another major inflection point. The rise of mobile, the Internet of Things, data storage and Big...
SYS-CON Events announced today that Akvelon will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Akvelon is a business and technology consulting firm that specializes in applying cutting-edge technology to problems in fields as diverse as mobile technology, sports technology, finance, and healthcare.