Containers Expo Blog Authors: Elizabeth White, Pat Romanski, Liz McMillan, Derek Weeks, Hollis Tibbetts

Related Topics: Containers Expo Blog, Microservices Expo

Containers Expo Blog: Article

Five Ways Data Virtualization Improves Data Warehousing

Data virtualization fills the EDW agility gap

An array of business intelligence (BI), predictive analytics, data and content mining, portals and more tap a growing volume of information sourced from enterprise data warehouses (EDW).  However, significant volumes of business-critical enterprise data resides outside the enterprise data warehouse.  To deliver the most comprehensive information to business decision-makers, IT teams are implementing data virtualization to preserve and extend their existing enterprise data warehouse investments.

This article discusses five integration patterns that combine both enterprise data warehouses and data virtualization to solve real business and IT problems along with examples from Composite Software's data virtualization customers.  The five patterns include:

  1. Data Warehouse Augmentation
  2. Data Warehouse Federation
  3. Data Warehouse Hub and Virtual Data Mart Spoke
  4. Complementing the ETL Process
  5. Data Warehouse Prototyping

Maximizing Value from Enterprise Data Warehouse Investments
Supporting critical, yet ever-changing information requirements in an environment of ever-increasing data volumes and complexity is a challenge well understood by large enterprises and government agencies today.

This inexorable pressure has and will continue to drive the demand for enterprise data warehouses as an array of BI, predictive analytics, data and content mining, portals and other key applications rely on data sourced from enterprise data warehouses.

However, business change often outpaces enterprise data warehouse evolution.  And while useful for physically consolidating and transforming a large portion of enterprise data, significant volumes of enterprise data resides outside the confines of the enterprise data warehouse.  Further, enterprise data warehouses themselves require support throughout their lifecycles, driving demand for solutions that prototype, migrate, extend, federate and leverage enterprise data warehouse assets.

Data virtualization middleware, an advanced version of earlier data federation or enterprise information integration (EII) middleware, complements enterprise data warehouses by providing a range of flexible data integration techniques that preserve, extend and thereby drive greater business value from existing enterprise data warehouse investments.

1. Data Warehouse Augmentation
Organizations overwhelmed by scattered data silos and exponentially growing data volumes have deployed data warehouses to meet many of their reporting requirements.  However, a number of data sources remain outside the warehouse.  Providing users with complete business insight in support of revenue, cost and risk management goals often requires the following:

  • Historical data from the warehouse and up-to-the-minute data from transaction systems or operational data stores;
  • Summarized data from the warehouse and drill-down detail from transaction systems or operational data stores;
  • Master customer, product or employee data from an MDM hub or warehouse and detail from transaction systems or operational data stores; and
  • Internal data from the warehouse and external data from outside sources including cloud computing.

Data virtualization effectively federates data-warehouse information with additional sources, therefore extending existing data warehouse schemas and data.  These complementary views are conducive to adding current data to historical warehouse data, detailed data to summarized warehouse data, and external data to internal warehouse data.

Energy Company Combines Up-to-the-minute and Historical Data - To optimize deployment of repair crews and equipment across more than 10,000 production oil wells, an energy company uses data virtualization to federate real-time crew, equipment and well status data from their wells and SAP's maintenance management system with historical surface, subsurface and business data from their enterprise data warehouse.  The net result is faster repairs for more uptime and thus more revenue.

2. Data Warehouse Federation
A primary reason enterprises implement data warehouses is to overcome the various transaction and analytic system silos typical in most large enterprise and government agencies today.  However, for a number of often pragmatic reasons, the single "enterprise" data warehouse remains elusive.  Instead, for these same reasons, multiple data warehouses and data marts have been developed and deployed, in effect perpetuating, rather than overcoming, the data silo issue.

Optimizing business performance requires data from across these various warehouses and marts.   But physically combining multiple marts and warehouses into a singular and complete enterprise-wide data warehouse is often too costly and time consuming.

Data virtualization federates multiple physical warehouses.  Two examples include combining data from the sales and financial warehouses, or combining two sales data warehouses after a corporate merger. This approach achieves logical consolidation of warehouses by creating an integrated view across them, using abstraction to rationalize the different schema designs.

Investment Bank Federate Financial Trading Data Warehouses - To enable more flexible customer self-service reporting and meet SEC compliance reporting mandates, a prime brokerage uses data virtualization to federate equity, fixed income and other investment positions and trades information from siloed trading data warehouses.  The net result is higher customer satisfaction and lower reporting costs.

3. Data Warehouse Hub and Virtual Spoke
A typical data warehouse pattern is a central data warehouse hub with satellite data marts as spokes around the hub.  These marts use a subset of the warehouse data and are used by a subset of the data warehouse users.   Sometimes these marts are created because the analytic tools require data in a different form than the warehouse.  On the other hand, they may be created to work around the controls provided by the warehouse, and thus act as "rogue" data marts.  Regardless of the reason, every additional mart adds cost and compromises data quality.

Data virtualization provides virtual data marts that eliminate, or at least significantly reduce, the need for physical data marts around the data warehouse hubs.  This approach abstracts the warehouse data to meet specific consuming tool and user query requirements, while still preserving the quality and controls inherent in the data warehouse.

Mutual Fund Manager Eliminates "Rogue" Financial Data Marts - A mutual fund company uses data virtualization to enable more than 150 financial analysts to build portfolio analysis models with MATLAB® and other analysis tools leveraging a wide range of equity financial data from a 10 terabyte financial research data warehouse.  Prior to introducing data virtualization, analysts frequently spawned new satellite data marts with useful data subsets for every new project.  To accelerate and simplify data access and to stop the proliferation of costly, unnecessary physical marts, the firm instead used data virtualization to create virtual data marts formed from a set of robust, reusable views that directly accessed the financial warehouse on demand.  This enables analysts to spend more time on analysis and less on access, thereby improving portfolio returns.  The IT team has also eliminated extra, unneeded marts and all the costs that go with maintaining them.

4. Complementing the ETL Process
Extract, Transform, and Load (ETL) middleware is the tool of choice for loading data warehouses.  However, there are some cases where ETL tools are not the most effective approach.  Some examples include:

  • ETL tools lack interfaces to easily access source data, for example data from packaged applications such as SAP or new technologies such as web services;
  • Readily available, existing virtual views or data services can be reused rather than building new ETL scripts from scratch; and
  • Tight batch windows require access, abstraction and federation activities to be pre-processed and virtually staged in advance of ETL processes.

ETL tools can leverage data virtualization views and data services as inputs to their batch processes, appearing as another data source. This integration pattern also integrates data source types that ETL tools cannot easily access as well as reuse existing views and services, saving time and costs.  Further these abstractions do not require ETL developers to understand the structure of, or interact directly with, actual data sources, significantly simplifying their work and reducing time to solution.

Energy Company Preprocesses SAP Data - To provide the SAP financial data required for their financial data warehouse, an energy company uses data virtualization to access and abstract SAP R/3 FICO data.  This replaces an error-prone, SAP data-expert-intensive, flat-file-extraction process that would not scale across a complex SAP landscape.  The results include more complete and timely data in the financial data warehouse enabling better performance management.

5. Data Warehouse Prototyping
Building a new data warehouse from scratch is a large undertaking that requires significant design, development and deployment efforts.  One of the biggest issues is schema change, a frequent activity early in a warehouse's lifecycle.   This change process requires modification of both the ETL scripts and physical data in the warehouse and thus becomes a bottleneck that slows new warehouse deployments.  This problem does not go away later in the lifecycle; it just lessens as the pace of change slows.

Data virtualization middleware can be the platform for prototype development environment for a new data warehouse.  In this prototype stage, a virtual data warehouse is built, rather than a physical one, saving the time to build the physical warehouse.  This virtual warehouse includes a full schema that is easy to iterate as well as a complete functional testing environment.  Performance testing is somewhat constrained at this stage, however.

Once the actual warehouse is deployed, the views and data services built during the prototype stage still have value.  These are useful for prototyping and testing subsequent warehouse schema changes that arise as business needs or underlying data sources change.

Government Agency Prototypes New Data Warehouses - To reduce data warehousing time-to-solution for new data warehouse projects and changes to existing ones, a government agency uses data virtualization.  The time spent in getting the data right has proven to be four times faster than directly building the ETL and warehouse, even when the subsequent translation of these working views into ETL scripts and physical warehouse schemas is factored in.

Key Takeaways
As data sources proliferate, including many web-based and cloud computing sources outside the traditional enterprise data warehouse, enterprises and government agencies are deploying solutions that combine enterprise data warehouses and data virtualization to deliver the most comprehensive information to decision-makers.  The results are extended life to existing information system investments, greater agility for adding new BI and other analytic technologies, and less disruption from corporate activities such as mergers and acquisitions.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

@ThingsExpo Stories
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
Cloud based infrastructure deployment is becoming more and more appealing to customers, from Fortune 500 companies to SMEs due to its pay-as-you-go model. Enterprise storage vendors are able to reach out to these customers by integrating in cloud based deployments; this needs adaptability and interoperability of the products confirming to cloud standards such as OpenStack, CloudStack, or Azure. As compared to off the shelf commodity storage, enterprise storages by its reliability, high-availabil...
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and ...
The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
The Open Connectivity Foundation (OCF), sponsor of the IoTivity open source project, and AllSeen Alliance, which provides the AllJoyn® open source IoT framework, today announced that the two organizations’ boards have approved a merger under the OCF name and bylaws. This merger will advance interoperability between connected devices from both groups, enabling the full operating potential of IoT and representing a significant step towards a connected ecosystem.
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet a...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
SYS-CON Events announced today that LeaseWeb USA, a cloud Infrastructure-as-a-Service (IaaS) provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LeaseWeb is one of the world's largest hosting brands. The company helps customers define, develop and deploy IT infrastructure tailored to their exact business needs, by combining various kinds cloud solutions.
A completely new computing platform is on the horizon. They’re called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general. Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some ...
November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Penta Security is a leading vendor for data security solutions, including its encryption solution, D’Amo. By using FPE technology, D’Amo allows for the implementation of encryption technology to sensitive data fields without modification to schema in the database environment. With businesses having their data become increasingly more complicated in their mission-critical applications (such as ERP, CRM, HRM), continued ...
SYS-CON Events announced today that Cloudbric, a leading website security provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Cloudbric is an elite full service website protection solution specifically designed for IT novices, entrepreneurs, and small and medium businesses. First launched in 2015, Cloudbric is based on the enterprise level Web Application Firewall by Penta Security Sys...
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
Most people haven’t heard the word, “gamification,” even though they probably, and perhaps unwittingly, participate in it every day. Gamification is “the process of adding games or game-like elements to something (as a task) so as to encourage participation.” Further, gamification is about bringing game mechanics – rules, constructs, processes, and methods – into the real world in an effort to engage people. In his session at @ThingsExpo, Robert Endo, owner and engagement manager of Intrepid D...
WebRTC adoption has generated a wave of creative uses of communications and collaboration through websites, sales apps, customer care and business applications. As WebRTC has become more mainstream it has evolved to use cases beyond the original peer-to-peer case, which has led to a repeating requirement for interoperability with existing infrastructures. In his session at @ThingsExpo, Graham Holt, Executive Vice President of Daitan Group, will cover implementation examples that have enabled ea...
SYS-CON Events announced today that Enzu will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive advantage. By offering a suite of proven hosting and management services, Enzu wants companies to focus on the core of their online busine...