Welcome!

Containers Expo Blog Authors: Elizabeth White, Liz McMillan, Zakia Bouachraoui, Pat Romanski, Yeshim Deniz

Related Topics: Containers Expo Blog, Microservices Expo, @CloudExpo

Containers Expo Blog: Article

What's So Great About Data Virtualization?

Value add from an additional data integration approach

Business Change Guides Data Integration Strategy
Traditional data integration approaches such as consolidation and replication alone are unable to keep pace with today's dynamic pace of business and ever changing information requirements. Seeking greater agility, lower costs, and less risk, organizations are increasing adopting  data virtualization.

Gain Agility and Cut Costs with Data Virtualization
Data virtualization is high-performance query middleware that integrates data from multiple, disparate sources - anywhere across the extended enterprise - in a unified, logically virtualized manner for on-demand consumption by a wide range of business solutions.

Data virtualization has a simple value proposition.  It  provides a more agile, lower cost data integration approach that overcomes data complexity and disparate silos to provide business with the timely data it needs to meet today’s ever-changing business requirements.  Data virtualization delivers the following benefits:

  • Improve Business Performance – Provide the information required to increase revenue and improve productivity;
  • Increase Agility – Beat competitors by responding faster to new and rapidly changing information demands;
  • Reduce Costs – Save staff and infrastructure resources from the start and then compound these savings over time;
  • Decrease Risk – Increase IT project success via rapid development and quick iterations;
  • Ensure Compliance – Meet regulatory compliance data requirements faster, for less

Overcome Difficult Data Integration Challenges
Data Virtualization addresses enterprise-scale data integration requirements while avoiding the higher costs and longer lead-times associated with data consolidation and replication.

These challenges include:

  • Business Need for Timely Insight – Up-to-the-minute data is a key business requirement.  Data virtualization's query optimization algorithms and techniques deliver the timely information required whenever needed, without impacting source system performance.
  • Business Need for Complete Picture – Data frequently needs to be combined with other data to provide business with the full picture. Data virtualization's data federation virtually integrates internal and external data in memory, without the cost and overhead of physical data consolidation.
  • Data Proliferation – Identifying and understanding data assets distributed across a range of fit-for-purpose repositories and locations requires significant manual effort. Data virtualization's data discovery saves time and money by automating entity and relationship identification and accelerating data modeling.
  • Data Complexity – Incredible complexity challenges IT’s ability to leverage existing data for new business uses.  Data virtualization’s powerful data abstraction tools simplify complex data, transforming it from native structures and syntax into easy-to-understand, reusable views and data services with common semantics.
  • Data Availability – With so many technologies, formats and standards, successfully surfacing data consumes significant IT resources. Data virtualization's numerous standards-based data access, caching and delivery options flexibly publish all the information business users require.
  • Limited Control – Data is a critical asset that must be governed.  Data virtualization's data governance centralizes metadata management, ensures data security, improves data quality and runs reliably 7x24 across scalable clustered servers to maximize control.
  • Environment of Non-Stop Change – New business requirements, new applications and new data sources make frequent change inevitable. Data virtualization's loosely-coupled data virtualization layer, rapid development tools, automated impact analysis and extensible architecture provide the information agility required to keep pace.

Mix and Match Data Virtualization with other Data Integration Tools
Organizations can flexibly deploy data virtualization to meet a range of integration needs for BI and analysis at both project and enterprise scale.  Further,data virtualization is often used in combination with data consolidation and data replication, to provide architects with the widest set of data integration techniques to address each new information challenge.

See For Yourself
If your organization has yet to start using data virtualization, do not worry.  Even though it is the fastest growing segment in the data integration market, actual penetration is less than 25% today.  If you start now, you won't be too far behind the leaders.  But don't wait too long to see for yourself  "What is so great about Data Virtualization?"

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

IoT & Smart Cities Stories
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessio...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
Rodrigo Coutinho is part of OutSystems' founders' team and currently the Head of Product Design. He provides a cross-functional role where he supports Product Management in defining the positioning and direction of the Agile Platform, while at the same time promoting model-based development and new techniques to deliver applications in the cloud.
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
LogRocket helps product teams develop better experiences for users by recording videos of user sessions with logs and network data. It identifies UX problems and reveals the root cause of every bug. LogRocket presents impactful errors on a website, and how to reproduce it. With LogRocket, users can replay problems.
Data Theorem is a leading provider of modern application security. Its core mission is to analyze and secure any modern application anytime, anywhere. The Data Theorem Analyzer Engine continuously scans APIs and mobile applications in search of security flaws and data privacy gaps. Data Theorem products help organizations build safer applications that maximize data security and brand protection. The company has detected more than 300 million application eavesdropping incidents and currently secu...