Welcome!

Virtualization Authors: Victoria Livschitz, Lori MacVittie, Jim Kaskade, Adrian Bridgwater, Elizabeth White

Related Topics: Virtualization, SOA & WOA

Virtualization: Article

Is Now the Right Time for Data Virtualization?

Leading BI analyst says “Data virtualization: the time has come”

A Tectonic Shift - Think Data Virtualization First!
Wayne Eckerson, Director of Research at TechTarget and former head of research for TDWI, "used to think that data virtualization tools were great for niche applications, such as creating a quick and dirty prototype or augmenting the data warehouse with real-time data in an operational system or accessing data outside the corporate firewall."

But in his October 17, 2011 article entitled Data Virtualization: The Time Has Come, he now believes "that data virtualization is the key to creating an agile, cost-effective data management infrastructure.  In fact, data architects should first design and deploy a data virtualization layer prior to building any data management or delivery artifacts."

This tectonic shift in thinking demonstrates how dynamic business model changes including innovative new offerings, changing competitive landscapes, M&A activities  along with new information technologies such as big data, cloud computing and data virtualization are breaking traditional IT architectures and approaches.

Data Virtualization Seen Through a Second Lens
As the marketing leader for a data virtualization software company, Composite Software, it is gratifying when a well-regarded IT analyst such as Wayne takes a firm position in favor of one's category, and at the same time be so articulate in communicating this message to the business and IT community.

Further, such an article provides a great opportunity to contrast Wayne's description of data virtualization's business benefits and technology with my own.  Let's do this comparison together to see what we can learn from his new insights.

Business Benefits a la Composite Software:
In my recent book, "Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility," I defined data virtualization's business benefits in the context of the universal challenge of business agility.  From our customer case studies, we at Composite Software have seen data virtualization used by innovative organizations to achieve business agility in three ways:

  • Business decision agility - Data virtualization delivers the complete high-quality, actionable information required for agile business decision making.
  • Time-to-solution agility - Data virtualization uses a streamlined approach, an iterative development process and ease of change to significantly accelerate IT time to solution.
  • Resource agility - Data virtualization directly enables greater resource agility through superior developer productivity, lower infrastructure costs and better optimization of data integration solutions.

Another Take on Data Virtualization's Business Benefits
In Wayne's article, he too addresses these business decision, time-to-solution and resource agility points using slightly different terminology.  "With data virtualization, organizations can integrate data without physically consolidating it. In other words, they don't have to build a data warehouse or data mart to deliver an integrated view of data, which saves considerable time and money. In addition, data virtualization lets administrators swap out or redesign back-end databases and systems without affecting downstream applications.

The upshot is that IT project teams can significantly reduce the time they spend sourcing, accessing, and integrating data, which is the lionshare of work in any data warehousing project.  In other words, data virtualization speeds project delivery, increases business agility, reduces costs, and improves customer satisfaction. What's not to like?"

My Data Virtualization Technology Description
To describe data virtualization technology, in the book I described it as "a form of middleware that leverages high-performance software and an advanced computing architecture to integrate and deliver to both internal and external consumers data from multiple, disparate sources in a loosely-coupled, logically-federated manner.

By implementing a virtual data integration layer between data consumers and existing data sources, the organization avoids the need for physical data consolidation and replicated data storage. Thus, data virtualization enables the organization to accelerate delivery of new and revised business solutions while also reducing both initial and ongoing solution costs.

Most front-end business applications, including BI, analytics and transaction systems, can access data through the data virtualization layer.  Consumption is on demand from the original data sources, including transaction systems, operational data stores, data warehouses and marts, big data, external data sources and more.

High performance query algorithms and other optimization techniques ensure timely, up-to-the-minute data delivery.  Logical data models, in the form of tabular or hierarchical schemas, ensure data quality and completeness.  Standard APIs and an open architecture simplify the consumer-to-middleware-to-data source connections."

Data Virtualization Technology Described Another Way
In Wayne's article, he describes data virtualization technology in a similar fashion.

"Data virtualization software makes data spread across physically distinct systems appear as a set of tables in a local database.  Business users, developers, and applications query this virtualized view and the software automatically generates an optimized set of queries that fetch data from remote systems, merge the disparate data on the fly, and deliver the result to users.

Data virtualization software consumes virtually any type of data, including SQL, MDX, XML, Web services, and flat files and publishes the data as SQL tables or Web services. Essentially, data virtualization software turns data into a service, hiding the complexity of back-end data structures behind a standardized information interface."

Data Virtualization's Time Has Come!
Critical business issues such as new product innovation, completion, M&A and more put increasing stress on IT to respond faster with new information solutions.  Old approaches and technologies won't keep pace.

The decision-making, time-to-solution and resource agility benefits of data virtualization in addressing these business needs are undeniable.  And as you can see, while not always described 100% consistently, data virtualization's value and solution are being increasingly recognized by IT mavens such as Wayne Eckerson.

Wayne said it best. "Data Virtualization: The Time Has Come!"

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

@ThingsExpo Stories
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at @ThingsExpo, Robin Raymond, Chief Architect at Hookflash, will walk through the shifting landscape of traditional telephone and voice services ...
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) i...
WebRTC defines no default signaling protocol, causing fragmentation between WebRTC silos. SIP and XMPP provide possibilities, but come with considerable complexity and are not designed for use in a web environment. In his session at @ThingsExpo, Matthew Hodgson, technical co-founder of the Matrix.org, discussed how Matrix is a new non-profit Open Source Project that defines both a new HTTP-based standard for VoIP & IM signaling and provides reference implementations.
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Architect for the Internet of Things and Intelligent Systems at Red Hat, described how to revolutioniz...
The definition of IoT is not new, in fact it’s been around for over a decade. What has changed is the public's awareness that the technology we use on a daily basis has caught up on the vision of an always on, always connected world. If you look into the details of what comprises the IoT, you’ll see that it includes everything from cloud computing, Big Data analytics, “Things,” Web communication, applications, network, storage, etc. It is essentially including everything connected online from hardware to software, or as we like to say, it’s an Internet of many different things. The difference ...
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
The security devil is always in the details of the attack: the ones you've endured, the ones you prepare yourself to fend off, and the ones that, you fear, will catch you completely unaware and defenseless. The Internet of Things (IoT) is nothing if not an endless proliferation of details. It's the vision of a world in which continuous Internet connectivity and addressability is embedded into a growing range of human artifacts, into the natural world, and even into our smartphones, appliances, and physical persons. In the IoT vision, every new "thing" - sensor, actuator, data source, data con...
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happens, where data lives and where the interface lies. For instance, it's a mix of architectural styles ...
SYS-CON Events announced today that Gridstore™, the leader in hyper-converged infrastructure purpose-built to optimize Microsoft workloads, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Gridstore™ is the leader in hyper-converged infrastructure purpose-built for Microsoft workloads and designed to accelerate applications in virtualized environments. Gridstore’s hyper-converged infrastructure is the industry’s first all flash version of HyperConverged Appliances that include both compute and storag...
The 3rd International @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades.
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and assessments, including a decade of leading incident response and digital forensics. He is co-author of t...
The Internet of Things promises to transform businesses (and lives), but navigating the business and technical path to success can be difficult to understand. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, demonstrated how to approach creating broadly successful connected customer solutions using real world business transformation studies including New England BioLabs and more.
There's Big Data, then there's really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. In her session at Big Data Expo®, Hannah Smalltree, Director at Treasure Data, discussed how IoT, Big Data and deployments are processing massive data volumes from wearables, utilities and other machines...
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.