Welcome!

Containers Expo Blog Authors: Liz McMillan, Pat Romanski, Yeshim Deniz, Elizabeth White, Zakia Bouachraoui

Related Topics: @DXWorldExpo, Java IoT, Microservices Expo, Containers Expo Blog, @CloudExpo, SDN Journal

@DXWorldExpo: Article

Gigamon Announces Vision of NFV for Tools

Analytic Data Arbitration Concept to Enable Elastic Big Data Analytics Compute Model

Gigamon on Thursday announced a concept for the future of network monitoring in highly virtualized service provider environments. Gigamon is calling this approach, NFV for Tools (NFVfT), and the vision is to standardize APIs and the demarcation point for network traffic and Big Data to be brokered into an elastic compute architecture for analysis. Gigamon's proposal is to offer visibility arbitration so that the Big Data analytics tools can virtualize their functions and process data on demand enabling a per-unit-of-information-processed pricing model. This vision would create a revolutionary new paradigm for the more effective processing of Big Data by Customer Experience Monitoring (CEM), troubleshooting and Quality of Service (QoS) tools, as well as the OSS/BSS function and other monitoring and analysis solutions.

"Big Data is changing the status quo for mobile carriers. The current business model for service providers could become problematic as they fund the rising cost of transporting this data," said Andy Huckridge, Director of Service Provider Solutions at Gigamon. "Many service providers have now realized the value of the Big Data in their pipes and are now in the process of enabling the monetization of that data. However, this model begins to break down with the legacy analytic tool vendors. The NFV for Tools concept empowers the tools of the future to analyze the bandwidth of the future with the end goal of enabling the monetization of Big Data."

The NFV for Tools concept revolves around the ability for Gigamon's Visibility Fabric™ to normalize, filter and forward data in to a storage medium through a virtual demarcation point. The vision is to utilize the Orchestration Layer of the Unified Visibility Fabric Architecture and any developed APIs to discover which tools are available to the Visibility Fabric and their capabilities. This enables what would be called the Analytic Data Arbitration Function (ADAF) which manages the brokering of analytic tools with the supply of data that needs to be processed or analyzed. The ADAF capability is in many ways similar to a portal, but where a provisioning and arbitration function exists to broker vendors of analytic tools with those who supply data that needs to be processed or analyzed.

"Vistapointe was founded on this exact NFV vision of decoupling the analytic tools from the underlying custom hardware probes and enabling this functionality via software on x86 compute platforms. This architecture eliminates the current method of deploying multiple custom probe-appliances, and leverages the existing data-center compute infrastructure, therefore drastically reducing the total cost of ownership," said Ravi Medikonda, CEO, Vistapointe Inc. "With an integrated solution of Vistapointe software analytic tools and Gigamon's innovative NFVfT, the future of service provider monitoring will create a cost-effective and scalable platform for big-data applications."

"This concept of NFV for tools is a great idea, but could be disruptive for the incumbent vendors if they don't innovate to keep up. It should help carriers generate revenue through Big Data analysis while reducing CAPEX and OPEX and accelerating service deployment," said Ray Mota, Managing Partner at ACG Research. "The dynamic brokering of Big Data analytic services will enable carriers to perform analytics on demand and could certainly create an invaluable new model for them to analyze the vast amounts of data on their networks."

 

About the Visibility Fabric architecture
At Gigamon we realized that delivering the visibility essential to manage, analyze and secure the complex system that is the IT infrastructure requires a new approach. With millions of traffic flows across thousands of endpoints, visibility needs to be pervasive, intelligent and dynamic. Using our patented, unique technology, we created an innovative new approach for delivering this visibility called the Visibility Fabric architecture. This new approach is intelligent and versatile in its ability to enable visibility into the network. For more information, visit http://www.gigamon.com/visibility-fabric-architecture.

About Gigamon
Gigamon provides an intelligent Visibility Fabric™ architecture for enterprises, data centers and service providers around the globe. Our technology empowers infrastructure architects, managers and operators with pervasive and dynamic intelligent visibility of traffic across both physical and virtual environments without affecting the performance or stability of the production network. Through patented technologies and centralized management, the Gigamon GigaVUE portfolio of high availability and high density products intelligently delivers the appropriate network traffic to management, analysis, compliance and security tools. With over eight years' experience designing and building traffic visibility products in the US, Gigamon solutions are deployed globally across vertical markets including over half of the Fortune 100 and many government and federal agencies. www.gigamon.com

More Stories By Elizabeth White

News Desk compiles and publishes breaking news stories, press releases and latest news articles as they happen.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...