Welcome!

Virtualization Authors: Liz McMillan, Plutora Blog, Elizabeth White, Trevor Parsons, Carmen Gonzalez

Related Topics: Virtualization, Java, SOA & WOA, IT SOLUTIONS GUIDE

Virtualization: Article

VERITAS Celebrates 15 Years - Next Big Focus: Grid Computing

VERITAS Celebrates 15 Years - Next Big Focus: Grid Computing

VERITAS Software reached a milestone today as it celebrated its 15th anniversary as a consistent innovator and leader in providing to customers computing solutions.

Rather than focusing solely on its achievements on such a day,  VERITAS is taking the time to reaffirm its business philosophy that puts servicing customers and their needs ahead of everything else. Moreover, in keeping with its history as an innovator, the company announced that it's positioned to bring the next computing paradigm - utility computing - to its customers.

Co-founder Dale Shipley formed Tolerant Software in 1988 with the noble-minded intention of fostering engineering excellence through software development aimed at ensuring the availability of computing systems and mission-critical applications. VERITAS Software emerged from Tolerant Software in 1989 which itself emerged from Tolerant Transaction Systems. Flash-forward fifteen years and one now finds a company that is a leading independent provider of software enabling utility computing, with over 7,100 employees and business enterprises in more than 40 countries around the world.

Reflecting upon the modest roots of VERITAS, Shipley said:

"From the beginning in 1989 when we formed VERITAS Software the cornerstone of our success centered upon convincing and securing the support of AT&T UNIX System Laboratories, Inc. to trust us to extract the valuable code from the Tolerant Systems high availability operating system and develop and deliver the industry's first robust UNIX System V transaction-based volume manager and file system thereby contributing the early success of AT&T and Sun Microsystems in the promising UNIX marketplace."

In describing how intelligent risk-taking enabled VERITAS to flourish, Shipley added, "It was this calculated risk that opened the door to the broad horizontal adoption of VERITAS software by the myriad of UNIX systems vendors and paved the path toward the company's leadership in storage management and data protection. The company's initial value proposition still holds true today, a company's most valuable IT assets are not the hardware systems, but rather the data that resides on, and runs across them and the protection and availability of that data is paramount."

When VERITAS initially began developing its business plan, the company kept things remarkably focused and simple. A hallmark of many great business ventures. Just starting to make its way in the world, the company strategized on three priorities that would be their focal point. Namely, the elimination of system downtime, the simplification of overly complex tasks and development of a Graphical User Interface (GUI) to visualize storage management and administrative tasks, and lastly, the adoption of a maniacal focus toward enhancing the performance of systems.

At a time when customers' file systems were "naturally" serviced by the manufacturer of their operating systems, VERITAS' proposition, calling for the independent servicing of enterprise file systems, seemed absurd. However, the company was able to not only convince customers that the start-up could provide better support of their files, but VERITAS backed this claim up with objective benchmarks so customers could see what they were gaining. Thus, VERITAS Volume Manager was introduced in 1990 and VERITAS File System in 1991. The products are two of the company's flagship products that can be found at the heart of nearly 70 percent of the world's data centers today.

Highlighting the company's customer centric philosophy, John Colgrove, VERITAS Fellow and 15 year veteran, VERITAS, said, "If you listen to customer's problems and what they're trying to do and you listen to their goals you get ideas for how you can go and create solutions to solve them. This remains our long term approach and we're working on some very interesting things for customers that will really deliver huge value when they choose our software."

VERITAS' customer base encompasses both small business and large corporations running highly complex enterprise systems. Both these business segments have helped to make VERITAS the industry standard for data backup and protection. VERITAS merged with OpenVision in 1997 and delivered its first release of VERITAS NetBackup software to customers. The company's engineers designed the software to provide maximum backup capabilities with the least disruption to end users. It does not seem surprising then that NetBackup is the No.1 backup and recovery software that allows companies to reduce costs by taking advantage of advanced enterprise features, such as synthetics, disk-based protection, automated disaster recovery, and desktop and laptop protection.

As the company advances, it brings to its customers a history steeped in providing solutions that are not only cutting-edge, but also extremely employable in reducing costs and preparing for revolutions that are gearing to take place. With grid computing, IT managers are being encouraged to shift from reactive problem resolution to proactive availability and failover testing of servers, storage systems, and applications before effecting business. By combining VERITAS Cluster Server with VERITAS i3, VERITAS OpForce and VERITAS UpScale software (to be released in 2005), IT managers will realize how much automating IT infrastructure can save their company.

Oleg Kiselev, technical director and 14 year veteran, VERITAS, speaking of utility computing said, "Utility computing is much like the box filled with wooden building blocks at your child's school; one day the children share the blocks to build a castle and a space station and the next day they use the same blocks to build whatever their imaginations come up with. Now apply that analogy to a box full of IT building blocks-- storage, servers, switches, and software. With VERITAS software, IT can provision those resources when they're needed, where they're needed, as they're needed and when they are no longer needed they can place them back in a resource pool or put them back in the box. Add to this the ability to automatically specify and maintain agreed upon application service and availability levels and provide fine-grain usage accounting and you have a powerful set of building blocks. In a nutshell our approach is to deliver the software building blocks that enable utility computing."

VERITAS today is one of the 10 largest software companies in the world. Using a utility computing model, VERITAS aligns IT resources with business needs. Ninety-nine percent of the Fortune 500 Companies use VERITAS products for data protection, storage and server management, and application performance management.

More Stories By Security News Desk

SYS-CON's Security News desk trawls the world of security for news of software, hardware, products, and services that seems likely to be of interest to infosec professionals and summarizes them for easy assimilation by busy IT managers and staff.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, addressed the big issues involving these technologies and, more important, the results they will achieve. Rodney Rogers, chairman and CEO of Virtustream; Brendan O'Brien, co-founder of Aria Systems, Bart Copeland, president and CEO of ActiveState Software; Jim Cowie, chief scientist at Dyn; Dave Wagstaff, VP and chief architect at BSQUARE Corporation; Seth Proctor, CTO of NuoDB, Inc.; and Andris Gailitis, C...
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
SYS-CON Media announced that Splunk, a provider of the leading software platform for real-time Operational Intelligence, has launched an ad campaign on Big Data Journal. Splunk software and cloud services enable organizations to search, monitor, analyze and visualize machine-generated big data coming from websites, applications, servers, networks, sensors and mobile devices. The ads focus on delivering ROI - how improved uptime delivered $6M in annual ROI, improving customer operations by mining large volumes of unstructured data, and how data tracking delivers uptime when it matters most.
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immediate and actionable interpretation of events as they happen. Another aspect concerns how to deliver ...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
Code Halos - aka "digital fingerprints" - are the key organizing principle to understand a) how dumb things become smart and b) how to monetize this dynamic. In his session at @ThingsExpo, Robert Brown, AVP, Center for the Future of Work at Cognizant Technology Solutions, outlined research, analysis and recommendations from his recently published book on this phenomena on the way leading edge organizations like GE and Disney are unlocking the Internet of Things opportunity and what steps your organization should be taking to position itself for the next platform of digital competition.
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happens, where data lives and where the interface lies. For instance, it's a mix of architectural styles ...
IoT is still a vague buzzword for many people. In his session at @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, discussed the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. He also discussed how IoT is perceived by investors and how venture capitalist access this space. Other topics discussed were barriers to success, what is new, what is old, and what the future may hold. Mike Kavis is Vice President & Principal Cloud Architect at Cloud Technology Pa...
Dale Kim is the Director of Industry Solutions at MapR. His background includes a variety of technical and management roles at information technology companies. While his experience includes work with relational databases, much of his career pertains to non-relational data in the areas of search, content management, and NoSQL, and includes senior roles in technical marketing, sales engineering, and support engineering. Dale holds an MBA from Santa Clara University, and a BA in Computer Science from the University of California, Berkeley.
The Internet of Things (IoT) is rapidly in the process of breaking from its heretofore relatively obscure enterprise applications (such as plant floor control and supply chain management) and going mainstream into the consumer space. More and more creative folks are interconnecting everyday products such as household items, mobile devices, appliances and cars, and unleashing new and imaginative scenarios. We are seeing a lot of excitement around applications in home automation, personal fitness, and in-car entertainment and this excitement will bleed into other areas. On the commercial side, m...
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your online presence.