Welcome!

Containers Expo Blog Authors: Liz McMillan, Yeshim Deniz, Pat Romanski, Flint Brenton, Elizabeth White

Related Topics: Containers Expo Blog, Java IoT, Microservices Expo, Microsoft Cloud, Open Source Cloud, @CloudExpo

Containers Expo Blog: Article

Be Thankful for Service Virtualization & Simulated Test Environments

Test earlier, faster, and more completely

To reduce the risk of business interruption in today's interconnected systems, organizations need to test across a complex set of applications, such as SAP, mainframes, third-party services, etc.. However, such systems are extraordinarily difficult to access for the purpose of testing. Service Virtualization provides simulated test environments that eliminate these constraints- enabling organizations to test earlier, faster, and more completely. Here are 10 specific reasons to be thankful for service virtualization...

Thankful For Service Virtualization

10. Dev.QA control over the test environment inclusive of dependencies
Development and QA often need to jump through hoops in order to get access to the test environments required to complete their development and testing tasks. Even worse, when the test environment is finally available, it typically lacks applications that lie beyond the organization's control. Service virtualization, with its test environment simulation technology, gives development and QA access to all the relevant application dependencies-including third-party applications-to create complete test environments on demand.

9. Scenario based testing from the outside-in
With today's highly-distributed systems, developers and testers need to invest a significant amount of effort to properly manipulate the environment that the application under test interacts with. As crunch time hits, the amount of work required often becomes prohibitive, resulting in incomplete testing. With service virtualization, it's fast and easy to immediately alter dependent system behavior so that tests can address a broad array of scenarios.

8. Reduce the risks of project failure
It's well known that delaying quality efforts until the end of the project places the entire project at risk-not only for missed deadlines and go-to-market dates, but also significant business risks. Using simulated test environments allows for continuous testing much earlier in the SDLC, which significantly reduces the organization's exposure to risk.

7. Release from large complex data management scenarios
Managing and resetting data from the database perspective requires considerable setup and teardown time. Service virtualization gives you granular control of test data at the component level. This allows the team to start testing earlier, and frees up resources previously required for test data management.

6. Performance testing under variable load from dependent systems
There's no doubt that server virtualization technology has enabled broader access for performance testing. However, the instability of this environment does not allow for consistent testing. Moreover, server virtualization is not applicable for applications that lie beyond the organization's control. Service virtualization's simulated test environments not only allow for discrete independent control over each endpoint, but also enable any permutation of endpoints to be orchestrated in the various ways needed to mimic realistic variable load from dependent systems.

5. Freedom to test early, getting the big showstoppers out of the way
When the team has early dev/test access to a simulated test environment, critical security, performance, and reliability issues will surface earlier-when they are exponentially faster, easier and cheaper to fix.  This early identification and resolution of defects allows for more complete testing later in the lifecycle and increases the prospects of meeting schedule and budget targets.

4. Simulate the performance of mobile applications
The biggest concern around mobile applications is variable performance of mobile apps across different provider networks. Service virtualization can simulate network performance (e.g., latency, error conditions, sporadic connection), allowing for the broad testing needed to test under a realistic spectrum of real-world conditions.

3. Make Agile teams truly agile
It's widely-accepted that testing has become a casualty of iterative development processes. Incomplete and evolving systems seem to limit the depth and breadth of tests that dev and QA are able to execute.  Additionally, the challenge of accessing a realistic test environment typically delays testing until late in each iteration. Service virtualization's test environment simulation eliminates these barriers by providing a realistic, complete test environment on demand-allowing Agile or Agile-ish teams to get to "done."

2. Test from the perspective of an environment, not just the app
The migration to cloud/SaaS applications, as well as SOA/composite applications, has distributed dependencies to a previously unfathomable extent. Service virtualization technologies give developers and testers visibility into-and control over-these "dependencies gone wild." They 1) paint a complete picture of the many dependencies associated with a test environment; 2) provide flexible access to a complete test environment (including the behavior of dependencies such as APIs and third-party applications); and 3) help the team identify evolving environment conditions that impact their test and service virtualization assets-and automatically refactor those assets for fast, intelligent updating.

1. Significantly reduce CapEx and OpEx associated with test infrastructure
Although server virtualization can assist to reduce the CapEx associated with test environments, it applies only to applications that are under your organization's control. Extending staged environments is extraordinarily costly and the OpEx associated with staged environments is a significant deterrence given the total cost of ownership. Using service virtualization and its test environment simulation technologies delivers control to the end users (dev/QA) and eliminates the need for superfluous hardware.

More Stories By Cynthia Dunlop

Cynthia Dunlop, Lead Content Strategist/Writer at Tricentis, writes about software testing and the SDLC—specializing in continuous testing, functional/API testing, DevOps, Agile, and service virtualization. She has written articles for publications including SD Times, Stickyminds, InfoQ, ComputerWorld, IEEE Computer, and Dr. Dobb's Journal. She also co-authored and ghostwritten several books on software development and testing for Wiley and Wiley-IEEE Press. Dunlop holds a BA from UCLA and an MA from Washington State University.

@ThingsExpo Stories
Michael Maximilien, better known as max or Dr. Max, is a computer scientist with IBM. At IBM Research Triangle Park, he was a principal engineer for the worldwide industry point-of-sale standard: JavaPOS. At IBM Research, some highlights include pioneering research on semantic Web services, mashups, and cloud computing, and platform-as-a-service. He joined the IBM Cloud Labs in 2014 and works closely with Pivotal Inc., to help make the Cloud Found the best PaaS.
Headquartered in Plainsboro, NJ, Synametrics Technologies has provided IT professionals and computer systems developers since 1997. Based on the success of their initial product offerings (WinSQL and DeltaCopy), the company continues to create and hone innovative products that help its customers get more from their computer applications, databases and infrastructure. To date, over one million users around the world have chosen Synametrics solutions to help power their accelerated business or per...
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Founded in 2000, Chetu Inc. is a global provider of customized software development solutions and IT staff augmentation services for software technology providers. By providing clients with unparalleled niche technology expertise and industry experience, Chetu has become the premiere long-term, back-end software development partner for start-ups, SMBs, and Fortune 500 companies. Chetu is headquartered in Plantation, Florida, with thirteen offices throughout the U.S. and abroad.
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
From 2013, NTT Communications has been providing cPaaS service, SkyWay. Its customer’s expectations for leveraging WebRTC technology are not only typical real-time communication use cases such as Web conference, remote education, but also IoT use cases such as remote camera monitoring, smart-glass, and robotic. Because of this, NTT Communications has numerous IoT business use-cases that its customers are developing on top of PaaS. WebRTC will lead IoT businesses to be more innovative and address...
Rodrigo Coutinho is part of OutSystems' founders' team and currently the Head of Product Design. He provides a cross-functional role where he supports Product Management in defining the positioning and direction of the Agile Platform, while at the same time promoting model-based development and new techniques to deliver applications in the cloud.
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
In his session at Cloud Expo, Alan Winters, U.S. Head of Business Development at MobiDev, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to maximize project result...
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abilit...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
In his session at @ThingsExpo, Dr. Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, presented the findings of a series of six detailed case studies of how large corporations are implementing IoT. The session explored how IoT has improved their economic performance, had major impacts on business models and resulted in impressive ROIs. The companies covered span manufacturing and services firms. He also explored servicification, how manufacturing firms shift from se...
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...