Welcome!

Virtualization Authors: Jayaram Krishnaswamy, ITinvolve Blog, Pat Romanski, Michael Jannery, Vormetric Blog

Related Topics: Virtualization, Java, Linux, Cloud Expo, Security, Big Data Journal

Virtualization: Blog Feed Post

[Case Study] API Testing and Service Virtualization Reduce Testing Time 20x

Accelerating Testing in Parallel and Agile Development Environments

Ignis Asset Management is a global asset management company, headquartered in London, with over $100 billion (USD) in assets under management. Ignis recently embarked on a large project aimed at outsourcing the back office as well as implementing the architecture and applications required to support the outsourcing model.

IgnisServiceVirtualizationAPITesting"To meet the business's needs, a number of projects have to be developed and delivered in parallel," explained Aaron Martin, Programme Test Manager at Ignis. "However, we didn't have the resources, budget, and management capacity required to create and maintain multiple test environments internally. This limited test environment access impeded our ability to validate each application under test's (AUT) integration with third-party architectures. Moreover, our third-party providers also had limited test environment access, which restricted the time and scope of their joint integration testing."

At the same time, the company was transitioning to an agile development methodology. To support this initiative, they needed to adopt an automated testing solution to provide faster feedback after each build.

It soon became apparent that the existing testing process had to be optimized in order to meet these new demands. Executing the core test plan required 10 man-days. This process involved manually entering transactions in the originating application, which wasn't the primary AUT. Moreover, they were also manually building simple stubs to simulate interactions with third-party components that were not integrated. To enable complete testing to occur in more agile, parallel development-without requiring additional test environments to be built and maintained- they needed ways to:

  • Enable applications (or parts of the target architecture) to be tested against the Ignis architecture before integration into the complete Ignis system.
  • More efficiently simulate the AUT's interactions with third-party systems not yet integrated into the Ignis system.

Parasoft API Testing and Service Virtualization Enables Ignis to Begin Extensive Automated Testing Before Integration

Ignis implemented Parasoft's API Testing and Service Virtualization solutions to establish a test automation framework that not only addressed the challenges outlined above, but also helped extend test automation across the SDLC.

Ignis's initial implementation of the API Testing solution focused on automating the generation of order management traffic at the API level. The AUT was the message architecture, which interfaces with third-party components-both existing services provided by business partners as well as services being implemented in parallel by outsourcing providers. From the application initiating the order, live trade scenarios were used to form their basic test transactions. Using SOAtest (Parasoft's API Testing tool), they were able to run the full transaction test plan, generating new instances of the message from a data source. This data-driven message building took advantage of features such as SOAtest's ability to update attributes to create unique IDs, set dates, and perform calculations.

In parallel with the functional test automation, Parasoft Virtualize (Parasoft's Service Virtualization tool) was implemented to simulate the expected transaction response messages from third-party components. "First, we rapidly implemented a simple virtual asset that provided a positive response to all generated transactions, enabling us to simulate third-party responses without manually developing and managing stubs," Martin explained. "The virtual assets were then extended to handle more complex response scenarios."

Ignis also implemented automated tests and virtual assets to test outsourced components fully- decoupled from the Ignis environment. They used this to establish a "quality gate" that had to be passed before progressing to the integration phase. Martin remarked, "This was quite useful, since their code quality was poor and repeated testing in our integrated environment would have impacted other deliverables."

Leveraging Supero to Transform a Manual Testing Process into an Automated One

Since Ignis test resources were not experienced in test automation or service virtualization, they enlisted the help of an automation developer to build out their test requirements in the Parasoft ecosystem. Ignis engaged Supero Solutions to manage the implementation and ongoing test requirements since they had extensive experience implementing and using Parasoft. Ignis has now replaced all the manual test resources in one location with Supero resources.

Supero's expertise has been critical for building automated tests within the scrum teams, which is a key factor in the success of the Ignis agile initiative. "Using Supero allows us to flex our resources to meet project requirements while still maintaining a consistent approach," Martin said.

Once the implementation proceeded, the value of having a Parasoft expert lay the proper foundation became clear. From this starting point, any resource can now run test plans via Parasoft and enable virtual assets in the test environment with a very minimal learning curve.

Results: A 20x Reduction in Testing Time

"With Parasoft's integrated functional test automation and service virtualization, we were able to reduce the execution and verification time for our transaction regression test plan from 10 days to a half day," shared Martin. This testing is not only automated, but also quite extensive. For example, to test the Ignis system's integration with one business partner's trading system, Ignis's fully automated regression testing now covers 300 test scenarios in a near UAT-level approach-with 12,600 validation checkpoints per test run.

"Previous automation implementations focused on automating testing at the UI level-with varying levels of success," Martin continued. "We determined that we really needed to generate transaction scenarios and traffic at the API level instead. With Parasoft, we can focus on the core test requirements and get more value from our investment in automation."

Beyond addressing the original challenges posed by the project, the solution has also enabled automated testing to occur all the way from the component/unit level to system integration. To achieve this impressive level of automation, testers fostered close relationships with the development team. Now, testers' role within the organization is elevated, and collaboration between development and testing has reached an all-time high.

More Stories By Cynthia Dunlop

Cynthia Dunlop is the lead technical writer for Parasoft.

@ThingsExpo Stories
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session at @ThingsExpo, Ryan Bagnulo, Solution Architect / Software Engineer at SOA Software, focused on desi...
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
"For over 25 years we have been working with a lot of enterprise customers and we have seen how companies create applications. And now that we have moved to cloud computing, mobile, social and the Internet of Things, we see that the market needs a new way of creating applications," stated Jesse Shiah, CEO, President and Co-Founder of AgilePoint Inc., in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, addressed the big issues involving these technologies and, more important, the results they will achieve. Rodney Rogers, chairman and CEO of Virtustream; Brendan O'Brien, co-founder of Aria Systems, Bart Copeland, president and CEO of ActiveState Software; Jim Cowie, chief scientist at Dyn; Dave Wagstaff, VP and chief architect at BSQUARE Corporation; Seth Proctor, CTO of NuoDB, Inc.; and Andris Gailitis, C...
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
SYS-CON Media announced that Splunk, a provider of the leading software platform for real-time Operational Intelligence, has launched an ad campaign on Big Data Journal. Splunk software and cloud services enable organizations to search, monitor, analyze and visualize machine-generated big data coming from websites, applications, servers, networks, sensors and mobile devices. The ads focus on delivering ROI - how improved uptime delivered $6M in annual ROI, improving customer operations by mining large volumes of unstructured data, and how data tracking delivers uptime when it matters most.
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immediate and actionable interpretation of events as they happen. Another aspect concerns how to deliver ...
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.