Welcome!

Containers Expo Blog Authors: Liz McMillan, Pat Romanski, XebiaLabs Blog, Automic Blog, Elizabeth White

Related Topics: @CloudExpo, Microservices Expo, Adobe Flex, Containers Expo Blog

@CloudExpo: Article

Benefits of Load Testing in the Cloud (Part 1)

How to choose the right approach

Many companies have moved applications to the cloud as a way to reduce capital expenditure while improving IT focus and effectiveness. End users see the cloud as a way to access their documents and applications remotely from anywhere and from any device. IT managers see the cloud as a means of rapidly adapting their infrastructures as needed via virtualization and a pay-per use-model. But what about load testing engineers? Can they seize the opportunities afforded by the cloud to better test the performance of web applications?

As with past overhyped trends in IT, it is important to see past all the talk and look for concrete ways to take advantage of this new technology's flexibility and scalability to save time, reduce costs, and improve the way your organization works.

This article describes how the cloud is revolutionizing load testing and the advantages it provides in many situations for ensuring your web applications perform well in production. It also covers key capabilities to look for in a load testing solution. Without the right tools in place, simply moving your testing activities to the cloud will likely not deliver the results necessary to justify the move. Understanding how to apply the right tools and practices to make the most of the cloud is fundamental to cloud-based testing and vital to ultimately going live with total peace of mind.

Benefits of Load Testing in the Cloud
Load testing with the cloud enables testing teams to take a big step forward in conducting more efficient and more realistic large-scale tests. In addition, it enables organizations to realize significant savings in cost and time made possible by cloud technology.

Perform Large-Scale Tests
More and more, today's web applications are experiencing sporadic surges in traffic. These traffic spikes can have many causes, including a new advertising campaign, an online article, a seasonal sale, and buzz on Twitter or other social media. If your application is unable to handle the increased load, you run the risk of lost business opportunities and potential damage to your brand.

Generating the load for large-scale tests to mimic these unanticipated spikes in production traffic, however, typically requires tens or even hundreds of machines. Purchasing and configuring these systems requires a significant investment of time and money. Once acquired and used for the immediate load testing need, the machines may sit unused for long stretches until they are needed for the next large-scale load testing project. With the cloud, you can rapidly set up as many load-generating machines as you need, on demand.

Perform More Realistic Tests
When testing a web application using machines inside your firewall, you're not testing the entire delivery chain. Unless all of your end users will also be within your firewall, such tests are inherently limited and may fail to reveal all performance issues.

With the cloud, you can execute load tests that access your web application as your users will - from outside of your firewall - and validate all components of the delivery chain, including the firewall, DNS, network equipment, and ISP. These tests are more realistic, and they enable you to evaluate the real-world effects of third-party components, such as content delivery networks, analytics servers, and ad servers.

Your users won't all be accessing your app from the same fixed location across the same network, so a realistic load test cannot be completed from a single location. That's why it's important to test your application and its components from different locations and geographic regions and assess its performance as network bandwidth and latency changes.

Save Time and Reduce Costs with Pay-as-you-go
When load testing with the cloud, there is no need to spend weeks setting up and configuring dozens of real machines. You can create and configure the machine image you need once and then replicate it in the cloud as many times as needed. Often, the cloud testing provider will automate this process as well, saving you even more time.

Further, the substantial up-front costs of purchasing and maintaining machines that may be used only infrequently are eliminated with the cloud. Using the pay-as-you-go model, you can rapidly set up the testing infrastructure you need, when you need it, and only for as long as you need it. From a business standpoint, the cloud lowers total cost of ownership, while increasing flexibility.

How to Choose a Cloud Testing Solution
While all cloud load testing solutions will enable you to make use of the cloud in some way, comparatively few enable you to follow all of the best practices outlined here and capitalize on the opportunities that load testing with the cloud offers. A highway lets you travel faster than a side street, but the vehicle you use makes a big difference in how quickly and how reliably you arrive at your destination. In much the same way, load testing with the cloud offers clear advantages over traditional load testing, but the tools you use are even more important to the quality of your tests.

When considering a cloud testing solution, ask the following questions:

  1. To what extent does the solution integrate with the cloud?
  2. Will the solution enable us to conduct realistic tests?
  3. Does the solution support unified tests inside and outside the firewall?
  4. Is the solution easy to use, or will we spend weeks learning and configuring it?
  5. Does the solution include full-featured reporting and decision-making modules to help our team make the most of the results?
  6. Does the solution support the technologies we used to build the application?

Integration with the Cloud Platform
If you opt for a solution that is not integrated with one or more cloud platforms, you'll need to handle several time-consuming tasks on your own. First, you'll need to learn how each platform you'll be using works, including its limitations and constraints. Second, you'll need to build, test, and maintain your own virtual machine images.

Load testing solutions that offer integration with the cloud simplify and accelerate the steps needed to use the cloud infrastructure. These solutions offer one or more of the following advantages over non-integrated alternatives:

  • Fast provisioning using preconfigured images. You can set up the infrastructure you need in minutes.
  • Simplified security. All required protections are set up by default, including firewall, certificates, and encryption.
  • Improved scalability. Leading load testing solution providers have negotiated with cloud providers to allow users of their software to employ more virtual machines (for the purpose of load testing) than are allowed by default.
  • A unified interface for multiple cloud providers. Load testing solutions can hide provisioning and billing details, so you can take maximum advantage of the cloud in a minimum of time.
  • Advanced test launching. You can save time and effort by defining and launching load generators in the cloud directly from the load testing interface.
  • Advanced results reporting. Distinct results from each geographic region involved in the test are available for analysis.

Of course, few solutions include every one of these integration capabilities. Most solutions fall somewhere on the spectrum between little or no integration and full-featured integration with multiple cloud platforms.

Realistic Tests
Although testing from the cloud is, in many cases, more realistic than testing in the lab, simply moving to the cloud is not enough to ensure the most realistic tests. Real users often have access to less bandwidth than a load generator in a cloud data center. With a slower connection, the real user will have to wait longer than the load generator to download all the data needed for a web page or application. This has two major implications:

  • Response times measured as-is from the cloud with virtually unlimited bandwidth are better than for real users. This can lead test engineers to draw the wrong conclusions, thinking that users will see an acceptable response time when in reality they will not.
  • The total number of connections established with the server will increase, because on average, connections for real users will be open longer than connections for the load generator. This can lead to a situation in which the server unexpectedly refuses additional connections under load.

When choosing a load testing solution, look for one that provides a bandwidth simulation feature that limits bandwidth to ensure that the virtual users download the content of the web application at a realistic rate. This capability is particularly important when testing mobile applications, because mobile devices typically operate with less bandwidth than laptops and desktops.

Similarly, look for a solution that can parallelize requests. Modern browsers have the ability to parallelize HTTP requests as they retrieve a web page's static resources. These parallel requests require more connections with the server and can lengthen response times. Load testing solutions that do not parallelize requests are incapable of producing truly realistic performance tests for web applications.

Unified Lab Testing and Cloud Testing
Organizations that use only lab testing or only cloud testing are at a disadvantage. So are companies that use different tools for these activities.

A solution that supports lab testing enables test engineers to begin verifying the performance of an application internally, before it's ready to be made available via the Internet. This makes it possible to find and fix performance problems earlier in the application lifecycle. Such a solution also lowers cloud costs by enabling teams to conduct internal performance tests on existing hardware when available.

More important, a single solution that supports lab testing and cloud testing enables test engineers to reuse scripts for both kinds of tests, which saves time and effort. Reusing scripts also helps pinpoint performance problems that show up in cloud testing but not in internal tests. Last, a unified solution lowers licensing and training costs, and enables test engineers to use their existing skill set for both types of load testing.

Ease of Use
Testing, with its natural position toward the end of the application lifecycle, is almost always performed under tight time constraints. Delays in the requirements or implementation phases of a project usually result in less time for the test engineers to do their jobs. The pressure is on to deliver results as quickly as possible. This environment is no place for a tool that is difficult to use and configure.

In developing and executing performance tests (either internally or via the cloud) several key features go a long way in improving test engineer productivity, including support for:

  • Easily launching the recording of a virtual user profile (preferably in one click).
  • Defining advanced behaviors (with structures such as conditions and loops) via a graphical interface, complemented by the ability to use a scripting language (JavaScript, for example) for more complex cases.
  • Automatic handling of dynamic parameters. This includes a set of correlation rules for well-known server frameworks. Ideally, the solution will dynamically detect and handle custom parameters specific to your application.
  • Sharing common script parts, such as login or logout transactions, between multiple virtual user profiles.
  • Comparing results. Sifting through results to determine the effect of a particular application or infrastructure change can be a time-consuming and arduous task without a dedicated comparison tool.

This is a not exhaustive list of usability features that can help test engineers work more efficiently; rather it should be considered as a baseline of minimum required capabilities for an efficient load testing solution.

Analysis, Monitoring, Scheduling, and Reporting
Recording a virtual user profile and playing it back to get raw results is only the beginning of an effective performance test. You need tools to help you analyze the results (in real time when possible), find the root cause of problems, and produce actionable results.

Real-time analysis enables you to detect and understand issues while the test is running. With real-time analysis, you don't have to wait for the test to finish detecting an issue, correcting it, and restarting the test. When testing in production, real-time analysis enables you abort a test if it threatens to affect the performance experienced by real users.

A comprehensive monitoring system is essential when you need to find the root causes of a problem. Predefined performance counters and threshold alerts based on industry best practices make it easy to define and analyze counters. For a nonintrusive solution that is easier to set up, look for a tool that supports agentless remote monitoring.

If your organization performs regular regression tests - and even if it doesn't - you may want to schedule performance tests and execute them automatically via the command line to complement functional testing. Regularly scheduled load tests with automatically generated reports can help organizations detect performance regression as soon as it starts to occur, which makes it easier to pinpoint and correct.

Last, reporting is a key capability and essential for communicating test results to others on the team, including management. Because reporting needs change, it is a good idea to keep your options open with a tool that supports multiple formats, including PDF, Word, HTML, and XML for integration with other systems.

Support for Web Technologies
To test Siebel applications or applications built with Adobe Flex, Microsoft Silverlight, Real-Time Messaging Protocol (RTMP), Oracle Forms, or AJAX push technologies you need a load testing tool with built-in support for the technologies you're using. Without this specialized support it can be very difficult, if not impossible, to effectively test the performance of your applications.

Similarly, the load testing solution you choose should provide support for the authentication mechanism employed by your applications, whether it is Basic, Digest, NTLM, or Kerberos. Otherwise, you won't be able to set up a virtual user profile that tests the application as a real person would use it.

Summing It Up
The cloud is opening new opportunities to improve the scale and realism of load testing while saving time and lowering costs. When selecting a cloud testing solution, keep in mind that the primary factor in your success will not be simply the move to the cloud, but rather the tool you use and how well it uses cloud technology.

More Stories By Steve Weisfeldt

Steve Weisfeldt is a Senior Performance Engineer at Neotys, a provider of load testing software for Web applications. Previously, he has worked as the President of Engine 1 Consulting, a services firm specializing in all facets of test automation. Prior to his involvement at Engine 1 Consulting, he was a Senior Systems Engineer at Aternity. Prior to that, Steve spent seven years at automated testing vendor Segue Software (acquired by Borland). While spending most of his time at Segue delivering professional services and training, he was also involved in pre-sales and product marketing efforts.

Being in the load and performance testing space since 1999, Steve has been involved in load and performance testing projects of all sizes, in industries that span the retail, financial services, insurance and manufacturing sectors. His expertise lies in enabling organizations to optimize their ability to develop, test and launch high-quality applications efficiently, on-time and on-budget. Steve graduated from the University of Massachusetts-Lowell with a BS in Electrical Engineering and an MS in Computer Engineering.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
SYS-CON Events announced today that Secure Channels, a cybersecurity firm, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Secure Channels, Inc. offers several products and solutions to its many clients, helping them protect critical data from being compromised and access to computer networks from the unauthorized. The company develops comprehensive data encryption security strategie...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics ...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, will introduce two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a...
SYS-CON Events announced today that App2Cloud will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. App2Cloud is an online Platform, specializing in migrating legacy applications to any Cloud Providers (AWS, Azure, Google Cloud).
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
SYS-CON Events announced today that Grape Up will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company specializing in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the U.S. and Europe, Grape Up works with a variety of customers from emergi...
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Consumers increasingly expect their electronic "things" to be connected to smart phones, tablets and the Internet. When that thing happens to be a medical device, the risks and benefits of connectivity must be carefully weighed. Once the decision is made that connecting the device is beneficial, medical device manufacturers must design their products to maintain patient safety and prevent compromised personal health information in the face of cybersecurity threats. In his session at @ThingsExpo...
SYS-CON Events announced today that Massive Networks will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Massive Networks mission is simple. To help your business operate seamlessly with fast, reliable, and secure internet and network solutions. Improve your customer's experience with outstanding connections to your cloud.
Because IoT devices are deployed in mission-critical environments more than ever before, it’s increasingly imperative they be truly smart. IoT sensors simply stockpiling data isn’t useful. IoT must be artificially and naturally intelligent in order to provide more value In his session at @ThingsExpo, John Crupi, Vice President and Engineering System Architect at Greenwave Systems, will discuss how IoT artificial intelligence (AI) can be carried out via edge analytics and machine learning techn...
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution and join Akvelon expert and IoT industry leader, Sergey Grebnov, in his session at @ThingsExpo, for an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
SYS-CON Events announced today that Datera, that offers a radically new data management architecture, has been named "Exhibitor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datera is transforming the traditional datacenter model through modern cloud simplicity. The technology industry is at another major inflection point. The rise of mobile, the Internet of Things, data storage and Big...
SYS-CON Events announced today that GrapeUp, the leading provider of rapid product development at the speed of business, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company, specialized in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market acr...
In the enterprise today, connected IoT devices are everywhere – both inside and outside corporate environments. The need to identify, manage, control and secure a quickly growing web of connections and outside devices is making the already challenging task of security even more important, and onerous. In his session at @ThingsExpo, Rich Boyer, CISO and Chief Architect for Security at NTT i3, discussed new ways of thinking and the approaches needed to address the emerging challenges of security i...