Welcome!

Containers Expo Blog Authors: Pat Romanski, Amit Gupta, Elizabeth White, Flint Brenton, Liz McMillan

Related Topics: Containers Expo Blog, @CloudExpo

Containers Expo Blog: Article

Employing Virtual Environments for Manual Testers

Overcoming lab constraints

Virtualization Track at Cloud Expo

Any die-hard software geek tends to be fascinated with solving the biggest, most complex IT problems enterprises face. In terms of testers, this means moving away from manual testing and getting teams to "go agile" with early development testing, or looking "between the boxes" into validating complex middle-tier technologies. In today's enterprise, most of the business logic happens deeper within the applications where services communicate transactions through an ESB and other messaging and data layers, both within the company's IT environment and with third-party systems such as Cloud-and SaaS-based services.

In these environments, concerns over manual testing can often get left behind as it is hard to validate a multi-tier SOA or BPMS-based app by testing at the user interface. Root causes of software issues are difficult to diagnose at the UI layer of modern apps. Still, we must certainly test systems from the user's point of view to ensure reliable outcomes in production - and that includes real people manually testing a UI. How can we eliminate constraints to make the manual testing process more productive as well? Virtualization of software behaviors provides a way to lift the productivity of manual testers as well as back-end integration developers.

Challenges of Constrained Manual Testing
We've all been there. We've got a team ready to start testing, our scripts are cued up, the lab time has been scheduled, and all of our test data has been loaded onto the system and verified. We start our run - and just a few minutes into banging on the system, testers start seeing hundreds of errors show up.

What Happened Here?
In today's complex system environments, it's not uncommon that the application we're working on has dozens, if not hundreds, of back-end dependencies. These dependencies span the gamut from simple and straightforward things like databases, LDAP servers, and web servers, to complex and arcane items like composite application services, mediation platforms, web services and external data feeds. More often than not, the majority of these dependencies is out of our control and has a lifecycle that is independent of the front-end web app that is being manually tested.

In essence, because of modern application complexity and service interconnectivity, manual testers can get stymied by the same factors that keep more technical integration and performance testers up at night:

  • Lack of availability or limited access to needed back-end resources behind the UI
  • Volatile, changing data that invalidates the testing efforts
  • High infrastructure cost and effort of setting up and provisioning ready test environments

VSE and Automation for Manual Testers
Service Virtualization (SV)
of external dependencies is an efficient way to mitigate the challenges associated with testing applications that have many dependencies. To date, this practice has normally been associated with automated regression, integration, and load testing and performance teams, but it is also applicable to the manual tester. Service Virtualization is the next step beyond hardware and OS types of virtualization, as it seeks ways to improve productivity by virtualizing the behavior and responsiveness of middle-tier components that are not suited for replication on a VM.

For instance, you might effectively virtualize a given Windows OS desktop or app server configuration on a VM. However, some items like a mainframe, a service under development, or a hosted cloud-based SaaS application are either unavailable or too large and distributed to replicate on a VM. This is where Service Virtualization can be brought into play to simulate the underlying system dependencies for testers.

Everyone tends to think in terms of tools for automated testers but, at the end of the day, virtualization is about enhancing productivity across the entire team, be it manual or automated execution. Organizations are starting to recognize that there's also tremendous uplift for an even larger audience of manual testers through SV.

Compare the roles using automation versus manual testing. If test automation is in play, there is a small team of highly skilled automators who are doing that work. When systems are not available they are inconvenienced, but, more often than not, their time is constrained and they may have a backlog of many different projects that they are working on. These skilled resources lose productivity due to the change in context, but there are further inefficiencies occurring in acceptance testing.

Manual testing is a very different game from automated testing in that manual testers usually consist of two different teams or audiences. There's the group of dedicated testers who focus in on the day-to-day validation of the application. But for larger test events such as Workflow Integration Testing (WIT), User Acceptance Testing (UAT), or End to End (E2E) testing a large number of business users are also brought into play. When a system dependency is unavailable then, it's not just a few people who are idle but sometimes literally dozens or hundreds of people.

Service Virtualization enables manual testers to move forward with an efficient and repeatable test flow, so they can expect to find consistent test data and full-time availability of application dependencies that exist under the UI they are testing. Furthermore, by using virtualization, they can reset their test data "on the fly." When they find they have made a mistake in their test workflow, they don't have to clear out or rebuild the test data in order to try to repeat a test.

Think about the number of resources involved in manual or "acceptance" testing within your enterprise, and the amount of cost and schedule coordination (and possible travel) required to get them ready to test software. While automation is a great goal to strive for, we know that there will always be a need for thorough testing from a user perspective. In these environments, Service Virtualization can be just the ticket for maximizing availability and minimizing downtime.

More Stories By Jason English

Jason joined iTKO in 2004, bringing more than 15 years of experience in executing marketing plans, re-engineering business processes and meeting customer requirements for companies such as IBM, EDS, Delphi, TaylorMade, Sun, Motorola and Sprint. As Director of eMarketing and Executive Producer, in2action Consulting at i2 Technologies, he was responsible for i2's outbound messaging during a period of extreme growth, as well as marketing services and working directly with clients to build easy-to-learn front ends to B2B systems. Prior to that, he managed customer experience as an Information Architect at Agency.com. Jason scored and designed several internationally released computer games in addition to conventional print advertising and television commercials.

More Stories By Andy Nguyen

Andy Nguyen is a Senior Solution Architect at iTKO LISA, a Virtualization, Test and Validation software provider (www.itko.com). iTKO provides virtualization and validation solutions optimized for distributed, modern applications that leverage cloud computing, SOA, BPM, integration suites, and ESBs.

@ThingsExpo Stories
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
I think DevOps is now a rambunctious teenager - it's starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
In his session at @ThingsExpo, Dr. Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, presented the findings of a series of six detailed case studies of how large corporations are implementing IoT. The session explored how IoT has improved their economic performance, had major impacts on business models and resulted in impressive ROIs. The companies covered span manufacturing and services firms. He also explored servicification, how manufacturing firms shift from se...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...