Performance Testing in the Real World
By Jeff Gainer
(Author's note: This essay originally appeared in the 17 May 2000 issue of the Cutter IT Journal E-Mail Advisor.)
Several years ago, while a software development lead, I was introduced to the concept of automated software testing. Just point, click, and record the test cases, the vendor promised. Their tool would, we were told, quickly accumulate a huge collection of automated test cases that would dutifully execute unattended. I was busy with my development team and didn't have time to inquire in detail while the testing consultants merrily clicked, pointed, and recorded. They planned to execute a suite of test cases over the weekend. Along with everyone else, I confidently left for a long weekend, sat on the balcony at my lakeside hotel suite, lazily watching the sun set while I sipped exotic cocktails, serene in the knowledge that 2,000 miles away, the computers were busily executing test cases.
I returned to the client office on Monday morning - late, like everyone else who had to fly in - and found that something had gone wrong, there weren't any test results, and that automated software testing wasn't all that had been promised. Still, I was intrigued by the concept, and over the next few years, gravitated toward automated software testing, discovering that it was considerably more complex than the marketing folks had promised.
Today, with the advent of e-commerce, automated performance testing is the relatively new technology, and many e-business companies are discovering that buying a tool is just the beginning of a successful performance testing effort.
Regardless of what the slick marketing brochures promise, there is more to automated performance testing than pointing, clicking, and recording. Performance testing tools are complex, robust programming tools, so no matter how busy your people might be, at least send them to a training class. Better yet, bring a trainer to your organization to conduct classes and then spend some time with your staff, answering their specific concerns and mentoring them in their new tools.
It's a given that planning is the most important step in software testing, and nowhere is this more true than in performance testing. Work with your marketing people to create real-life user scenarios, then tap the system administration and security staff to build scenarios to test possible areas of vulnerability. Then take this a step further: for an online retailer, double or triple the holiday sales projections and see how the system reacts. An online stock brokerage might develop a scenario for a market panic (replicating, say, five billion shares traded on the New York Stock Exchange) to see how the system fares under that load.
Unless you're willing to bring down a live site, don't even think about testing against the production environment. Better to build a testing laboratory - but make sure it reflects, at least in some measure, the production environment. If you are testing a site before going live, test early enough so that the systems staff can revive it in time if the site caves in under your tests.
Performance Testing is an Ongoing Project
Performance testing is not just a one-time endeavor to verify that your current system capacity is adequate or to verify that a new site will be robust before going live. E-commerce is only in its early stages; even the experts are still learning lessons about increased input from new markets, buying surges, and hacker, virus, and worm crises.
(c)2000 Cutter Information Corp. All rights reserved. This article has been reprinted with the permission of the publisher, Cutter Information Corp., a provider of information resources for IT professionals worldwide.
This article originally appeared in the Cutter IT E-Mail Advisor, a supplement to Cutter IT Journal. www.cutter.com/itjournal
Return To jeffgainer.com.