Smarter Tools, Dumber Developers? I see a train wreck happening in slow motion. Faster development techniques and technologies are crashing into traditional manual testing approaches. No one seems to understand that just because you can develop faster doesn't mean you can deliver faster. The friction factor is testing.
Look at it this way. If you make any change at all to an application, you run the risk of accidental -- and therefore unanticipated -- breakage. In order to protect operations from potential failure, testing needs to be performed not only for the changes but also all previously existing functionality匰o even a 5% development change requires 100% testing.
Do the math. Do you have 20 testers to each developer and 20 times longer to test than develop? I didn't think so.
So you have a choice. Deliver dangerous code fast, or safe code slow? Now, you would think that the obvious answer to this conundrum would be test automation, and you would be right. Only automation can replicate all previous tests while testers add new ones within anything approaching a reasonable time frame. But the glaring reality is that most testing is still done manually because automation is too hard.
And it is probably all your fault. Applications are becoming less and less testable because developers are using less and less discipline. I have seen major, mission-critical Web applications where the pages and objects seem to change names every time you log in -- not that the names were any good to begin with. Button1, text3 and similar names aren't much help, but when they spontaneously become button2 or text2 they are worse than useless -- they are disastrous. When did this swashbuckling approach to programming become OK?
It's like because new tools use wizards, drag-and-drop functionality and other ways of generating code quickly, the code itself is somehow seen as disposable. I've actually had developers argue that it is easier to just throw code away than to fix it. After all, it only took a few hours, days, weeks or whatever to develop, so who cares?
Testers, that's who. Disposable code creates disposable tests. To automate a test I have to be able to tell my script how to interact with the application in a way that is repeatable and maintainable. If the application has all the predictability of a dice game, then there is simply no way to ever re-run the same test. Imagine trying to interface to an object where the names of the interfaces, methods and properties all changed often and without warning or meaning. Wouldn't work, would it? But if you can't rerun tests then you can't solve the original dilemma: how to retest previous functionality with the miserly time and resources allocated.
But where I really go off the deep end is when I raise this issue to developers and they say something like "Well that's just the GUI, so it's not that important", or "The HTML code is dynamic anyway, so it doesn't matter what things are called." As though I am focusing on some trivial, cosmetic aspect that doesn't merit the waste of their valuable time and talent.
Excuse me? The middle letter in GUI is User, as in the poor person who paid for the software and has to use it. If the interface to the end user is not important, then what is?
Let me give you a real life example. A major rental car company redesigned their Web GUI for corporate car rentals to adopt a new look and feel. Following the logic that it was just GUI changes and the back end -- where all the real work happened -- was still intact, testing was cursory at best.
But when they rolled it into production, their corporate customers (think 80% of their revenue) could not rent cars at all. They lost revenue in excess of $25 million, received thousands of hate e-mails, and sustained serious if not permanent damage to key customer relations. Why? Because the GUI was redesigned to change how the renter specified whether insurance was wanted; the old option of Y/N was too cryptic for the new look and the drop-down was changed to Accept or Decline. Too bad the database was expecting Y or N. So the hapless user went through the entire effort to enter all required information, only to get to the very last page and receive an error.
There are a million other examples I could regale you with, but suffice it to say that unless developers start building applications with some semblance of discipline, we are all going to pay the price. Testing already takes too long and costs too much, and the only way to deliver increasing functionality within collapsing cycle times is to use automation.
The ultimate irony of all this is that it looks to management as though it is the testers who are bottlenecks, roadblocks or worse. What's wrong with them, anyway? Why does it take them so long when development gets it done so fast? Why can't they get their tests automated? Why are they slowing everything down, are they just lazy or actually stupid? And all the while they are working overtime, tearing their hair out, and puzzling over how a test that ran yesterday won't run today.
So how about it? Can anyone explain to me how tools can get so much smarter and developers get so much dumber? What do we need, training wheels for wayward developers to keep their code stable enough to be tested? Or am I missing something?
本文地址:http://com.8s8s.com/it/it45715.htm