Perfect automation...one of many holy grails in the QA world. Before posting the link to the article, here's my history with automation.
During my many years in the QA world of video game development, we had automation goals every year I was there. For the first ~3-5 years, the goal was quite simply to 'find a solution.' And during the beginnings of those years, we were developing for a Windows PC platform. We eventually got some basic scripts to build, install and launch or product. Our tools repository was growing as our QA group matured...we began using virtual images to allow for fresh installs and more effective config testing...we had tools to throttle bandwidth and manipulate packet drops for multiplayer games...but we still had very little in terms of true "automation testing."
Then we moved to the console testing world and we spent the next year or two getting our heads around a new batch of tools and procedures. The Xbox team and Microsoft provided numerous sample snippets of code that showed that game automation was possible, even within the big black box (or at that time, silver towers).
We had some testers on the team with some dev backgrounds and we began developing our own tools. One of our testers developed a great tool that helped us test with confidence the Xbox certification requirements. Our automated build system evolved into a custom build tool that also deployed builds to specified boxes in the building. Another tester built a framework that ran javascript commands against the game code to navigate the UI and start AI-only games. Our first internal gameplay automation was underway.
The next couple of years, and with growing support from the dev team to open up code and hooks for us, we eventually had a test automation framework within our game code that was actually accessible. We had testers on our team with the job to write and maintain automation code. We had automated BVTs run against our automated builds. We had suites of automated test cases run against banks of consoles in a bay as well as "checked in" idle boxes throughout the building. We were truly underway.
And then, the current phase of products shipped. Our SDETs were either promoted to developers (without resources to continue automation support) or fired. Shortly thereafter, our test team was completely dissolved with only 2 survivors who were transfered out of QA. And then a month later, the studio was closed completely.
Despite potential opportunities in the game world, I moved into the web applications world (work/life balance is much easier outside of games). I've worked for two different companies since then. In neither company has there been a robust automation toolset. The primary/initial desire for automated scripts in the web world seems to be stress testing. I've become familiar with JMeter and used that tool to write very simple scripts to navigate a website which I then scale up to a massive load. I've also glanced at a couple of other free/open-source tools.
And yet, the scripts I've written/run are not things I would call "automation testing." These stress tests are indeed run by software and they are vital, but my experience with these tools and the scripts does not put me in a position to expand these stress tests to an automation suite. Not only that, but the reporting I have from these reports does not provide me with a sense of the stability of the product except as how the load was handled. I don't know if a particular link is failing because there's bad code there or because the load caused the database to hang up (which is a bug just the same, but a different style).
So, for the past few months, I'm back to where I was near the beginning of my QA life....searching for the right tool. I'm not a "developer in QA"...my dev/coding knowledge is minimal. I can read debug output...I can parse code...I can even write some code...but I'm not the guy who's going to write an automation tool or framework in my spare time and 'wow' the team. Which means I turn to friends, colleagues and articles on the web.
Which led to me finding this article. I now have a few new products/tools/links to check out. I'm intrigued with the idea of 'automatically automating.' That concept seems to be the holy grail of holy grail...but the article seems to suggest that it's not only possible, but it's already happening.
With two products on the horizon, each planning to have thousands of concurrent users, load testing is critical. Beyond that, these systems are huge in scope and are largely different from other products we've already done. Which means a suite of well defined test cases is vital. Having tools to help automatically generate those test cases and to automate those test cases will provide me with a much less stressful summer.
I'm off to find the holy grail.
No comments:
Post a Comment