listeningTo: South Park, Season 1 Episode 4
inRealLife: It’s been a rough patch here, luckily I’m pretty easy to amuse and putting on my “When it rains, it Poes” shirt, featuring a very sad Edgar Allen Poe under an umbrella was enough to put a smile on my face this morning. If there’s one thing to know about me it’s that I love a good pun.
whatIReadThisWeek: We’ve had a difficult release cycle. I feel like I’m saying that a lot here, it’s not the norm, just the recent norm, if that makes sense. My time is pretty preoccupied and my TBR list is growing. I use the OneTab Chrome extension to keep track of articles that show up in my Inbox that I want to read and not lose. I also use Feedly to manage a lot of RSS feeds I’m subscribed to. Despite the busy-ness, I’m still trying to go through headlines and save some for later.
I did get a few reads in this week, which I’ll be talking about in this post. All three are about test automation: What’s Next in Test Automation and 5 Barriers to Automated Testing and How to Overcome Them both by SauceLabs, and 10 Best Practices in Test Automation #3: Build Maintainable Tests by Ranorex.
whatILearnedThisWeek: I built our automation project by myself from scratch earlier this year using a “codeless” solution called Sahi Pro – in fact it’s one of the main reasons I got this promotion and we hired a new assistant. I don’t have the time I want to focus on it, though, so although it is effective and running, I also have a list like 20+ items long of things I want to do.
This past release cycle, I ran through 37 scripts. I have a few more written, but I added them at the very last minute and they ended up not working as I hoped during the full test run, so I removed them at the last second and tested those steps manually. This last minute work is explicitly listed as one of the 5 barriers to automated testing in SauceLab’s article, and I’m happy to see I’m not alone here. The recommendation in the article is to “make ‘has automated testing’ a part of the acceptance criteria” when writing case requirements. This is something I’m going to experiment with this sprint. I will occasionally add a note to QA Plans to write technical documentation on the feature or add a manual test plan for regression testing, but I have not yet tried to work in writing automated scripts for each new feature (where applicable) and I think this is a great idea.
Another issue in the 5 barriers to automated testing article is “data dependency problems” and I feel that at a molecular level. We don’t have updated test data. The CTO will refresh the staging databases occasionally, though there is no set schedule and it usually only happens when I cry loud enough that my tests are virtually ineffective since we don’t have production-like data anymore. There is a long outstanding case in the backlog to automate the data refresh, or update some of the process so its a less onerous task or at least one that someone other than the extremely busy CTO can do. I push it up every now and then only to have it de-prioritized. “We’ve lived with it this long, what’s a few more months?” they say over, and over, and over. I have to choose my battles.
whatIAmThinkingAbout: Reading through the 3rd installment in Ranorex’s 10 Best Practices in Test Automation series sparked a lot of ideas and I highly recommend this article in particular as well as the first two. One of their tips is to “use a modular structure”. Um, yeah, that is ridiculous simple and something I am not doing at all in my current scripts. This sprint, I’m committed to spending more time refactoring my scripts, and the first step in that process was to audit what I currently have and identify what I can call out as a reusable function and what set up / clean up tasks are necessary for each chunk, rather than one bit set up in the beginning and one big (and partially manual) clean up at the end.
So far, I settled on a Google Sheet that lists all my scripts, by name, in the first column, and a few columns to check off if the script already includes set up, clean up, or if I can identify a part or parts of the script to pull out into a reusable function (log in, impersonate a member, etc). Once this list is complete, I will work on writing the actual functions, which will be a whole process of its own since we all know that I am not a coder, and what would take an actual developer 5 minutes to write will probably take me the better part of 5 hours.
recommendationsAndTakeAways: There is never a good, calm time to refactor. Refactoring projects are unwieldy, you start with a plan and get quickly sidetracked, thinking, “well, since I’m already working on this, let me just fix this this and this”. I’m going to attempt to put a cap on the refactoring and focus on the set up, clean up, and reusable functions, and not the 20 other things on my “I want to do this” list. One of the suggestions in the SauceLabs article to work automated testing into the normal sprint/AGILE process is very tempting. I’ll be experimenting with not only adding “write automated test” in some test plans for new features, but, I’m thinking of also writing my own cases to put an actual estimate and time aside on working with this code. All other code gets tracked through Jira, why shouldn’t mine?