OK so since I was out last week on vacation, I thought I would post twice this week to make up for it :).
Over the years I've worked with several QA engineers, and after observing the amount of work that went into maintaining the test infrastructure I really wanted to see what was so difficult about it.
What I had observed was tests were failing quite a bit, took forever to run, and lacked a sufficient degree of methodology or atomicity to anything.
Since most of the web automation industry uses Java, I also observed a general deficiency in some common software engineering practices that would really help improve efficiency and maintability of the platforms.
Most of the automation code I've seen suffers from some common problems:
As a result, I've decided to begin an undertaking to understand how automated testing using Java Selenium works.
I did some research on how the majority of the software engineering world takes on automated testing using Selenium and I was not impressed.
One of the biggest pains I found was just trying to configure a modern Java Selenium project. What I wanted out of this was something I could easily run from Eclipse, and that all of my dependencies would be managed for me and I wouldn't have to worry about classpath, downloading JARs etc.
I absolutely, positively did not want to be confined to compiling and running Java on the command line. Dear God, kill me now if that's the experience I'm in for.
The obvious mechanism for all of this was to use Maven along with TestNG. Luckily there's a great website that tells you all about how to do it:
https://testng.org/doc/download.html
Yea, that's right - it's from the actual maintainer of the framework. And it wasn't the first or second or third result. It was basically at the end of the first page of results.
Google does a terrible job of understanding when someone is looking for documentation and knowing how to show the actual source of information. Also, I think this is a side-effect of just so many bloggers and "authority-sites" that exist with dubious experience and information.
There were a ton of pages that just sought to confuse the reader and obviously didn't even bother to look at the main website, since all the advice they gave was way over the top and convoluted.
With that behind me I also began to discover, like any piece of programming or software or framework out there, that for every 1 good developer who knows who to document Selenium code, there are 100 that absolutely do not.
Take for example any content from these domains:
And a bunch of sites just seemed to have duplicate content - I can't tell who copied it off of each other, but whenever I found one I just threw the domain in timeout and ignored it from future search results.
Also - no one seems to be able to format their code, or speak succinctly as to why a certain pattern or programming decision is made.
I also found several good sources of information, one of them funnily enough being Stackoverflow.com as well.
Like I mentioned before, sometimes the testing suites are too complex and end up having a good amount of bugs themselves along with bugs they're trying to detect in websites and web applications. Surely, this was creating unplanned test failures and adding additional cycle time.
I also discovered good and bad design patterns, and programming styles that either cut down on code, or severely bloated the test suite.
A big area I saw that I could tell was going to create massive amounts of headaches was browser driver management. Obviously, I want to be able to support testing on multiple browsers, and the only way to do this is to create a specific driver type for a Selenium WebDriver.
Now, I saw a lot of terrible error-prone patterns where these things are managed directly in test classes, or have it not really be easily accessible to tests:
Google has this framework called Guice which looks like it should be a great framework to take care of the DI side of things for me.
I think i should be able to let this manage my WebDriver and browser driver config, and potentially even test case data.
Page object model is the best pattern out there to model webpages and you don't have any excuse for not using it. And it's not like this is something new, it's been out since ~2013 and I can't understand why so many sites still use some weird pattern where WebDrivers and element selectors are all hardcoded in. I definitely don't want to fall victim to that so I'll be sure to follow the patterns suggested by the Selenium devs.
But to be fair, even the Selenium devs don't really take full advantage of all of the tooling that ships with the Selenium code.
For one, all of the selectors can be annotated away. I didn't want to always have to access the driver, manage a property that's tied to some xpath or whatever, and then find the element I want to do some work on. I didn't want to do all of this:
After looking through the documentation, I found there were annotations available to make this work and looks like there's a good amount of community support:
I found some comparisons of other patterns automation developers seem to use so this might come in handy as well as far as lessons learned:
I also wanted to make sure I could derive components that would force the suite to wait for them to be visible, or else timeout or something. This could of course be done with some custom code for every implementation, but I don't want to have a bunch of boilerplate muddling up my testing codebase. Heaven forbid I ever have to upgrade it for improvements, ugh that would be terrible.
Luckily, after some thorough searching I found a great resoure that I could tell would take my automation suite to the next level and get away from having to write a bunch of code to interpret a page just like humans do.
Ok, so I've gone through a lot of concepts and can tell this will not be a simple undertaking. However, after doing this little bit of research I feel much more prepared than if I just went in half-cocked which it appears most of the developers writing Selenium tests do.
I'll plan on writing some additional articles relating to this to showcase good patterns, heavily based on the content I've highlighted above.