As a software development company, what is your goal? What is the one thing you feel you need to do to ensure you have a job at the beginning of each wonderful work week? The answer is actually quite simple; You need to deliver a quality product. Like how I used the word simple? Although the answer I gave was simple, the process to make that answer possible can sometimes be a frightening, unclear, and glooming reality. But then one day you wake up and ask the question, what can I do to make this better? Well, for those who have woke up and asked themselves that question, this should help clarify at least one step in the process to a smoother and more collaborative product delivery.
At the core of every great product, is a great team that has the ability to ensure a product delivery that puts a smile on everyone’s face. We all want to push a product to production and make sure it happens with the least amount of flaws as possible. We also want to do that as fast and efficient as possible. That is where continuous integration becomes a key aspect in your software development mindset. When you look at the grand scheme of things, what scares you about continuous integration? It usually involves a lot of testing, and if you don’t have the proper test automation in place, how will you receive that feedback in a timely manner?
Well, continuous integration requires continuous feedback. What better way to get continuous feedback then a list of test results that solidify stability of an application? With DevOps at the forefront of every innovative software application these days, the need for solid test automation is critical. It’s so critical that I would go as far to say that you cannot call yourselves a true DevOps shop without some form of test automation built around the application. You have unit testing, integration testing, UI regression testing, and the list goes on and on. So you ask yourself, what kind of test automation is out there? How much do I really need to automate, and where is my value? Well, as many within our company know, I like to break it down into a simplistic format.
Traditionally, you had your three levels of test automation and they were tiered in a fashion that represented a triangle. A whole bunch of unit tests, a good chunk of integration tests, and a few UI smoke tests. Kind of like such…
Many software development shops use this test automation triangle and it serves them quite well. Tests are still automated and the end result is pretty successful. Why argue with something that works, right? Well, I began to ponder on the efficiency of this model after having numerous discussions with some of my very knowledgeable co-workers. Why not tweak that triangle a little bit, and in the spirit of DevOps, get the whole team involved with the process? Wouldn’t it be neat to live in a world where a developer and QA actually worked together to build an automated test? An “integrated”, integration test. Sounds pretty cool doesn’t it? In a nutshell, that is exactly what we did at my company. Now, before I get into explaining the process, let me do a quick explanation of what our test automation triangle looks like now, and why it looks like it does…
Funny looking triangle right? As our team began to develop and improve our testing processes, we found that the traditional triangle reference quickly shifted to a diamond reference. We began to write fewer, but higher quality unit tests around our code. The integration testing became the bulk of our testing because we were able to test our business logic, separate from the actual UI, in a much timelier fashion. And of course, we still had our UI smoke tests that really confirmed that our site was looking the way it should and functioning appropriately. In the past, a developer would write a unit test or integration test, leaving a large portion of the testing effort on them. With a large portion of the testing effort falling on a developer, that leaves less time to actually develop.
The trickle-down effect starts to show its wear and tear on the unit tests. They become stale, unmaintainable, and irrelevant after a few iterations of work simply because the developer just doesn’t have the time to do it all. Not to mention, the QA were completely hands off from our code and left in the dark with many questions when an ugly bug would arise. In reality, our QA also had a lot of good scenario-based input that was valuable to a developer, but often times that got lost in translation. So what did we do? We brainstormed, we kicked it around, and we finally found an awesome solution for getting the team integrated from a testing perspective.
That is where the wonderful tool of TestArchitect came into play. We were able to utilize TestArchitect as a source for data storage when writing automated integration tests with the help of a developer. The way it works is pretty slick. First off, you have a QA who creates a test module within the TestArchitect tool, this is where the test data or test scenarios lived. Next, a developer would code up a test harness that connects the communication from the data in the test module to the data being returned from a service or API. The cool thing was, when you execute the test, you are left with a combined effort to make that test successful. The developer didn’t do it all, and the QA provided valuable input on the data scenarios that needed to be tested. We “NEEDED” both the developer, and the QA, to work hand in hand which created a natural collaboration between the two of them.
With the help of TFS, you can then associate all of those tests to run daily, hourly, at check-in, or whenever you feel they are needed to verify that the software being built is being thoroughly tested. I like to explain it in a simple process flow like this; A QA creates a test module that houses scenarios of data within it. The developer creates the harness that links the test data to the service or API that is being tested. And in result, the test executes and shows the quality of your application under test. It turned out to be a really collaborative effort and something that benefited our team tremendously.
So when I titled this article, “Integrated, Integration Testing”, it was intentional. It truly was an effort that integrated our team. With that integrated effort came challenges and experiences that provided so much value to the team in the long run. For the first time, our QA were able to understand the logic within the services of our site, and the developer was able to understand testing from a QA perspective based on the constant exposure to the test scenarios. The more we sat back and watched all of this collaboration happen before our eyes, the more we realized that we have something that really benefits the team. It was really neat to see a team work together to deliver a product that everyone truly cares about. At the end of the day, we all want to build software that pleases our customers. Doing it as an integrated team, just made it that much more enjoyable.