In my previous post on Writing Acceptance Tests for ASP.NET MVC, I described the setup that we got to work for us to execute automated acceptance tests for our application Marinas.info. This post provides some more background on the topic which problems we encountered and how we solved them.
From MvcIntegrationTestFramework to Browser back to MvcIntegrationTestFramework and back to Browser
MvcIntegrationTestFramework I
When we started writing our first acceptance tests we, of course, took a look at the acceptance tests the Orchard team had already written inside the Orchard.Specs project of the Orchard source solution. They basically use the exact same approach as the MvcIntegrationTestFramework, hosting the AUT (application under test) in a manually created, ASP.NET enabled AppDomain. Unfortunately, quite early I ran into problems when using cookies that I wasn’t able to solve, so we decided to just do browser based tests.
Browser Tests I
Automatically executing browser based tests has proven not to be as simple as it may look, at least for us. You have a host of components that need to play well together: your test code, a web server, a browser, the AUT, and in a continuous integration scenario you also have the test execution environment, TeamCity in our case. Here’s an overview of some of the problems we ran into:
- on TeamCity, the IIS Express process would not terminate during test teardown
- on TeamCity, the browser instances would not close when an exception happened during the test
- a Firefox Portable instance would work on one dev’s machine but not on another one’s
- Internet Explorer seemed to work for a long time (at least locally) until we ran into problems with missing cookies when executing the tests on our CIS (Continuous Integration Server)
MvcIntegrationTestFramework II
These problems and the fact that each test would take 2-3 min to run (mostly due to Orchard site setup running as part of each scenario’s setup, and the use of the browser) pushed us to try the MvcIntegrationTestFramework (again, if you count copying Orchard.Specs as the first time). We found a slightly evolved version at https://github.com/cvrajeesh/MvcIntegrationTestFramework which was already ready to use in an MVC3 project and enhanced at along the way. It worked quite well with a modified version of Orchard’s default Global.asax.cs which we found in the Orchard.Specs project. Unfortunately, trying to run a few more complicated tests on our AUT revealed one problem after the other that seemed to be related to the modifications we made to the application which made it in a few ways behave differently than e.g. our stage version. After spending quite a few hours on fixing those problems, we decided that it would be best to use the exact same version of our Marinas.info application that would be running on our stage and live systems and write test against that. This, though, had the effect that none of the magic hooks of the MvcIntegrationTestFramework worked anymore. They simply wouldn’t fire when using the Global.asax.cs that ships with the Orchard.Web project. Sadly, I was unable to find the root cause of this (mis-)behavior.
Browser Tests II
In the end, we went back to browser testing, this time using Firefox 10.0.6 ESR (Extended Support Release) and some improved wrapper code for managing the IIS Express process. But we kept the approach that MvcIntegrationTestFramework gives us, namely running code inside an independent AppDomain that hosts our AUT, to execute Orchard commands faster than by spinning up a Windows command processor and remote controlling that. We also stopped running the Orchard setup procedure for every scenario – instead we set up the application instance once at the beginning of all tests and reset the database to a predefined, clean state before each scenario runs. This cut down test execution from previously 2-3 min to 30-45 sec which is quite an improvement.
Conclusion
If you develop a fairly complex application, you have to roll your own environment for testing, automation, and continuous integration. The phases our code has gone through have taught me a lot, although there are still some “mysteries” unsolved. For now, we’re happy to have reliable automated acceptance tests. That’s what this was all about in the first place.
Happy Coding!