Continuing the series on recommendations for testing with BizTalk this post will focus on how to effectively test custom adapters developed by your team and for use within a BizTalk solution.
As most of you will be aware adapter development is probably one of the most difficult development tasks in the BizTalk arena because it is often easy to get the functionality of the adapter to work, but once you put it in a production like situation you will often discover a whole new set of problems.
Based on my experience from different projects the most common ways to test adapters are as follows:
1. No Specific Testing
Unfortunately this is one of the most common approaches to testing for adapters. In this approach the adapters are only tested when one of the BizTalk applications is deployed and that applications tests are ran. The limitations of this type of testing are:
- The adapters are only tested within the context of a specific BizTalk application and the tests for that application may only test a subset of the code within the adapters
- The tests usually will not look at the performance of the adapters and any issues in this area will not be picked up until much later in the project
The main consequence of this approach is that the adapters developed tend nit Tto experience bugs which are picked up later in testing when defects are higher profile and more expensive to handle. This often leads to a lack of confidence in the adapter.
2. Unit Testing
This approach is most commonly an extension of approach 1. The development team will perform some MsTest/NUnit style unit testing on the adapter code in addition to running tests for an application. While generally a good practice the problem with unit testing an adapter is a lot of the code depends heavily on the BizTalk framework and it is difficult to write these tests and you probably end up writing a lot of tests which have limited value.
3. Ignoring Adapter Performance
The worst practice I often see in adapter development is when a project focuses on the functional development of an adapter and ignores the performance aspect. In this case some tests may be done and the adapter may be considered to work, but often later in the development life cycle when some load is being pushed through the system you find a problem.
The worst case scenario here is that you end up doing significant bug fixes or refactoring of the approach which ends up costing a project a lot of time and money and potentially impacting other development teams.
4. The Proof of Concept becomes the Production Code
I once worked on a project where I produced a POC for a custom adapter to help out the development team. The POC did its job and then I moved on to another project. The development team had then went on and developed their BizTalk solution using the POC and later in the project realised they had not planned the task of productionising the POC and were actually considering just using the POC in production - until I found out :-)
There are different views on POC's in terms of if you should just keep working on them until they are production ready, or if you should start from scratch and do the production code. I think either approach has its merits, but what is important is to remember that either way a POC is just a POC and almost always needs a lot more work before you can consider it to be of production quality.
Again as with previous the aims of the testing approach is as follows:
- We want to catch bugs eariy in the development process
- We want to have tests which are automated as much as possible
- We want to integrate our tests into a continuous integration process
- We want to reduce the risk of bugs in our adapters impacting the development and testing effort of a BizTalk application development
- We want to have an effective testing approach, we do not want to write hundreds of tests which have little value and make increase the costs of development and maintenance
Unfortunately I havent really got time to produce a sample like I have for some of the other posts, but I will list the things we typically do in an adapter development which help to deliver an effectively tested adapter. (If there is feedback about this article which indicates a sample would be helpful then I may revisit this in the future)
The things we usually do are as follows:
1. Proof of concept
Adapter development is quite an big task so the first recommendation I would make is that before you get into doing a "proper" development of an adapter you should do a simple proof of concept. The POC should also focus on performance (see the details below) as this is where the biggest problems are likely to come from.
The importance of doing a POC is that it allows you to mitigate risks with the development of an adapter by testing the concept without having to go through the full development cycle.
The key point when your POC is complete and your approach is validated is that the POC is not the completed adapter deliverable!!!! You should now either begin from scratch or spend time refactoring your POC into a production quality solution, by this I mean:
- Is documented
- Is in source control
- Is tested (as below)
- Is build by a continuous integration solution
- Is easily deployable and managable
2. Treat the adapter as a deliverable in its own right
As mentioned earlier adapter developments are often done within the solution for a BizTalk application. This means they are only intended to be used by this application and if the application works then the adapter is assumed to be fine. What we do instead is to seperate the adapter into a solution in its own right. I also like to create a sample BizTalk application within this solution which is solely used for testing and demonstrating the functionality of the adapter.
3. Use the Adapter Wizard
Hopefully a lot of you will already do this, but if not I would recommend when you develop adapters you should look to using the adapter wizard on CodePlex to help you. When you use the wizard this means most of the code within your adapter is now generated and there are only a couple of places where you need to hand code some bits.
The benefit of this is that you can make a decision to not test the "generated" code (or most of it) and trust that the wizard will produce code that works. This means you can significantly reduce the amount of tests you need to produce and mitigate the risk in doing this.
4. Using the Facade Pattern
When I write the custom code that will do the communications for my adapter I use a similar approach to the one I discussed in a previous article when developing pipeline components. I will use the Facade pattern to develop my custom code which will allow me to simplify as much as possible the interaction between the generated adapter code and my custom code.
This allows me to focus my unit tests around my custom code and ensure that is well tested and know that when I plug it into the generated adapter code I can be quite confident that it will work as expected.
5. BizUnit Tests
So far I have only really unit tested the my custom code within the adapter. Next I will extend the solution to have a BizTalk application development which will allow me to produce a set of messaging or orchestration processes which will test the way the adapter will work within BizTalk.
Within this application I can now focus on how the adapter will work and produce processes which will help test that.
Some examples of the kinds of processes I will produce are:
- A messaging process which will pick up a file and then via subscription send to a port which will use my custom adapter to send to the external system (or a stub of it)
- An orchestration process which subscribes to a receive port which uses my custom adapter. The port recieves a message and the orchestration does something simple with it.
From these couple of samples you can see we are just using the core functionality of the adapter and the test scenarios would be simple to test using BizUnit.
Also note that in this testing because we would be deploying to BizTalk before running the tests we would also be testing the design time aspects of the adapter because we would be configuring and deploying bindings.
6. Basic LoadGen Tests
After we have used BizUnit to test the main functions of the adapter it is also a good idea to perform some performance tests on your adapter. I tend to prefer the approach of 2 levels of performance tests for the adapter. In the first I will basically look to include some basic load tests within the set of tests for the adapters. To do this I will identify some of the BizUnit tests I have already developed which I will then use LoadGen to execute these tests in a load scenario.
To do this I use a technique I have mentioned in a previous post where I created a custom BizUnit Transport Type for LoadGen. To see more information about this refer to the below link:
This approach allows me to reuse some of the tests I have already done when performance testing rather than having to develop a whole new set in LoadGen's xml format.
7. More extensive Performance Testing with LoadGen
While the basic tests mentioned above can be integrated into your continuous integration process their aim is to not fully stress the adapter, just to put some load through it and see how it performs and if any obvious defects are thrown up. This allows you to again catch bugs early but I would not consider it to be a proper performance test.
To do an extensive performance test you would want to create some proper LoadGen tests and have a production like BizTalk setup rather than the developer machine that I used in the basic tests.
This testing is going to take a bit of setting up and in some cases tends to get done a little later in the development life cycle however it is still very important and I would recommend deploying the sample/test BizTalk application you have developed along side your adapters to allow you to do a performance session on the adapters in isolation. Although this means you will be doing additional performance sessions it does have a few benefits listed below:
- The adapters development may be completed well before the overall BizTalk solution and you can test the adapters in isolation and get them right
- You can take this opportunity to work on your performance testing approach in a smaller scale and get it working efficiently in time for the bigger effort required to performance test your overall solution
Ok so again quite another big article, but hopefully by using some of the things I have discussed here you will have a pragmatic approach to testing for custom BizTalk adpaters. In this approach you will be able to deliver a high quality component while minimising the risks and costs of this development.