Adding JUnit support for webMethods integration server


Greenbird has created a simple JUnit-based framework to enable effortless and comprehensive unit- and functional testing of webMethods Integration Server Flow elements.


Greenbird is working with Europe’s leading Credit Management Services company, Intrum Justitia, to create a SOA enabled integration solution targeting the energy industry in Norway. The solution expose a set of generic APIs plus a number of client specific adapters and integrates with a back end debt collection system by exchanging flat files. A large amount of integration points and data transformations are involved.

The solution is implemented on webMethods Integration Server from Software AG, a best of breed integration platform with a “point and click” based IDE called webMethods Developer.

Integration Server has great support for SOA enablement and application integration. It also has great support for manual testing, including a debugger with stepping and breakpoint capabilities.
It does however not come with built in support for automatic testing.

We are following an agile process and must be able to continuously handle new and changing requirements. Due to the “point and click” nature of the Integration Server IDE a lot of manual work is involved when refactoring existing features and adding new ones. It was therefore even more important than usual for us to write robust unit and integration tests to discover bugs and regressions as early as possible.

We wanted to use a testing framework to make the process of writing and maintaining high quality tests as effortless as possible. There are a few commercial frameworks that enables the writing of automatic tests directly in Integration Server but we wanted to leverage existing infrastructure and skill-sets based on Java and the JUnit framework. We found an Open Source JUnit based framework called wmUnit but the project has been inactive since 2006 and lacked some of the features we wanted. In the end we decided to write a simple framework ourselves.

These were the main requirements for our framework:

  1. The tests must be able to run in a Continuous Integration set up.
  2. We must be able to write the tests as regular JUnit tests.
  3. The plumbing involved when doing remote calls to the Integration Server must be invisible.
  4. It must be very easy to produce input data.
  5. It must be very easy to perform assertions on the data returned by the Integration Server.

The requirement regarding Continuous Integration support was easily attained by fulfilling the requirement that the tests should be regular JUnit tests. There is a whole plethora of CI products supporting JUnit based projects.

The rest of this article gives an overview of how we implemented the other requirements.

JUnit based tests

We created a generic JUnit based superclass that makes the utilities described below easily available for test subclasses. We based the framework on the latest version of JUnit to ensure that the tests could be written using modern paradigms such as test annotations and Hamcrest matchers.

Transparent log-in and session management

The superclass takes care of logging into the right Integration Server and maintaining a logged on Context for the tests.

Transparent invocation and service selection

By using a convention over configuration approach the superclass is able to determine what Integration Server service is under test simply by analyzing the fully qualified name of the test class.
The convention is that the tests are placed in a package reflecting the folder structure of the service and named according to the following pattern: <service name>Test.
E.g.: To test the service my.folder.structure:myService you create a test with the following fully qualified name: my.folder.structure.MyServiceTest.
This enables tests to invoke the service under test with no configuration what so ever:


Cursor management

Integration Server returns data in IData structures. IData requires that you access data by opening a cursor. The cursor needs to be destroyed after use. The test superclass manages the top level cursor so that simple return values can be looked up without thinking about resource management:

IDataCursor responseCursor = invoke(inData);
String result = getString(responseCursor, "result");

IData builder DSL

The IData programming model is very verbose and requires resource management. Building even the simplest multi-level input documents is time consuming and error prone.
As an example we’re going to build an IData structure reflecting the following document model:

<subElement>string value</subElement>
<name1>value 1</name1>
<name2>value 2</name2>

This would require the following IData code:

IDataCursor cursor = null;
IData subDocument = IDataFactory.create();
cursor = subDocument.getCursor();
IDataUtil.put(cursor, "name1", "value 1");
IDataUtil.put(cursor, "name2", "value 2");
IData rootElement = IDataFactory.create();
cursor = rootElement.getCursor();
IDataUtil.put(cursor, "subElement", "string value");
IDataUtil.put(cursor, "subDocument", subDocument);
IData document = IDataFactory.create();
cursor = document.getCursor();
IDataUtil.put(cursor, "rootElement", rootElement);

We wanted to be able to easily define smaller documents directly in the test code so we created a simple Java based Domain Specific Language (DSL) that hides the complexities of the IData API. Here’s our example document model implemented using the DSL:

IData document = doc(
el("rootElement", doc(
el("subElement", "string value")
.el("subDocument", doc(
el("name1", "value 1")
.el("name2", "value 2")

The DSL enabled us to quickly define documents, elements and lists without worrying about the IData programming api and resource management.

Converting XML to IData

The DSL is suitable for smaller input documents, but when the document size get bigger it’s usually sensible to maintain the test data separately from the test code. Integration Server provides a couple of services for for converting XML to IData (pub.xml:xmlStringToXMLNode and pub.xml:xmlNodeToDocument). We leveraged these services to enable the tests to store larger input documents in an intuitive XML format. Our example document can simply be saved on the format described above:

<?xml version="1.0" encoding="UTF-8"?>
<subElement>string value</subElement>
<name1>value 1</name1>
<name2>value 2</name2>

The input data structure can then be loaded into the test using the following one-liner:

IData inputData = toIData("my_test.xml");

Converting IData to XML for easy querying

Querying an IData document using the Integration Server Java API is just as verbose and error prone as building one. We decided we rather wanted to work with XML since there are many Java tools for querying XML documents.
Again we leveraged an existing Integration Server service (pub.xml:documentToXMLString) to convert incoming IData structures to XML documents.

We chose the XPath utility from the Java SDK as our query tool. We also created a little wrapper for it (XPathRoot) to make it simpler to use.

This made it easy for tests to query even the most complex IData document using xpath:

String xml = toXML(myData);
XPathRoot docRoot = xPath(xml);
String value2 = docRoot.value("/rootElement/subDocument/name2");

..or if you prefer a one-liner:

String value2 = xPath(toXML(myData)).value("/rootElement/subDocument/name2");

FTP support

It is very easy to set up an FTP endpoint on Integration Server so that services can be called with the content of files received from clients. Integration Server can also return the result of the service invocation as a file with a name on the pattern <input filename>.out. This makes FTP a viable transport for synchronous request/response style service invocations.

We created a simple way to test our synchronous FTP endpoints by integrating the Apache Commons Net FTP client into our test superclass. Files can now be transmitted and the response received by a one-liner:

String resultXml = ftpStore("file_name_to_put.txt", inputStream);