Unit Testing Servlets with Weblogic and Cactus

This document outlines our experiences in using Cactus on a Weblogic project for a major French financial institution.  Not all of our findings will be applicable to your project but hopefully this document will give you some pointers to writing your own unit tests with Cactus.

Introduction to Unit Testing and Test Driven Development

The basic principals of Test Driven Development (TDD) are:

  • A complete suite of unit tests must be created and maintained
  • No code goes into production unless the associated tests have been created
  • Code is not checked into source control unless it compiles
  • Code is not moved to the test environment unless is passes the test suites Tests are written before code
  • The tests determine the code to be written

TDD is a core component of the eXtreme Programming movement but much of what is advocated was standard practice before the Internet craze of rapid application development took hold. Back in the old days of structured top-down development test plans and scripts were developed in conjunction with a QA team from the specifications developed by the analysts.

Creating tests before coding is started is faster than creating tests afterwards. This is because the programmer already has a framework in which to test the code he is creating so gets immediate feedback on errors or incomplete functionality. It also helps the programmer focus on what needs to be done. Requirements are defined by the tests. There can be no misunderstanding a specification written in the form of executable code. In brief, it keeps us honest, there is no more '90% complete' or almost works, no more forgetting those inconvenient corner cases.

However we found that management is not so easily convinced of the merits of automated testing.  First of all it appears that a great block of time is being allocated to testing before the project even starts.  Most of today's junior managers are only used to chaotic Internet Time projects and they've also been burned one too often by geeks with their own agenda claiming that the latest bit of open source software that they've downloaded for free from the net will solve their problems.

We found that using TDD on a small, contained section of the overall project was the best way to convince management of the merits of this approach.  Once they can see, and demonstrate to their superiors, the benefits they are usually very open adoption.

Enter Junit and Cactus

It is possible to unit test java code by developing a main method with each class to test the various APIs. Indeed this was once the favoured methodology.  However it doesn't present programmers with a standard method of building, launching and analysing tests.  In the J2EE world it also fails to run tests in the same environment as the production server.  Junit solves the first problem and has been widely adopted by the Java community.  Cactus is an extension of the Junit package to enable testing in a J2EE environment.

Configuring Weblogic 5.1 with Cactus

It is worth remembering that if you are still running Weblogic 5.1 with an  early service pack then it is only certified to work with JDK version 1.2.*.  There are some API changes with JDK 1.3+ that can cause problems with EJB lookups amongst others. Fortunately there are Cactus release for 1.3 and 1.2 JVMs if you find yourself in this situation.  It is therefore important to download and install the version with the 12 tag, e.g:


The next question is where to place the various libraries. This question is complicated by the wonderful and varied class-loading mechanisms present in Weblogic and by how you construct your application. Assuming you have servlets, jsps and ejbs you will probably build a WAR to hold the former and build the ejbs into one or more JAR achives. If you are only planning on unit testing your servlets, and in many environments this will also exercise your EJBs, you can place all third party jars into the WAR under the WEB-INF/lib directory. This keeps your classpath from getting too polluted. If you are testing both EJBs and Servlets load the Cactus jars in the Weblogic classpath specified in your startup script.

With the introduction of Weblogic 6.1 and support for EAR files you can also place the Cactus libraries under a top level WEB-INF/lib directory for access by EJBs. This effective duplication of the Cactus libraries may appear strange, but the EJB and Servlet containers should be thought of as separate sandboxes. If you decide to distribute your servlets and EJBs onto different servers there should be no further changes in configuration.

As a core you will need the following libraries:

  • cactus.jar
  • junit.jar
  • commons-logging.jar,
  • aspectjrt.jar
  • commons-httpclient.jar,
  • log4j.jar

You may also want to add httpunit.jar if you want to parse the returned HTML output into a DOM tree for testing.

Writing Unit Tests and Developing a Test Suite

This section will reference our application to show how we build a single test case and then go on to develop a complete test suite. The application is a Voice Mail system that integrates with an existing financial Web site.  Customers call a number to consult and update their share holdings and other account information. The Voice Mail server was provided by Netcentrex and is customizable via .DLLs (shared libraries) written in C++. This limited interactions with the J2EE application and in the end HTTP was used with returned data encapsulated in XML.  XML-RPC was considered as an alternative as their are libaries for both C++ and Java but the external consultants had limited experience with this techonolgy.

Readers should be aware that no state is stored on the server and the architecture uses a single Front Controller and Command pattern.

We needed to test each individual command, the returned XML data and a sequence of interactions. An ideal candidate for Cactus.

Parsing Data

Normally you would use httpunit for verifying the returned HTML or XML code.  Httpunit can parse the returned data stream into a Document Object Model (DOM) which is an easy to navigate Java representation of the HTML tree.  However the application already uses JDOM, the Java alternative to DOM, to read an XML format configuration file and exchange mesages with a mainframe system.  In order to keep the number of third party libraries to a minimum we decided to stick with JDOM.

As was described above our architecture consists of a single Front Controller servlet that dispatches requests to Command objects based on a request parameter. In total there about a dozen Command objects, representing the various different requests the Voice Mail server can generate. In order to better structure our test suit we created a test case class per command. Normally these classes would be placed in a package under the classes to be tested. So if our commands were in: esweb.servlet.svi, all the test classes would be in the package: esweb.servlet.svi.tests.

Again the location and packaging of your tests and Servlets is somewhere where the Weblogic classloaders can come back to bite you. Your test cases must be accessible to the Servlet classloader and vice-versa. As we were only testing servlets we packaged all the test classes into the same WAR as the Servlet classes, that is under the WEB-INF/classes directory. Our WAR has the following structure:

  • WEB-INF/classes/
  • WEB-INF/lib/
    Cactus and other third party JARS
  • WEB-INF/web.xml
  • WEB-INF/weblogic.xml
  • JSPs

The first class is a wrapper for the test suite.  Package declarations, imports and other code has been omitted in all examples to make things clearer.

public class TestControllerServlet extends ServletTestCase {

public TestControllerServlet(String theName) {


public static Test suite() {
    TestSuite suite = new TestSuite("SVI integration tests");
    // etc

    return suite;

Each test suite must be added to the suite() method. The order is important, the IDT command connects and authenticates our users with the system, it must be executed before the other commands. We also added a utility function to this class to help us parse XML documents with JDOM:

static Element getDocRoot(InputStream is) throws JDOMException {
    SAXBuilder builder = new SAXBuilder();
    Document doc = builder.build(is);
    Element root = doc.getRootElement();
    assertNotNull("XML Response", root);

    return root;

This also asserts that the XML response has been parsed and that the root element is not empty.

The Test Cases

Our first test case makes sure that a user can connect to the system. We would also test all the error paths such as an incorrect username or password and connection problems to the server hosting the EJB logic.  Here is the first wrinkle, the tests are performed in alphabetical order. In the completed case I also test for incorrect password, this test must be performed before the test which correctly authenticates the user as it has the effect of breaking the session.  I therefore add a number to the test method name to show the order in which it should be executed, in this case we are looking at the second test.  The somewhat purer alternative is to put each test in its own Class [indeed we later adopted this approach, your mileage may vary as they say].  For structural reasons these could all be in the same file, perhaps using inner classes, with a test suite class as shown above.

public class TestIDT extends ServletTestCase {

// these are shared by all Servlets
static String employeeId
static String profile;

public void begin2IDTCommand(WebRequest request) {
    request.addParameter(ControllerServlet.COMMAND_KEY, "IDT");
    request.addParameter(IDTCommand.LOGIN, "23908789");
    request.addParameter(IDTCommand.PASSWORD, "324890");

First we have to prepare the request parameters, the begin* and end* methods are executed on the client side. Control is then transferred to the Cactus ServletTestRunner class running within the J2EE framework. This calls our suite and each test method:

public void test2IDTCommand() throws ServletException, IOException {
    ControllerServlet servlet = new ControllerServlet();
    servlet.doPost(request, response);

We create a new ControllerServlet, initialize it and run the doPost method. As the doGet method is not supported by our application we may also want to add a test to show that calling this causes some kind of error. Our Front Controller will select the Command to perform the IDT operation and will pass it the parameters LOGIN and PASSWORD (defined as constant strings in the Class itself). This is exactly the same as if the Servlet had been invoked by the client, in our case a Voice Mail server.

I'm going to talk some more about this approach in the next paper on integrating Cactus with The Grinder, a white box performance testing tool.  It is important to remember that Cactus is aimed at Unit testing, not at testing a complete conversation.

So far, so simple, we now want to check the returned XML and extract any data useful for subsequent tests. Cactus passes a WebResponse object to our end command, from this we can get a response stream and parse it into a JDOM structure. Don't worry too much about the JDOM specifics, they are very similar to the DOM structure generated by httpunit.

Given an xml response has the following format:

<?xml version="1.0" encoding="iso-8859-1" standalone="yes" ?>

<response request="IDT" errorCode="okay">
<user employeeID="0053412341" emplyeeProfile="0020200"/>

The following method will get our document root, check that the request value is as expected and extract the employeeID and check that it is ten characters long.

public void end2IDTCommand(WebResponse response)

throws IOException, JDOMException {

Element root = TestControllerServlet.getDocRoot(response.getInputStream());
// perform tests on returned xml
Attribute request = root.getAttribute("request");
assertEquals("request", "IDT", request.getValue());

Attribute errorCode = root.getAttribute(" errorCode ");
assertEquals("errorCode ","okay", errorCode .getValue());

// check utilisateur element
Element user = root.getChild("user");
employeeId = user.getAttribute(" employeeId ").getValue();
assertEquals("employeeId", 10, employeeId .length());


We now want to run our LIS command that depends on the user having been correctly authenticated with the system. The beginLISCommand simply picks up the values stored in the IDTTest case. There are a couple of things to remember, the begin and end methods are run on the client side and the Test case object lives only as long as the test is being run. We therefore have to store returned data in Class level variables and make these static so we can access them later.

This also means that the data is shared by all threads running the test case, a point to remember if you are running Cactus tests in parallel, perhaps for stresst testing.

public void beginLISCommand(WebRequest request) {

    request.addParameter(ControllerServlet.COMMAND_KEY, "LIS");
    request.addParameter("employeeID" TestIDT.employeeID);
    request.addParameter("employeeProfile", TestIDT.profile);

Configuring Weblogic to Run Cactus

One final thing, you will need to tell the Weblogic server about the Cactus ServletTest runner. This is done with an entry in the WEB-INF/web.xml file:



<!-- CACTUS END -->

We can run our test suite by typing the following URL in our browser:


This calls the ServletTestRunner (our web application is called svi in the weblogic.properties file) with the test to be run as the parameter. The output is an XML file giving the results of our test run.


It is very easy to test a complete Web application with Cactus. Sceptics might think that giving the application a quick run with a browser is sufficient. Or that a load test tool can be used to perform the same granularity of testing.

However it is extremely time consuming and error prone to test a whole J2EE application and provide a sensible error report that can be used by developers to pin-point problems. Unit tests are an essential part of regression testing, that is those tests that should be run every time a change is made to the system.

Unit tests also provide guidance to programmers, there is no 'it almost works' what works and what is yet to be completed are obvious to both programmers and management alike.

About The Author

David George is an independent software engineer located in Paris, France.  He specialises in Java Performance Tuning and Consulting on J2EE project management.