JaVa
   

Overview of JUnit

JUnit is a framework for writing unit tests. This section helps you focus your development efforts around proving that the code you wrote works and that later, when you refactor to add more features, it still works. Let's clarify the following concepts: test case, test fixture, and test suite. A test case defines a fixture to run a related set of tests. Typically, every class that you write should have a test case. A test fixture provides resources: primitive variables and objects that tests need to run. A test suite is a collection of related test cases. For example, if we write a HashMap class, we write a test case for it. The test case has a test fixture object for each test so that we can put objects into the map, pull them out of the map, and perhaps compare them as part of the test. However, if we write a library of collection objects, we write a test suite for it that contains the test case we wrote for the HashMap class—think of a test suite as a collection of test cases. Let's put these concepts into practice. In order to write a test case, we do the following:

  1. Subclass junit.framework.TestCase.

  2. If we need fixture objects, override the setUp() method.
  3. Define a number of tests that return void and whose method name begins with test, such as testAdd(), testPut(), and testIterator().
  4. If we need to release resources that were part of the fixture, override the tearDown() method.
  5. If we need to group a related set of test cases, define a suite of tests.

The next section discusses writing your own test case based on java.util.HashMap.

Writing a Test Case

An excellent example of creating a test case is provided in the samples that ship with JUnit. The example is called VectorTest, and it shows how you would go about writing a test case for java.util.Vector. The good thing about this example is that most people are familiar with the Vector class. In the same spirit, we created a simple example based on the java.util.HashMap (we will go through this example step by step):

/*
 * HashMapTest.java
 *
 * Created on February 8, 2004, 2:29 PM
 */
package xptoolkit.junit.example;
import junit.framework.*;
import java.util.Map;
import java.util.HashMap;
import junit.extensions.*;
/**
 *
 * @author Rick Hightower
 * @version 1.0
 */
public class HashMapTest extends TestCase {
 private Map testMap;
 private Map testMap2;
 public static Test suite() {
 return new TestSuite(HashMapTest.class);
 }
 public static void main (String[] args) {
 junit.textui.TestRunner.run (suite());
 }
 private static final String APPLE_KEY = "AppleCEO";
 private static final String APPLE_VALUE = "AppleCEO";
 protected void setUp() {
 testMap = new HashMap();
 testMap.put(APPLE_KEY, APPLE_VALUE);
 testMap.put("OracleCEO","Larry Ellison");
 testMap2 = new HashMap();
 testMap2.put("1", "1");
 testMap2.put("2", "2");
 }
 public void testPut(){
 String key = "Employee";
 String value = "Rick Hightower"; //put the value in
 testMap.put(key, value);
 //read the value back out
 String value2 = (String)testMap.get(key);
 assertEquals("The value back from the map ", value, value2);
 }
 public void testSize(){
 assertEquals (2, testMap.size());
 }
 public void testGet(){
 assertEquals(APPLE_VALUE, testMap.get(APPLE_KEY));
 assertNull(testMap.get("JUNK_KEY"));
 }
 public void testPutAll(){
 testMap.putAll(testMap2);
 assertEquals (4, testMap.size());
 assertEquals("1", testMap.get("1"));
 testGet();
 }
 public void testContainsKey(){
 assertTrue("It should contain the apple key", testMap.containsKey(APPLE_KEY));
 }
 public void testContainsValue(){
 assert(testMap.containsKey(APPLE_VALUE));
 }
 public void testRemove(){
 String key = "Employee";
 String value = "Rick Hightower"; //put the value in
 testMap.put(key, value);
 //remove it
 testMap.remove(key);
 //try to read the value back out
 assertNull(testMap.get(key));
 }
 }


Let's break down the example based on the steps we defined in the last section for writing a test case. Step 1 is to define a class that derives junit.framework.TestCase, as follows:

import junit.framework.*;
...
public class HashMapTest extends TestCase {


Next, if our test case needs a fixture, we override the setUp() method (Step 2), which the HashMapTest does as follows:

 protected void setUp() {
 testMap = new HashMap();
 testMap.put(APPLE_KEY, APPLE_VALUE);
 testMap.put("OracleCEO","Larry Ellison");
 testMap2 = new HashMap();
 testMap2.put("1", "1");
 testMap2.put("2", "2");
 }


Here we see that the fixture the test case sets up is actually instances of the class under test: the HashMap class. In addition, the test fixture adds some objects (int wrapper) to the HashMap instance. Because the objects that the setUp() method creates will be garbage-collected when we are done with them, we don't have to write a tearDown() method (Step 4). If the setUp() method allocated resources like network connections or database connections, then we would override the tearDown() method to release those resources (Step 4). Next, the HashMapTest class defines several tests to test the HashMap class, as follows (Step 3):

 public void testPut(){
 String key = "Employee";
 String value = "Rick Hightower"; //put the value in
 testMap.put(key, value);
 //read the value back out
 String value2 = (String)testMap.get(key);
 assertEquals("The value back from the map ", value, value2);
 }
 public void testSize(){
 assertEquals (2, testMap.size());
 }
 public void testGet(){
 assertEquals(APPLE_VALUE, testMap.get(APPLE_KEY));
 assertNull(testMap.get("JUNK_KEY"));
 }
 public void testPutAll(){
 testMap.putAll(testMap2);
 assertEquals (4, testMap.size());
 assertEquals("1", testMap.get("1"));
 testGet();
 }
 public void testContainsKey(){
 assertTrue("It should contain the apple key", testMap.containsKey(APPLE_KEY));
 }
 public void testContainsValue(){
 assertTrue(testMap.containsKey(APPLE_VALUE));
 }
 public void testRemove(){
 String key = "Employee";
 String value = "Rick Hightower"; //put the value in
 testMap.put(key, value);
 //remove it
 testMap.remove(key);
 //try to read the value back out
 assertNull(testMap.get(key));
 } 


Note that each test method becomes a test. The JUnit framework uses reflection to look for methods whose names begin with test and uses them as test cases. It does this when we invoke the TestSuite constructor in the static suite() method, as follows:

 public static Test suite() {
 return new TestSuite(HashMapTest.class);
 }


A test suite (TestSuite) is a collection of test cases. The test cases themselves can be other test suites. Thus the test suite is a composite of tests using the composite design pattern. Notice that each test performs an operation on one or both of the HashMaps and then asserts that some condition is true, as follows:

 public void testPutAll(){
 testMap.putAll(testMap2);
 assertEquals (4, testMap.size());
 assertEquals("1", testMap.get("1"));
 testGet();
 }


The assertTrue() method asserts that a condition is true; if you are an old C/C++ coding dog, this assert works similar to the one in assert.h. If the condition is not true, then the assert method throws an AssertionFailedError, which is an unchecked exception that causes the test to fail. The JUnit API includes various forms of assert methods; for example, we could put a description of the assertion as in the testContainsKey() method, as follows:

 public void testContainsKey(){
 assertTrue("It should contain the apple key", testMap.containsKey(APPLE_KEY));
 }


Or we can opt to leave it out, as follows:

 public void testContainsValue(){
 assertTrue(testMap.containsKey(APPLE_VALUE));
 }


Note that the setUp() and tearDown() methods are called before and after every textX() method that is run. Because the setUp() method does not allocate any resources that need to be released, the HashMapTest does not need to override the tearDown() method. If it did, the code would look something like this:

 protected void setUp() {
 //get db connection
 connection = DriverManager.getConnection();
 statement = connection.createStatement();
 results = statement.executeQuery("select count(*) from Pet");
 }
 protected void tearDown() {
 //get db connection
 results.close();
 statement.close();
 connection.close();
 }


So far, we've created a test case and fixture objects and tested with assert, but how do we group these test cases into a suite of related test cases? The authors of JUnit have also provided an example of how to do this. They define two tests in the JUnit samples directory: VectorTest and MoneyTest. Then, they define a test suite to run the test cases in the class AllTest, defined in the next listing.

package junit.samples;
import junit.framework.*;
import junit.samples.money.MoneyTest;
/**
 * TestSuite that runs all the sample tests
 *
 */
public class AllTests {
 public static void main (String[] args) {
 junit.textui.TestRunner.run (suite());
 }
 public static Test suite ( ) {
 TestSuite suite= new TestSuite("All JUnit Tests");
 suite.addTest(VectorTest.suite());
 suite.addTest (new TestSuite(MoneyTest.class));
 suite.addTest(junit.tests.AllTests.suite());
 return suite;
 }
}


The code in this listing compiles several suites of test into one suite. Notice that the main method calls junit.textui.TestRunner.run, passing it the returned value from the static suite() method. The suite() method creates an instance of TestSuite and then adds suites of tests from VectorTest, MoneyTest, and junit.tests.AllTests. Notice that when the AllTest suite() method adds the VectorTest, it calls the VectorTest's suite() method, which is defined as follows:

 public static Test suite() {
 return new TestSuite(VectorTest.class);
 }


As you can see, the VectorTest's static suite() method creates a new TestSuite instance by passing itself as a class. TestSuite uses reflection to extract the test methods that make up the suite. The end effect is that you can group related tests into larger and larger test suites. Thus, you could have suites of tests nested in a larger suite. The alternative would be to run every TestCase independently, which would take a long time and would be tedious. Nesting suites of tests enables you to test large portions of code quickly.

We have now covered the basics of JUnit. In the next section, we integrate the test suite with Ant.

Integrating JUnit with Ant

JUnit and Ant go together like a horse and carriage. Ant automates the build-and-deploy process. JUnit automates testing. Put them together, and Ant can automate the build, deploy, and test process. Ant has several tags to support JUnit. For the integration to work, we need the Extensible Style Language Transformation (XSLT) transform engine JAR file installed; refer to the Ant User Manual documentation for more information. We also need to put the JAR file for JUnit on the Ant classpath, and we must download the optional.jar file from the Apache site (go to http://jakarta.apache.org/ant/index.html and select the "download" option). The easiest way to put these JAR files on the Ant classpath is to copy them to the lib directory in the ANT_HOME directory (ANT_HOME/lib). Once we have the required JAR files, we can build and test the last example with the following Ant buildfile, which we put in the ANT_HOME directory:

<project default="test">
 <target >
 <property value="/tmp/junitSample" />
 </target>
 <target depends="init">
 <mkdir dir="${outdir}" />
 </target>
 <target depends="prepare">
 <javac srcdir="." destdir="${outdir}" classpath="junit.jar"/>
 </target>
 <target depends="compile">
 <junit printsummary="true" >
 <test name="junit.samples.AllTests" />
 <classpath> <pathelement location="${outdir}" />
 </classpath>
 </junit>
 </target>
</project>


Let's quickly break down this buildfile. The name of the project is junitSample, and it has the typical targets, as follows: init, prepare, compile, and test. The test target is the default target of the junitSample project's buildfile. The init target creates an "outdir" property that holds the location of the output directory. The prepare tag creates the output directory (outdir). The compile tag builds the Junit sample source code (discussed in the last section) to the output directory (outdir). The interesting target is the test target, as follows:

 <target depends="compile">
 <junit printsummary="true" >
 <test name="junit.samples.AllTests" />
 <classpath> <pathelement location="${outdir}" />
 </classpath>
 </junit>
 </target>


The test target depends on the compile target. The test target uses the junit task defined in the optional.jar file—note that you must have junit.jar on the classpath in order for this task to work. The junit task can run a test created with the junit framework, such as junit.samples.AllTest, described in the last section. The junit task has a sub-element called test. We use the sub-element test to set the classname of the test case we are going to run. In addition, we set up the classpath for JUnit so that it can find the sample classes we compiled in the compile target. Running the code yields these results:

C:\tools\junit> ant Buildfile: build.xml init:
prepare:
compile:
test:
 [junit] Running junit.samples.AllTests
 [junit] Tests run: 86, Failures: 0, Errors: 1, Time elapsed: 0.911 sec
 [junit] TEST junit.samples.AllTests FAILED BUILD SUCCESSFUL Total time: 2 seconds


The sample test for our JUnit distribution failed! This event is a nice segue to our next point. As you can see, the summary report for running the test is not very verbose—in fact, it's terse. It is hard to tell which test failed. This result may not be what you want. In fact, we are sure that in the real world, you probably want to know which test failed. All we have to do is add a formatter sub-element that directs JUnit to print out a more detailed report. To do so, we add the following to the test target under the junit task (<formatter type="plain" usefile="false"/>):

 <target depends="compile">
 <junit printsummary="true" >
 <formatter type="plain" usefile="false"/>
 <test name="junit.samples.AllTests" />
 <classpath> 
 <pathelement location="${outdir}" />
 </classpath>
 </junit>
 </target>


Now we get much more detailed information, as follows:

Buildfile: build.xml init:
prepare:
compile:
test:
 [junit] Running junit.samples.AllTests
 [junit] Tests run: 86, Failures: 0, Errors: 1, Time elapsed: 0.941 sec
 [junit] Testsuite: junit.samples.AllTests
 [junit] Tests run: 86, Failures: 0, Errors: 1, Time elapsed: 0.941 sec
 [junit]
 [junit] Testcase: testCapacity took 0 sec
 [junit] Testcase: testClone took 0 sec
 [junit] Testcase: testContains took 0 sec
 . . .
 . . .
 [junit] Testcase: testFailAssertNotNull took 0 sec
 [junit] Testcase: testSucceedAssertNotNull took 0 sec
 [junit] Testcase: testFilter took 0 sec
 [junit] Caused an ERROR
 [junit] null
 [junit] java.lang.NullPointerException
 . . .
 . . .
 [junit] Testcase: testJarClassLoading took 0.01 sec
 [junit] TEST junit.samples.AllTests FAILED BUILD SUCCESSFUL


We can clearly see that the testFilter failed. What a bummer! But let's not leave this section on a bad note. We'll change the Ant buildfile to build and test the VectorTest described in the previous section so we can show a test that passes. The test target changes as follows:

 <target depends="compile">
 <junit printsummary="true" >
 <formatter type="plain" usefile="false"/>
 <test name="junit.samples.VectorTest" />
 <classpath> 
 <pathelement location="${outdir}" />
 </classpath>
 </junit>
 </target>


Then we run it as follows:

Buildfile: build.xml init:
prepare:
compile:
test:
 [junit] Running junit.samples.VectorTest
 [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 0.01 sec
 [junit] Testsuite: junit.samples.VectorTest
 [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 0.01 sec
 [junit]
 [junit] Testcase: testCapacity took 0 sec
 [junit] Testcase: testClone took 0 sec
 [junit] Testcase: testContains took 0 sec
 [junit] Testcase: testElementAt took 0 sec
 [junit] Testcase: testRemoveAll took 0 sec
 [junit] Testcase: testRemoveElement took 0 sec BUILD SUCCESSFUL Total time: 1 second


Perhaps you were hoping for a little more from your reporting. It would be nice if you could display the results in a Web page. Then you could have an automated build that would run every night, send out a status email, and post the results on your department's intranet Web site. You can do that with the JUnitReport junitreport task. First we must change the formatter sub-element's "type" attribute to "xml"; it was set to "plain". This setting outputs the test information in XML format. We also need to set the "usefile" attribute to "true"; for the last example, it was "false". The default "usefile" attribute value is "true", so we will remove it altogether. Here is the updated test target:

 <target depends="compile">
 <junit printsummary="true" >
 <formatter type="xml" />
 <test name="junit.samples.VectorTest" />
 <classpath> <pathelement location="${outdir}" />
 </classpath>
 </junit>
 </target>


Now, when we run the buildfile, it creates an XML file named TEST-junit.samples.VectorTest.xml. The contents of the XML file are as follows:

<?xml version="1.0"?>
<testsuite errors="0" failures="0" name="junit.samples.VectorTest" tests="6" time="0.201">
 <testcase time="0"></testcase>
 <testcase time="0"></testcase>
 <testcase time="0"></testcase>
 <testcase time="0"></testcase>
 <testcase time="0"></testcase>
 <testcase time="0"></testcase>
</testsuite>


Because we now have the output in XML, we can use the junitreport task, which takes the XML and transforms it to HTML using XSLT. You don't have to know XSLT to use the junitreport task. There are two types of reports: those with frames and those without. We add the junitreport task tag after the junit task tag, as follows:

 <junitreport todir="./reports">
 <fileset dir=".">
 <include name="TEST-*.xml"/>
 </fileset>
 <report format="frames" todir="./report/html"/>
 </junitreport>


When we run this buildfile it generates the report shown on the next page.

Java Click To expand

As you can see from this figure, the report that is generated allows you to navigate the tests that were run. Therefore, instead of building large suites, you may want to use Ant and the junit task and just specify the tests you want to run as file sets; you will be able to generate really nice reports.


JaVa
   
Comments