ITKv3 Procedure for Adding a Test

From KitwarePublic
Jump to navigationJump to search

Appopriate testing is the most important aspects of writing software.

How to Add a Test in ITK

The general procedure involves the following steps

  • Write the test itself
  • Add the Test to the Test Driver
  • Add the Test to the CMakeLists.txt file
  • Run the Test locally
  • Submit an Experimental build
  • Commit the Test to CVS

Write the test

Naming the file

Unit tests should be named with the name of the class they are testing and the word Test at the end.

For example:

  • class itkBMPImageIO will have a test called itkBMPImageIOTest.cxx
  • class itkIndex will have a test called itkIndexTest.cxx* itkBMPImageIOTest.cxx

Sometimes it may be necessary to have multiple tests, in which case a number will be added after the "Test" string of the name.

For example:

  • itkBMPImageIOTest.cxx
  • itkBMPImageIOTest2.cxx
  • itkBMPImageIOTest3.cxx

Note that we skip the "1" entry. It is implicitly the first test.

Writing the content

The test file must contain a function that has the EXACT same name as the file (without the extension).

For example, the file itkBMPImageIOTest.cxx must contain a function

    int itkBMPImageIOTest( int argc, char * argv [] )

The function must have a return value, and must be one of the following two options:


NOTE: Input for tests should reside in Testing/Data/Input or Examples/Data. Tests should not depend on the output of another test. All tests are not necessarily run in the same session. So each test should be able to run with a clean Testing/Temporary directory.

Adding The Test to the Test Driver

What are Test Drivers

ITK uses test drivers in order to manage the very large number of unit test. A test driver aggregates many test in a single executable by registering all of them as functions.

As a general guideline, there is a Test driver per major directory. The test driver is named after the directory that it is testing.

For example, the classes in the directory


are tested by files in the directory


and will use a test driver file called


Note that in some directories, there are so many classes that multiple test drivers are needed. In such cases they are named numerically as:

  • itkAlgorithmsTests.cxx
  • itkAlgorithmsTests2.cxx
  • itkAlgorithmsTests3.cxx
  • itkAlgorithmsTests4.cxx

As a general guideline, a Test driver should not contain more than 50 tests. Note however that, in some cases, the reason for having multiple drivers is not necessarily the large number of tests that they aggregate, but their own code size, which tend to give trouble to some linkers (e.g. Borland).

How to add the tests

Check if there are multiple test drivers in the directory, and if so, select the last one in numerical order. For the example above, a new test should be added in itkAlgorithmsTests4.cxx. Check first if this test driver has less than 50 tests inside. If it has more than that, you may have to create a new Driver (e.g. itkAlgorithmsTests5.cxx in this case).

You can add the test by simply inserting the following line


In order to make life easier for maintainers, it is nice to keep the tests sorted alphabetically, but this is not a software requirement.

Add the Test to the CMakeLists.txt file

The tests should be registered as well in the CMakeLists.txt file of the Testing directory were the test file and its driver are.

Add filename to list of sources to compile

Typically this can be done with the following procedure

  • Find the variable that contains the list of filenames for tests. (e.g. SET( IOTest_SRCS ... )
    • Add the test filename to this list (itkBMPImageIOTest2.cxx for example).


Minimum Case

Add a line like the following



  • itkBMPImageIOTest5 is a symbolic name for the tests. This is the name that ctest will use, and the one that will appear in a Dashboard
  • ${IO_TESTS} is the CMake variable containing the executable for that directory. Note that it must match the Test driver where you registered the test.
  • itkBMPImageIOTest2 is the name of the test itself, and it must match the test filename and the name of the function inside the test file.

Passing arguments

Some tests may require command line arguments. In that case, these arguments can be added after the name of the tests function. For example


will pass two BMP images as arguments to the test.

Adding regression Testing

Some test may produce images as output, in which case we should add regression testing instructions that make possible to compare the test output against a baseline image.

A typical case will look like:

    --compare ${ITK_DATA_ROOT}/Baseline/IO/image_color.bmp

Note that the "--compare" string goes just after the name of the executable, and it is followed by the filename of the baseline image and the filename of the test output.

Since different platforms may produce slightly different, but still acceptable, results; the regression testing system allows users to define a tolerance for the comparison. Three different tolerances are available

  • --compareIntensityTolerance
  • --compareRadiusTolerance
  • --compareNumberOfPixelsTolerance

compareIntensityTolerance sets the intensity difference after which two pixels are considered to be different. For example, if this tolerance is set to 5, then a pixel with value 123 is considered to be the same as pixels with values in the range [118,128].

compareNumberOfPixelsTolerance defines a Manhattan-type neighborhood around a pixel. When comparing pixel A from one image to pixel B in another image, all pixels in the neighborhood of pixel B are compared against pixel A, The most similar one is selected and then their intendity difference is tested against compareIntensityTolerance. If the difference is larger, then the pixel "FAILS" the comparison.

compareNumberOfPixelsTolerance defines how many pixels can be accepted to fail, and still consider the two images to be the same. If more than compareNumberOfPixelsTolerance failed the comparison, then the test will FAIL and a difference image between the test output and the baseline will be produced and posted in the Dashboard.

WARNING: Tolerances must be used sparingly. Before adding a tolerance, you must exhaust the options for making sure that there is not a bug in the test, or a bug in the class, that is being revealed by the test failure in that platform. Once you demonstrate that a tolerance is needed, add just enough of a tolerance to make the test pass. Do not over-relax the test, because then it will lose its capacity for detecting real failures in the future, and it will end up providing a false sense of security.

A test with regression verification and tolerances will look like

    --compare ${ITK_DATA_ROOT}/Baseline/IO/image_color.bmp
    --compareIntensityTolerance 5
    --compareRadiusTolerance 1
    --compareNumberOfPixelsTolerance 25

This test will consider two pixels to be different only if their intensity values differ by more than 5 units. It will compare a pixels with the 3x3 neighborhood of the other pixel (in 2D), and it will tolerate 25 pixel failures. Pixel failures above that value it will result in the test failing, and a difference image being posted to the Dashboard.

Run the Test locally

Rerun CMake on your project in order to include the new test in the project configuration of your native build system.

Run the test by calling

  ctest -V -R TestName

Where "TestName" is a regular expression that identifies the test or tests that you want to run.

Submit an Experimental build

You can do this by calling

   ctest -D Experimental


   make Experimental

or in Windows, Visual Studio, by selecting the Experimental project and invoking "Build" on int.