Proposals:Increasing ITK Code Coverage

From KitwarePublic
Revision as of 13:20, 11 July 2010 by Ibanez (talk | contribs) (→‎Infrastructure)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search


ITK currently (Dec 20th 2008) has a 80.5% code coverage.

Sloccount report on number of lines of code in the Insight/Code directory returns:

 158,928 lines of code

This means that about 31,000 lines of code are not tested.

We could significantly increase the code coverage of the toolkit, and in the process reduce the number of hidden bugs, by asking volunteers to adopt particular classes and write additional tests for increasing their code coverage. This could be done at the image of the "Adopt a Bug" program.


It has been pointed out that the current testing infrastructure of ITK impose a high threshold of effort on contributors of new tests.


There are some existing unit test harnesses that might decrease the effort and provide additional functionality. A unit testing package for ITK must meet the following requirements:

  1. It must have an itk-compatible license.
  2. We must be able to distribute it with itk.
  3. It must support all itk platforms.
  4. It must fit within the itk test harness facility. Recall that we try to minimize the number of executables by combining large numbers of tests into FooTests.cxx files.
  5. It must be compatible with cmake/ctest/cdash. For example, a test must be able to "return EXIT_SUCCESS" and "return EXIT_FAILURE".
  6. It must not add complexity to an already complex testing process.
  7. It must be compatible with itk's strict attention to compile warnings and dynamic memory analysis. In other words, it must not produce warnings or purify defects.
  8. It should have a minimal source footprint.
  9. It should allow to test the Examples, but without inserting instrumented code in them (so they still should look like normal Examples).

Suggestions for improving the testing system to make easier for contributors to introduce new tests include

Boost Test

Suggested by Steve Robbins

How it could work

--------------------- itkImageRegionTest.cxx ---------------------------------

#include <boost/test/auto_unit_test.hpp>

#include "itkImageRegion.h"

template< unsigned int VImageDimension >
struct Fixture
    typedef itk::ImageRegion<VImageDimension>            RegionType;
    typedef typename RegionType::IndexType               IndexType;
    typedef typename RegionType::SizeType                SizeType;

    RegionType mRegion;

struct Fixture1 : public Fixture<1>
    Fixture1( int start0,
	      int size0 )
	IndexType start = {{ start0 }};
	SizeType  size  = {{ size0 }};
	mRegion = RegionType( start, size );

struct Fixture2 : public Fixture<2>
    Fixture2( int start0, int start1,
	      int size0,  int size1 )
	IndexType start = {{ start0, start1 }};
	SizeType  size  = {{ size0,  size1 }};
	mRegion = RegionType( start, size );

struct Fixture3 : public Fixture<3>
    Fixture3( int start0, int start1, int start2,
	      int size0,  int size1,  int size2 )
	IndexType start = {{ start0, start1, start2 }};
	SizeType  size  = {{ size0,  size1,  size2 }};
	mRegion = RegionType( start, size );

    Fixture3 volume( 12, 12, 12, 10, 20, 30 );
    Fixture2 slice0( 12, 12, 20, 30 );
    Fixture2 slice1( 12, 12, 10, 30 );
    Fixture2 slice2( 12, 12, 10, 20 );

    BOOST_CHECK_EQUAL( slice0.mRegion, volume.mRegion.Slice( 0 ) );
    BOOST_CHECK_EQUAL( slice1.mRegion, volume.mRegion.Slice( 1 ) );
    BOOST_CHECK_EQUAL( slice2.mRegion, volume.mRegion.Slice( 2 ) );
BOOST_AUTO_TEST_CASE( testSliceOutOfBounds )
    Fixture3 volume( 12, 12, 12, 10, 20, 30 );

    BOOST_CHECK_THROW( volume.mRegion.Slice( -1 ), std::exception );
    BOOST_CHECK_THROW( volume.mRegion.Slice( 3 ), std::exception );

BOOST_AUTO_TEST_CASE( testVolumeIsInside )
    Fixture3 volumeA( 12, 12, 12, 10, 20, 30 );
    Fixture3 volumeB( 14, 14, 14,  5, 10, 15 );

    BOOST_CHECK(   volumeA.mRegion.IsInside( volumeB.mRegion ) );
    BOOST_CHECK( ! volumeB.mRegion.IsInside( volumeA.mRegion ) );

--------------------- itkImageRegionTest.cxx ---------------------------------

Google Test

The Google Test framework is very similar to the Boost test harness. GTest is essentially a set of macros that assist the developer in writing concise tests. The framework does not make use of exceptions, nor templates and is supported on all major platforms and some minor ones, i.e., Cygwin, Windows CE, and Symbian. The code is available under the BSD license. The framework supports many of the features already in place through ctest, e.g., run every N'th test, run all matching tests, etc. The source code and includes are ~600K and trivially compiles using CMake without configuration or modification.

# Build Google Testing
set ( GTestSource
include_directories ( ${MI3CLib_SOURCE_DIR}/Testing/Utilities/gtest-1.2.1/include )
add_library(gtest ${BUILD_SHARED_LIBS} ${GTestSource})

Test Driver

The test driver is a very simple main function:

#include <gtest/gtest.h>

int main(int argc, char* argv[])
  testing::InitGoogleTest ( &argc, argv );
  return RUN_ALL_TESTS();

Types of Test

There are two basic types of tests, simple tests using the TEST macro, and test fixtures using the TEST_F macro. Many different macros are available to when executing tests, ranging from string comparisons, expected exceptions, floating point comparisons, etc. The basic framework is well documented with advanced guidance for those who dig deeper. Below is example code from an internal project that demonstrates how to write a test. Text fixtures run as methods in a sub-class of the fixture and have access to all public and protected ivars of the fixture. All test macros function as stream operators with any text directed into them appearing in the output. NB: in this example an MD5 hash is used to verify correct output rather than comparison to a known good image.

TEST(IO, LoadCT) {
  mi3c::ImageLoader loader;
  mi3c::Image::Pointer image = loader.setFilename ( dataFinder.getFile ( "CT.hdr" ) ).execute();
  ASSERT_EQ ( "c1d43aaa5b991431a9daa1dc4b55dbb1", image->getMD5() ) << " failed to load the expected image data";

class ImageDataTest : public testing::Test {
  ImageDataTest () {
    image = NULL;
    floatImage = NULL;
  virtual void SetUp() {
    mi3c::ImageLoader loader;
    try {
      image = loader.setFilename ( dataFinder.getFile ( "MRA.hdr" ) ).execute();
      mi3c::ConvertDataType convert ( mi3c::mi3cFLOAT );
      floatImage = convert.execute ( image );
    } catch ( itk::ImageFileReaderException e ) {
      FAIL(); // Couldn't load, so fail this test before we go any further with bad data.
  virtual void TearDown() {
    image = NULL;
    floatImage = NULL;

  mi3c::Image::Pointer image;
  mi3c::Image::Pointer floatImage;

TEST_F(ImageDataTest, DiscreteGaussianFilter) {
  mi3c::DiscreteGaussianFilter filter;
  mi3c::Image::Pointer o = filter.execute ( image );
  EXPECT_EQ ( "6adeb490bda64b47e9c1bd6c547e570e", o->getMD5() ) << " Filtered with a gaussian";
  EXPECT_EQ ( "300c7ee796d1b3c2b49a7649789bbf55", filter.execute ( floatImage )->getMD5() ) << " Filtered with a gaussian";

TEST_F(ImageDataTest, MeanFilter) {
  mi3c::MeanFilter filter;
  filter.setRadius ( 1 );
  mi3c::Image::Pointer o = filter.execute ( image );
  EXPECT_EQ ( "8b7235e1f8497b0a7fb84eb5c94af00b", o->getMD5() ) << " Mean filtered";
  EXPECT_EQ ( "069a6670309db5c03a79af11a9c6e526", filter.execute ( floatImage )->getMD5() ) << " Mean filtered";

Running the tests

Test status is reported (in color) when running the tests and final status is reported as the exit status, much like current ITK testing.

[blezek@mi3bld04 MI3CLib-linux86-gcc]$ bin/NoOp  /mi3c/projects/Source/MI3CTestData
[==========] Running 9 tests from 3 test cases.
[----------] Global test environment set-up.
[----------] 4 tests from ImageDataTest
[ RUN      ] ImageDataTest.MD5
[       OK ] ImageDataTest.MD5
[ RUN      ] ImageDataTest.Threshold
[       OK ] ImageDataTest.Threshold
[ RUN      ] ImageDataTest.DiscreteGaussianFilter
[       OK ] ImageDataTest.DiscreteGaussianFilter
[ RUN      ] ImageDataTest.MeanFilter
[       OK ] ImageDataTest.MeanFilter
[----------] 2 tests from IO
[ RUN      ] IO.LoadCT
[       OK ] IO.LoadCT
[ RUN      ] IO.LoadInvalidFile
[       OK ] IO.LoadInvalidFile
[----------] 3 tests from Image
[ RUN      ] Image.InstantiateImage
[       OK ] Image.InstantiateImage
[ RUN      ] Image.InstantiateImage2
[       OK ] Image.InstantiateImage2
[ RUN      ] Image.TestHash
[       OK ] Image.TestHash
[----------] Global test environment tear-down
[==========] 9 tests from 3 test cases ran.
[  PASSED  ] 9 tests.
[blezek@mi3bld04 MI3CLib-linux86-gcc]$ echo $?

CMake Integration

With a slightly clever CMake macro, and a regular expression or two, Google tests are trivially integrated into CMake projects. Here, all TEST and TEST_F macros found in the source code are added as tests to the project. Each test is run as:

NoOp --gtest_filter=TestGroup.TestName

where TestGroup is the first argument to the TEST macro, and TestName is the second.

# C++ tests
set ( mi3cTestSource

add_executable(NoOp ${mi3cTestSource})

macro(ADD_GOOGLE_TESTS executable)
  foreach ( source ${ARGN} )
    file(READ "${source}" contents)
    string(REGEX MATCHALL "TEST_?F?\\(([A-Za-z_0-9 ,]+)\\)" found_tests ${contents})
    foreach(hit ${found_tests})
      string(REGEX REPLACE ".*\\(([A-Za-z_0-9]+)[, ]*([A-Za-z_0-9]+)\\).*" "\\1.\\2" test_name ${hit})
      add_test(${test_name} ${executable} --gtest_filter=${test_name} ${MI3CTestingDir})

# Add all tests found in the source code, calling the executable to run them
add_google_tests ( ${EXECUTABLE_OUTPUT_PATH}/NoOp ${mi3cTestSource})


Suggested by Mathieu Malaterre

This package is distributed under an MIT License:

Custom XML Testing Framework for CTest CMake CDash Integration

This proposal has been submitted as a Insight Journal Article:

If we look at the XML for the CDash (CDash-XML), specifically the file which describes the valid xml for test (ValidationSchemata/Text.xsd) we see the NamedMeasurement tag. This is the field that is displayed for the test. It can show a variety of types with the "type" attribute. Previously Dart defined the following types: "numeric/integer", "numeric/float", "numeric/double", "numeric/boolean", "text/string", "text/plain", "image/png", "image/jpeg".

The way this testing system works is each test produces XML output of "NamedMeasurements". This XML is then compared against a XML baseline. Then any difference will be reported and sent to CDash via ctest. To produce a baseline, the output of the test should be carefully manually verified, then attributes added to the XML tags to describe how to do comparisons.

Sample test:

class ImageFileReaderInfoTest:
    public itk::Regression
  virtual int Test(int argc, char* argv[] )

    if( argc < 2 )
      std::cerr << "Usage: " << argv[0] << " inputFile" << std::endl;
      return EXIT_FAILURE;

    typedef unsigned char             PixelType;
    typedef itk::Image<PixelType,3>   ImageType;

    typedef itk::ImageFileReader<ImageType>         ReaderType;
    ReaderType::Pointer reader = ReaderType::New();
    reader->SetFileName( argv[1] );


    ImageType::Pointer image = reader->GetOutput();
    ImageType::RegionType region = image->GetLargestPossibleRegion();
    itk::ImageIOBase::Pointer imageIO = reader->GetImageIO();

    this->MeasurementInsightSize( region.GetSize(), "ImageSize" );
    this->MeasurementInsightVector( image->GetSpacing(), "ImageSpacing" );
    this->MeasurementInsightPoint( image->GetOrigin(), "ImageOrigin" );
    this->MeasurementInsightMatrix( image->GetDirection(), "ImageDirection" );
    this->MeasurementTextString( imageIO->GetFileTypeAsString( imageIO->GetFileType()), "FileTypeAsString" );
    this->MeasurementTextString( imageIO->GetByteOrderAsString( imageIO->GetByteOrder() ), "ByteOrderAsString" );
    this->MeasurementTextString( imageIO->GetPixelTypeAsString( imageIO->GetPixelType() ), "PixelTypeAsString" );
    this->MeasurementTextString( imageIO->GetComponentTypeAsString( imageIO->GetComponentType() ), "ComponentTypeAsString" );
    this->MeasurementNumericInteger( imageIO->GetNumberOfComponents( ), "NumberOfComponents" );

    return EXIT_SUCCESS;

The baseline XML:

<?xml version="1.0" encoding="US-ASCII"?>
<!-- created on victoria at Fri Jul 24 17:22:33 2009
<DartMeasurement name="ImageSize_0" type="numeric/integer">34</DartMeasurement>
<DartMeasurement name="ImageSize_1" type="numeric/integer">34</DartMeasurement>
<DartMeasurement name="ImageSize_2" type="numeric/integer">141</DartMeasurement>
<DartMeasurement name="ImageSpacing_0" type="numeric/double">62</DartMeasurement>
<DartMeasurement name="ImageSpacing_1" type="numeric/double">62</DartMeasurement>
<DartMeasurement name="ImageSpacing_2" type="numeric/double">1</DartMeasurement>
<DartMeasurement name="ImageOrigin_0" type="numeric/double">30</DartMeasurement>
<DartMeasurement name="ImageOrigin_1" type="numeric/double">30</DartMeasurement>
<DartMeasurement name="ImageOrigin_2" type="numeric/double">0</DartMeasurement>
<DartMeasurement name="ImageDirection_0_0" type="numeric/double">1</DartMeasurement>
<DartMeasurement name="ImageDirection_0_1" type="numeric/double">0</DartMeasurement>
<DartMeasurement name="ImageDirection_0_2" type="numeric/double">0</DartMeasurement>
<DartMeasurement name="ImageDirection_1_0" type="numeric/double">0</DartMeasurement>
<DartMeasurement name="ImageDirection_1_1" type="numeric/double">1</DartMeasurement>
<DartMeasurement name="ImageDirection_1_2" type="numeric/double">0</DartMeasurement>
<DartMeasurement name="ImageDirection_2_0" type="numeric/double">0</DartMeasurement>
<DartMeasurement name="ImageDirection_2_1" type="numeric/double">0</DartMeasurement>
<DartMeasurement name="ImageDirection_2_2" type="numeric/double">1</DartMeasurement>
<DartMeasurement name="FileTypeAsString" type="text/string">Binary</DartMeasurement>
<DartMeasurement name="ByteOrderAsString" type="text/string">LittleEndian</DartMeasurement>
<DartMeasurement name="PixelTypeAsString" type="text/string">scalar</DartMeasurement>
<DartMeasurement name="ComponentTypeAsString" type="text/string">unsigned_char</DartMeasurement>
<DartMeasurement name="NumberOfComponents" type="numeric/integer">1</DartMeasurement>

The strengths of this approach is that it separates the execution and the validation (or is it verification) making the test code it's self smaller. It easily integrates with the CDash/CMake infrastructure (as it was designed to). Many existing test which print text could easily be migrated to this approach so that the output of the program is also validated and we will know when it changes. It could easily be expanded to compare new types. A single executable could be run with multiple arguments for multiple test and each test could have a different baseline. On the down sides this may require the most work to get working.

Downloading and Documentation

The latest version can be downloaded via SVN:

svn co

The following little CMakeLists.txt file is needed to make a working project out of the 3 utility libraries:




OPTION(BUILD_SHARED_LIBS "Build with shared libraries." OFF)


SET (CMAKE_RUNTIME_OUTPUT_DIRECTORY ${PROJECT_BINARY_DIR}/bin CACHE INTERNAL "Single output directory for building all executables.")

ADD_SUBDIRECTORY ( FileUtilities )
ADD_SUBDIRECTORY ( TestingFramework )

Documentation can be found here:

Other C++ Unit Test Framework