Test cases organization (JUnit, Ant...)

Test cases organization (JUnit, Ant...)

Post by Stephane Baillie » Sun, 03 Dec 2000 04:00:00



Hello,

I would like some advices on testcase organization, directory structure,
dependency build, smoke test, and so on. Warning, This mail might be quite
long to read and require some Java background to understand my examples. A
JUnit and Ant knowledge is also required.

I feel that there is a need for such information and examples of automated
integration in the community. Everybody need the common structure of a build
file that could perform such automated build, run tests, process the tests
reports in order for them to be readable in a pleasant way.

Martin Fowler describe such system in "continous integration"
http://www.martinfowler.com/articles/continuousIntegration.html

I would like to reach such level of automated integration and automated
build for developers.

In the following paragraphs, I will ask some questions related to the source
code organization for a given system since I have some problems laying out
something that will not have too many drawbacks.

Background:
The software has about a dozen of modules in their own directory.
Modules depends on some thirdparty products (jaxp, crimson, log4j, jsse,
httpclient, ...)
Modules tests depend on misc products (ant, javacc, hypersonic, httpunit,
junit...)
Some modules have interdependencies: For example: mod6 depends on mod4 and
mod3.
Everything is built using Ant.

##########
Q1: How to organize the third party products and misc products ?
Third party products are updated (bugfixes, features, etc...) quite often.
All modules need to use the same version of a third party package.
What directory structure do you recommend so that each module know where to
get exactly the jar and to minimize the customization of ant build file for
each module. ?

what about:
main
 + misc
    + lib { ant.jar, javacc.zip, ... }
 + thirdparty
    + lib { jaxp.jar, log4j.jar, .... }
 + mod1
 + mod2
 + mod3

Or what about:
main
 + misc
    + log4j
      + src
      + dist
      + doc
...
 + mod1
 + mod2

If not using solution 2, for debugging and coding purposes, where do you put
the source code and API documentation of these products ? (If developers
rely on a local download from the internet, it will be dangerous as the API
and source code will not match the binaries 99% of the time).
I didn't find a good solution to this problem so far :-(

##########
Q2: How do you organize the module dependencies to always use the latest
build ?
If mod6 depends on mod4 and mod3, mod6 need to always use the latest
integration daily build. Where to put these binaries ?
What about :
 main
  ...
  + mod4
    + dist - latest integration build always here as mod4.jar
  + mod6 - mod6 build file points to ../mod4/dist/mod4.jar

lib contains the binary{jar}, the source code{jar}, and the javadocs
corresponding to the latest build (and more things that will be discussed
later)

##########
Q3: How do you organize the module structure ?
The module musts compile and being able to run easily using the structure.
Testcases often need various kind of data to perform tests (.xml, .xsl,
.properties, .java, etc...). Where to put these data so that they are not
affected by a build ?
Modules typically run using a product.home property. Where should this
product.home point so that when running the testcases, the module the same
directory structure as the one of the product ? etc ?

What about:
mod1
 + bin
    + classes
    + dist
    + docs
    + testcases
 + src
    + main
    + testcases { match the main package structure }
 + dist
 + conf
 + etc { match the main package structure and the product structure}

##########
Q4: How do you organize the testcases ?
Testcases source structure must match exactly the main package structure. It
avoid many testing problems, especially with package private classes to be
tested, import and so on.
It might be useful to run all the test of a package as well as all the tests
of a package and its subpackages. What is the best way to do this ?

It is commonly admitted to have a AllTests class that has a suite method
that return all TestCases of the package. However it duplicates works and
each time you add a testcase you must also update the AllTests class. :(

Assuming the AllTests was also adding the AllTests class of its immediate
subpackages. You may also have a PackageTests class that only run the test
of a the package.
If you want this kind of flexibility you must do the work 3 times when
adding a test case. :(((

The worse thing is that AFAIK and despite some attempts, JUnit task in Ant
1.2 does not generate a full report of a batchtest but a set of individual
report for each testcase found.
It is really a batchtest and not a compositetest.

As well, performing a batchtest for a PackageTests or an AllTests is not
that readable since all you have is the name of the test that were run as
the method name and not the testcase name. So you have some problems to
figure out when reading the report in WHICH testcase an error or failure
occured.

This individual testfiles for each testcase require quite some work to
process every xml file via an xsl(my limited abilities in xsl programming
are a real pain here but that's another story), and do a global report that
can be sent by mail as a result of a smoke test as well as generating the
html files that can be browsed easily for more details.

As an addition to the files mentionned in Q2, the dist directory of each
module will also contains the browsable testcases report of the build.

As Martin Fowler mentions in his article "continous integration", setting up
this kind of thing is not trivial and requires quite a lot of work. But you
can feel the power of such automated integration, it is damn motivating !

Thanks for any comments and advices you may have.

Note: Some people might remind that I asked a question a few months ago
about performing some tests on checked-in classes to see if the classes were
commented correctly and were conform to the company coding standards... To
avoid redeveloping a parser with ANTLR (which I'm not used to the syntax), I
quickly set up a JavaDoc doclet with a simple rule design. When you want to
add a new code checking rule you just code the corresponding class and
defines it in the rules list. A server is running and a perl trigger on the
SCC product connects to the server and sends the content of the class as
well as some context information (checkin comment, user, ...). It then check
the code for all rules and send back a report via mail to the developer that
checked-in (as well as its project leader for information.) After a couple
of weeks with this system the quality of the javadoc (and the correctness as
well) increased dramatically.

On a next step we are thinking about performing audit and metrics (with
commercial products of course). I have not seen yet commercial products
available that can check for both javadoc correctness and source code.

Metamata Audit Enterprise does not allow to check for the javadoc directly
and as they mentioned it honestly in a recent email it would require quite a
lot of effort to do it yourself (they are taking it as a suggestion for
enhancement though)

Parasoft JTest can perform such checking(even though the rules capabilities
are limited in some cases) but it is way too expensive - about $10,000 for
the project version (multiclasses) and $3,500 for the developer version
(single class...and useless to me)) and requires quite a high learning
curve. This also definitely not replace the ease of use of JUnit (and its
acceptance in the community).

--
Stephane Bailliez

 
 
 

Test cases organization (JUnit, Ant...)

Post by Jari M?ke » Tue, 05 Dec 2000 15:30:12


Just a couple of notes how we have done some of the organizing here in
Attoparsek.

For thirdparty stuff we have this kind of structure: each package or
tool has a directory of its own where different versions of that
package lie in subdirectories like this:

somepackage
           /somepackage-1.2.3
           /somepackage-1.2.4
           /somepackage-1.3

In the somepackage directory we then have a link "latest" that points
to the current production version of the package for instance:

somepackage

Then in path settings or links to binaries or whatever we can have a
quite static [...]/somepackage/latest/lib/somejar.jar and if we
require to use some older version for some project we can override the
default settings and point to some older version.

This structure is actually a much simpler version of the system used
in Helsinki University of Tech where they had multiple versions of
software for multiple hw-architectures and operating systems in the
same file servers.

For java source we have a basic directory structure like this:

project
       /src
           /mod1/com/attoparsek/package/name
                                            /test
           /mod2/com/attoparsek/package/name
                                            /test
           /test/test

And then we have a makefile that has the following rules among others:

<default>
        Compiles all *.java files found under src
test
        do <default> and run the unit tests by a runner class in
        test/test
jar
        build jars named after the subdirs of src containing all the
        classes in them (mod1.jar and mod2.jar)

So it is dead easy and fast to compile everything and run the tests by
just writing make test. And because it is simple and fast it is done
very often.

Unit tests for package are allways located in a sub-package called
test. In module test package test we have a test runner class where we
just add the test class names to an array and it runs all the tests
then in the classes listed there.

public class AllTests {

    public static void main(String[] args) {
        junit.textui.TestRunner.run(suite());
    }

    public static Test suite() {
        TestSuite suite = new TestSuite("All JUnit Tests");

        // add new tests by adding the test class to the array
        Class[] testClasses = {
            com.attoparsek.package.test.SomeTestClass.class,
            com.attoparsek.package.test.SomeOtherTestClass.class,
            [...]
        };

        for (int i = 0; i< testClasses.length; i++) {
            TestSuite ts = new TestSuite(testClasses[i]);
            Enumeration e = ts.tests();

            while (e.hasMoreElements()) {
                suite.addTest((Test) e.nextElement());
            }
        }
        return suite;
    }

Quote:}


 
 
 

Test cases organization (JUnit, Ant...)

Post by Stefan Bodewi » Tue, 05 Dec 2000 04:00:00



> Q1: How to organize the third party products and misc products ?

I usually don't keep them together with the sources at all. The jars
live somewhere below /usr/local/java/lib in versioned directories
jaxp-1.0.1, jaxp-1.1 and so on.

Quote:> Q2: How do you organize the module dependencies to always use the
> latest build ?

Actually I prefer the "one big buildfile with a single compile target"
approach over an hierarchical build, but judging from the feedback on
ant-user I'm rather alone with it. Having only a single compile target
ensures that the latest sources of all modules work together.

Quote:> Q3: How do you organize the module structure ?
> Q4: How do you organize the testcases ?

Close to Ant's own structure (i.e. have a main and a testcases branch
with matching package structures). I tend to keep the data needed for
tests with the test sources instead of a separate etc branch, though.

Quote:> The worse thing is that AFAIK and despite some attempts, JUnit task
> in Ant 1.2 does not generate a full report of a batchtest but a set
> of individual report for each testcase found.  It is really a
> batchtest and not a compositetest.

True, any contributions are welcome 8-).

Stefan

 
 
 

Test cases organization (JUnit, Ant...)

Post by Stephane Baillie » Tue, 05 Dec 2000 04:00:00




> > Q1: How to organize the third party products and misc products ?

> I usually don't keep them together with the sources at all. The jars
> live somewhere below /usr/local/java/lib in versioned directories
> jaxp-1.0.1, jaxp-1.1 and so on.

I think I will try to use the following:
thirdparty
  + module
       version_x.y.z    <--- filename indicating the current version
       + latest (here the latest binary + source + docs)

the version_x.y.z is useless on Unix, but NT lacks the link facility and I
want to have the version directly without digging into the docs or the jar
manifest or even decompiling the jar (why the hell Sun doesn't even use the
manifest facilities for his own products ???). I didn't find better yet and
don't want to name the directory so I always provide the entry point as
'latest' in order not to break each build file when the version is updated.

Quote:> > Q2: How do you organize the module dependencies to always use the
> > latest build ?

> Actually I prefer the "one big buildfile with a single compile target"
> approach over an hierarchical build, but judging from the feedback on
> ant-user I'm rather alone with it. Having only a single compile target
> ensures that the latest sources of all modules work together.

I follow you on this. Each module has its own life. Integration takes care
of generating the latest version. The buildfile for each module always point
to the latest integration binary.

Quote:> > Q3: How do you organize the module structure ?
> > Q4: How do you organize the testcases ?

> Close to Ant's own structure (i.e. have a main and a testcases branch
> with matching package structures). I tend to keep the data needed for
> tests with the test sources instead of a separate etc branch, though.

I'm in the process of using the Ant structure for testcases and data.

Quote:> > The worse thing is that AFAIK and despite some attempts, JUnit task
> > in Ant 1.2 does not generate a full report of a batchtest but a set
> > of individual report for each testcase found.  It is really a
> > batchtest and not a compositetest.

> True, any contributions are welcome 8-).

Thanks Stefan :-)
 
 
 

Test cases organization (JUnit, Ant...)

Post by Stephane Baillie » Tue, 05 Dec 2000 04:00:00



Quote:

> Just a couple of notes how we have done some of the organizing here in
> Attoparsek.

[...]

Thanks for the feedback. This gave me a couple of ideas.

--
 Stphane Bailliez
 Software Engineer, Paris - France
 iMediation - http://www.imediation.com
 Disclaimer: All the opinions expressed above are mine and not those from my
company.