I've been working a bit with Nat Pryce on his 'Protest' project recently. It's a python unit test framework which generates documentation from the tests. E.g. you write test cases like:


class myFunctionTests:    
    ''' myfunction is a function belonging to me '''

    def does_something_cool():
        ... code which asserts that something cool is done ...

    def does_something_else_as_well():
        assert(something_else)

and the test framework will generate web documentation along the lines of :

myFunction myfunction is a function belonging to me Features of function myFunction:
  • does something cool
  • does something else as well

It then goes on to show the tests that confirm these statements all nicely marked up and also does some nifty graphviz diagramming stuff - all pretty rinky dinky. I'm hoping to get round to using it to document bicyclerepairman before my motivation runs out.

Anyway, the really interesting thing is seeing how the documentation informs which tests I write. In general I'm testing stuff that I wouldn't have bothered with before just so that I get some doc for it. It also ensures that documentation doesn't rot since each piece of documentation is tested against the codebase. Sweet!

There's no proper 'release' as such yet, but if you're interested in partially working software then Nat's got the subversion repository in his xspecs sf project - just do a

svn checkout https://svn.sourceforge.net/svnroot/xspecs/protest-python/trunk

(hope that's ok Nat!)