Measure the code coverage of the ooni probe unittests
At 2013-07-04 13:12:14 Arturo Filastò wrote: As we agreed in #107 (moved) we should assess how much code coverage we are reaching with our unittest (by using a tool like coverage) and possible integrate it with coveralls.io.
Here are some suggestions by @nathan-at-least
Some handy automated tools are:
API documentation generator from python doc strings - so that anyone can browse the names and intent of particular tests. Coverage analysis - see coverage which can generate html reports of which lines of application code are exercised by unit tests. This is a quick way to notice untested portions of code. Test Bots - Setting up a bot to run unit tests then generate an html report for various revisions and platforms can quickly show regressions.
This issue was automatically migrated from github issue https://github.com/TheTorProject/ooni-probe/issues/128