Opened 5 years ago

Closed 3 years ago

Last modified 3 years ago

#13080 closed enhancement (duplicate)

Add Ant tasks for measuring coverage, dependencies, etc.

Reported by: karsten Owned by:
Priority: Medium Milestone: Onionoo 3.1-1.0.0
Component: Metrics/Onionoo Version:
Severity: Normal Keywords:
Cc: iwakeh Actual Points:
Parent ID: Points:
Reviewer: Sponsor:

Description

I'd like to use SonarQube to improve Onionoo's code quality. In the long term, I want Tor to run a SonarQube server (which could mean that I run it myself) and make it available to all projects in the Tor ecosystem, ideally integrated into our continuous integration workflow. But let's start small and try out SonarQube for a single project, which would be Onionoo.

I think step 1 is to install the SonarQube server and SonarQube Runner into the Vagrant machine. (The server would later be installed on a separate host, but for the purpose of this evaluation it should be fine to install it on the Vagrant box.) Maybe this installation shouldn't happen automatically using Vagrant's bootstrap script, but be described in a short howto document for those who want to try it. This howto could designate the end of step 1.

Step 2 would be to look through the issues found by SonarQube (of which there are plenty) and either start fixing them or tweak the rules to better match our requirements. This step may also include searching for other plugins that might be useful. At the end of this step we should have a better idea of whether we want to deploy a SonarQube machine for other Tor projects. The result could be a SonarQube configuration that works fine for Onionoo and that comes with documentation why it differs from the default configuration.

But before I hide in my cave for too long: any general thoughts on this idea? Cc'ing iwakeh who I think might have an opinion about this.

Child Tickets

TicketTypeStatusOwnerSummary
#13616enhancementclosediwakehdefine jmeter testcase(s) and ant task(s)

Change History (6)

comment:1 Changed 5 years ago by iwakeh

Some thoughts (sort of leading into the opposite direction to reach the same goal):

What and Why

The decision about which quality metrics are really important,
for both the general Tor project and Onionoo in particular, should come first.

In my opinion, the most important metric is test coverage,
b/c tests really document, what the code is supposed to do
and verify this once a test is in place. (Onionoo's tests are quite
helpful for me designing the client-api-protocol, but I also noticed
that there are some part not yet covered by tests.)
Design metrics (like cycle-detection, code complexity, etc.) should
be a top level concern, too.

After that I would put metrics along the lines of findbugs
with a focus on security vulnerabilities.

Metrics about style, 'comments per code line' or javadoc coverage might
not be as useful. Who wants to read javadoc like 'this is the string input to
function xYz'? Unfortunately, such comments are 'rewarded' by these metrics.
The Onionoo code base hardly contains any comments. In my opinion
that's fine and the comments I encountered were really worth reading.

How (and Why)

When coding metrics are supposed to be enforced, it should be possible
to measure during the coding itself, i.e. having one or more ant tasks
for measuring coverage, dependencies, etc. would be convenient.

These can be easily integrated into many IDEs and are also useful
when only a basic editor and the command line are available.
In addition, results from these tasks could be integrated and visualized
in Sonar and/or some CI (Does Tor use any CI?)

Integration, Visualization

When there are meaningful metrics these should be published using Sonar, JeninsCi or ...


Is there an overview what other Tor projects use? What metrics and visualization tools?


comment:2 Changed 5 years ago by karsten

Summary: Evaluate using SonarQube to improve Onionoo's code qualityAdd Ant tasks for measuring coverage, dependencies, etc.

Sorry for the late reply. I agree with you that metrics on test coverage, dependencies, etc. are more important than code-style metrics. I also agree that having one or more Ant tasks for this would be a fine start. I just changed the subject to reflect that. Would you want to write these Ant tasks?

To quickly answer your questions: other Tor projects use CI, but I don't know much about that. I just asked our CI person to give me an overview of Tor's CI or simply post on this ticket. Once we have a better idea we should move this part of the discussion to its own ticket.

comment:3 Changed 4 years ago by karsten

Type: taskenhancement

Sounds like an enhancement to me, and what's a "task" anyway in this context?

comment:4 Changed 3 years ago by iwakeh

Milestone: Onionoo 3.1.1

comment:5 Changed 3 years ago by iwakeh

Resolution: duplicate
Severity: Normal
Status: newclosed

Actually we now have the Coding Guidelines and work on implementing them in all Java projects.
So this can be closed, I think. Work is tracked in #19613.

Feel free to reopen, if I missed anything.

comment:6 Changed 3 years ago by karsten

Milestone: Onionoo 3.1.1Onionoo 3.1-1.0.0

Milestone renamed

Note: See TracTickets for help on using tickets.