Track down Will Scott's torperf-like scripts, make them public if needed, and do a trial deployment somewhere
Will Scott says he's
using the code to actively look at the performance impact of
proxies on web page load time. Essentially it's a wrapper around
http://phantomjs.org/ with some aggregation and reporting added.
He adds that there are two design things that ought to get figured out
Where the monitoring should live. I have servers I can use to get a system
working at UW. At some point in a few years I'll graduate, and my
experience is that things which get left behind decay pretty fast, so I'm
somewhat hesitant to go that route.
How to get a stable / meaningful measurements. We need enough aggregation
across both the circuit and the destination domain to dampen individual
server issues and be able to say something about tor as a whole. Are there
other factors I'm missing that aggregation + setting up a new circuit
before each measurement won't be able to overcome?