Opened 5 years ago

Closed 5 years ago

#12909 closed defect (worksforme)

stem.util.str_tools.get_size_label() insufficiently precise

Reported by: mmcc Owned by: atagar
Priority: Medium Milestone:
Component: Core Tor/Stem Version:
Severity: Keywords:
Cc: Actual Points:
Parent ID: Points:
Reviewer: Sponsor:


There is a known issue in Arm that causes AccountingMax sizes to be displayed with insufficient precision. For example, a real AccountingMax of 1850 GB will be displayed as 1 TB.

This is confusing and perplexing, especially for those worried about big bandwidth overage charges.

It seems that this is an issue with stem.util.str_tools.get_size_label(), which is itself mostly a wrapper for stem.util.str_tools._get_label(). These functions format and return the string that is then displayed:

(the call in arm)

(the called Stem function)

Here's the above example problem returned from the master branch of Stem (version, I believe), cloned on 20-08-2014. 1850000000000 is 1850 GB converted to bytes:

>>> from stem.util.str_tools import get_size_label
>>> get_size_label(1850000000000)
'1 TB'

Child Tickets

Change History (1)

comment:1 Changed 5 years ago by atagar

Resolution: worksforme
Status: newclosed

Hi mmcc. Thanks for pointing this out though this isn't a Stem bug. Stem allows you to provide the precision for its results...

>>> from stem.util.str_tools import get_size_label
>>> get_size_label(1850000000000, decimal = 2)
'1.68 TB'
>>> get_size_label(1850000000000, decimal = 40)
'1.6825651982799172401428222656250000000000 TB'

The arm ticket is mostly about deciding a balance between space concerns and precision. Showing '1.682565198279917240142822265625 TB' in the above example would be unacceptable since it would be too wide, but you're right that '1 TB' isn't great either. This will probably be a compromise of a couple decimal places or so.

Arm is undergoing a major, multi-month overhaul and I plan to take care of issues of that sort at the end.

Note: See TracTickets for help on using tickets.