Opened 8 years ago

Closed 7 years ago

#5273 closed defect (fixed)

Update TBB design doc for 2.3.x

Reported by: mikeperry Owned by: mikeperry
Priority: High Milestone: TorBrowserBundle 2.3.x-stable
Component: Firefox Patch Issues Version:
Severity: Keywords: MikePerry201302d
Cc: trallala, proper@…, gk, pde Actual Points: 20
Parent ID: Points:
Reviewer: Sponsor:

Description

There's a few XXX's in the design doc I should clean up.

Additionally, we need to describe our resolution defenses, provide an entropy count estimate for fingerprinting defenses, and document the environment variables and settings used to provide a non-grey "New Identity" button.

There may also be some development changes we'll want to roll up into this update, too.

Child Tickets

Change History (48)

comment:1 Changed 8 years ago by mikeperry

From https://lists.torproject.org/pipermail/tor-talk/2012-January/022899.html:

The design doc isn't crystal clear here. It is clear that bing.com
searches will not leak to igoogle page, but not clear if
encrypted.google.com searches leak to www.google.com/ig and vice versa.

You're right, on a more technical level we need to tighten some
definitions. Unfortunately, the underlying implementation for each
identifier storage is not always uniform between FQDNs versus
subdomains. But, this could just mean we take the loosest definition.
Ie, in most cases mail.google.com can track you on
encrypted.google.com, but mail.google.com can't track you on
www.twitter.com.

comment:2 Changed 7 years ago by mikeperry

We've also made changes to prefs.js, added new patches, and deployed an update notification system we should describe.

comment:3 Changed 7 years ago by proper

Cc: proper@… added

Under https://www.torproject.org/projects/torbrowser/design/ the Single state testing is out of date. Some of the test sites do no longer exist.

A new nice site for testing may be http://www.stayinvisible.com.

comment:4 Changed 7 years ago by proper

Is it okay if I add comments here, were documentation is incomplete either in The Design and Implementation of the Tor Browser [DRAFT] or Torbutton Design Documentation? (You may prefer one ticket for Tor Browser and one for Tor Button documentation issues.)

For example, Torbutton Design Documentation lacks documentation about what "Transparent Torification (Requires custom transproxy or Tor router)" in Tor Button settings does.

comment:5 Changed 7 years ago by mikeperry

Yeah, here is fine for comments. The Torbutton design document needs to be retired. Anything it says that we still want to keep we should make a note to add to the Tor Browser document.

Also, note to self: As the QA testing pages, sandboxes, and procedures solidify, we should record them in the Testing section of the Tor Browser doc.

comment:6 Changed 7 years ago by mikeperry

I should also provide direct links to known violations of our design requirements in our bugtracker. These links are https://trac.torproject.org/projects/tor/query?keywords=~tbb-linkability for the cross-site identifier linkability violations, and https://trac.torproject.org/projects/tor/query?keywords=~tbb-fingerprinting for the fingerprinting linkability violations.

comment:7 Changed 7 years ago by mikeperry

Keywords: MikePerry201206 added

comment:8 Changed 7 years ago by mikeperry

Cc: gk added

Georg Koppen notes the following:

  1. We specify browser.cache.memory.enable under disk avoidance. That's wrong. We don't even set it at all. Torbutton relic?
  1. We should only preserve window.name if the url bar domain remains the same. I could be convinced of this, but it's going to be trickier to implement and I think it's not really possible to remove linkability for user clicks in general.
  1. He reminded me about documenting disabling IndexedDB, but that is just one of the many prefs.js changes we need to document.
  1. We need to link to the evercookie test page, and perhaps also http://jeremiahgrossman.blogspot.de/2007/04/tracking-users-without-cookies.html
  1. We should perhaps be more vocal about the fingerprinting issues with some or all of http://www.w3.org/TR/navigation-timing/. I think I agree. If any of it shows up in Firefox, I think we should probably just disable it.

comment:9 Changed 7 years ago by gk

  1. Don't forget the SPDY issues.

comment:10 in reply to:  8 Changed 7 years ago by gk

Replying to mikeperry:

Georg Koppen notes the following:

  1. We should perhaps be more vocal about the fingerprinting issues with some or all of http://www.w3.org/TR/navigation-timing/. I think I agree. If any of it shows up in Firefox, I think we should probably just disable it.

It is already implemented since Firefox 7. I've created #6204 to keept track of this issue.

comment:11 Changed 7 years ago by mikeperry

Georg also points out that 3.5.8 is not clear that what we're trying to limit is non-click driven/non-interactive linkability rather than linkability in all cases. Other sections may have this problem, too.

This is a subtlety that arises from both the impossibility of satisfying unlinkability due to covert channels in GET/POST, as well as the desire to avoid breaking thinks like consensual federated login.

comment:12 Changed 7 years ago by mikeperry

Georg also really hates window.name. I mean like really really hates it with a blinding, almost irrational level of fury, even if its existence is limited only to click-driven linkability. ;)

He's talked me into this addition to the Philosophy section, as well as creating a Deprecation List section for things like window.name:

7. Linkability transparency

Our long term goal is to reduce all linkability to mechanisms that are
detectable by experts, so they can alert the general public about places
where it occurs. To this end, we will create a Deprecation List of archaic
web technologies that are currently (ab)used to facilitate federated login
and other legitimate click-driven cross-domain activity but that should
be replaced with more privacy friendly, auditable alternatives.

comment:13 in reply to:  12 Changed 7 years ago by gk

Replying to mikeperry:

Georg also really hates window.name. I mean like really really hates it with a blinding, almost irrational level of fury, even if its existence is limited only to click-driven linkability. ;)

*blush*

He's talked me into this addition to the Philosophy section, as well as creating a Deprecation List section for things like window.name:

7. Linkability transparency

Our long term goal is to reduce all linkability to mechanisms that are
detectable by experts, so they can alert the general public about places
where it occurs. To this end, we will create a Deprecation List of archaic
web technologies that are currently (ab)used to facilitate federated login
and other legitimate click-driven cross-domain activity but that should
be replaced with more privacy friendly, auditable alternatives.

Thanks!

comment:14 Changed 7 years ago by mikeperry

Keywords: MikePerry201207 added; MikePerry201206 removed

Default-on referrers should also go in the Deprecation List section. In an ideal world, they should be default-off but optionally enabled by the first party html.

comment:15 Changed 7 years ago by mikeperry

Cc: pde added

I realized the adversary model doesn't mention "Correlate activity across multiple site visits" as one of the adversary goals. This is the primary goal of the ad networks, though. We need to explicitly mention it in the Adversary Goals section for completeness.

Additionally, it occurs to me that I should probably sit down and actually make an example context menu for my mockup privacy UI. It would contain choices such as:

  • Remove all site history and data [mapped to delete key]
  • Clear Tracking Data
  • Protect site data during New Identity
  • ------------
  • Block advertising from site
  • Beg site for privacy (aka "Do Not Track")
  • Allow Plugins and other Media Content

The reason to put the beggar's header, the adblocker, and the plugin control on a per site basis is to avoid the fingerprinting due to global prefs.

I still hate the beggar's header and dislike the adblocker ideas, but siloing them per url bar at least mitigates the damage they can do. The per-site adblocker might also drive per-site incentive for ads to not suck more than a global adblocker would.

comment:16 in reply to:  15 ; Changed 7 years ago by gk

Replying to mikeperry:

Additionally, it occurs to me that I should probably sit down and actually make an example context menu for my mockup privacy UI. It would contain choices such as:

  • Remove all site history and data [mapped to delete key]
  • Clear Tracking Data
  • Protect site data during New Identity
  • ------------
  • Block advertising from site
  • Beg site for privacy (aka "Do Not Track")
  • Allow Plugins and other Media Content

The reason to put the beggar's header, the adblocker, and the plugin control on a per site basis is to avoid the fingerprinting due to global prefs.

Maybe I am bit slow here but could you explain the fingerprinting risks you see for TBB users a bit? Offering these options seems rather to introduce fingerprinting issues as users choosing them are not in the default set anymore. Let alone the option for bad exits to test whether users are deploying the same filterlists and if not separating them and so on...

I still hate the beggar's header and dislike the adblocker ideas, but siloing them per url bar at least mitigates the damage they can do. The per-site adblocker might also drive per-site incentive for ads to not suck more than a global adblocker would.

I am lost here as well. But maybe your ideas are due to the "Correlate activity across multiple site visits" adversary goal you thought about adding for completeness' sake? If so, I do not see how options buried in a context menu which are off by default could defend against it.

comment:17 in reply to:  16 ; Changed 7 years ago by mikeperry

Replying to gk:

Replying to mikeperry:

The reason to put the beggar's header, the adblocker, and the plugin control on a per site basis is to avoid the fingerprinting due to global prefs.

Maybe I am bit slow here but could you explain the fingerprinting risks you see for TBB users a bit? Offering these options seems rather to introduce fingerprinting issues as users choosing them are not in the default set anymore. Let alone the option for bad exits to test whether users are deploying the same filterlists and if not separating them and so on...

Yes, the key thing in my mind is that users are able to define a relationship with a specific site under this model. If they decide to end this relationship, they hit the delete key and everything is wiped. Moreover, their decisions wrt one site do not affect browser behavior on other sites (which is the important component for 3rd party linkability/correlation through fingerprinting, IMO).

I still hate the beggar's header and dislike the adblocker ideas, but siloing them per url bar at least mitigates the damage they can do. The per-site adblocker might also drive per-site incentive for ads to not suck more than a global adblocker would.

I am lost here as well. But maybe your ideas are due to the "Correlate activity across multiple site visits" adversary goal you thought about adding for completeness' sake? If so, I do not see how options buried in a context menu which are off by default could defend against it.

The core idea here is rooted in the assumption that the crazies who think they know better (but really do not) will enable this stuff by default globally right now by way of installing Adblock or clicking the Beggar Checkbox... That behavior (which we probably can't expect to stop) is worse for the total population's anonymity set than per-site options. At least, I think so.. Are there reasons to the contrary?

I also expect that certain sites will have homogenous requirements wrt ad blockers and plugins/media because people will naturally decide that those sites suck in similar ways... But perhaps that is a poor assumption? If so, please explain how/why?

As a general matter, I prefer allowing user choice if possible, but it also seems clear that user choice for global behaviors is really, really bad... Allowing easy access to per-site choices would be way better by comparison...

comment:18 in reply to:  17 ; Changed 7 years ago by gk

Replying to mikeperry:

Replying to gk:

Replying to mikeperry:

I still hate the beggar's header and dislike the adblocker ideas, but siloing them per url bar at least mitigates the damage they can do. The per-site adblocker might also drive per-site incentive for ads to not suck more than a global adblocker would.

I am lost here as well. But maybe your ideas are due to the "Correlate activity across multiple site visits" adversary goal you thought about adding for completeness' sake? If so, I do not see how options buried in a context menu which are off by default could defend against it.

The core idea here is rooted in the assumption that the crazies who think they know better (but really do not) will enable this stuff by default globally right now by way of installing Adblock or clicking the Beggar Checkbox... That behavior (which we probably can't expect to stop) is worse for the total population's anonymity set than per-site options. At least, I think so.. Are there reasons to the contrary?

We'll see. What makes you confident that people do not install a global adblocker anymore or do the four clicks to activate DNT globally? To make that point more clear: Imagine the user that does these things globally because she is not happy with the current TBB in this regard. I would strongly argue that she is not fiddling with filterlists but just does not want to get tracked (globally!). I mean if I fear tracking and take the effort to install an add-on that defends against it and try to get the DNT option activated in the pref menu I think that tracking is bad in general and not just on google.com, right? Now, let's suppose you implement that option and let this user decide to use your context menu to express her beliefs. What is she going to do? Does she really go on every site she visits into the context menu and to make sure that the options for this site are checked (and stay checked! she might not get the per site logic behind your idea)? Or is she going to do the thing she a) is usded to do and b) that is much, much less error-prone c) that already protects before the site is loaded the first time d) is much, much more convenient: setting DNT to true and installing an add-on globally? I think you'll lose the per side battle wrt to DNT and Adblock and it is just wasted effort.

There are other nasty side-effects of this design decision: it makes explaining the privacy by design idea even harder (if one really ships a privacy by design browser but still needs some exceptions there seems something wrong with the design, right?), it suggests you are okay with DNT and adblocker add-ons generally (yes, you are not but shipping those options even if in a context menu by default suggests that to the laymen. They may even say: "Wow, finally, Mike got it and I am right in using DNT and Adblock globally as I have always done!"). You have to take new attack vectors into account (e.g. bad exits trying to mess with filterlists etc.) which in turn makes it even more difficult to understand and evealuate the security implications of your design...

All in all, I really doubt whether that idea is worth the effort, especially as you have to educate/explain the design decisions to the the users anyway. If so, lets not distract from getting the double-keying idea to them.

I also expect that certain sites will have homogenous requirements wrt ad blockers and plugins/media because people will naturally decide that those sites suck in similar ways... But perhaps that is a poor assumption? If so, please explain how/why?

Don't know yet as I am not sure whether I understand you correctly. What do you mean with "homogenous requirements wrt ad blockers and plugins/media" and how does this fit to the issue whether site-based options for adblockers etc. or no options at all should be implemented in the default TBB?

As a general matter, I prefer allowing user choice if possible, but it also seems clear that user choice for global behaviors is really, really bad... Allowing easy access to per-site choices would be way better by comparison...

Choice is good but do users really want to have per site choices regarding ad-trackers and DNT? What makes you believe so? And why does that not already exist as an add-on and all anti-tracker add-ons are working globally (at least all I can currently come up with)? If there is even a tiny demand for a specific wish there usually seems to exist an add-on for it on AMO.

comment:19 in reply to:  18 ; Changed 7 years ago by mikeperry

Replying to gk:

I also expect that certain sites will have homogenous requirements wrt ad blockers and plugins/media because people will naturally decide that those sites suck in similar ways... But perhaps that is a poor assumption? If so, please explain how/why?

Don't know yet as I am not sure whether I understand you correctly. What do you mean with "homogenous requirements wrt ad blockers and plugins/media" and how does this fit to the issue whether site-based options for adblockers etc. or no options at all should be implemented in the default TBB?

Well, I think certain sites are going to cause similar choices by the userbase. Consider YouTube. Clicking through the barriers for every video there is annoying (and sometimes breaks random videos), but disabling the clickthroughs globally is a fingerprinting vector and also a large security risk. Similar situations exist for adblockers... Some sites have obnoxious advertising and rather than let this drive users to install Adblock Plus for *all* sites, we can provide them with a permission for the specific site that annoys them...

However, both of those points assumes that the majority of the userbase will successfully navigate the Privacy UI...

Wrt to the Beggar's Header, I am very tempted to remove it from the Tor Browser UI entirely.. As a compromise, I was toying with the idea of removing it from the usual location but providing the per-site context menu... But yeah, perhaps we don't give a damn about providing even that, and should just go out of our way to kill the stupid thing.

As a general matter, I prefer allowing user choice if possible, but it also seems clear that user choice for global behaviors is really, really bad... Allowing easy access to per-site choices would be way better by comparison...

Choice is good but do users really want to have per site choices regarding ad-trackers and DNT? What makes you believe so? And why does that not already exist as an add-on and all anti-tracker add-ons are working globally (at least all I can currently come up with)? If there is even a tiny demand for a specific wish there usually seems to exist an add-on for it on AMO.

Yeah, that's exactly the scenario I want to avoid. We should add warnings/text in the addon store UI that tells the user that addons can harm privacy and anonymity... Since Adblock Plus is the most popular addon in the world, supporting some form of user choice per-site might be a decent compromise? It might prevent the userbase from fragmenting itself globally.. It's a tricky tradeoff, though.

At any rate, the key thing I want to communicate with the context menu is that all of the global privacy options that alter browser behavior need to be in a per-site menu (if they exist at all), rather than global choices.

comment:20 Changed 7 years ago by mikeperry

I thought of a cool thing to add to this context menu UI: "Block All Tracking Data From Site". This would disable cookies, cache, HTTP Auth, SSL state, HSTS, keep-alive, etc etc etc for the site.

We could then render such site icons translucent or something in the UI.

It would probably only be most useful if the user opted in to disk storage in #3100, because we'd need to store that bit on disk to allow the browser to remember it...

comment:21 in reply to:  19 Changed 7 years ago by gk

Replying to mikeperry:

Replying to gk:

Choice is good but do users really want to have per site choices regarding ad-trackers and DNT? What makes you believe so? And why does that not already exist as an add-on and all anti-tracker add-ons are working globally (at least all I can currently come up with)? If there is even a tiny demand for a specific wish there usually seems to exist an add-on for it on AMO.

Yeah, that's exactly the scenario I want to avoid. We should add warnings/text in the addon store UI that tells the user that addons can harm privacy and anonymity... Since Adblock Plus is the most popular addon in the world, supporting some form of user choice per-site might be a decent compromise? It might prevent the userbase from fragmenting itself globally.. It's a tricky tradeoff, though.

Okay, I've been thinking about that idea for a while now. I guess a user choice per-site might indeed be a decent compromise. But only if there is really a substantial chance to get the user deploying a global adblocker etc. to use the per-site model (but that's just a matter of proper "incentives", right?). I don't know what means would be appropriate to reach that goal yet. A warning/text in the add-on UI seems not bad to me as a starting point. I would be happy to have some code that is binding the adblock/ghostery or whatever global logic to the URL bar domain by default as well if the per-site checkbox is checked and ignoring it otherwise. And, of course, it would be nice to have some usability study evaluating this approach.

At any rate, the key thing I want to communicate with the context menu is that all of the global privacy options that alter browser behavior need to be in a per-site menu (if they exist at all), rather than global choices.

Yes, that's a reasonable fallback if users want to alter the default behavior. Agreed.

comment:22 Changed 7 years ago by mikeperry

Keywords: MikePerry201209 added; MikePerry201207 removed

I'm going to push this out a bit longer, I think. I suspect I'm going to be pretty distracted with interviewing and other things this month.

comment:23 Changed 7 years ago by mikeperry

Keywords: MikePerry201209d added; MikePerry201209 removed

comment:24 Changed 7 years ago by mikeperry

Note to self: rransom wants us to document the env vars too. (#6821).

comment:25 Changed 7 years ago by proper

Parent ID: #5811

comment:26 Changed 7 years ago by mikeperry

There's also a bunch of broken links in #6633.

comment:27 Changed 7 years ago by mikeperry

Keywords: MikePerry201211d added; MikePerry201209d removed
Parent ID: #5811
Summary: Update TBB design doc for 2.3.x-alphaUpdate TBB design doc for 2.3.x

By the time I get to this, it won't be alpha anymore :/.

comment:28 Changed 7 years ago by mikeperry

Hrmm. One of the major reasons people might not grok this whole doc is because the security and privacy requirements should perhaps go first, and then the adversary model afterwards. In this way, I might be able to describe how each adversary goal, capability, and attack lines against the security and privacy properties we aim to provide.

Also, the adversary model could then be its own top-level section, instead of buried in the intro...

comment:29 in reply to:  28 Changed 7 years ago by gk

Replying to mikeperry:

Hrmm. One of the major reasons people might not grok this whole doc is because the security and privacy requirements should perhaps go first, and then the adversary model afterwards. In this way, I might be able to describe how each adversary goal, capability, and attack lines against the security and privacy properties we aim to provide.

But both the security and the privacy properties follow from the adversary model. Thus, it seems a bit odd to me to describe the former first and then deliver the model that fits to them. But, nevertheless, I think it is a good idea to describe how the attacker model is related to the security and privacy properties in more depth. Maybe in an additional, informal/informational section in "2. Design Requirements and Philosophy" just after the requirements got introduced? That probably depends on the stuff you'd like to write...

Also, the adversary model could then be its own top-level section, instead of buried in the intro...

That seems orthogonal to me but is a good idea anyway.

comment:30 in reply to:  24 Changed 7 years ago by mikeperry

Replying to mikeperry:

Note to self: rransom wants us to document the env vars too. (#6821).

Err #6820 is the env var doc bug.

The tails people also want better documentation on the resolution spoofing, and related to that we should probably devote an entire subsection to our seemingly bizarre Panopticlick results and the need for a user-agent-specific panopticlick test.

comment:31 Changed 7 years ago by mikeperry

Note to self: When I finally get around to this, I should clean up/remove the old Torbutton docs, too. See #6567.

comment:32 Changed 7 years ago by mikeperry

Keywords: MikePerry201212d added; MikePerry201211d removed

comment:33 Changed 7 years ago by mikeperry

Keywords: MikePerry201301d added; MikePerry201212d removed

comment:34 Changed 7 years ago by mikeperry

Keywords: MikePerry201302d added; MikePerry201301d removed

comment:35 Changed 7 years ago by mikeperry

Actual Points: 16
Status: newneeds_review

Ok, the above changes should be reflected in https://www.torproject.org/projects/torbrowser/design/. gk's transparency idea lives in Appendix A: https://www.torproject.org/projects/torbrowser/design/#Transparency.

Please let me know if anything should be expanded or clarified.

comment:36 Changed 7 years ago by proper

I am not a qualified reviewer, but I am going to read it out of curiosity.

Mike, did you consider to update the Design as you push changes? I think this could heavily reduce further time you need to spend on it. It's how the Tails devs do it. If you update the design every time you push changes while it's fresh in your brain, you have a lot less guesswork figuring out what you did and why later.

comment:37 Changed 7 years ago by proper

Bonus points for managing that thing in git, so people can follow your changes as they happen (as a notification mechanism).

comment:38 Changed 7 years ago by gk

Okay here come some comments to the first 2 sections in chronological order:

1) 2.1.2 State Separation: I think there is some material criterion for "other browsing modes" missing. I mean the doc should give kind of a blueprint of a Private Browsing Mode, right? Now, if one tries to design such a mode for say, Chrome, with it, when is it allowed to share state to the content window? "If you are not in Private Browsing Mode" does not help here. I think a minimum criterion could be: "You are in an other mode if you don't have Tor enabled" But maybe there is more to it...

2) 2.2.3 Long-Term Unlinkability: Having a requirement with a "SHOULD" does not seem to fit IMO. So, my first thought was to omit 2.2.3 fully, re-label 2.2.1 to "Identifier Unlinkability" and put the content of the old 2.2.3 there. But after a while I came to the conclusion that it maybe should be an own point but the "SHOULD" should be upgraded to a "MUST". I think that long-term unlinkability requirement is important as there is some dangerous tracking falling through the cracks if one "only" provides isolation to the URL bar domain. I have here first party tracking (via cookies or whatever) in mind done e.g. by some powerful search engine provider which gets used by 80% of the people and which is the only one they use. Against correlating all the search entries of one person via cookies or an other identifier only a fresh identity function seems to help in your current design. Therefore, it seems like a MUST to me.

3) 2.3 Philosophy: That is kind of an informational section and all occurrences "MUST" and "SHOULD" seem therefore wrong to me. They belong into the technical sections or should be lower-case. Thus, I would e.g. omit the second paragraph of 2.3.3 as its first sentence is already in 4.6.1 and the second one would fit there better, too.
I am not sure where to put the important points about disabling the (system-wide) add-ons/plugins but they don't belong into 2.3.4. Thinking about local history storage there seems not appropriate either as that is a specific technical issue, too, while the section is more about broader, underlying and non-technical questions. I'd just delete it.

Last sentence of 2.3.1 "tor-state". Not sure if that is a typo but I was wondering why it is "tor" and not "Tor".

comment:39 Changed 7 years ago by proper

It's difficult to determine how complete it is. I think that would require reading all past trac tickets, Firefox Patches and Tor Button source code.

I missed information about how the update check works currently and what the plans are. Perhaps you concentrated on keeping it small, the web fingerprint, what the state currently is, whats planed and what needs help, which is also good.

However, I missed information about website fingerprinting (at entry guard), quite an important topic.

What about https://www.torproject.org/torbutton/en/design/ - time to take it offline or mark outdated?

What about #8032 "add "no dependency on Tor controller connection" to the design"?

comment:40 Changed 7 years ago by mikeperry

proper: I added the update check and our website traffic fingerprinting defenses + plans to https://www.torproject.org/projects/torbrowser/design/#other. I also added some background on website traffic fingerprinting to https://www.torproject.org/projects/torbrowser/design/#website-traffic-fingerprinting, because if you took current academic literature at face value, you'd have to come to the conclusion that it's crazy to try to encrypt anything on the web, because it can all be fingerprinted no matter what. Obviously I disagree.

gk: Ok, most of your comments should be reflected in the design doc. I did not remove the paragraphs you suggested, but I did change the wording a bit and remove the use of SHOULD and MUST.

Btw, the design doc source lives in my torbrowser.git remote mikeperry/design-doc. I'm still trying to decide how to merge that or what to do with it. I will be making an effort to keep the design doc more current in the future in general, too.

comment:41 Changed 7 years ago by mikeperry

I pushed an update that may have broken that first link to the other defenses section. In the updated version, it is now at https://www.torproject.org/projects/torbrowser/design/#other-security.

I also tried to clarify the website traffic fingerprinting attack material a bit more in that same update.

comment:42 Changed 7 years ago by gk

Here comes the second bunch of comments to section 3 - the end to the document in chronological order:

3.1.4) I think you can delete that point as that is better viewed as one of the myriads of means to reach 3.1.6 and not a separate goal (that is quite good reflected by "zero in" which you used in both sections (btw. is it "zero in" or "zero-in"?)

3.2.4) I am wondering if that adversary described there is still the one you assume when you are talking about a passive forensic local adversary. If I as an adversary have intermittent or constant physical access, well, then I have more options than just passively monitoring something... Maybe a comment or a hint like you did with 3.3.3 would help here.

4.5.2) It was a bit confusing to me that the cache domain attribute is using the FQDN and not the url bar origin (i.e. the second-level DNS name) especially as the Tor Browser is following that url bar paradigm in almost all the other cases (and is only mentioning that one MAY use the FQDN as url bar origin). A sentence or two explaining this "discrepancy" might be good here.

4.6.4) While reading that remote fonts are excluded from the defense now I remembered:

When Gecko displays a page that uses web fonts, it initially displays text using the best CSS fallback font available on the user's computer while it waits for the web font to finish downloading.  As each web font finishes downloading, Gecko updates the text that uses that font.  This allows the user to read the text on the page more quickly.

on https://developer.mozilla.org/en-US/docs/CSS/@font-face. Have you tested that a remote server cannot fool you here? The question is: Does the font defense get triggered even if the local fonts are just placeholders for remote fonts? I am not sure whether there are different code paths for genuinely loading a local font and having it just as a placeholder and whether your defense works in both cases. Just a thought...

4.7) The GeoIP wiki token URL is probably a GeoIP wifi token URL, right? (at least "geo.wifi.access_token" makes that plausible)

4.8.7) It seems to me that closing the Tor Browser is not early enough as there are numerous add-ons that start network activity way before the browser.js code is running. But I am not sure if that justifies an own Design Goal section here which states that one tries to patch the Tor Browser in a way that it is guaranteed to only start network activity if Tor is up, running and used.

A) While reading "[...] based on the assumption that link-click navigation indicates user consent to tracking [...]" I was suddenly thinking about element.dispatchEvent() and friends. I am wondering whether there is really the possibility to keep a user clicking on a link separated from JavaScript "clicking" on a link and whether therefore the link-click criterion is really useful here. Have you looked into that?

A.1.1) Either "Referer or "referer" should be used but not both (I tend to the former).

The idea with making the Referer explicit (and how to do it) is a really good one! And for the record: it was not mine as your comment 35 indicates :)

comment:43 in reply to:  42 ; Changed 7 years ago by mikeperry

Replying to gk:

Here comes the second bunch of comments to section 3 - the end to the document in chronological order:

Ok, I think I've fixed most of these. You can double check when the website gets updated. The new design doc will claim to be current as of Torbutton 1.5.1 (rather than 1.5.0).

3.1.4) I think you can delete that point as that is better viewed as one of the myriads of means to reach 3.1.6 and not a separate goal (that is quite good reflected by "zero in" which you used in both sections (btw. is it "zero in" or "zero-in"?)

Yep. Deleted. Also added some phrasing to 3.1.6 (now 3.1.5) to capture this idea too.

3.2.4) I am wondering if that adversary described there is still the one you assume when you are talking about a passive forensic local adversary. If I as an adversary have intermittent or constant physical access, well, then I have more options than just passively monitoring something... Maybe a comment or a hint like you did with 3.3.3 would help here.

Well, 3.2.4 is just about positioning. See the corresponding attack vector in 3.3.4, which I updated to mention both complete code-exec compromise and passive forensics.

4.5.2) It was a bit confusing to me that the cache domain attribute is using the FQDN and not the url bar origin (i.e. the second-level DNS name) especially as the Tor Browser is following that url bar paradigm in almost all the other cases (and is only mentioning that one MAY use the FQDN as url bar origin). A sentence or two explaining this "discrepancy" might be good here.

I added a phrase. I pretty much chose FQDN because it was the simplest option.

4.6.4) While reading that remote fonts are excluded from the defense now I remembered:

When Gecko displays a page that uses web fonts, it initially displays text using the best CSS fallback font available on the user's computer while it waits for the web font to finish downloading.  As each web font finishes downloading, Gecko updates the text that uses that font.  This allows the user to read the text on the page more quickly.

on https://developer.mozilla.org/en-US/docs/CSS/@font-face. Have you tested that a remote server cannot fool you here? The question is: Does the font defense get triggered even if the local fonts are just placeholders for remote fonts? I am not sure whether there are different code paths for genuinely loading a local font and having it just as a placeholder and whether your defense works in both cases. Just a thought...

I have actually observed this behavior happen visually, but I have not tested it with code. We do update the actual font rule itself (as the nsStyleRule member of nsRuleNode) to /only/ list the font-face font before font rendering, so I'd be surprised if you could still probe specific fonts this way. My guess is that what happens is that you get /the/ system generic font, the same one you get with browser.display.use_document_fonts set to false. But it would be a good test to do, especially if we also verify that inherited font rules can't somehow alter this fallback behavior. Both are probably something content window JS can inspect with getComputedStyle(), if it manages to win the race to inspect the element before the font is downloaded.

4.7) The GeoIP wiki token URL is probably a GeoIP wifi token URL, right? (at least "geo.wifi.access_token" makes that plausible)

Yeah. Typo.

4.8.7) It seems to me that closing the Tor Browser is not early enough as there are numerous add-ons that start network activity way before the browser.js code is running. But I am not sure if that justifies an own Design Goal section here which states that one tries to patch the Tor Browser in a way that it is guaranteed to only start network activity if Tor is up, running and used.

I am confused what you mean here. We don't actually kill the Firefox process, and we consider addon network activity out of scope... I tried to clarify a couple things in this section, though.

A) While reading "[...] based on the assumption that link-click navigation indicates user consent to tracking [...]" I was suddenly thinking about element.dispatchEvent() and friends. I am wondering whether there is really the possibility to keep a user clicking on a link separated from JavaScript "clicking" on a link and whether therefore the link-click criterion is really useful here. Have you looked into that?

No. I would consider this as part of #3600, as an automated cross-origin redirect vector. We should probably work on enumerating those, as that will probably help us decide what to do there. I updated the ticket with a (probably partial) list of vectors.

A.1.1) Either "Referer or "referer" should be used but not both (I tend to the former).

Fixed.

The idea with making the Referer explicit (and how to do it) is a really good one! And for the record: it was not mine as your comment 35 indicates :)

Heh, yeah, but you kept hounding me about it long enough to find some kind of solution. ;)

comment:44 in reply to:  40 Changed 7 years ago by gk

Replying to mikeperry:

gk: Ok, most of your comments should be reflected in the design doc. I did not remove the paragraphs you suggested, but I did change the wording a bit and remove the use of SHOULD and MUST.

That's fine IMO. The only thing I am not happy with here is that disabling extensions is only mentioned in 2.3.4, an informational section. I mean, extensions are basically as powerful as plugins and especially 3rd party extensions (i.e. extensions installed by some crappy software as a byproduct) caused Mozilla a lot of trouble as they were not seldom malicious wrt the privacy/security of users. Why not adding a special point at least in section 4.1. explaining that all system-wide/3rd party extensions MUST be disabled as long as the user did not allow them as they can easily bypass proxy settings creating e.g UDP sockets? Depending on how they are programmed (see the contentaccessible flag, for instance) extensions might as well contribute to cross-origin linkability...

comment:45 in reply to:  43 Changed 7 years ago by gk

Replying to mikeperry:

Replying to gk:

3.2.4) I am wondering if that adversary described there is still the one you assume when you are talking about a passive forensic local adversary. If I as an adversary have intermittent or constant physical access, well, then I have more options than just passively monitoring something... Maybe a comment or a hint like you did with 3.3.3 would help here.

Well, 3.2.4 is just about positioning.

Ah, yes. I missed that somehow while I was reading that section. Makes sense.

4.8.7) It seems to me that closing the Tor Browser is not early enough as there are numerous add-ons that start network activity way before the browser.js code is running. But I am not sure if that justifies an own Design Goal section here which states that one tries to patch the Tor Browser in a way that it is guaranteed to only start network activity if Tor is up, running and used.

I am confused what you mean here. We don't actually kill the Firefox process, and we consider addon network activity out of scope...

Okay, I was just referring to

appStartup.quit(3);

which forces all windows to close and quits Firefox thereafter. But if addon network activity is out of scope, just having patch 0007 is fine.

3.3.1) Just a typo: "realtively"

comment:46 Changed 7 years ago by mikeperry

Actual Points: 1620
Resolution: fixed
Status: needs_reviewclosed

Ok, I added a paragraph to the proxy obedience and state separation sections describing how we disable system extensions and eliminate the xpi whitelist (which doesn't seem to affect the addons pane, so I filed #8493 for that). I also fixed the typo and improved the exploit/physical access attack section a bit more.

I didn't do anything about the 0007 patch because it's a wart on top of a wart and I want it to die as soon as we get rid of Vidalia as our launcher (which is hopefully soon). It's also currently broken, so if we do end up needing to keep it, it needs to be rewritten anyways (#8350).

The design doc date with these changes will be March 15.

comment:47 Changed 7 years ago by proper

Resolution: fixed
Status: closedreopened

There is now the updated https://www.torproject.org/projects/torbrowser/design/ an the older https://www.torproject.org/torbutton/en/design/.

Isn't there redundancy? Shouldn't the older one get merged into the newer one and/or retired?

And if there is need to keep https://www.torproject.org/torbutton/en/design/, is there a ticket to update it?

comment:48 Changed 7 years ago by mikeperry

Resolution: fixed
Status: reopenedclosed

Don't reopen random unrelated tickets. Thanks!

Note: See TracTickets for help on using tickets.