Opened 17 months ago

Last modified 8 months ago

#25197 new defect

Design document isn't precise about "Security" and "Privacy".

Reported by: arthuredelstein Owned by: tbb-team
Priority: Medium Milestone:
Component: Applications/Tor Browser Version:
Severity: Normal Keywords: tbb-spec
Cc: arma, hiro Actual Points:
Parent ID: #25021 Points:
Reviewer: Sponsor:


In Tor Browser, we have a "Security" Slider and various "Privacy" features. But these words are not so easily distinguished. Maybe we could think of a better words?

In any case, we should defined the two concepts very clearly in the Design document, and we should make sure we don't mix them up. For example, section 2.1 is entitled "Security Requirements" but goes on to list what I would consider privacy properties and does not include the sort of security intended to be provided by the Slider.

Child Tickets

Change History (6)

comment:1 Changed 17 months ago by arma

This ticket started when I saw tor browser devs saying things like "that's security, not privacy", which is a recipe for confusion in our modern "you have to choose between security and privacy" world.

I think we have been using two notions:

  • Code security, or implementation security, which is about whether the browser can be exploited, which of course then could lead to deanonymization, identification, etc.
  • Privacy, which includes fingerprinting defense, but also proxy bypass defense, so in a sense it's all of the ways that things can go wrong for the user without implementation bugs.

Our name "security slider" is strictly supposed to be the first one. That is, all settings of the security slider are intended to provide all of our privacy protections. That is, if a Tor Browser dev ever says "well you set your security slider to low so i figured i shouldn't enable that expensive tracking protection", then that is a mistake.

(Arthur correctly points out that reducing surface area, which primarily aims to reduce exposure to implementation bugs aka exploits, can also improve things against fingerprinting and tracking and so on. That blurry line certainly confuses the issue, but it doesn't by itself mean we aren't talking about two different topics.)

The suggestion in this ticket is to (a) have a section towards the top of the design doc explaining this distinction between the two goals, and then (b) make sure that the rest of the design doc uses these two goals correctly, i.e. doesn't confusingly switch between one word and the other.

It's also worth brainstorming more intuitive terms for each of these goals. I think "code security" or "implementation security" is a pretty good one for the first, but the privacy one is broad enough that it's not obvious which term would be best. Let's not let a lack of the best term slow us down too much though. :)

comment:2 Changed 17 months ago by arthuredelstein

(Moving this comment to the other ticket where it belongs.)

Last edited 17 months ago by arthuredelstein (previous) (diff)

comment:3 Changed 17 months ago by arthuredelstein

Parent ID: #25021

comment:4 Changed 8 months ago by gk

Before the advent of the security slider "code security" was not on the radar of the design document. Its aim was (and still to a large extent is) to describe what we think a *Private Browsing Mode*, not a whole browser, should look like. In that context "security requirements" and "privacy requirements" had/have some particular meaning.

So, in that regard I think this bug is not really valid, especially as it is quite clear in the document what is meant with those concepts. Sure, it gets tricky once one does not have the PBM scope of the document in mind, but that's not unexpected. :)

Now, I am fine if we want to refocus slightly and try getting the bigger picture into the document which started with the security slider (and mentioning it in our design doc) and is intensifying with our planned sandboxing efforts.

I don't want to give up on the distinction between security and privacy requirements as made in the document at the moment, as that one seems useful. But I think we can re-label those. I've been thinking about:

"security requirements" -> "safety requirements"
"privacy requirements" -> "unlinkability requirements"

both under the umbrella of what we would commonly call Private Browsing Mode and thus, they are indeed privacy requirements.

We can call the other one "code security" and put into it the slider but as well the updater we deploy and the update notifications over Tor(button). Later on all our sandboxing efforts can get into that part, too.

Last edited 8 months ago by gk (previous) (diff)

comment:5 in reply to:  4 Changed 8 months ago by arthuredelstein

Replying to gk:

Just to give what I think is an important anecdote: in my experience, *many* people already think the higher settings of the "Security Slider" are designed to provide more "privacy protections" such as fingerprinting resistance.

If we want to keep the distinction clear that the slider is not about the things in 2.1, I think it will be helpful to use consistent and specific terminology, regardless of the context, especially as not everyone is going to read the whole document.

comment:6 Changed 8 months ago by traumschule

comment:5:ticket:25021 mentions to explain better why extensions can be harmful, both for security (preventing exploits) and preventing fingerprintability as one danger to privacy, next to leaks of personal information. Maybe this is good example to differentiate these concepts.

Also trackers are a tricky topic that should be covered because "deactivate javascript" is the common answer to browse more safely (higher security), but it leads to the wrong assumption, that it helps against tracking (privacy) while IIUC it does not help against web bugs like hidden images or other resources on third party domains etc (only if loaded by scripts).

This is especially important because many users (including myself) are trained to install "essential extensions" for privacy:

Note: See TracTickets for help on using tickets.