Currently, we tell users that the GPG signatures linked to from the download page 'allow you to verify the file you've downloaded is exactly the one that we intended you to get. For example, tor-browser-1.3.15_en-US.exe is accompanied by tor-browser-1.3.15_en-US.exe.asc.' This is false.
The GPG signatures only prove that a particular person associated with The Tor Project has signed a particular file; they do not authenticate the filename, thus they do not authenticate the package name or the package version, and they do not prove that a particular package file is the final build of a package version which we want to distribute to users. This leaves our users vulnerable to version-rollback attacks and package-substitution attacks if they download packages from mirrors or over non-HTTPS connections.
We should:
switch to signing the output of sha256sum on a package file, which includes the filename and a hash of the file, rather than signing the package file directly, and
explain on the verifying-signatures page how to verify downloaded packages using the signed SHA256SUM files, including explaining that unless there is a blank line after the 'Hash: ' line and before the hash-and-filename lines, the SHA256SUM file has been tampered with.
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information
Child items 0
Show closed items
No child items are currently assigned. Use child items to break down this issue into smaller parts.
Linked items 0
Link issues together to show that they're related.
Learn more.
I'm assigning this bug to myself because solving this will clearly require writing new software, including a signature-verification tool for Windows with a small binary.
Trac: Status: new to assigned Owner: erinn to rransom
I just uploaded a shell script that takes a package file as its argument and generates a clearsigned document giving the package name, the package version, the correct SHA256 digest, and some basic verification instructions.
I've tested it with Tor source distributions; it might need more smarts to properly grab package names and version numbers from other things. It surely needs more human-friendly instructions.
If we like it, we can stick it into contrib/ and use it to sign all our stuff.
note from IRC discussion: if we make a header or a footer of the signed thing into something with a fixed format, it will be easier to machine-verify in the future if we want too. Something based on (part of?) or incorporated into Thandy is an easier longterm solution , but this script will IMO do in the short term.
The GPG signatures only prove that a particular person associated with The Tor Project has signed a particular file; they do not authenticate the filename, thus they do not authenticate the package name or the package version, and they do not prove that a particular package file is the final build of a package version which we want to distribute to users. This leaves our users vulnerable to version-rollback attacks and package-substitution attacks if they download packages from mirrors or over non-HTTPS connections.
Isn't this still true if they download the proposed new file format over non-HTTPS connections? as an attacker in this scenario, i can just point them to the set of different files, including the old .asc.
Doesn't the tor installer package contain its version number internally? You mention an .exe, and i haven't worked on that platform in years, but i seem to recall that Windows executables could embed a version number that is visible in the one of the tabs in the File Properties dialog, which would presumably not change even if the file name changed.
If you know you have a regular release schedule, say, every 6 months, then you could also add a signature expiration subpacket to your OpenPGP signature that is roughly in the time frame that you expect the next release to come around. (see the --ask-sig-expire and --default-sig-expire options for gpg), to limit rollback attacks beyond that range. Users who actually verify the download with OpenPGP-compliant software will be notified that the signature is expired.
Another approach entirely could use the OS-native mechanism for signing distributed software:
windows appears to use signtool.exe -- i don't know much about it, whether embedded version numbers are themselves signed, and/or whether the signatures can be made to expire.
debian and debian-derived operating systems use signed apt repositories. Tools like reprepro can create signed archives, and those signatures can themselves have expiration dates.
macOS -- it seems that packagemaker is capable of signing the files. i also don't know about expiration or embedded version numbers here.
i notice that you are already doing the right thing with respect to debian and ubuntu, but you aren't protected against version-rollback attacks there either. the Release.gpg file doesn't have a signature expiration subpacket in it, and the Release file doesn't have a Valid-Until header (i only checked your sid repo -- maybe this isn't true for the other ones).
If you are using reprepro to maintain that repo, and you want to add the Valid-Until: header, add a line to the relevant stanzas in conf/distributions (replace 14d with your preferred expiration time):
ValidFor: 14d
If you want to make the signatures themselves expire, you'll need to tweak gpg.conf for whoever handles the archive-signing key (or make a new $GNUPGHOME and tweak the gpg.conf there) by adding a line like:
default-sig-expire 14d
sorry i can't help you more with the proprietary operating systems.
I think if we changed the way we do signatures we will just confuse most of those users that are already confused about signatures even more, without actually offering much better protection. For the careful gpg user, the date of the signature should be a good indication that something is wrong.
That said, if we want to improve the situation, the script should probably add a date field, so that people can get suspicious when the date is off (note that they could already do that with the plain gpg signatures, but looking into many different places makes things just more complicated).
I agree with Sebastian that simplifying and integrating into existing systems is the right way forward, not to make the verification process even more complex.
At its core, it sounds like the problem you're facing here is that old packages have no expiration mechanism so users can realize that they should look for a newer version.
It seems to me that this is best achieved through a combination of system-specific cryptographic signatures with embedded expirations (for dealing package installation time), and run-time version-checking against some authoritative server that can declare (in a cryptographically-secure way) "this version should no longer be run". I don't much like this kind of "phone home" approach, but as i understand it, tor already needs to check in with some authoritative servers to find its way into the network anyhow. If that's the case, maybe those servers can be re-used for this purpose?
The GPG signatures only prove that a particular person associated with The Tor Project has signed a particular file; they do not authenticate the filename, thus they do not authenticate the package name or the package version, and they do not prove that a particular package file is the final build of a package version which we want to distribute to users. This leaves our users vulnerable to version-rollback attacks and package-substitution attacks if they download packages from mirrors or over non-HTTPS connections.
Isn't this still true if they download the proposed new file format over non-HTTPS connections? as an attacker in this scenario, i can just point them to the set of different files, including the old .asc.
You wouldn't be able to label an old package like TBB-Windows 1.3.13 as a shiny new 1.3.18, and thereby persuade users of an up-to-date version to 'upgrade' to a buggy older version, with the new format.
Doesn't the tor installer package contain its version number internally? You mention an .exe, and i haven't worked on that platform in years, but i seem to recall that Windows executables could embed a version number that is visible in the one of the tabs in the File Properties dialog, which would presumably not change even if the file name changed.
The Vidalia Bundle for Windows installer has the version numbers of Tor and Vidalia in its 'File Description' field. The Tor Browser Bundle for Windows self-extracting archive does not have any useful version information on the archive itself, although a README file inside the archive can give a lower bound on the version.
Another approach entirely could use the OS-native mechanism for signing distributed software:
windows appears to use signtool.exe -- i don't know much about it, whether embedded version numbers are themselves signed, and/or whether the signatures can be made to expire.
The major advantage of this signing method is that Windows will verify the signature for users under some circumstances. The major drawback is that it requires paying off the 'SSL mafia' for a code-signing certificate.
I agree with Sebastian that simplifying and integrating into existing systems is the right way forward, not to make the verification process even more complex.
At its core, it sounds like the problem you're facing here is that old packages have no expiration mechanism so users can realize that they should look for a newer version.
This bug report is about the fact that a user cannot verify that the file he has downloaded is the package the download page described it as.
See [http://www.freehaven.net/~arma/tuf-ccs2010.pdf] for the design of an automatic package updater which protects a user from 'repository-state freezing' and other attacks without requiring the user to manually verify that a package file is the one he intended to download.
The way to make the verification process simpler is to incorporate the package-verification part of TUF as a standalone program that verifies a package's identity using a downloaded single-file bundle of certificates.
It seems to me that this is best achieved through a combination of system-specific cryptographic signatures with embedded expirations (for dealing package installation time), and run-time version-checking against some authoritative server that can declare (in a cryptographically-secure way) "this version should no longer be run". I don't much like this kind of "phone home" approach, but as i understand it, tor already needs to check in with some authoritative servers to find its way into the network anyhow. If that's the case, maybe those servers can be re-used for this purpose?
The Tor network consensus already includes a list of 'recommended versions' of Tor (see the Tor Metrics consensus health page). We don't actually seem to use this to deprecate old versions -- the list of recommended versions currently includes many versions of Tor vulnerable to CVE-2011-0427.
You wouldn't be able to label an old package like TBB-Windows 1.3.13 as a shiny new 1.3.18, and thereby persuade users of an up-to-date version to 'upgrade' to a buggy older version, with the new format.
isn't that a job of the installer itself? I'd imagine that the tor installer packages will say "sorry, you already have a more recent version" and decline to upgrade.
The Vidalia Bundle for Windows installer has the version numbers of Tor and Vidalia in its 'File Description' field. The Tor Browser Bundle for Windows self-extracting archive does not have any useful version information on the archive itself, although a README file inside the archive can give a lower bound on the version.
can you add that info to the self-extracting archive?
The major advantage of this signing method is that Windows will verify the signature for users under some circumstances. The major drawback is that it requires paying off the 'SSL mafia' for a code-signing certificate.
i prefer to call them the "CA Cartel", but i get your point :) What happens when a user tries to install a program that is signed by a code-signing cert that was not issued by a member of the cartel? Does it just say "unknown issuer"? if so, you can achieve the same approach you have now by distributing your own code-signing certficate separately, and encouraging users who want to verify the package to load that certificate into their system's certificate store. (i don't know how to do this exactly, but something like certmgr.exe is probably heading in the right direction)
I'm only advocating x.509 from a tactical perspective, you understand. i agree that the system is flawed, and i prefer OpenPGP's multi-issuer certification model. But asking people to install an entirely new (to them) certificate checking tool on a system just to check some other tool seems like you've given them two problems instead of one, so they're probably not going to check it anyway.
You wouldn't be able to label an old package like TBB-Windows 1.3.13 as a shiny new 1.3.18, and thereby persuade users of an up-to-date version to 'upgrade' to a buggy older version, with the new format.
isn't that a job of the installer itself? I'd imagine that the tor installer packages will say "sorry, you already have a more recent version" and decline to upgrade.
The Tor Browser Bundle is not installed, it is unpacked, and often onto a removable storage device in order to reduce the traces left on the computer(s) on which it is used.
The Vidalia Bundle for Windows installer has the version numbers of Tor and Vidalia in its 'File Description' field. The Tor Browser Bundle for Windows self-extracting archive does not have any useful version information on the archive itself, although a README file inside the archive can give a lower bound on the version.
can you add that info to the self-extracting archive?
That information can be added. It is too difficult to be worthwhile.
The major advantage of this signing method is that Windows will verify the signature for users under some circumstances. The major drawback is that it requires paying off the 'SSL mafia' for a code-signing certificate.
i prefer to call them the "CA Cartel", but i get your point :) What happens when a user tries to install a program that is signed by a code-signing cert that was not issued by a member of the cartel? Does it just say "unknown issuer"? if so, you can achieve the same approach you have now by distributing your own code-signing certficate separately, and encouraging users who want to verify the package to load that certificate into their system's certificate store. (i don't know how to do this exactly, but something like certmgr.exe is probably heading in the right direction)
I'm only advocating x.509 from a tactical perspective, you understand. i agree that the system is flawed, and i prefer OpenPGP's multi-issuer certification model. But asking people to install an entirely new (to them) certificate checking tool on a system just to check some other tool seems like you've given them two problems instead of one, so they're probably not going to check it anyway.
Once we have an easy-to-use package verification tool, we should release binaries of it signed using each OS's code-signing system. That is not a substitute for implementing TUF and releasing a package verifier based on TUF.
The Go standard library now contains an openpgp package which may be sufficient to write a verification tool for signature files produced by make-signature.sh. We would still need to work out how the verification tool would manage keys and make trust decisions.
Trac: Points: N/AtoN/A Status: needs_review to assigned Actualpoints: N/AtoN/A
Once we have an easy-to-use package verification tool, we should release binaries of it signed using each OS's code-signing system. That is not a substitute for implementing TUF and releasing a package verifier based on TUF.
Even if you had this, it would have horrible usability (doing-it-right (r) and wouldn't be of help for many users. Users still would have to blindly trust the easy-to-use package verification tool on first download or use ordinary complicated verification to get it in the first place.
The problem anyone who wants to ship software, such as The Tor Project, is that operating systems doesn't have an easy and secure way to obtain software from third parties. (None including end to end verification, defense against rollback attacks, deterministic builds, quorum signatures, etc.) Ideas...
Windows:
Windows Appstore maybe? Well, Microsoft knew who downloaded Tor Browser, but anonymously obtaining Tor Browser is difficult anyway and Microsoft has a say what goes into they store and what not. Tor Browser may not be able to get in for some obscure reason. That's something open for research/communication. You won't find great solutions for the Microsoft operating system, since they're not interested to support your case.
Debian:
Since #3994 (moved) "Get TorBrowser in Debian" is unrealistic, maybe at least #5236 (moved) "Make a deb of the Torbrowser and add to repository" is realistic?
But even if you had #5236 (moved), how to get people the repository signing key and apt line? Debian lacks something like "sudo apt-get install torproject-repository" or "sudo apt-get install torproject/tbb" to install tbb from a third party (torproject).
And even if you had this, you still wouldn't have quorum signatures.
There are no graphical package mangers for Debian with good usability. (Well, there is Software Center, but it has bugs which make it unusable.)
Conclusion:
All in all, getting a highly secure and usable software update tool where third parties can easily and reliably ship their stuff isn't trivial. In short term, I suggest creating a draft. What today's built-in software update tools (apt-get etc.) can do, which features are lacking, sketching the software update tool of your third party dreams. And then hope someone creates or get it founded.
In fact, SHA-256 hash of file must only be used to provide integrity against accidential errors, not malicially crafted.
You must NEVER use it as a protection measure. You must use a secure MAC/signature instead. So, I think that the author of this ticket must learn crypto a little.
for example replay attack: a malicious actor performing a MitM against your machine has saved the metadata with the vulnerable version. The malicious actor replays that metadata to your system, preventing your system from seeing the newly patched libEXAMPLE. This gives the attacker up until the Valid-Until date to attempt to launch an attack against you.
What I learned:
we know downloading executable files from a website is unsafe unless the authenticity is checked (by verifying the issuer of the TLS certificate), assuming the used encryption is not vulnerable to a rollback attack or the server has not been compromised in another way
to protect against this, files need to be signed with the release key which is kept offline (not anywhere near the production environment), trusting the signer's opsec
it is better to go "the debian way" (or fedora'ssigning architecture) by pooling all files in a trusted infrastructure (though this is not failprove, see link above)
package repositories should provide an sufficiently low expiration time (implemented for sid, good!), to protect against distribution of vulnerable older versions (fedora uses 3 days)
curl http://deb.torproject.org/torproject.org/dists/sid/Release > Releasecurl http://deb.torproject.org/torproject.org/dists/sid/Release.gpg > Release.sig$ gpg --verify Release.siggpg: assuming signed data in 'Release'gpg: Signature made Fri 10 Aug 2018 01:28:01 PM CESTgpg: using RSA key 2265EB4CB2BF88D900AE8D1B74A941BA219EC810 gpg: Good signature from "deb.torproject.org archive signing key" [unknown] gpg: WARNING: This key is not certified with a trusted signature! gpg: There is no indication that the signature belongs to the owner. Primary key fingerprint: A3C4 F0F9 79CA A22C DBA8 F512 EE8C BC9E 886D DD89 Subkey fingerprint: 2265 EB4C B2BF 88D9 00AE 8D1B 74A9 41BA 219E C810 $ gpg --list-packets Release.sig # just for reference# off=0 ctb=89 tag=2 hlen=3 plen=307:signature packet: algo 1, keyid 74A941BA219EC810 version 4, created 1533900481, md5len 0, sigclass 0x00 digest algo 8, begin of digest a8 9f hashed subpkt 33 len 21 (issuer fpr v4 2265EB4CB2BF88D900AE8D1B74A941BA219EC810) hashed subpkt 2 len 4 (sig created 2018-08-10) subpkt 16 len 8 (issuer key ID 74A941BA219EC810) data: [2048 bits]
When a signature has an expiration date however it is shown at the end:
gpg: Signature expires Wed 14 Aug 2019 03:37:29 AM CEST$ en gpg --list-packets vanguards/TODO.txt.sig # off=0 ctb=89 tag=2 hlen=3 plen=441:signature packet: algo 1, keyid AA84FDED4E218633 version 4, created 1534210649, md5len 0, sigclass 0x00 digest algo 10, begin of digest 54 c3 hashed subpkt 33 len 21 (issuer fpr v4 D32C227073F822651EAD8F5DAA84FDED4E218633) hashed subpkt 2 len 4 (sig created 2018-08-14) critical hashed subpkt 3 len 4 (sig expires after 1y0d0h0m) subpkt 16 len 8 (issuer key ID AA84FDED4E218633) data: [3070 bits]
Question is what a user is supposed to do, when the signature has been expired.
Trac: Summary: GPG signatures do not authenticate filenames to protect users against freeze, replay and version-rollback attacks Parent: N/Ato#3893 (moved)
Tor Browser uses an automatic update system now, so the only concern is an initial download. The Windows Expert Bundle is the only package a user must manually update. While I doubt many Windows users are verifying the openpgp signature, we should probably provide a mechanism for verifying the signature was recently created.
We can add during final gpg signing:
--default-sig-expire The default expiration time to use for signature expiration. Valid values are "0" for no expiration, a number followed by the letter d (for days), w (for weeks), m (for months), or y (for years) (for example "2m" for two months, or "5y" for five years), or an absolute date in the form YYYY-MM-DD. Defaults to "0".
As for "what should a user do if the signature is expired", the best we can say is "try again from a different website" or "contact with us and tell us where you got the package".