Bobnomnom or some troll who is excellent at impersonating him seems to be clamoring for blocking all plugins from the Firefox address space, including flash.
In https://gitweb.torproject.org/tor-browser.git/commitdiff/efbc82de0af0c6db05804777777b7177e593f73d, we block everything but flash from entering the address space because it has been shown that arbitrary non-malicious browser plugins can be invasive to privacy. Culprits include AV plugins that report your browsing history to the AV vendor for inspection, and bank authentication plugins that send additional identifiable info to sites under certain circumstances.
Note that neither that patch nor the 'plugin.disable' pref are a comprehensive defense for keeping malicious code out of Firefox's address space. It really only helps if code is generally well-behaved, but has some functionality we simply don't want in the browser at all. In the case of AV plugins, they can seriously manipulate the process address space during initialization in a way that simply disabling them from the Firefox UI won't undo. Moreover, in some cases their hooks and binary patches are so custom-tailored to official Firefox binaries that they have caused crashes when loaded under TBB. As far as I know, this is not the case for flash, which follows the NPAPI interface and doesn't do any other binary patching or hooking.
Truly Malicious code has lots of ways to hoist itself into Firefox, including but not limited to: writing extensions, XPCOM components, or DLLs into the Firefox app or profile directories, injecting DLLs via CreateRemoteThread debugger attachment or the AppInitDLLs registry key, modifying system DLLs, and watching for desktop keypress and drawing events.
I don't understand what threat model bob is using to argue for the additional exclusion of flash. If flash was malicious and you had it installed on your system, it could do all of these things if you ever ran your normal Firefox browser and it got loaded there. It would then have no problems using your user privileges to write the malicious portions of itself into your TBB directory using the above or other vectors.
Perhaps bob can explain the specific issue with flash in this ticket.
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information
Child items 0
Show closed items
No child items are currently assigned. Use child items to break down this issue into smaller parts.
Linked items 0
Link issues together to show that they're related.
Learn more.
As I said in the description, truly malicious code can inject itself into TBB in many ways.
If we're seriously going to consider impacting people's ability to enable flash (by requiring a restart), we need justification as to what actual protections are gained by asking flash nicely not to infect Firefox. Real malware wouldn't be stopped by such a measure.
As I said in the description, truly malicious code can inject itself into TBB in many ways.
If we're seriously going to consider impacting people's ability to enable flash (by requiring a restart), we need justification as to what actual protections are gained by asking flash nicely not to infect Firefox. Real malware wouldn't be stopped by such a measure.
Hi Mike,
Before I get to my main points, I'd first like to explain the circumstances in which I use Torbrowser, in order that my remarks may be understood in the context of my mindset and my point of view regarding security. Basically, it's important to understand why users who are worried about security would be worried about the implications of autoloading Flash. So I'd like to start by giving a brief overview of my security precautions when using Tor:
I use Whonix. It's an operating system designed to be run via VirtualBox. It's based on Debian, and implements "security by isolation". Whonix consists of two parts. The first part is a VirtualBox VM image that solely runs Tor and acts as a gateway. The other part, called the "Workstation" VM, is where the user runs his programs (like Torbrowser, IRC, messenger, etc). By design, all network activity in the Workstation VM is transparently routed through the gateway VM (which routes it through Tor). Therefore it's impossible for any program running in the Workstation to discover the user's IP address. And so it's clear that one of the major benefits of running Whonix is that it prevents exploits like the one the FBI recently deployed against Freedom Hosting users, because even if an exploit infects the Workstation VM, it's still impossible for that exploit to discover the user's IP address, by design.
So, the types of people who run Whonix are those of us who are especially concerned about guarding our anonymity even against unlikely threats such as "the FBI develops and deploys a zero-day attack which is designed to exploit Torbrowser to deanonymize us."
Admittedly, the Whonix userbase is currently a tiny fraction of the total Torbrowser userbase. So maybe the Torbrowser userbase largely doesn't care that much about having airtight security guarantees. And as a developer, I appreciate how crucial it is for Torbrowser to offer users the most convenient experience possible, and how important it is to strike the right balance between security concerns vs convenience.
For the sake of argument, let's assume security is the primary concern. How might this "autoload system Flash binary at startup" behavior cause security problems? Well, in truth, you're correct that it's very difficult to think of any practical scenario in which an adversary could take advantage of this behavior. But, as someone who cares deeply about having reliable anonymity tools, it makes me extremely uneasy that Torbrowser can be influenced at all by any files outside of the Torbrowser folder, on principle.
This may be paranoia, but it seems like healthy paranoia: Philosophically, if you copy a Torbrowser folder from computer A to computer B, then it's desirable for Torbrowser to behave "the same way" on both computers, to the extent which is possible/reasonable. But if by default you try to autoload a system flash plugin binary, then that's no longer the case. The flash version may be different, or the flash binary may not exist at all. The only reason this is a concern is because the user was never consulted, so the user may not be aware that Torbrowser is searching for and loading unsigned binaries by default (in this case, the system Flash plugin).
It seems like a question of ethics/morals. If preserving anonymity is the most important goal of the Tor project, then it seems impossible to be morally okay with the idea that Torbrowser's runtime behavior can be influenced by files outside the Torbrowser directory, by default, unless the user has been made aware of that.
Don't get me wrong, it's very valuable that Torbrowser supports loading the system Flash plugin. There are certainly many users who will want to do that. However it seems unfair to force that feature onto all users by default, without explicitly bringing it to their attention.
Ok, enough philosophizing. At this point I'd like to point out a practical scenario in which the user's security may be jeopardized by this auto-loading behavior. Note that the scenario isn't merely a theoretical concern; users often exhibit the pattern of behavior I'm about to describe. Here's the scenario:
It's possible that sometime in the future, a Flash-based remote code execution vulnerability will be discovered. Now, what if the user's system Flash plugin is out of date the next time they launch Torbrowser? Then they won't have the Flash security update, and therefore they'll be vulnerable to the newly-discovered Flash exploit until they update their system Flash plugin.
For example, imagine a user installs Firefox, along with a system Flash plugin, but chooses not to allow the system flash plugin to automatically update itself. Then the user downloads Torbrowser, shuts down his computer, and goes on vacation to Amsterdam for a week. While he's on vacation, a vulnerability is discovered in the latest system flash plugin which allows a specially-crafted SWF file to overflow a memory buffer and thereby enable an attacker to execute arbitrary malicious code.
When our user returns from vacation, he starts up Torbrowser without checking whether his Flash is currently up to date. His Torbrowser will now load the old, vulnerable system Flash plugin. At this point, the user starts browsing around various onion sites. If the user is unfortunate enough to visit a malicious site that serves the exploit to him, then his Torbrowser will be immediately pwned. Then his home IP address (and therefore his real identity) can easily be revealed to the adversary (unless the user happens to be using an isolated environment like Whonix), potentially landing him in jail or in trouble with his government.
Now, the only reason this scenario is disturbing in the slightest is because it was facilitated by Torbrowser's default behavior. If Torbrowser defaults to "can be influenced by files outside of Torbrowser directory" without explaining that to the user, then from the user's point of view, it's extraordinarily surprising that something else on the system that seems completely unrelated to Torbrowser (in this case, installing Firefox+Flash but choosing to disallow Flash autoupdates) could possibly be the cause of a catastrophic security breach in Torbrowser under any circumstances. Whereas if Torbrowser had asked the user to opt-in to the autoloading behavior, then the user himself is rightfully to blame: in that case the user would be fully aware Torbrowser is autoloading the system Flash, yet he failed to ensure the Flash binary was up to date, which is clearly his own fault.
At this point, it's possible you may have spotted some error in my reasoning; if so, you may feel like my overall concerns are totally invalid just because that one particular scenario turned out to be invalid. But my main point is simply this: if Torbrowser is searching for and executing arbitrary unsigned binary files (e.g. system Flash plugin or anything else) outside of Torbrowser's folder, then we should be extremely careful about the implications and the risks, and at a minimum it seems like we shouldn't make it the default behavior unless the user has been consulted first, or unless we explain the implications to him.
Therefore, it really seems like Torbrowser should never, by default, allow itself to be influenced by any file outside of the Torbrowser folder, unless the user has explicitly allowed it, or is at least aware that Torbrowser does that.
Is everyone comfortable with Torbrowser being affected by files outside of its directory by default, without consulting the user? If not, then it might be good to consider changing the default behavior to "must ask the user whether he's OK with this."
Lastly, I realize there's a chance that perhaps I'm simply being unreasonably concerned about this whole thing, realistically. So if that's the case, then I sincerely apologize, and please feel free to ignore this writeup. But the reason I wrote this is because I can't think of any logical reason why it's unreasonable to expect Torbrowser's default behavior to be: Torbrowser's operation shall never be affected by any file outside the Torbrowser directory under any circumstances, unless the user has explicitly been asked and gave approval."
First, remember that TBB pulls in a lot of code from all over your system. It is dependent on a ton of libraries, display manager code, and interacts with other apps on your desktop all the time through X11 event monitoring and other mechanisms.
Further, at the end of the day, I want the default experience to be maximally usable, but of course not at the expense of any known proxy bypass or deanonymization issues.. If there was a solid, known security reason not to load Flash, I would be more convinced that it was worth impeding UX. But the Firefox plugin blocker has shown no signs of being incomplete, nor has flash shown any signs of being malicious in its interaction with the Firefox address space.
However, it does sound like we're getting closer to a situation where we can have both decent UX and satisfy this request. If we can touch up this patch a bit to also add a button in the Addons->Plugins UI such that users can enable plugins by clicking on that button (in addition to via the Torbutton settings), this does seem like a reasonable user experience, especially since it would appear to no longer require restarting the browser to load+enable Flash (which was a key aspect of my initial opposition).
The other thing we can (perhaps also?) do is make this part of one of the positions on the security slider from #9837 (closed).
It is dependent on a ton of libraries, display manager code, and interacts with other apps on your desktop all the time through X11 event monitoring and other mechanisms.
It's system libs, all that is about operating system and ui. That need to exist to use TBB. Plugins (Flash) is not part of OS, it's not need to exist to use TBB.
If we can touch up this patch a bit to also add a button in the Addons->Plugins UI such that users can enable plugins by clicking on that button (in addition to via the Torbutton settings)
Attached complete version. As enhancement it doesn't requires any changes to Torbutton code. No need to show warning and ask user anything while enabled to load plugins if this ticket didn't counted as bug. But TBB should be shipped with plugin.disable preference defined as true.
About patch. It's still fragile in several places. First, it observes changes of preference by PluginProvider, but this can't guarantee plugins was loaded or unloaded already. Need to change code of nsPluginHost::Observe from nsPluginHost.cpp instead to generate new notify for PluginProvider after operations was complete. Second, it used observer by gListView. Installing of addonlisteners for every plugin to catch onUninstalled would be more correct version instead. (During test it failed to catch it correctly at first show of glistview and required many code, so unsure about this way)
As far as localization, the easiest thing is probably to use DTD/properties elements from either Torbutton or Tor Launcher (we can create new strings).
I would also like to still pop up the current confirmation dialog as well (which uses the xpcom-category-entry-added observer notifcation). Is that still emitted, and can we still cancel the load from there? If not, we should find some other way to provide the same notification and confirmation behavior.
I would also like to still pop up the current confirmation dialog as well (which uses the xpcom-category-entry-added observer notifcation). Is that still emitted, and can we still cancel the load from there? If not, we should find some other way to provide the same notification and confirmation behavior.
If user will click to enable one of plugin from list that was loaded after this new button then pop up still should to occur. This patch do not changes those parts, it adds layer before exists confirmations. Do you want to change Torbutton so it asked twice or to enable flash after user clicked to find all plugins by changing plugin.disable? (This plugin disabling seems like makes confusion with attributes from getPluginTags per every plugin, but they are different and about different things)