Opened 4 years ago

Closed 20 months ago

#10280 closed enhancement (fixed)

Torbrowser shouldn't load flash into the process space by default

Reported by: mikeperry Owned by:
Priority: Medium Milestone:
Component: Applications/Tor Browser Version:
Severity: Normal Keywords: tbb-testcase, TorBrowserTeam201503R, tbb-firefox-patch, tbb-4.5-alpha, MikePerry201503R
Cc: mcs, brade, erinn, gk, arthuredelstein Actual Points:
Parent ID: Points:
Reviewer: Sponsor:

Description

Bobnomnom or some troll who is excellent at impersonating him seems to be clamoring for blocking all plugins from the Firefox address space, including flash.

In https://gitweb.torproject.org/tor-browser.git/commitdiff/efbc82de0af0c6db05804777777b7177e593f73d, we block everything but flash from entering the address space because it *has* been shown that arbitrary non-malicious browser plugins *can* be invasive to privacy. Culprits include AV plugins that report your browsing history to the AV vendor for inspection, and bank authentication plugins that send additional identifiable info to sites under certain circumstances.

Note that neither that patch nor the 'plugin.disable' pref are a comprehensive defense for keeping malicious code out of Firefox's address space. It really only helps if code is generally well-behaved, but has some functionality we simply don't want in the browser at all. In the case of AV plugins, they can seriously manipulate the process address space during initialization in a way that simply disabling them from the Firefox UI won't undo. Moreover, in some cases their hooks and binary patches are so custom-tailored to official Firefox binaries that they have caused crashes when loaded under TBB. As far as I know, this is not the case for flash, which follows the NPAPI interface and doesn't do any other binary patching or hooking.

Truly Malicious code has lots of ways to hoist itself into Firefox, including but not limited to: writing extensions, XPCOM components, or DLLs into the Firefox app or profile directories, injecting DLLs via CreateRemoteThread debugger attachment or the AppInitDLLs registry key, modifying system DLLs, and watching for desktop keypress and drawing events.

I don't understand what threat model bob is using to argue for the additional exclusion of flash. If flash *was* malicious and you had it installed on your system, it could do all of these things if you ever ran your normal Firefox browser and it got loaded there. It would then have no problems using your user privileges to write the malicious portions of itself into your TBB directory using the above or other vectors.

Perhaps bob can explain the specific issue with flash in this ticket.

Child Tickets

Attachments (11)

disable_plugin_button.diff (3 bytes) - added by cypherpunks 4 years ago.
Complete fix of UI with button
notify_for_pluginprovider.diff (3 bytes) - added by cypherpunks 4 years ago.
Another Notify for PluginProvider
prerelease_with_racings_without_cpp_fix.diff (3 bytes) - added by cypherpunks 4 years ago.
Complete version for test purpose without rebuilding browser
proper_intercat_by_ui.diff (3 bytes) - added by cypherpunks 4 years ago.
Proper interface for extensions.js
disable_plugin_button.2.diff (6.5 KB) - added by erinn 4 years ago.
notify_for_pluginprovider.2.diff (2.1 KB) - added by erinn 4 years ago.
prerelease_with_racings_without_cpp_fix.2.diff (7.9 KB) - added by erinn 4 years ago.
proper_intercat_by_ui.2.diff (5.3 KB) - added by erinn 4 years ago.
prerelease_with_cpp_fix_nontested.2.diff (3 bytes) - added by cypherpunks 3 years ago.
Complete version, non tested
prerelease_with_cpp_fix_nontested.diff (3 bytes) - added by cypherpunks 3 years ago.
Complete version, non tested.
10280.patch (12.3 KB) - added by disgleirio 3 years ago.
Proper UI for hidden pref, with support to TorBrowser security

Download all attachments as: .zip

Change History (62)

comment:1 Changed 4 years ago by brade

Cc: mcs brade added

comment:2 Changed 4 years ago by arma

Some irc person adds:
"why did you said plugin.disable can't keep plugins out of firefox?"
"https://mxr.mozilla.org/mozilla-central/source/dom/plugins/base/nsPluginHost.cpp#1919"
"how plugins can be loaded with such pref?"
"#10280 can't be enhancement. it's defect as is."

comment:3 Changed 4 years ago by mikeperry

Cc: erinn added

As I said in the description, truly malicious code can inject itself into TBB in many ways.

If we're seriously going to consider impacting people's ability to enable flash (by requiring a restart), we need justification as to what actual protections are gained by asking flash nicely not to infect Firefox. Real malware wouldn't be stopped by such a measure.

comment:4 Changed 4 years ago by gk

Cc: gk added

comment:5 Changed 4 years ago by cypherpunks

Last edited 4 years ago by cypherpunks (previous) (diff)

comment:6 in reply to:  3 Changed 4 years ago by onezero

Replying to mikeperry:

As I said in the description, truly malicious code can inject itself into TBB in many ways.

If we're seriously going to consider impacting people's ability to enable flash (by requiring a restart), we need justification as to what actual protections are gained by asking flash nicely not to infect Firefox. Real malware wouldn't be stopped by such a measure.

Hi Mike,

Before I get to my main points, I'd first like to explain the circumstances in which I use Torbrowser, in order that my remarks may be understood in the context of my mindset and my point of view regarding security. Basically, it's important to understand why users who are worried about security would be worried about the implications of autoloading Flash. So I'd like to start by giving a brief overview of my security precautions when using Tor:

I use Whonix. It's an operating system designed to be run via VirtualBox. It's based on Debian, and implements "security by isolation". Whonix consists of two parts. The first part is a VirtualBox VM image that solely runs Tor and acts as a gateway. The other part, called the "Workstation" VM, is where the user runs his programs (like Torbrowser, IRC, messenger, etc). By design, all network activity in the Workstation VM is transparently routed through the gateway VM (which routes it through Tor). Therefore it's impossible for any program running in the Workstation to discover the user's IP address. And so it's clear that one of the major benefits of running Whonix is that it prevents exploits like the one the FBI recently deployed against Freedom Hosting users, because even if an exploit infects the Workstation VM, it's still impossible for that exploit to discover the user's IP address, by design.

So, the types of people who run Whonix are those of us who are especially concerned about guarding our anonymity even against unlikely threats such as "the FBI develops and deploys a zero-day attack which is designed to exploit Torbrowser to deanonymize us."

Admittedly, the Whonix userbase is currently a tiny fraction of the total Torbrowser userbase. So maybe the Torbrowser userbase largely doesn't care that much about having airtight security guarantees. And as a developer, I appreciate how crucial it is for Torbrowser to offer users the most convenient experience possible, and how important it is to strike the right balance between security concerns vs convenience.

For the sake of argument, let's assume security is the primary concern. How might this "autoload system Flash binary at startup" behavior cause security problems? Well, in truth, you're correct that it's very difficult to think of any practical scenario in which an adversary could take advantage of this behavior. But, as someone who cares deeply about having reliable anonymity tools, it makes me extremely uneasy that Torbrowser can be influenced at all by any files outside of the Torbrowser folder, on principle.

This may be paranoia, but it seems like healthy paranoia: Philosophically, if you copy a Torbrowser folder from computer A to computer B, then it's desirable for Torbrowser to behave "the same way" on both computers, to the extent which is possible/reasonable. But if by default you try to autoload a system flash plugin binary, then that's no longer the case. The flash version may be different, or the flash binary may not exist at all. The only reason this is a concern is because the user was never consulted, so the user may not be aware that Torbrowser is searching for and loading unsigned binaries by default (in this case, the system Flash plugin).

It seems like a question of ethics/morals. If preserving anonymity is the most important goal of the Tor project, then it seems impossible to be morally okay with the idea that Torbrowser's runtime behavior can be influenced by files outside the Torbrowser directory, by default, unless the user has been made aware of that.

Don't get me wrong, it's very valuable that Torbrowser supports loading the system Flash plugin. There are certainly many users who will want to do that. However it seems unfair to force that feature onto all users by default, without explicitly bringing it to their attention.

Ok, enough philosophizing. At this point I'd like to point out a practical scenario in which the user's security may be jeopardized by this auto-loading behavior. Note that the scenario isn't merely a theoretical concern; users often exhibit the pattern of behavior I'm about to describe. Here's the scenario:

It's possible that sometime in the future, a Flash-based remote code execution vulnerability will be discovered. Now, what if the user's system Flash plugin is out of date the next time they launch Torbrowser? Then they won't have the Flash security update, and therefore they'll be vulnerable to the newly-discovered Flash exploit until they update their system Flash plugin.

For example, imagine a user installs Firefox, along with a system Flash plugin, but chooses *not* to allow the system flash plugin to automatically update itself. Then the user downloads Torbrowser, shuts down his computer, and goes on vacation to Amsterdam for a week. While he's on vacation, a vulnerability is discovered in the latest system flash plugin which allows a specially-crafted SWF file to overflow a memory buffer and thereby enable an attacker to execute arbitrary malicious code.

When our user returns from vacation, he starts up Torbrowser without checking whether his Flash is currently up to date. His Torbrowser will now load the old, vulnerable system Flash plugin. At this point, the user starts browsing around various onion sites. If the user is unfortunate enough to visit a malicious site that serves the exploit to him, then his Torbrowser will be immediately pwned. Then his home IP address (and therefore his real identity) can easily be revealed to the adversary (unless the user happens to be using an isolated environment like Whonix), potentially landing him in jail or in trouble with his government.

Now, the only reason this scenario is disturbing in the slightest is because it was facilitated by Torbrowser's default behavior. If Torbrowser defaults to "can be influenced by files outside of Torbrowser directory" without explaining that to the user, then from the user's point of view, it's extraordinarily surprising that something else on the system that seems completely unrelated to Torbrowser (in this case, installing Firefox+Flash but choosing to disallow Flash autoupdates) could possibly be the cause of a catastrophic security breach in Torbrowser under any circumstances. Whereas if Torbrowser had asked the user to opt-in to the autoloading behavior, then the user himself is rightfully to blame: in that case the user would be fully aware Torbrowser is autoloading the system Flash, yet he failed to ensure the Flash binary was up to date, which is clearly his own fault.

At this point, it's possible you may have spotted some error in my reasoning; if so, you may feel like my overall concerns are totally invalid just because that one particular scenario turned out to be invalid. But my main point is simply this: if Torbrowser is searching for and executing arbitrary unsigned binary files (e.g. system Flash plugin or anything else) outside of Torbrowser's folder, then we should be extremely careful about the implications and the risks, and at a minimum it seems like we shouldn't make it the default behavior unless the user has been consulted first, or unless we explain the implications to him.

Therefore, it really seems like Torbrowser should never, by default, allow itself to be influenced by any file outside of the Torbrowser folder, unless the user has explicitly allowed it, or is at least aware that Torbrowser does that.

Is everyone comfortable with Torbrowser being affected by files outside of its directory by default, without consulting the user? If not, then it might be good to consider changing the default behavior to "must ask the user whether he's OK with this."

Lastly, I realize there's a chance that perhaps I'm simply being unreasonably concerned about this whole thing, realistically. So if that's the case, then I sincerely apologize, and please feel free to ignore this writeup. But the reason I wrote this is because I can't think of any logical reason why it's unreasonable to expect Torbrowser's default behavior to be: Torbrowser's operation shall never be affected by any file outside the Torbrowser directory under any circumstances, unless the user has explicitly been asked and gave approval."

What do you think?

comment:7 Changed 4 years ago by mikeperry

First, remember that TBB pulls in a *lot* of code from all over your system. It is dependent on a ton of libraries, display manager code, and interacts with other apps on your desktop all the time through X11 event monitoring and other mechanisms.

Further, at the end of the day, I want the default experience to be maximally usable, but of course not at the expense of any known proxy bypass or deanonymization issues.. If there was a solid, known security reason not to load Flash, I would be more convinced that it was worth impeding UX. But the Firefox plugin blocker has shown no signs of being incomplete, nor has flash shown any signs of being malicious in its interaction with the Firefox address space.

*However*, it does sound like we're getting closer to a situation where we can have both decent UX and satisfy this request. If we can touch up this patch a bit to also add a button in the Addons->Plugins UI such that users can enable plugins by clicking on that button (in addition to via the Torbutton settings), this does seem like a reasonable user experience, especially since it would appear to no longer require restarting the browser to load+enable Flash (which was a key aspect of my initial opposition).

The other thing we can (perhaps also?) do is make this part of one of the positions on the security slider from #9837.

comment:8 Changed 4 years ago by cypherpunks

It is dependent on a ton of libraries, display manager code, and interacts with other apps on your desktop all the time through X11 event monitoring and other mechanisms.

It's system libs, all that is about operating system and ui. That need to exist to use TBB. Plugins (Flash) is not part of OS, it's not need to exist to use TBB.

comment:9 Changed 4 years ago by cypherpunks

Status: newneeds_revision

If we can touch up this patch a bit to also add a button in the Addons->Plugins UI such that users can enable plugins by clicking on that button (in addition to via the Torbutton settings)

Attached complete version. As enhancement it doesn't requires any changes to Torbutton code. No need to show warning and ask user anything while enabled to load plugins if this ticket didn't counted as bug. But TBB should be shipped with plugin.disable preference defined as true.

About patch. It's still fragile in several places. First, it observes changes of preference by PluginProvider, but this can't guarantee plugins was loaded or unloaded already. Need to change code of nsPluginHost::Observe from nsPluginHost.cpp instead to generate new notify for PluginProvider after operations was complete. Second, it used observer by gListView. Installing of addonlisteners for every plugin to catch onUninstalled would be more correct version instead. (During test it failed to catch it correctly at first show of glistview and required many code, so unsure about this way)

comment:10 Changed 4 years ago by cypherpunks

And yet it need to support translated strings. Right now all text for buttons just hardcoded.

comment:11 Changed 4 years ago by cypherpunks

First, addressed with another patch for notify of PluginProvider (with patch to core code).

comment:12 Changed 4 years ago by mikeperry

Thanks for making forward progress on this!

As far as localization, the easiest thing is probably to use DTD/properties elements from either Torbutton or Tor Launcher (we can create new strings).

I would also like to still pop up the current confirmation dialog as well (which uses the xpcom-category-entry-added observer notifcation). Is that still emitted, and can we still cancel the load from there? If not, we should find some other way to provide the same notification and confirmation behavior.

comment:13 Changed 4 years ago by cypherpunks

I would also like to still pop up the current confirmation dialog as well (which uses the xpcom-category-entry-added observer notifcation). Is that still emitted, and can we still cancel the load from there? If not, we should find some other way to provide the same notification and confirmation behavior.

If user will click to enable one of plugin from list that was loaded after this new button then pop up still should to occur. This patch do not changes those parts, it adds layer before exists confirmations. Do you want to change Torbutton so it asked twice or to enable flash after user clicked to find all plugins by changing plugin.disable? (This plugin disabling seems like makes confusion with attributes from getPluginTags per every plugin, but they are different and about different things)

comment:14 Changed 4 years ago by mikeperry

Ok, I think this may be acceptable without an additional pop-up then. I will have to try it out though when I get the chance.

comment:15 Changed 4 years ago by cypherpunks

Changed patch for extensions.js to address second issue. Now it uses AddonManager only to interact with another code.

comment:16 Changed 4 years ago by cypherpunks

Except it can't handle case with non exists plugins.

comment:17 Changed 4 years ago by gk

Keywords: tbb-testcase added

comment:18 Changed 4 years ago by cypherpunks

This actually need patch for Torbutton too. torbutton_toggle_plugins() should to toggle plugin.disable pref.

comment:19 Changed 4 years ago by cypherpunks

Nobody care.

Changed 4 years ago by cypherpunks

Attachment: disable_plugin_button.diff added

Complete fix of UI with button

Changed 4 years ago by cypherpunks

Another Notify for PluginProvider

Changed 4 years ago by cypherpunks

Complete version for test purpose without rebuilding browser

Changed 4 years ago by cypherpunks

Attachment: proper_intercat_by_ui.diff added

Proper interface for extensions.js

comment:20 Changed 4 years ago by cypherpunks

Nobody helped me, and just ignores and still using oftc. good luck.

comment:21 Changed 4 years ago by mikeperry

Keywords: MikePerry201401R added
Status: needs_revisionneeds_review

Well, we had patches here a minute ago. Seems they disappeared. Somehow...

I guess I'll just set this ticket to needs_review and see if they happen to turn up before I merge stuff before the next release. Otherwise, this seems like a good browser or extdev interview question.

Might need a new ticket though. This one is kind of filled with junk.

comment:22 Changed 4 years ago by erinn

skruffy was keeping them safe with bobnomnom's win98 license. I've attached them after he graciously gave me the copies.

Changed 4 years ago by erinn

Changed 4 years ago by erinn

Changed 4 years ago by erinn

comment:23 Changed 4 years ago by erinn

Updated again with all the correct patches.

comment:24 Changed 4 years ago by guranna2

Please also see new tickets from another "troll" #10885, #10772, #10773

comment:25 Changed 4 years ago by cypherpunks

I believe many of these problems are ultimately the result of bad software engineering. There is a reason you have a Requirements Document, and a Design Document. Bad News when combining the two into one. The Requirements document specifies WHAT, a Design Document specifies HOW.

By putting them together you lock your implementation, and make it difficult to determine what is a functional defect. For instance, if you had a Requirements document, this discussion would not be necessary.

comment:26 Changed 4 years ago by mikeperry

Keywords: MikePerry201402R added; MikePerry201401R removed

comment:27 in reply to:  25 ; Changed 4 years ago by mikeperry

Replying to cypherpunks:

I believe many of these problems are ultimately the result of bad software engineering. There is a reason you have a Requirements Document, and a Design Document. Bad News when combining the two into one. The Requirements document specifies WHAT, a Design Document specifies HOW.

By putting them together you lock your implementation, and make it difficult to determine what is a functional defect. For instance, if you had a Requirements document, this discussion would not be necessary.

Oh yeah, here's a good idea: let's clutter this bug up some more by going meta and arguing about the design document and how many different pieces of paper we should have filed with the appropriate standards bodies in triplicate...

"You have just leveled up. You are now a Level 23 Bureaucrat. Congratulations on your studious attention to rigid processes!" ;).

In all seriousness though, I actually don't believe your claims to be the case at all. If these documents were separate (really they already nearly are - the requirements and implementation already are in separate sections), we *still* would not have had a requirement that plugins not have been loaded into the address space, only that plugins must be audited and restricted such that they can meet our privacy and security requirements, or that they be prevented from running by default if they cannot. Just as we specify now.

So no, having two pieces of paper instead of just one would not have saved us from this argument. This is an argument about a highly nuanced implementation detail, and if that detail mattered enough for us to write a patch for it.

Thankfully, bobnomnom saved us from endless argument by just writing a patch. Let's not drag it meta-meta now, if we can avoid that, please.

Bob - I will review your patch as soon as possible. At a glance, it looks like we have to localize it still though. I might be able to do that myself, but it will mean it may be delayed a little more. Please be patient.

comment:28 in reply to:  27 ; Changed 4 years ago by cypherpunks

Type: enhancementdefect

Replying to mikeperry:

Replying to cypherpunks:

In all seriousness though, I actually don't believe your claims to be the case at all. If these documents were separate (really they already nearly are - the requirements and implementation already are in separate sections), we *still* would not have had a requirement that plugins not have been loaded into the address space, ...

Yes, this is correct. A discussion would have happened still. But the discussion at then point is more clear: should we add this requirement, and how shall we word it? Once that is done, then this automatically becomes a functional defect--no further discussion necessary.

So technically there are potentially three defects here--a defect in the requirements, once fixed, triggers defect in the design, and then the code. Potentially three defects. How does your design cover the new requirement/vunlerability class? Does this trigger a re-design of something?

And then you can track these things. You can track that you had either:

  1. A defect in the requirements.
  2. A defect in the design.
  3. A defect in the code.

Lower number means harder to fix. Higher number, easier.

This is a defect in the requirements, because you did not have a requirement that restricted this class of vulnerabilities, yet these are realistic vulnerabilities.

So no, having two pieces of paper instead of just one would not have saved us from this argument.

Buy putting them in the same document, you make tracking changes in that document more complicated. The documents should be revisioned and versioned--and in source control. It should be clear who made what change. This gets complicated if you obviously have diagrams and so on, and formatting, in the documents. People use special software just for this--you could get by with a text file, or simply multiple versions on the file name, with text annotations of changes in the documents and in the commit summaries.

Once that is done, a lot of things get easier. You can read a slew of papers on this stuff. For the most part, you will benifit in your projects with this approach because it is possible to completely spec everything out before hand--you are not like a startup needing "Agile" crap, because you have 10 clients that are all marketing people that don't know what they want, and keep calling you each day with requirements changes. We know exactly what we want Tor Browser to do, if we think hard enough. How it does it, is design. Requirements changes should only be necessary in cases like this, where someone finds need for a new requirement based on an increased understanding of vulnerability surface, or because some new plugin or change to firefox comes out, or html5 comes out, or stuff like that. Finally, new functionality. Then you know where these requirements are coming from, and can document that. You don't have to use some expensive software to do this, although people do. More advanced commercial bug tracking software would let you enter this bug in a fine-grained way, give you daily graphs of bugs/requirements and bugs/components. Some of them will even generate code for you! You don't need that for this project. You just need your own way of keeping track of this stuff, that is relatively easy and transparent.

With each ticket, you ask, is this an enhancement or defect? You check the requirements. If you cannot determine by checking the requirements, and it does not add new functionality, then there is a defect in the requirements. This does not add new functionality, and the requirements are mute on this, so it is a defect. In particular, the defect is that the security requirements are overly vague, and hence the designer did not catch this. "Is this a bug in requirements, design, or code/implementation?"

You may have other projects that are more "Agile" like atlas, but Tor Browser is not one of them. This is all a bit of an art form, and some people are great coders but suck at this. Yet when you look at the figures, this stuff saves exponential time==$$. Know your limitations and hire someone that knows what they are doing.

You then get the benifit of determining which designs cover which requirements, which component, and which code covers which requirements, and eventually can track bugs easier, and determine responsible developers.

Finally, you can get community feedback easier on requirements, then you can on a patch.

It was calculated in the 1970's and 80's that every hour spent on requirements equals 100 hours less coding, and every hour on design, 10 hours less coding. (or something like that). The factors are usually the same for time fixing defects--so are you sure this fixes it?

So in one sentance (or so) please explain what this new requirement is, and add it to the requirements document (once you fix that up). Thanks.

[This puts me +100000XP which bumps me up to level 25 Bureaucrat btw. I am thinking of multi-classing.]

Last edited 4 years ago by cypherpunks (previous) (diff)

comment:29 in reply to:  28 ; Changed 4 years ago by mikeperry

Replying to cypherpunks:

Replying to mikeperry:

Replying to cypherpunks:

In all seriousness though, I actually don't believe your claims to be the case at all. If these documents were separate (really they already nearly are - the requirements and implementation already are in separate sections), we *still* would not have had a requirement that plugins not have been loaded into the address space, ...

Yes, this is correct. A discussion would have happened still. But the discussion at then point is more clear: should we add this requirement, and how shall we word it? Once that is done, then this automatically becomes a functional defect--no further discussion necessary.

Uh, that's exactly why I was so reluctant to do anything about this. Because this issue does *not* violate any of our design requirements.

So technically there are potentially three defects here--a defect in the requirements, once fixed, triggers defect in the design, and then the code. Potentially three defects. How does your design cover the new requirement/vunlerability class? Does this trigger a re-design of something?

And then you can track these things. You can track that you had either:

  1. A defect in the requirements.
  2. A defect in the design.
  3. A defect in the code.

Lower number means harder to fix. Higher number, easier.

This is a defect in the requirements, because you did not have a requirement that restricted this class of vulnerabilities, yet these are realistic vulnerabilities.

Wow, you are indeed a rigid process wonk. I suppose RUP is your bible? Waterfall is the One True and Only Way? Do you like to sing the IBM company song before you get into bed? ;)

I maintain this is not a design defect. Indeed, the requirements section is deliberately generalized, because it is meant to specify the minimal set of requirements for a browser to adhere to in Private Browsing Mode to such that we would consider that browser's Private Browsing Mode suitable for Tor. In fact, the requirements state exactly this.

So no, having two pieces of paper instead of just one would not have saved us from this argument.

With each ticket, you ask, is this an enhancement or defect? You check the requirements. If you cannot determine by checking the requirements, and it does not add new functionality, then there is a defect in the requirements. This does not add new functionality, and the requirements are mute on this, so it is a defect. In particular, the defect is that the security requirements are overly vague, and hence the designer did not catch this. "Is this a bug in requirements, design, or code/implementation?"

This is not a bug. This is an enhancement in the implementation, if that. I have not yet heard a solid argument from *anyone* why this is a violation of our design requirements, let alone a bug.

You may have other projects that are more "Agile" like atlas, but Tor Browser is not one of them. This is all a bit of an art form, and some people are great coders but suck at this. Yet when you look at the figures, this stuff saves exponential time==$$. Know your limitations and hire someone that knows what they are doing.

You then get the benifit of determining which designs cover which requirements, which component, and which code covers which requirements, and eventually can track bugs easier, and determine responsible developers.

Finally, you can get community feedback easier on requirements, then you can on a patch.

It was calculated in the 1970's and 80's that every hour spent on requirements equals 100 hours less coding, and every hour on design, 10 hours less coding. (or something like that). The factors are usually the same for time fixing defects--so are you sure this fixes it?

It was also realized in the late 90s and early 2000s that designing too much up front and expecting your foresight to be perfect can cause costly delays and lost release cycles. But again, we're not arguing about a design defect here. The requirements we specified are still fully valid.

So in one sentance (or so) please explain what this new requirement is, and add it to the requirements document (once you fix that up). Thanks.

I am not fixing the design document. Please see https://www.torproject.org/projects/torbrowser/design/#DesignRequirements. This issue violates none of those. In fact, it doesn't even violate our philosophical guidelines.

[This puts me +100000XP which bumps me up to level 25 Bureaucrat btw. I am thinking of multi-classing.]

You mean you're not already a full time bureaucrat? ;)

I think we're actually on the same page here, more or less. You just want two pages, and you think this is a design defect. I think it is not a design defect, and therefore did not warrant Tor Project development time.

Again, always happy to accept patches, though.

comment:30 Changed 4 years ago by mikeperry

Type: defectenhancement

comment:31 in reply to:  29 Changed 4 years ago by cypherpunks

Replying to mikeperry:

Replying to cypherpunks:

Replying to mikeperry:

Replying to cypherpunks:

hostility, defensivness, and anger. great. The proof is in the pudding. And Tor Browser is a POS. Thanks! Please submit your resignation on behalf of the Tor community. Seriously.

comment:32 Changed 4 years ago by mikeperry

I won't apologize for doing my job, and doing it properly. Nor will I tolerate veiled insults on my engineering experience or our development practices, especially those designed to generate pointless busywork for the whole team.

comment:33 Changed 4 years ago by erinn

Keywords: tbb-firefox-patch added

comment:34 Changed 4 years ago by erinn

Component: Firefox Patch IssuesTor Browser

comment:35 Changed 3 years ago by mikeperry

Keywords: TorBrowserTeam201502R added; MikePerry201402R removed

Damn, a troll managed to derail this bug for a year. Hopefully these patches still apply.

comment:36 Changed 3 years ago by mikeperry

Keywords: MikePerry201502R added

comment:37 Changed 3 years ago by mikeperry

Resolution: fixed
Status: needs_reviewclosed

Ok, I merged https://trac.torproject.org/projects/tor/raw-attachment/ticket/10280/disable_plugin_button.2.diff, https://trac.torproject.org/projects/tor/raw-attachment/ticket/10280/notify_for_pluginprovider.2.diff and https://trac.torproject.org/projects/tor/raw-attachment/ticket/10280/proper_intercat_by_ui.2.diff into a frankenpatch. I put the DTD strings in torbutton.

There was a lot of overlap between those diffs, but I think I managed to sort out the latest version of each change, and it seems to work OK. Thanks bobnombom. This will appear in 4.5a4.

comment:38 Changed 3 years ago by cypherpunks

Last edited 3 years ago by cypherpunks (previous) (diff)

comment:39 Changed 3 years ago by cypherpunks

Last edited 3 years ago by cypherpunks (previous) (diff)

comment:40 Changed 3 years ago by cypherpunks

Resolution: fixed
Status: closedreopened
Last edited 3 years ago by cypherpunks (previous) (diff)

comment:41 Changed 3 years ago by cypherpunks

Last edited 3 years ago by cypherpunks (previous) (diff)

comment:42 Changed 3 years ago by cypherpunks

Last edited 3 years ago by cypherpunks (previous) (diff)

comment:43 Changed 3 years ago by cypherpunks

Last edited 3 years ago by cypherpunks (previous) (diff)

Changed 3 years ago by cypherpunks

Complete version, non tested

Changed 3 years ago by cypherpunks

Complete version, non tested.

comment:44 Changed 3 years ago by cypherpunks

Resolution: invalid
Status: reopenedclosed

comment:45 Changed 3 years ago by mikeperry

Bob/skruffy - I am not sure what you mean by this being broken. It seems to work just fine for me for Linux+Gnash. Can you describe the problem? Or provide new patches?

Changed 3 years ago by disgleirio

Attachment: 10280.patch added

Proper UI for hidden pref, with support to TorBrowser security

comment:46 in reply to:  45 Changed 3 years ago by disgleirio

Bob/skruffy - I am not sure what you mean by this being broken. It seems to work just fine for me for Linux+Gnash. Can you describe the problem? Or provide new patches?

I'm not Bob/skruffy as you can guess, attached proper patches anyway.
Review and tests needed.

comment:47 Changed 3 years ago by mikeperry

Keywords: TorBrowserTeam201503R tbb-4.5-alpha MikePerry201503R added; TorBrowserTeam201502R MikePerry201502R removed
Resolution: invalid
Status: closedreopened

Ok great, thanks. I'll see if I have time to test this, but it may not make it in time for 4.5a5.

comment:48 Changed 3 years ago by mikeperry

Resolution: fixed
Status: reopenedclosed

I tested this in a new build on Linux, and it seems to work. Gnash is absent from my /proc/pid/maps for Firefox before I click the button, and then appears in /proc/pid/maps afterwords.

This is now merged for 4.5a5. (Btw, I reset the commit info because the email address you gave is not valid).

comment:49 Changed 20 months ago by arthuredelstein

Cc: arthuredelstein added
Resolution: fixed
Severity: Normal
Status: closedreopened

I was examining this patch to see if we need to upstream any of it. But now I wonder if this patch is no longer needed in Tor Browser.

Flash and all other plugins (with a couple of exceptions) appear (in my manual experiments) to be correctly excluded from Firefox if the pref "plugin.disable" is true. (I admit the code in dom/plugins/base/* is complex enough that it is hard to be absolutely sure there is no path where a plugin might be loaded.) The plugins are not listed in the Plugins section of about:addons and are not loaded. The only exceptions I see seems to be "built-in" add-ons such as "OpenH264 Video Codec provided by Cisco Systems, Inc." and "Widevine Content Decryption Module provided by Google Inc." These plugins are excluded from Tor Browser by disabling GMP and EME respectively.

I might be missing something here -- but is there any advantage in including this patch any more? Perhaps the safest thing would be simply to hide the "Plugins" section in about:addons altogether, rather than giving users the option to "Enable" them.

comment:50 Changed 20 months ago by arthuredelstein

Status: reopenedneeds_information

comment:51 in reply to:  49 Changed 20 months ago by arthuredelstein

Resolution: fixed
Status: needs_informationclosed

Replying to arthuredelstein:

I was examining this patch to see if we need to upstream any of it. But now I wonder if this patch is no longer needed in Tor Browser.

We can discuss this question further in #19508. Setting this ticket back to closed (fixed).

Note: See TracTickets for help on using tickets.