Opened 9 years ago

Closed 7 years ago

#2199 closed defect (fixed)

rules with [^/@:] don't catch all traffic

Reported by: dkg Owned by: dtauerbach
Priority: High Milestone:
Component: HTTPS Everywhere/EFF-HTTPS Everywhere Version:
Severity: Keywords: chromium
Cc: mikeperry, aaronsw Actual Points:
Parent ID: Points:
Reviewer: Sponsor:

Description

Torproject.xml currently has the following

 <rule from="^http://([^/:@]*)\.torproject\.org/" to="https://$1.torproject.org/"/>

but an attacker trying to get you to send (for example) cookies in the clear can just include a username part in (for example) an img src to coax the browser into making a cleartext connection:

<html>
<head>
<title>a test</title>
</head>
<body>
<!-- this first one gets loaded in the clear -->
<img src="http://www@www.torproject.org/images/icon-default.jpg" />
<!-- https-everywhere intercepts this one and sends it out over https -->
<img src="http://www.torproject.org/images/icon-default.jpg" />
</body>
</html>

this seems especially bad for sites with cookies to project which don't have the secure flag set properly.

Child Tickets

Change History (17)

comment:1 Changed 9 years ago by dkg

Bah. "cookies to project" should have been "cookies to protect", of course.

comment:2 Changed 9 years ago by pde

Are you sure this works in practice? We thought about it early on but Firefox seems to ask the user for confirmation before it follows URIs that contain username/password portions.

comment:3 Changed 9 years ago by dkg

Yes, i'm sure. visiting the URLs directly will trigger firefox's confirmation prompt, but i'm concerned more about the embedded img src's which don't seem to be prompted for.

I've placed the following code online:

<html>
<head>
<title>a test</title>
</head>
<body>
<!-- this first one gets loaded in the clear -->
<img src="http://www@duckduckgo.com/nduck.v104.png" />
<!-- https-everywhere intercepts this one and sends it out over https -->
<img src="http://duckduckgo.com/nduck.v104.png" />
</body>
</html>

If you have firebug installed, open up the net console, and visit the example (The net console might close when you switch domains. just re-open it and refresh the page with ctrl-shift-R)

you should see one request to the duckduckgo servers in the clear (HTTP) and another one encrypted (HTTPS).

tcpdump + wireshark confirms this behavior for me on a debian squeeze system with https-everywhere 0.9.0 installed and the duckduckgo rule enabled.

comment:4 Changed 9 years ago by pde

Priority: normalminor
Status: newaccepted

comment:5 Changed 9 years ago by pde

Priority: minornormal

Ok, that's not minor.

comment:6 Changed 9 years ago by pde

Priority: normalmajor

So what can we do about this. Here are some ideas:

  1. Ask mozilla to raise the warning prompt for images and other subsidiary requests.
  1. Take the replace [/:@] with [/]. I think that defeats dkg's attack. Ironically it would leave all the rules that DON'T start with a pattern vulnerable. We would need to add a pattern to the front of every (www\.)? rule to catch a username/password :(.
  1. Use Mozilla's built in URI parsing to strip out username/password fields before we do URI rewriting (then add them back in, if we think they're ever legit?).
  1. Per rransom's suggestion, move to something like agl's proposed chromium syntax. https://mail1.eff.org/pipermail/https-everywhere/2010-November/000545.html. There are several downsides to that.

comment:7 Changed 9 years ago by pde

Ouch. I meant replace the exclude /:@ pattern with a pattern that only exludes /.

comment:8 Changed 9 years ago by dkg

In trac wiki syntax, you can get literal strings to display properly by wrapping them in backtick characters. So:

replace `[^/:@]` with `[^/]`.

becomes:

replace [^/:@] with [^/].

I really hope we don't go with pde's option 1. i don't think that mozilla's warning prompt in these cases is particularly useful or intelligible.

comment:9 Changed 9 years ago by rransom

Replying to pde:

  1. Per rransom's suggestion, move to something like agl's proposed chromium syntax. https://mail1.eff.org/pipermail/https-everywhere/2010-November/000545.html. There are several downsides to that.

The only downside is that you will need to convert all of the existing rulesets to the new format. This time, add an XML namespace URI and/or some other version indicator.

But the real reason this is necessary is (quoting agl's message):

Serialising and re-parsing URLs is very scary from a security point of view. It would be greatly preferable to handle URLs in their processed form.

If we don't start operating on parsed URLs, we can only expect more exploitable bugs like this one in the future.

Since Firefox extensions can use arbitrary JavaScript code to munge URL requests before they are acted on, HTTPS Everywhere rules can easily continue to support matching URL components against regular expressions and inserting captured strings into any component of the new URL.

comment:10 in reply to:  9 Changed 9 years ago by pde

Replying to rransom:

The only downside is that you will need to convert all of the existing rulesets to the new format. This time, add an XML namespace URI and/or some other version indicator.

Yes, and the fact that the Wikipedia and Google Search rulesets cannot be represented with fewer than thousands of entries in agl's format. Yes, we could do interfield regexps of some sort, but only at the expense of significant added complexity.

But the real reason this is necessary is (quoting agl's message):

Serialising and re-parsing URLs is very scary from a security point of view. It would be greatly preferable to handle URLs in their processed form.

If we don't start operating on parsed URLs, we can only expect more exploitable bugs like this one in the future.

So my proposal #3 is a hybrid between these approaches; it relies on Mozilla to do some but not all of the URI parsing. Question: can we think of any other categories of parsing trouble that we might run into if we do #3?

comment:11 Changed 9 years ago by pde

I think we should implement solution 3. This is a blocker for 1.0, I think.

comment:12 Changed 8 years ago by pde

Status: acceptedneeds_review

comment:13 Changed 8 years ago by pde

Resolution: fixed
Status: needs_reviewclosed

One further bug fixed in the fix. Calling this done.

comment:14 Changed 7 years ago by pde

Keywords: chromium added
Resolution: fixed
Status: closedreopened

I just realised that this bug is present in the Chrome/Chromium port :(

comment:15 Changed 7 years ago by pde

Cc: mikeperry aaronsw added

comment:16 Changed 7 years ago by pde

Owner: changed from pde to dtauerbach
Status: reopenedassigned

comment:17 Changed 7 years ago by pde

Resolution: fixed
Status: assignedclosed

This will be fixed in chrome-2012.5.1

Note: See TracTickets for help on using tickets.