Security idea: rate-limiting requests Topic is solved

Talk about code development, features, specific bugs, enhancements, patches, and similar things.
Forum rules
Please keep everything here strictly on-topic.
This board is meant for Pale Moon source code development related subjects only like code snippets, patches, specific bugs, git, the repositories, etc.

This is not for tech support! Please do not post tech support questions in the "Development" board!
Please make sure not to use this board for support questions. Please post issues with specific websites, extensions, etc. in the relevant boards for those topics.

Please keep things on-topic as this forum will be used for reference for Pale Moon development. Expect topics that aren't relevant as such to be moved or deleted.
Thrawn

Security idea: rate-limiting requests

Unread post by Thrawn » 2016-08-15, 00:47

Many of the vulnerabilities in SSL/TLS require the attacker to send a large number of requests to the target site. BEAST, Lucky 13, breaking RC4, etc, eg require many millions of requests. Even much more compact attacks, like CRIME and POODLE, still require hundreds of requests per byte, and if simple mitigations are applied, CRIME could easily take thousands.

So, as a proactive measure, why not apply some rate-limiting when a page is sending an unreasonable number of requests to a site? Even busy AJAX shouldn't need more than, eg, 1 request per second. So, if traffic from a page exceeds a threshold - say, 100 requests to the same third-party site within the same minute, or more than 10000 first-party requests within the same minute - then the browser could start applying a 10-second delay between each request (exact parameters subject to discussion, testing, etc, of course).

As an alternative to silent rate-limiting, it might also be reasonable to give the user a warning to the effect that the site is sending a large number of requests and may be slowing things down. This could also be relevant to users on low-bandwidth connections, who would get a way to leave sites that are too busy.

New Tobin Paradigm

Re: Security idea: rate-limiting requests

Unread post by New Tobin Paradigm » 2016-08-15, 00:53

It is not the job of a web client to protect a web server.

Thrawn

Re: Security idea: rate-limiting requests

Unread post by Thrawn » 2016-08-15, 01:37

Er...in the listed attack scenarios (BEAST, CRIME, POODLE, or even just limiting usage of low bandwidth), it's not the web server being targeted; it's the end user. Rate-limiting in the browser would simply be protecting the browser owner.

At the moment, whenever there's a TLS zero-day, everyone pretty much has to twiddle their thumbs and wait for someone else to protect them. It shouldn't have to be that way. Many of these attacks share a common signature of involving a large number of requests sent to the target site. By imposing a small amount of throttling on requests - not blocking anything, but delaying requests that come in unreasonable numbers - we could kill these exploits in their tracks and give Pale Moon safer encryption than any other browser.

User avatar
Moonchild
Pale Moon guru
Pale Moon guru
Posts: 35635
Joined: 2011-08-28, 17:27
Location: Motala, SE
Contact:

Re: Security idea: rate-limiting requests

Unread post by Moonchild » 2016-08-15, 10:16

BEAST, CRIME, POODLE are vulnerabilities present on insecure servers - although the end-user is indirectly affected, it is up to the server operator (and the service provider this applies to, in turn) to have their security in order and respond to known attacks. The rate limiting should occur on the server side, which would be proper server administration to begin with. (which is why iptables has connection tracking modules etc.)

Although a nice idea if it would be the task of the web browser to provide this kind of networking security (it is not), rate limiting would go directly against the premise behind a web client that should always focus on providing web content with minimal delay. There is no easy way to determine when requests are legitimate or not, and I don't see a "zero day TLS vulnerability" being a major threat warranting the implementation of such a complex mechanism.
"Sometimes, the best way to get what you want is to be a good person." -- Louis Rossmann
"Seek wisdom, not knowledge. Knowledge is of the past; wisdom is of the future." -- Native American proverb
"Linux makes everything difficult." -- Lyceus Anubite

Thrawn

Re: Security idea: rate-limiting requests

Unread post by Thrawn » 2016-08-15, 23:22

Moonchild wrote:although the end-user is indirectly affected, it is up to the server operator
This is the part that concerns me. If someone directs an attack like this at my bank, for example, it's my account and my money on the line. I can try to say, "Well, it's the bank's responsibility to fix it", but if hackers successfully impersonate me and empty my account this way, I don't expect my bank to helpfully refund the money. They aren't actually going to accept ultimate accountability.
The rate limiting should occur on the server side, which would be proper server administration to begin with. (which is why iptables has connection tracking modules etc.)
For levels that would actually constitute a DoS, yes, but that's a different order of magnitude to what we're talking about.

Also it could do nothing against CRIME, for example, which examines the request payload. Even if the server rejects the requests due to rate-limiting, the damage is already done.
Although a nice idea if it would be the task of the web browser to provide this kind of networking security (it is not)
But all the major browsers do attempt to mitigate known threats to TLS. Record-splitting to mitigate BEAST. TLS_FALLBACK_SCSV to prevent downgrade attacks. Disabling TLS compression to avoid CRIME. These were mitigations for specific threats, but why not a general mitigation that could counter as-yet-unknown threats?
rate limiting would go directly against the premise behind a web client that should always focus on providing web content with minimal delay.
But not all the power has to go to the website operator when deciding what to render. Surely the browser operator should get first priority in making decisions. And when something appears to be operating outside normal parameters - eg when a script is taking too long to execute - it's normal and acceptable to give a warning and the option to stop it. I'm just proposing that massive numbers of similar requests are likewise potentially "outside normal parameters"
There is no easy way to determine when requests are legitimate or not
I've proposed a criterion: unreasonably high numbers of requests being sent to the same site within a short time window. Value of "unreasonable" subject to discussion and fine-tuning; I would suggest that it be conservative (ie allow quite a lot of requests before stepping in).
I don't see a "zero day TLS vulnerability" being a major threat warranting the implementation of such a complex mechanism.
You're serious? Speaking as a browser developer, you're not very concerned about TLS zero-days? HEIST showed up just last week...

User avatar
Moonchild
Pale Moon guru
Pale Moon guru
Posts: 35635
Joined: 2011-08-28, 17:27
Location: Motala, SE
Contact:

Re: Security idea: rate-limiting requests

Unread post by Moonchild » 2016-08-17, 09:15

I don't think you understand what you're asking, or where the responsibility for these things lies.
"Sometimes, the best way to get what you want is to be a good person." -- Louis Rossmann
"Seek wisdom, not knowledge. Knowledge is of the past; wisdom is of the future." -- Native American proverb
"Linux makes everything difficult." -- Lyceus Anubite

Thrawn

Re: Security idea: rate-limiting requests

Unread post by Thrawn » 2016-08-17, 23:56

Moonchild wrote:I don't think you understand what you're asking
Sliding windows tracking the number of requests from each domain to each other domain within a time frame, with throttling if it gets too high.
or where the responsibility for these things lies.
It's an opportunity (to be safer against future threats than all other browsers), rather than a responsibility.

Thrawn

Re: Security idea: rate-limiting requests

Unread post by Thrawn » 2016-08-31, 05:13

And another threat pops up that involves malicious third-party JavaScript sending huge amounts of traffic to the sensitive site in order to break its encryption...

viewtopic.php?f=13&t=12955

New Tobin Paradigm

Re: Security idea: rate-limiting requests

Unread post by New Tobin Paradigm » 2016-08-31, 05:56

RESOLVED INVALID

User avatar
Moonchild
Pale Moon guru
Pale Moon guru
Posts: 35635
Joined: 2011-08-28, 17:27
Location: Motala, SE
Contact:

Re: Security idea: rate-limiting requests

Unread post by Moonchild » 2016-08-31, 09:02

Thrawn wrote:
Moonchild wrote:I don't think you understand what you're asking
Sliding windows tracking the number of requests from each domain to each other domain within a time frame, with throttling if it gets too high.
No, you obviously don't understand what you're asking. Square peg, round hole.
or where the responsibility for these things lies.
It's an opportunity (to be safer against future threats than all other browsers), rather than a responsibility.
It is not an opportunity if it is completely outside of the design scope of a web browser.
it is not the responsibility of a web browser to do this kind of traffic shaping.
"Sometimes, the best way to get what you want is to be a good person." -- Louis Rossmann
"Seek wisdom, not knowledge. Knowledge is of the past; wisdom is of the future." -- Native American proverb
"Linux makes everything difficult." -- Lyceus Anubite

Locked