This is a best effort attempt at ensuring that the adverse impact of
reformatting the entire tree over the comments would be minimal. I've used a
combination of strategies including disabling of formatting, some manual
formatting and some changes to formatting to work around some clang-format
limitations.
Differential Revision: https://phabricator.services.mozilla.com/D13073
--HG--
extra : moz-landing-system : lando
This is a best effort attempt at ensuring that the adverse impact of
reformatting the entire tree over the comments would be minimal. I've used a
combination of strategies including disabling of formatting, some manual
formatting and some changes to formatting to work around some clang-format
limitations.
Differential Revision: https://phabricator.services.mozilla.com/D13073
--HG--
extra : moz-landing-system : lando
This is a best effort attempt at ensuring that the adverse impact of
reformatting the entire tree over the comments would be minimal. I've used a
combination of strategies including disabling of formatting, some manual
formatting and some changes to formatting to work around some clang-format
limitations.
Differential Revision: https://phabricator.services.mozilla.com/D13073
--HG--
extra : moz-landing-system : lando
Before this change, the trusted URI schemes, based on a string whitelist, were:
https, file, resource, app, moz-extension and wss.
This change removes "app" from the list (since we don't implement it),
and adds "about" to the list (because we control the delivery of that).
Adds a new TYPE_SPECULATIVE to nsIContentPolicy uses it as the type for
speculative connection channels from the IO service. I believe I've added it to
all the content policies in tree to make sure it behaves the same as TYPE_OTHER
used to.
The webextension test shows that the webextension proxy API sees speculative
lookups requested through the IO service.
MozReview-Commit-ID: DQ4Kq0xdUOD
--HG--
extra : rebase_source : d9460fdac118bc68f0db79749a16f181b580f2e7
Websites which collect passwords but don't use HTTPS start showing scary
warnings from Firefox 51 onwards and mixed context blocking has been
available even longer.
.onion sites without HTTPS support are affected as well, although their
traffic is encrypted and authenticated. This patch addresses this
shortcoming by making sure .onion sites are treated as potentially
trustworthy origins.
The secure context specification
(https://w3c.github.io/webappsec-secure-contexts/) is pretty much focused
on tying security and trustworthiness to the protocol over which domains
are accessed. However, it is not obvious why .onion sites should not be
treated as potentially trustworthy given:
"A potentially trustworthy origin is one which a user agent can
generally trust as delivering data securely.
This algorithms [sic] considers certain hosts, scheme, and origins as
potentially trustworthy, even though they might not be authenticated and
encrypted in the traditional sense."
(https://w3c.github.io/webappsec-secure-contexts/#is-origin-trustworthy)
We use step 8 in the algorithm to establish trustworthiness of .onion
sites by whitelisting them given the encrypted and authenticated nature
of their traffic.
Most of the names passed to nsIStringBundle::{Get,Format}StringFromUTF8Name
have one of the two following forms:
- a 16-bit C string literal, which is then converted to an 8-bit string in
order for the lookup to occur;
- an 8-bit C string literal converted to a 16-bit string, which is then
converted back to an 8-bit string in order for the lookup to occur.
This patch introduces and uses alternative methods that can take an 8-bit C
string literal, which requires changing some signatures in other methods and
functions. It replaces all C++ uses of the old methods.
The patch also changes the existing {Get,Format}StringFromName() methods so
they take an AUTF8String argument for the name instead of a wstring, because
that's nicer for JS code.
Even though there is a method for C++ code and a different one for JS code,
|binaryname| is used so that the existing method names can be used for the
common case in both languages.
The change reduces the number of NS_ConvertUTF8toUTF16 and
NS_ConvertUTF16toUTF8 conversions while running Speedometer v2 from ~270,000 to
~160,000. (Most of these conversions involved the string
"deprecatedReferrerDirective" in nsCSPParser.cpp.)
--HG--
extra : rebase_source : 3bee57a501035f76a81230d95186f8c3f460ff8e
Add a field to the HSTS cache which indicates the source of the HSTS
entry if known, from the preload list, organically seen header, or HSTS
priming, or unknown otherwise. Also adds telemetry to collect the source
when upgrading in NS_ShouldSecureUpgrade.
MozReview-Commit-ID: 3IwyYe3Cn73
--HG--
extra : rebase_source : 9b8daac3aa02bd7a1b4285fb1e5731a817a76b7f
Collect telemetry for all requests to get an exact percentage of
requests that are subject to HSTS priming, and how many result in an
HSTS Priming request being sent. Clean up telemetry to remove instances
of double counting requests if a priming request was sent.
HSTSPrimingListener::ReportTiming was using mCallback to calculate
timing telemetry, but we were calling swap() on the nsCOMPtr. Give it an
explicit argument for the callback.
Add tests for telemetry values to all of the HSTS priming tests. This
tests for the minimum as telemetry may be gathered on background or
other requests.
MozReview-Commit-ID: 5V2Nf0Ugc3r
--HG--
extra : rebase_source : daa357219a77d912a78b95a703430f39d884c6ab
According to the spec, content from loopback addresses should no longer
be treated as mixed content even in secure origins. See:
- 349501cdaa
- https://w3c.github.io/webappsec-secure-contexts/#is-origin-trustworthy
Note that we only whitelist '127.0.0.1' and '::1' to match Chrome 53 and
later. See:
- 130ee686fa
It is unclear if HTTPS origins should be able to use workers and WebSocket
connections through a loopback HTTP address. They are not supported in Chrome
(whether this is intentional or not is uncertain) so lets just ignore them for
now.
See also: https://github.com/w3c/web-platform-tests/pull/5304