Various places in dom/ use the pattern:
already_AddRefed<NodeInfo> ni = ...;
which is supposed to be disallowed by our static analysis code, but
isn't, for whatever reason. To fix our static analysis code, we need to
eliminate instances of the above pattern.
Unfortunately, eliminating this pattern requires restructuring how Nodes
are created. Most Node subclasses take `already_AddRefed<NodeInfo>&` in
their constructors, and a few accept `already_AddRefed<NodeInfo>&&`. We
need to enforce the latter pattern consistently, which requires changing
dozens of source files.
This makes things saner for all consumers, which makes it worth doing on its
own. But it also gives me an easy nsTArray to work with, which I can use to
dispatch DOM events with array properties.
Differential Revision: https://phabricator.services.mozilla.com/D3466
--HG--
extra : rebase_source : e8bcc75e84845e42c932886502f99ca3154df48d
Mostly automatic via sed. Only parts which I touched manually (apart from a
couple ones where I fixed indentation or which had mispelled arguments) are the
callers. I may have removed a couple redundant `virtual` keywords as well when
I started to do it manually, I can revert those if wanted.
Most of them are just removing the argument, but in Element.cpp I also added an
assertion for GetBindingParent when binding the ShadowRoot's kids (the binding
parent is set from the ShadowRoot constructor, and I don't think we bind a
shadow tree during unlink or what not which could cause a behavior difference).
Differential Revision: https://phabricator.services.mozilla.com/D2574
MozReview-Commit-ID: 2oIgatty2HU
For number controls, nsContentUtils::IsFocusedContent doesn't really do the
right thing, because the thing it thinks is focused is the anonymous text
element inside the number control. As a result, we weren't properly updating
the state of the currently-focused number control when hitting enter in it to
submit the form.
The HTMLFormElement change is enough on its own to fix the bug. The constraint
validation change is a just-in-case.
I haven't figured out a sane way to write a reftest for this, unfortunately:
the enter key press needs to look like a real user event to trigger the
submission behavior.
This patch is an automatic replacement of s/NS_NOTREACHED/MOZ_ASSERT_UNREACHABLE/. Reindenting long lines and whitespace fixups follow in patch 6b.
MozReview-Commit-ID: 5UQVHElSpCr
--HG--
extra : rebase_source : 4c1b2fc32b269342f07639266b64941e2270e9c4
extra : source : 907543f6eae716f23a6de52b1ffb1c82908d158a
This was done automatically replacing:
s/mozilla::Move/std::move/
s/ Move(/ std::move(/
s/(Move(/(std::move(/
Removing the 'using mozilla::Move;' lines.
And then with a few manual fixups, see the bug for the split series..
MozReview-Commit-ID: Jxze3adipUh
This patch goes through and changes a bunch of places in our tree which mention
this bug to use the new feature, making the methods more strongly typed.
There are probably more places in tree which could be changed, but I didn't try
to find them.
Websites which collect passwords but don't use HTTPS start showing scary
warnings from Firefox 51 onwards and mixed context blocking has been
available even longer.
.onion sites without HTTPS support are affected as well, although their
traffic is encrypted and authenticated. This patch addresses this
shortcoming by making sure .onion sites are treated as potentially
trustworthy origins.
The secure context specification
(https://w3c.github.io/webappsec-secure-contexts/) is pretty much focused
on tying security and trustworthiness to the protocol over which domains
are accessed. However, it is not obvious why .onion sites should not be
treated as potentially trustworthy given:
"A potentially trustworthy origin is one which a user agent can
generally trust as delivering data securely.
This algorithms [sic] considers certain hosts, scheme, and origins as
potentially trustworthy, even though they might not be authenticated and
encrypted in the traditional sense."
(https://w3c.github.io/webappsec-secure-contexts/#is-origin-trustworthy)
We use step 8 in the algorithm to establish trustworthiness of .onion
sites by whitelisting them given the encrypted and authenticated nature
of their traffic.
This is necessary in order to parse style attributes using the subject
principal of the caller, rather than defaulting to the page principal.
MozReview-Commit-ID: GIshajQ28la
--HG--
extra : rebase_source : 5dba46f61d70ec647cae16383b62961ac72d2f47