3 /content-security-policy/securitypolicyviolation/ tests are expected TIMEOUT
Categories
(Core :: DOM: Security, defect, P3)
Tracking
()
People
(Reporter: jmaher, Unassigned)
References
Details
(Whiteboard: [domsecurity-backlog1])
These 3 tests are marked expected timeout:
/content-security-policy/securitypolicyviolation/script-sample-no-opt-in.html
/content-security-policy/securitypolicyviolation/script-sample.html
/content-security-policy/securitypolicyviolation/targeting.html
running these locally I see:
/content-security-policy/securitypolicyviolation/script-sample-no-opt-in.html subtest:
Timeout JavaScript URLs in iframes should not have a sample. Test timed out
/content-security-policy/securitypolicyviolation/script-sample.html subtest:
Timeout JavaScript URLs in iframes should have a sample. Test timed out
/content-security-policy/securitypolicyviolation/targeting.html subtest:
Timeout Elements created in this document, but pushed into a same-origin frame trigger on that frame's document, not on this frame's document. Test timed out
:ckerschb, can you help triage this?
Updated•4 years ago
|
Comment 1•4 years ago
|
||
(In reply to Joel Maher ( :jmaher ) (UTC-4) from comment #0)
These 3 tests are marked expected timeout:
/content-security-policy/securitypolicyviolation/script-sample-no-opt-in.html
We block the javascript: URI but we don't produce violations for javascript URIs. That's only a subtest within that test though and the other 3 tests pass. I guess there is no option to just disable that one subtest, is there? If so, I would be fine with disabling that to speed up round trip time of treeherder.
/content-security-policy/securitypolicyviolation/script-sample.html
Same reason as above.
/content-security-policy/securitypolicyviolation/targeting.html
The 2 subtests for the shadowtree are disabled within this test. I am fine with also disabling the Elements created in this document
because that's really a corner case and I don't think we will fix that any time soon.
Comment 2•4 years ago
|
||
You can't disable subtests in a way that helps here; you can just choose to ignore the result. We could split the one subtest into a different file?
Description
•