Open Bug 410700 Opened 17 years ago Updated 2 years ago

Security checks for file:// URIs is horribly slow.

Categories

(Core :: Security: CAPS, defect, P3)

defect

Tracking

()

People

(Reporter: jst, Assigned: dveditz)

References

Details

(Keywords: perf)

Attachments

(1 obsolete file)

Running the testcase in bug 375470 from the local file system shows that out of all the time spent in js_Invoke(), we spend ~70% doing security checks. That's compared to ~15% when running the same testcase loaded through http! Here's a rough breakdown of where time is spent (scoped at js_Invoke()): 1100.00 js_Invoke (203935 calls) 92.06 XPC_WN_CallMethod (201116 calls) 91.63 XPCWrappedNative::CallMethod (203131 calls) 71.52 nsScriptSecurityManager::CanAccess (203131 calls) 71.50 nsScriptSecurityManager::CheckPropertyAccessImpl (203590 calls) 70.34 nsScriptSecurityManager::CheckSameOriginDOMProp (200007 calls) 70.32 nsScriptSecurityManager::CheckSameOriginPrincipalInternal (200007 calls) 70.16 nsScriptSecurityManager::SecurityCompareURIs (170005 calls) 68.62 nsScriptSecurityManager::SecurityCompareFileURIs (170003 calls) 52.06 nsLocalFile::ResolveAndStat (510117 calls) 50.11 GetFileInfo (340086 calls) 49.27 GetFileAttributesExW (340086 calls) 27.79 nsLocalFile::Contains (170003 calls) 27.32 nsLocalFile::IsDirectory (170014 calls) 24.86 nsLocalFile::GetTarget (340009 calls) Seems like we'd need to stat less (GetFileInfo etc) and remember more of that in the file objects or something.
Flags: blocking1.9?
a stat cache ? Eh ... bug 307815 arguments against it. But 285157 might help to speed up PR_GetFileInfo on Windows.
Keywords: perf
Dan, is there any way we could avoid stating files when we're doing security checks. Looks pretty bad that 52% of the time executing javascript is spent stating files.
Flags: blocking1.9? → blocking1.9+
Priority: -- → P3
Version: unspecified → Trunk
I'd argue for fixing bug 402983 before worrying about performance here, since that fix is likely to impact performance characteristics no matter how it goes...
Blocks: 230606
I swear I already attached this today :-( This patch changes the algorithm to use only file: URI string manipulation to the extent possible. It assumes (danger will robinson) that the file URIs are already normalized by time someone's trying to security check them. My reading of nsStandardURL says that's a guarantee, but it needs to be tested or we could have a directory traversal problem. It also assumes nsStandardURL will canonicalize Windows '\\' slashes into '/' ('\\' isn't technically valid in a file: URI, but browsers traditionally accept them on Windows). With that out of the way... The code supports 4 modes, a same-directory mode (stricter than the default), the default setting of samedir + a guessed savedir, an anyfile excluding directories (preserving the fix for bug 209234), and a "traditional" setting that does what FF2 and earlier did (all file: are same origin). The "anyfile" setting might be slow because it does do isDir checks. In fact it does twice as many as the current code because of bz's insistence on symmetry. But it does try to do a few faster string checks first since they're likely cases. This isn't the default setting anyway, it's for developers who might need it who nonetheless want a safer-than-traditional mode: downloaded attack pages could load known files but won't be able to look around for them. The current default "subdir" mode is replaced with "savedir". It's now symmetric, but is limited to a single directory below the parent file, and the name of that subdir must contain the base filename somewhere. Given the variety of localizations that's the best I could do http://mxr.mozilla.org/l10n/search?string=filesFolder.*%3D&regexp=on&find=contentAreaCommands%5C.properties%24 If you allow unlimited subdirectories, and at the same time demand that the checks be symmetric, an attack page simply opens a frame to a known file at the root of the drive and then injects code there to reach down wherever it likes.
Attachment #298663 - Flags: superreview?(bzbarsky)
Attachment #298663 - Flags: review?
Attachment #298663 - Flags: review? → review?(jonas)
> an attack page simply opens a frame to a known file at the root of the drive Could we prevent that via CheckLoadURI? I guess we can hash that out in bug 402983...
Comment on attachment 298663 [details] [diff] [review] remove file access from file URI security checks >Index: caps/include/nsScriptSecurityManager.h >+// "save webpage complete" save folder (symetric) "symmetric" >Index: caps/src/nsScriptSecurityManager.cpp >+ PRInt32 sourceDirLen = sourceSpec.RFindChar('/')+1; >+ PRInt32 targetDirLen = targetSpec.RFindChar('/')+1; Wouldn't it make sense to use the "directory" and "filename" properties on nsIURL here? That would still have the issue that file:///tmp would not get treated as a directory name (since the filename is "tmp" in that case), but file:///tmp/#foo has an empty filename, as does file:///tmp/?foo, and those would be caught as directories, while the code in this patch doesn't catch those. That way you don't have to rely on the '/' vs ' \\' thing, etc, just on the way the parsing works in nsStandardURL. Seems more robust to me. I'm not reviewing the string manipulation details for now, since they'll change a good bit with this approach, I suspect. >+ if (ext != kNotFound) >+ parent.SetCharAt('\0', ext); parent.Truncate(ext); Then you don't need to .get() when doing the Find(), just Find() the string.
Comment on attachment 298663 [details] [diff] [review] remove file access from file URI security checks Word on the street is that there's a new patch on the way?
Attachment #298663 - Flags: superreview?(bzbarsky)
Attachment #298663 - Flags: review?(jonas)
Flags: wanted1.9.0.x+
Flags: tracking1.9+
Flags: blocking1.9-
Added a patch to bug 402983 that also addresses this speed issue
Attachment #298663 - Attachment is obsolete: true
The initial patch in bug 402983 did fix this, by restricting the expensive potentially-stat()ing calls to DoLoadURI() and having caps use quick Equals() checks. The final version moved the checks back into caps (nsPrincipal::CheckMayLoad) and unfortunately it got slow again.
Not only did it not get better, it got another 2-3x slower!
Upping priority on this so that it stands a chance to get fixed in a dot-release.
Priority: P3 → P1
Priority: P1 → P2
Flags: blocking1.9.1?
Not going to block 1.9.1 on this. Dan, are you planning on working on this? It would be good to get some traction here...
Flags: blocking1.9.1? → blocking1.9.1-
Flags: wanted1.9.2?
Flags: wanted1.9.2? → wanted1.9.2+
Moving to p3 because no activity for at least 1 year(s). See https://github.com/mozilla/bug-handling/blob/master/policy/triage-bugzilla.md#how-do-you-triage for more information
Priority: P2 → P3
Severity: normal → S3
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: