Closed Bug 391699 Opened 17 years ago Closed 6 years ago

warning users against possible malicious code at websites

Categories

(Firefox :: Security, defect)

x86
Windows XP
defect
Not set
normal

Tracking

()

RESOLVED DUPLICATE of bug 432687

People

(Reporter: chofmann, Unassigned)

Details

(Keywords: sec-want, Whiteboard: [sg:want?])

There is a lot of malicous code on the web and the trend seem to be taking the numbers higher...

On Apr 27, 2007, at 9:53 PM, Chris Hofmann wrote:

>
> "SiteAdvisor reports 0.13 percent of all links on major search engines
> results contain browser exploits"
>
> http://blogs.pcworld.com/staffblog/archives/004248.html 

Brendan Eich wrote:
> IE, or others as well? Method, reproducible by others?
>
> /be 

I wasn't able to track down much detail about the claims made by the Exploit Security Labs study and the high number of exploits being injected into google ad content, but there is another semi-related interesting study published by some Google folks about lessons learned from mal-ware detecting bots

 http://www.usenix.org/events/hotbots07/tech/full_papers/provos/provos.pdf

Has anyone done any thinking about the possibility of  adding some of these content analysis techniques or abstracting rules directly in the browser to inform/guide users away potentially dangerous situations?

the research suggests that enough simple and straight forward patterns might already exist to make detection and warnings like:

 -"this site/page appears to be searching your system for possible vulnerability" (patterns for checking for old vulnerable versions of java, plugins, and unpatched browser versions)

 -"this site appears to be obfuscating relocation of your browser to a possibly dangerous site" (overly complex eval that turns into location.replace)

Think something like this could be done?  The first might be hard since version checking is pervasive, but could a set of these rules be kept up to date enough to be useful? 
The "obfuscated eval/location.replace" combo seems to have been widely used in the past enough that it should be considered a dangerous pattern.  I can't think of any reason this might be used in any legitimate way.  bclary/others, can you think of any general web content that has a legitimate use for this combo that that might be broken if we warned or broke the redirect?   Would something like this make sense to add to the other clamp down work happening on eval()?
in a presentation at blackhat summer 2007 feinstien_and_peck showed some analysis  that indicates there is quite a bit of obfuscated JS code on the web, so making sure the warnings are confined to just dangerous code might be tricky.  More here

http://people.mozilla.com/~chofmann/security/bh-usa-07-feinstien_and_peck-WP.pdf

Using CaffeineMonkey to find and analyze dangerous JS code might be interesting research.
Whiteboard: [sg:investigate]
Whiteboard: [sg:investigate] → [sg:want?]
maybe the tracking bug for doing something like this from the talk we heard today from bing folks.
Status: NEW → RESOLVED
Closed: 6 years ago
Resolution: --- → DUPLICATE
You need to log in before you can comment on or make changes to this bug.