I’ve just finished writing up some docs on the new Cross-Site XMLHttpRequest feature in Firefox 3. I was a little worried at first, but it definitely appears to be both easy-to-implement and easy-to-use. Specifically, it’s an implementation of the W3C Access Control working draft (which is respected by Firefox’s XMLHttpRequest).
If you’re interested in giving it a try you should fire up your copy of Firefox 3 and get ready to take it for a spin.
In a nutshell, there are two techniques that you can use to achieve your desired cross-site-request result: Specifying a special Access-Control header for your content or including an access-control processing instruction in your XML.
More information can be found in the documentation but here’s a quick peek at what your code might look like:
An HTML document (served via PHP) that specifies an Access-Control header: (Demo – FF3 Only)
<?php // Change this to allow <yourdomain.com> to make it accessible to your site, or allow <*> for ANYONE to be able to access it. header('Access-Control: allow <ejohn.org>'); ?> <b>John Resig</b>
An XML document that specifies an access-control processing instruction: (Demo – FF3 Only)
<?xml version="1.0" encoding="UTF-8"?> <!-- Change this to allow="yourdomain.com" to make it accessible to your site, or allow="*" for ANYONE to be able to access it. --> <?access-control allow="ejohn.org"?> <simple><name>John Resig</name></simple>
Now what’s especially nice about all this is that you don’t have to change a single line of your client-side code to make this work! Take, for example, this page which requests an HTML file from a remote domain – and, specifically, the JavaScript within it:
var xhr = new XMLHttpRequest(); xhr.open("GET", "http://dev.jquery.com/~john/xdomain/test.php", true); xhr.onreadystatechange = function(){ if ( xhr.readyState == 4 ) { if ( xhr.status == 200 ) { document.body.innerHTML = "My Name is: " + xhr.responseText; } else { document.body.innerHTML = "ERROR"; } } }; xhr.send(null);
This is same-old pure-blood JavaScript/DOM/XMLHttpRequest, as we’re use to it. For some limited applications, I think this functionality is already going to be terribly useful – and once wider adoption starts to trickle in we can certainly see a whole range of applications, especially in the area of client-side applications and mashups.
Kris Zyp (January 9, 2008 at 6:27 pm)
Very cool, I am looking forward to playing with it. Also, I tried to do a little writeup comparing JSONRequest to the XHR cross site capabilites
Mike Malone (January 9, 2008 at 9:16 pm)
If the browser doesn’t know whether it should allow a cross-domain request until after the request is made, won’t this introduce vulnerabilities for non-idempotent requests?
I haven’t read all the details yet, so forgive me if this has been answered already…
Andrew Dupont (January 9, 2008 at 11:19 pm)
@Mike: According to the spec, all non-GET cross-domain requests send out a “test balloon” request to check whether the origin domain is allowed access. If so, the real request happens.
There’s an authorization request cache to minimize the number of authorization checks.
Mike Malone (January 9, 2008 at 11:45 pm)
Ah, it pays to read the spec I guess. Thanks.
John Silvestri (January 10, 2008 at 1:16 am)
Maybe I’m a real killjoy, but I worry about careless web authors implementing things like allow as above, and then letting data leak via ‘legitimate’ XSS.
Any programmer who /understands/ these concepts should set their code to carefully allow only certain sites access, and/or have generic levels of access to public sites…but there a /lot/ of PHP-‘users’ who don’t know half of what they entered into an editor.
I would certainly /hope/ a bank wouldn’t do something stupid like implement this carelessly, but if they did*, or some up-and-coming Facebook-like site did it, some people could have a very bad day. I’m sure these factors were considered already, but I still find it troubling to be breaking down the walls of security present in current browsers, for the sake of Web 2.0.
/me returns to chasing kids off his lawn…
*Oh…like this: http://fergdawg.blogspot.com/2008/01/italian-banks-xss-opportunity-seized-by.html
Filip (January 10, 2008 at 1:51 am)
John,
This is amazing and I can’t wait to see cross-site-xmlhttprequest implemented in all browsers!
Sebastian Werner (January 10, 2008 at 2:32 am)
What exactly is the reason we need this? Has anybody here really understood why XMLHttp is currently limited to one host and cannot communicate cross-domain? I really do not understand that. If XMLHttp cannot do this by default, why it is still possible to load scripts and images from other servers? Why can I do exactly the same type of cross-domain communication using Flash, maybe using Silverlight in the future? What is the original reason for this limitation? Is this documented anywhere?
If, as mentioned in the spec, HTTP DELETE is problematic, because it may delete data, why cannot we filter such actions when detecting a cross-domain communication? GET and POST are possible in the same way when submitting simple form. It is even possible to generate these form elements dynamically. And this also works cross-domain. At least these two HTTP methods should be enabled by default to allow cross-domain communication. The open web, as often mentioned by Alex Russell, really needs features comparable with closed source software e.g. Flash or Silverlight.
Thomas (January 10, 2008 at 5:48 am)
I’m still under the impression – and correct me if I’m wrong – that all these means are tailored to protect the server and its documents. But I thought the issue was to protect the client! Server protection has been around for ages. If I don’t want a certain domain to retreive my document, I simply add an access restriction to my Apache config, and I’m done.
The real challenge is to protect the client, and should be solved on the client, IMHO. Like with any other piece of application software, the user should be able to decide what he wants to allow *per web application*. The browser should prompt the user “This web page wants to connect to … Do you allow this?” as soon as it hits an X request. And the user decides whether he trusts the web app or not. Put the control back into the hands of the users. Any malicious content lies on some server, and intruders will make all of it freely available.
Andrew (January 10, 2008 at 9:08 am)
I agree with Thomas. I never understood the NEED to modify the client security model to allow for this. If this is something the software needs to do, then the developer can implement a proxy on the server side. At least in this way the developer has sole discretion on the connections. Just more to go wrong if you ask me.
past (January 10, 2008 at 9:24 am)
Won’t this functionality open the road to DDOS attacks or other vulnerabilities?
Laurens Holst (January 10, 2008 at 11:01 am)
Can anyone point me to an explanation of why this particular mechanism is ‘secure’? If you can get a script to load from an arbitrary URL, then surely you can make it point to a site that you control?
The only thing this would seem to achieve is that web site owners will get tons of requests from people to add this header or processing instruction, because they don’t want to set up a proxy. I can see it happening that eventually this will be set de facto on all documents/servers.
Seems to me that you might as well just let people do cross-site requests without any strange requirements like this for the remote content. That way things like W3C Tabulator can work without having to allow special privileges to the page in your browser.
~Grauw
Laurens Holst (January 10, 2008 at 11:13 am)
Ok, never mind, I should read the spec before commenting :). Still, I fear that web site owners will be bothered by a lot of requests for this from users, and that they will set this without properly considering the security implications.
Is it really desirable that every web service has to include (or the header) on their pages?
I also hope that non-private HTTP headers such as Content-Type, Content-Encoding and Content-Language will still be accessible in cross-site requests.
~Grauw
Tom (January 10, 2008 at 11:26 am)
I agree with those saying that this spec is misguided. But bothering users too much is also not good. How are they to know in every case what things mean? Further, even communication with the current remote server is already dangerous. We complain when desktop apps report on our behavior but use web sites all the time that do the same. Without a much better security model, I think it’s just a matter of being careful where/how you surf. Not completely unlike being cautious in real life.
The web just plain isn’t secure, and it doesn’t seem to be getting better.
Justin (January 10, 2008 at 12:34 pm)
Sebastian,
The following post on the same-origin policy should help clear up some of your misconceptions: http://taossa.com/index.php/2007/02/08/same-origin-policy/
Thomas,
User dialogs tend to be a less than ideal solution. For example, it was one of the major failings in the ActiveX security model. A much more consistent strategy is for the site to provide a policy. This is necessary because the site developers have the best vantage point to determine how their site can be accessed safely. However, the enforcement needs to occur at the client because only the client has complete context of the relationships between multiple sites.
The W3C approach accounts for these considerations, so it’s a step in the right direction. However, I find their implementation excessively complex while being too narrow in scope. I’ve posted my thoughts on it here: http://taossa.com/index.php/2008/01/10/w3c-cross-site-request/
Nathan de Vries (January 11, 2008 at 1:30 am)
This is just as ridiculous as Flash’s security sandboxing via server-side crossdomain.xml files.
pwb (January 11, 2008 at 4:00 pm)
The risks are straightforward and huge. Biggest being access behind firewalls. A malicious server could serve up cross-domain script that essentially robots an entire intranet.
DerFichtl (January 13, 2008 at 3:47 am)
ok … and now … waiting for the other browsers. so, i think i can forget this feature till it will be again announced by microsoft in two or three years.
Ashish (January 13, 2008 at 5:37 pm)
This feature is inevitable. There is a need for it and its going to be implemented. Most people kid themselves about the “new” security issues its going to introduce. Most attacks can be performed today. you want cross site internet/intranet DDOS, use the image tag. There is a business demand for cross domain stuff, there is money to be made. Get over it people. Just start training your programmers on how to use it.
Nathan de Vries (January 13, 2008 at 10:46 pm)
@Ashish: I would prefer you protect your business interests outside the realms of browser technology like Javascript. Those who subvert will continue to do so, which means that this kind of opt-out server-side change will only hinder everyone else.
John Resig (January 13, 2008 at 11:09 pm)
@Nathan: You do realize that this is opt-in, right? You don’t have to change a single thing in order to maintain the current security model. If you want your documents to be accessible in a cross-domain manner, then you opt-in to the Access-Control scheme – and even then, only for the domains that you specify.
Ben Hicks (January 14, 2008 at 5:14 am)
Not a big issue, i’d except anybody using the example to find out by them self, but i prefere a working copy’n’paste solution.
The HTML comment infront of the PHP might force Apache to send out the HTTP header plus the comment as content, making it impossible for PHP to add a HTTP header. You should wrap the comment into PHP like this:
<?php
// Change this to allow <yourdomain.com> to make it accessible to your site, or allow <*> for ANYONE to be able to access it.
header('Access-Control: allow <ejohn.org>');
?>
<b>John Resig</b>
I would like seeing Adobe allowing Flash to also adopt this scheme, so we don’t have to keep synchronising our access configuration between HTTP and HTTP accessed via Flash.
Ben Tremblay (January 15, 2008 at 2:03 pm)
When Mosaic and IE were surging ahead a few of us (Well, lots of us really, but a slight portion of all hands.) pulled out the stops to make sure that folk surfing the very new WWW using Lynx didn’t get marginalized and shut out.
This very lovely delopment leaves Win98SE users sucking fumes. (I can hear the snickers … and they’re none of them intelligent or rational.) FF3 raises the bar to Win2K … how many Win98 boxes are out there right now?
/In effect/ … all rationalizations and prevarication aside, in effect FF3 will cut some users loose.
We aren’t just talking about choice of browsers here: we’re telling folk “Since you can’t upgrade your OS, you can’t use these services”.
Jonas Raoni (January 16, 2008 at 7:14 pm)
Finally, that’s a good thing I wanted to see. But since most of the browsers won’t work with this, it’s just a mere interesting, but still useless feature, “proxies” will stay in the scope for quite a long time :]
Paul (March 19, 2008 at 2:14 pm)
Were you ever able to get this working with POST requests? I get a
[Exception... "Component returned failure code: 0x80004005 (NS_ERROR_FAILURE) [nsIXMLHttpRequest.setRequestHeader]" nsresult: "0x80004005 (NS_ERROR_FAILURE)" ..]
exception without FF3b4 even checking the site for permission.Ricky (March 26, 2008 at 9:05 pm)
Um, neither of these examples work in the latest firefox beta (Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9b5pre).
Is the feature currently broken or what? I get this message:
Permission denied to call method XMLHttpRequest.open
[Break on this error] xhr.open("GET", "http://dev.jquery.com/~john/xdomain/test.xml", true);
I’d love to this working.
Dante (April 3, 2008 at 2:54 am)
@Ricky Use Beta4, release 5 comes with this note
“Support for Cross-Site XmlHttpRequest has been removed until the specification becomes more stable and the security model is improved (bug 424923)”
@Denise (April 15, 2008 at 4:27 pm)
I don’t suppose there is any way to get my hands on a beta copy that has this feature enabled? Just working on some internal testing (access VLAN from DMZ) and having the ability to use XmlHttpRequest would certainly make my life easier.
John Weeddig (May 24, 2008 at 6:47 am)
The XML demo just says “Loading…”
Peter Kehl (June 22, 2008 at 8:03 am)
John,
thank you for this. However, please consider updating it to reflect the following note from http://developer.mozilla.org/en/docs/Cross-Site_XMLHttpRequest:
Cross-Site XMLHttpRequest
This feature is available in Firefox 3, but only to extensions and other privileged code; it is not currently available for web content.