Friday, April 8, 2011

Dynamic caching of webpages

I read an interesting article on Oracles site on how to scale PHP pages for high traffic to scale web applicatons
I can see how this would be powerful. I have made php pages before that are very complex and if there were thousands of people coming every minute it would kill the sever unless it only had to load a certain page once every hour or so. Then if there are any updates they will only be updated and seen by everyone visiting once an hour instead of the instant they appear.

instead of having to run php all the time you just run it once an hour and write the content to a textfile with the name of the file being the url of the file plus any variables or get parameters that make the page unique or look different. All requests for the next hour will just read a text file with the webpage in it.
I will try an example in php and put it here.

<?php
$timeout = 60; // One Hour is 3600
$file = 'C:/Users/mike/Desktop/webhome/cachingtest/tempcache/'.sha1($_SERVER['REQUEST_URI']);
if (file_exists($file) && (filemtime($file) + $timeout) > time()){
//only show the cached copy of the page
readfile($file);
exit();
} else {
// Setup saving and let the page execute:
ob_start();
register_shutdown_function(function (){
Global $file;
$content = ob_get_flush();
file_put_contents($file, $content);
});
}
?>
<h2>Testing saving the current page from cache and only loading a fresh copy after a certain time limit has passed.</h2>

No comments: