It can sometimes be useful to have a script that captures all the HTTP requests and that logs these requests to a file. The PHP script below works for me. I use this script in combination with the htaccess file to catch all web page requests and redirect them to one file.
<?php $logfile = "/tmp/webserver.log"; $fh = fopen($logfile, 'a'); fwrite($fh, "\n=== " . date("Y-m-d H:i") . " ====== " . $_SERVER["REQUEST_URI"] . " ==========="); printarray($fh, $_SERVER, "_SERVER"); printarray($fh, $_ENV, "_ENV"); printarray($fh, $_REQUEST, "_REQUEST"); fwrite($fh, "\n"); fclose($fh); function printarray($fh, $arr, $arrlabel) { fwrite($fh, "\n"); foreach($arr as $key => $value) { fwrite($fh, "\n ".$arrlabel."[\"".$key."\"] = ".$value); } } ?>
The .htaccess file :
RewriteEngine On RewriteCond %{REQUEST_FILENAME} -s [OR] RewriteCond %{REQUEST_FILENAME} -l [OR] RewriteCond %{REQUEST_FILENAME} -d RewriteRule ^.*$ - [NC,L] RewriteRule ^.*$ index.php [NC,L]
The apache config for the virtual host is straightforward.
<VirtualHost *:80> DocumentRoot /var/www/ DirectoryIndex index.php <Directory /var/www/> Order allow,deny allow from all </Directory> </VirtualHost>
I pay a quick viit every day a few blogs and websites to read articles, excep this web site presents feature based
writing.