Hi everybody,
regarding this issue:
http://wiki.squid-cache.org/WikiSandBox/Discussion/YoutubeCaching
I came up with a workaroud, it's a rewriter script in PHP (sorry I'm
not good at Perl, but maybe someone would be kind enough to later
share a transcoded version... jeje)
NOTE 1: Use this script for testing purposes only, It may not work as
expected... I've tested it only with very few URLs... If you can
improve it, please share.
NOTE 2: To use this script you need the PHP command line interface. In
Ubuntu yo can install it with this command:
sudo apt-get install php5-cli
NOTE 3: Make sure the log file is writable by the script.
And now the script:
#!/usr/bin/php -q
<?php
#
# 2008-11-03 : v1.3 : Horacio H.
#
## Open log file ##
$log = fopen('/var/squid/logs/rewriter.log','a+');
## Main loop ##
while ( $X = fgets(STDIN) ) {
$X = trim($X);
$lin = split(' ', $X);
$url = $lin[0];
## This section is for rewriting store-URL of YT & GG videos ##
if ( preg_match('@^http://[^/]+/(get_video|videodownload|videoplayback)\?@',$url)
) {
## Get reply headers ##
$rep = get_headers($url);
## If reply is a redirect, make its store-URL unique to avoid
matching the store-URL of a video ##
$rnd = "";
if ( preg_match('/ 30[123] /',$rep[0]) ) {
$rnd = "&REDIR=" . rand(10000,99999);
}
$url = preg_replace('@.*id=([^&]*)&?.*$@',"http://videos.SQUIDINTERNAL/ID=$1$rnd",$url);
}
## Return rewrited URL ##
print $url . "\n";
## Record what we did on log ##
fwrite($log,"$url $rep[0]\n");
## May do some good, but I'm not sure ##
flush();
}
fclose($log);
?>
## END OF SCRIPT ##
The trick here is knowing if the URL is a redirect (301, 302 or 303)
with the get_headers function. It would be nice if the Squid process
passed the HTTP status to the script, maybe as a key=value pair, but
I'm not even a programmer so that is way beyond my knowledge...
Regards,
Horacio H.
Received on Mon Nov 03 2008 - 17:32:08 MST
This archive was generated by hypermail 2.2.0 : Sat Nov 08 2008 - 12:00:01 MST