Hi

I am writing a script run as a cron job on a server to check a list of links and notify me when they are updated. I have just started to develop it on my small server machine running apache.

Basically, I want the script to have a 30 second timeout for each url it checks so it won't get stuck looking infinitely for a missing url. However, there appears to be no timeout at all atm, I can even stop my web server and it just moves to the next url to check.

Here is some of the code:

while (my $data = $sortqh->fetchrow_hashref()) {
my $browser = LWP::UserAgent->new();
$browser->agent("NEWbot/0.0.1");
$browser->timeout("60");
if ($data->{'method'} eq "hash") {
my $webdoc = $browser->request(HTTP::Request->new(GET => $data->{'hashurl'}));
if ($webdoc->is_success) {
if (md5_hex($webdoc->content) ne $data->{'hash'}) {
my $updateqh = $dbh->prepare("update data set hash='" . md5_hex($webdoc->content) . "' where id='" . $data->{'id'} . "'");
$updateqh->execute();
$to_update{$data->{'id'}} = 1;
} else {
$to_update{$data->{'id'}} = 0;
}
} else {
print LOGFILE "$0: Couldn't fetch $data->{'realurl'}n";
}

commenting out the if ($webdoc->is_success)..else statments doesn't help either.

Many thanks for any replies.

Mark Drayton