Hey Guys:
I could use some help with a perl library WWW:Mechanize and a script that scrapes web-site data. Here's my sample code. The issue is I do not get the same content as when I 'view source' from my browser... Any ideas?

Code:

my $agent = WWW::Mechanize->new();
$agent->timeout(240);
my $url="http://www.myStupidPage.net/";
my $response='';
$url = $url.'domino.aspx?'.'id='.$id.'&lang=en&p_date='.$date.';
print "\nGetting home page--->URL: $url \n";
$response=$agent->get($url);
sleep(15);
if($response->is_success)
{
    $htmlPage = $agent->{content};
    open OUT, ">:utf8", $fileName or die "Cannot open $fileName for write :$!"; 
    print OUT "$htmlPage";
    close OUT;
}
Please understand that this is merely a sample snip of the code; the actual site that I am reaching is not important.
I have reached the site via my browser and the data that I seek is there, however, it is not there when this code runs.