August 9th, 2000, 11:05 PM
Join Date: Jul 2000
Time spent in forums: 57 m 25 sec
Reputation Power: 0
<BLOCKQUOTE><font size="1" face="Verdana,Arial,Helvetica">quote:</font><HR>Originally posted by chinnavi:
I wanna download all the files that are open to download ...
that is suppose i have a site called www.name.com ...
Is there a software that will get a input as a Site name and will look for the links and then download those files including images and other files ?
I have some sites that will be having excellent resoirces and i couldnt have any downloadable files in a bunch as .tar.gz or as .zip ...
Is there any open source software is there?
Try using a program called wget.
you can get it from ftp/http sunsite.unc.edu/pub/gnu/wget.
Although I haven't used it in awhile, if I remember correctly if you use the -R switch it will recursively get all pages & images etc.. take a look at the docs anyway.