#1
  1. No Profile Picture
    chinnavi
    Guest
    Devshed Newbie (0 - 499 posts)
    hai,

    I wanna download all the files that are open to download ...
    that is suppose i have a site called www.name.com ...
    Is there a software that will get a input as a Site name and will look for the links and then download those files including images and other files ?

    I have some sites that will be having excellent resoirces and i couldnt have any downloadable files in a bunch as .tar.gz or as .zip ...
    Is there any open source software is there?

    vijay
  2. #2
  3. No Profile Picture
    Junior Member
    Devshed Newbie (0 - 499 posts)

    Join Date
    Aug 2000
    Location
    Charleston, SC, USA
    Posts
    10
    Rep Power
    0
  4. #3
  5. No Profile Picture
    Registered User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Jul 2000
    Posts
    10
    Rep Power
    0
    <BLOCKQUOTE><font size="1" face="Verdana,Arial,Helvetica">quote:</font><HR>Originally posted by chinnavi:
    hai,

    I wanna download all the files that are open to download ...
    that is suppose i have a site called www.name.com ...
    Is there a software that will get a input as a Site name and will look for the links and then download those files including images and other files ?

    I have some sites that will be having excellent resoirces and i couldnt have any downloadable files in a bunch as .tar.gz or as .zip ...
    Is there any open source software is there?

    vijay
    [/quote]

    Try using a program called wget.
    you can get it from ftp/http sunsite.unc.edu/pub/gnu/wget.

    Although I haven't used it in awhile, if I remember correctly if you use the -R switch it will recursively get all pages & images etc.. take a look at the docs anyway.

    regards



IMN logo majestic logo threadwatch logo seochat tools logo