Automated Downloader

Gerard Beekmans gerard at linuxfromscratch.org
Fri Sep 1 11:10:57 PDT 2000


> The only problem with wget is that sometimes it misses packets (because
> it is not able to connect to the site (full of users or no route)) - and
> it's hard to find that in the long wget-logfile (wget -o logfile). I
> tried to get around that by adding "-t 0" (unlimited tries) - which
> seemed to work better.
>
> But we should get a cleaner way sooner or later.
>
> btw. You can start right away after wget got the first bash-file, as
> compiling usually takes longer (okay - DLing is here at my T1 quite
> fast ;).

The perfect solution for that would be using rsync. You don't have to rsync 
the entire ftp archive, i can setup a module that just gives you the packages 
needed by the book. If something goes wrong just run rsync again later and it 
will only download the files that are missing or have changed since the last 
sync.

Rsync is what the mirrors use right now to sync the http (and ftp in a day or 
two again) site daily.

My personal vote would go for rsync.

-- 
Gerard Beekmans
www.linuxfromscratch.org

-*- If Linux doesn't have the solution, you have the wrong problem -*-





More information about the alfs-discuss mailing list