Nell’articolo di oggi vediamo come si scarica un sito per intero con il comando “
Wget is a download manager open source, included in major Linux distributions. It is a tool command line, of which there is a gui for Gnome (“wget“) and a gui for Windows (“wgetGUI“). Today we see some simple instruction, but for those who want to delve into the manual is always available from the terminal by typing “
Using wget to download a file, also large:
$ wget -c http://dominio.ext / file.iso
The option “
-c” is not essential, ma soprattutto per i file grossi rende possibile l’interruzione del download in qualsiasi punto, lanciando in futuro lo stesso comando dalla stessa directory il download verrà ripreso dallo stesso punto in cui l’abbiamo lasciato.
Using wget to download an entire website with all internal links:
$ wget -r http://dominio.ext /
Senza l’argomento “
-r” would only download the index.html file located in the root of the site. In this case, instead, “
-r” causes are downloaded even all internal links and images recursively. Be careful with this command, however,, tossing it on a very large download site is likely to never end.
Using wget to download an entire site, but limit 3 the depth level:
$ wget -r -c -l 3 -o wget.log http://dominio.ext /
Quest’altro comando scarica il sito ed i suoi collegamenti interni, you can resume the download if the interrupt, the link will be downloaded to the third (hierarchically), keeps a log file called “wget.log” in this directory.
Unfortunately, the gui does not have all these options to configure, turns out to be a simple download manager, the actual program, instead, is chock full of configurations for all your needs.
I keep this blog as a hobby by 2009. I am passionate about graphic, technology, software Open Source. Among my articles will be easy to find music, and some personal thoughts, but I prefer the direct line of the blog mainly to technology. For more information contact me.