Network

Download entire web site using wget

From http://www.linuxjournal.com/content/downloading-entire-web-site-wget

1
2
3
4
5
6
7
8
9
10
wget \
--recursive \
--no-clobber \
--page-requisites \
--html-extension \
--convert-links \
--restrict-file-names=windows \
--domains website.org \
--no-parent \
www.website.org/tutorials/html/
  • --recursive: download the entire Web site.
  • --domains website.org: don’t follow links outside website.org.
  • --no-parent: don’t follow links outside the directory tutorials/html/.
  • --page-requisites: get all the elements that compose the page (images, CSS and so on).
  • --html-extension: save files with the .html extension.
  • --convert-links: convert links so that they work locally, off-line.
  • --restrict-file-names=windows: modify filenames so that they will work in Windows as well.
  • --no-clobber: don’t overwrite any existing files (used in case the download is interrupted and resumed).

See application open ports

1
sudo netstat -ntpl

Using rsync to download remote server files

1
rsync -chavzP --stats user@remote.host:/path/to/copy /path/to/local/storage
⬆︎TOP