1
0
Fork 0
mirror of https://github.com/tldr-pages/tldr.git synced 2025-04-29 23:24:55 +02:00
tldr/pages/common/wget.md
Owen Voke c4e95b92c4 multiple pages: add homepages (#3026)
* zstd: add link to homepage

* zsh: add link to homepage

* zopflipng: add link to homepage

* zbarimg: add link to homepage

* z: add link to homepage

* yesod: add link to homepage

* xsv: add link to homepage

* xo: add link to homepage

* xcv: add link to homepage

* wuzz: add link to homepage

* wordgrinder: add link to homepage

* wget: add link to homepage

* webtorrent: add link to homepage

* webpack: add link to homepage

* wat2wasm: add link to homepage

* w3m: add link to homepage

* vue: add link to homepage

* vsce: add link to homepage

* virtualboxvm: add link to homepage

* vim: add link to homepage

* vegeta: add link to homepage

* vault: add link to homepage

* valgrind: add link to homepage

* vagrant: add link to homepage

* upx: add link to homepage
2019-05-14 13:09:07 -03:00

1.2 KiB

wget

Download files from the Web. Supports HTTP, HTTPS, and FTP. Homepage: https://www.gnu.org/software/wget.

  • Download the contents of an URL to a file (named "foo" in this case):

wget {{https://example.com/foo}}

  • Download the contents of an URL to a file (named "bar" in this case):

wget -O {{bar}} {{https://example.com/foo}}

  • Download a single web page and all its resources with 3-second intervals between requests (scripts, stylesheets, images, etc.):

wget --page-requisites --convert-links --wait=3 {{https://example.com/somepage.html}}

  • Download all listed files within a directory and its sub-directories (does not download embedded page elements):

wget --mirror --no-parent {{https://example.com/somepath/}}

  • Limit the download speed and the number of connection retries:

wget --limit-rate={{300k}} --tries={{100}} {{https://example.com/somepath/}}

  • Download a file from an HTTP server using Basic Auth (also works for FTP):

wget --user={{username}} --password={{password}} {{https://example.com}}

  • Continue an incomplete download:

wget -c {{https://example.com}}

  • Download all URLs stored in a text file to a specific directory:

wget -P {{path/to/directory}} -i {{URLs.txt}}