Download files from HTTP URLs instantly in the Windows CMD.EXE command line with the free Swiss File Knife.

sfk wget [options] url [outfile|outdir] [options]

download content from a given http:// URL.
an output filename or directory can be specified.
existing output files are overwritten without asking back.

   -user=u     and -pw=p set http basic authentication.
               you may also use global options -webuser, -webpw.
               note that passwords are not encrypted on transfer,
               except when using SFK Plus with HTTPS connections.
   -proxy      hostname:port of a proxy server. from within a company
               network, it is often required to connect through proxies.
               alternatively, set the environment variable SFK_PROXY :
                 set SFK_PROXY=myproxyhost:8000
               to find out what proxy your browser is using, see
               - Firefox: tools/options/advanced/network/settings
               - IE: tools/internet options/connections/lan settings
   -path2name  include web path in generated output name,
               to create unique names on multiple downloads.
               this option is default on chained processing.
   -fullpath   recreate the whole web path within output dir.
   -nodom      do not include domain name in output name.
   -nopath     do not include any path and domain information
               within the output names. will not work if URL
               does not contain any relative filename.
   -quiet      or -noprog shows no download progress indicator.
   -quiet=2    show no "done" info line.
   -addext     always add a filename extension like .txt, .html
               or .dat even if the URL has no such extension.
   -timeout=n  wait up to n msec for data
   -verbose    tell current proxy settings, if any
   -noclose    do not send "Connection: close" header.

automatic name expansions
   http:// is added automatically. short ip's like .100 are
   extended like depending on your subnet.

quoted multi line parameters are supported in scripts
   using full trim. type "sfk script" for details.

   although sfk wget can download a list of URLs, it is not
   a real webpage downloader/archiver, as this would require
   the conversion of html pages to adapt contained links.

HTTPS support
   SSL/TLS downloads are supported with SFK Plus.
   read more under:

chaining support
   output filename chaining is supported.

see also
   sfk web      send a simple web request with instant
                result output to terminal
   curl         powerful web request and download tool

web reference

   sfk wget -proxy myproxy:8000
      download, writing the content into a file,
      connecting through a proxy server myproxy on port 8000.

   sfk filt urls.txt +wget mydir
      if urls.txt contains a list of http:// URLs, load it
      and download all contents into mydir. the output names
      will include path information found in the source URL.

   sfk filt urls.txt +wget -fullpath mydir +list -big
      the same as above, but create the whole dir structure,
      and then list biggest files from the downloaded.

   sfk wget -quiet=2 server/info.xml tmp.txt +ffilter -nofile
      download info.xml from server, write it as file tmp.txt
      and instantly print the tmp.txt content to terminal
      without any status messages or filename infos.