我想知道如何使用 wget 和/或 curl 从互联网上下载文件,将文件分成等份,然后使用 8 或 16 个直接链接下载文件,从而大幅提高下载性能。
PS:aria2c 怎么样?它可以使用并发下载链接吗?
答案1
通过使用apt install aria2c
命令安装 aria2c 并使用这些参数,我可以使用指向同一文件的多个链接以全带宽下载我的文件:
aria2c -c -V -j 1 -k 5M -s 16 "link"
-c, --continue[=true|false] Continue downloading a partially downloaded
file. Use this option to resume a download
started by a web browser or another program
which downloads files sequentially from the
beginning. Currently this option is only
applicable to http(s)/ftp downloads.
Possible Values: true, false
Default: false
Tags: #basic, #http, #ftp
-V, --check-integrity[=true|false] Check file integrity by validating piece
hashes or a hash of entire file. This option has
effect only in BitTorrent, Metalink downloads
with checksums or HTTP(S)/FTP downloads with
--checksum option. If piece hashes are provided,
this option can detect damaged portions of a file
and re-download them. If a hash of entire file is
provided, hash check is only done when file has
been already download. This is determined by file
length. If hash check fails, file is
re-downloaded from scratch. If both piece hashes
and a hash of entire file are provided, only
piece hashes are used.
Possible Values: true, false
Default: false
Tags: #basic, #metalink, #bittorrent, #file, #checksum
-j, --max-concurrent-downloads=N Set maximum number of parallel downloads for
every static (HTTP/FTP) URL, torrent and metalink.
See also --split and --optimize-concurrent-downloads options.
Possible Values: 1-*
Default: 5
Tags: #basic
-k, --min-split-size=SIZE aria2 does not split less than 2*SIZE byte range.
For example, let's consider downloading 20MiB
file. If SIZE is 10M, aria2 can split file into 2
range [0-10MiB) and [10MiB-20MiB) and download it
using 2 sources(if --split >= 2, of course).
If SIZE is 15M, since 2*15M > 20MiB, aria2 does
not split file and download it using 1 source.
You can append K or M(1K = 1024, 1M = 1024K).
Possible Values: 1048576-1073741824
Default: 20M
Tags: #basic, #http, #ftp
-s, --split=N Download a file using N connections. If more
than N URIs are given, first N URIs are used and
remaining URLs are used for backup. If less than
N URIs are given, those URLs are used more than
once so that N connections total are made
simultaneously. The number of connections to the
same host is restricted by the
--max-connection-per-server option. See also the
--min-split-size option.
Possible Values: 1-*
Default: 5
Tags: #basic, #http, #ftp