并行下载过程

并行下载过程

使用 Bash 终端,如何在脚本中随时让 5 ± 2 个并发 cURL 进程下载,直到没有更多链接?

伪代码:

Links = {a.txt, b.txt, c.txt, d.txt ... x.txt, y.txt, z.txt}

Loop URLS in Links
    if (less than 5 cURL processes)
        cURL URL
    else
        wait until 1 cURL process stopped

答案1

尝试以此作为起点(未经测试)

#! /bin/bash
# LINKS is a list of space separated links use text below, otherwise
# replace first uncommented line below with
# for each in `cat $LINKS`

for each in $LINKS
do 
    count=`ps waux | grep "curl" | wc -l`
    until [ $count -gt 4 ]
    do
        count=`ps | grep "curl" | wc -l`
        sleep 1
    done

    curl $each &
    fi

done

相关内容