Categories
Development System & Network

Parallelize curl requests

This is my shell script:

for i in `seq 1 $1`; do
    filename=randomfile$(uuidgen).txt
    head -c $2 </dev/urandom > $filename
    checksum256=$(sha256sum $filename | awk '{ print $1 }')
    attrs=$(jq -n --arg cf "$checksum256" '{ "confidential":"N", "transactionId": "sdf", "codiDirCorp": "CorpCode", "expiration": "10/10/2025", "description": "desc", "locked": "N", "title": "titol", "docHash": $cf }')

    curl -X POST \
        'http://localhost:8083/documents?application=RDOCUI&user=nif' \
        -F docFile=@$filename \
        -F 'docAttributes='"${attrs}"''
done

As you can see, I’m generating several files with randomized content.

After that, I just perform a curl request in order to upload them.

In order to perform that command, I just type:

$ ./stress.sh 1000 200K

Here, I generate 1000 files and 1000 request uploadting them.

I would like to speed it up running those request in parallel.

Any ideas?

Leave a Reply

Your email address will not be published. Required fields are marked *