sábado, 21 de enero de 2017

using xargs to parallelize command line

Did you know that xargs can spawn multiple processes to execute the commands it gets? It's the poor man's gnu parallel (very poor).

Well, I've been dabbling quite a bit with shell scripts lately and I've come to this balance of using shellscripts for simple (or not so simple) tasks. It's a mix of suckless and Taco Bell programming....

Well, the problem at point was to create many connections to some api endpoints. The endpoints would be serving data in a streamming fashion (think twitter hose).   My idea would be to spawn several curls to a "circular" list of urls.

I thought about wrk (because lua), vegeta or siege, but I'm not sure how they would cope with 'persistent' connections, so I tried my luck with plain curl and bash.

It's funny how you can solve some issues in bash with so few lines of code.  I don't have stats, or anything, but the only thing I wanted, that was to generate traffic can be done in plain simple shellscripting.

Things I learned:

- ${variable:-"default-value"}   is great
- doing arithmetic in bash is next to impossible.
- xargs can spawn multiple processes, and you can control the number, so you can use the number as an upper limit.
- while true + timeout are a very nice combo when you want to generate any kind of 'ticking' or 'movement' on the loop.

Here's the script:

No hay comentarios: