Quantcast
Channel: User sofs1 - Unix & Linux Stack Exchange
Viewing all articles
Browse latest Browse all 7

GNU Parallel vs. xargs vs. Python tools - Sending 10000 reqs/second [closed]

$
0
0

I have 1000000 different requests (different request I mean it has different query param. It is just a GET request, and no payload. Size of request and response are in order of KBs only. no images or complicated stuff) in a text file where each line is cURL followed by the actual URL. Actually, each line could be http url and I can pipe the response to jq and if a condition is met, I will write to a log file (this is what I'm trying to achieve).

I plan to start with 1000, 5000 and reach 10000 reqs/second. We prefer to have sustainted approx. 10000 reqs/sec over few hours (say 48-72 hours) period of time.

Which is the best approach?

  1. Use gnu parallel on text file, where each text file will have10000 prepared http urls? (Here in the text file curl is better thanhttp?)
  2. While doing the above is each curl request run inits own shell? How do I change this to a single shell that can sendreq and receive response for 10000 requests/sec?

I spent 2.5 days between xargs and gnu parallel. But some answers for gnu parallel to go for xargs and vice versa in Stack over flow. Also, is there a better tools (say Python tools or ab tools) to send 10000 reqs/sec and for each response if a condition is met, write to a log file. Please help . Thanks.

PS: can golang help better with parallelizing this work better than gnu parallel?


Viewing all articles
Browse latest Browse all 7

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>