How to quickly stress test a web server

09 February 12

The Curl syntax allows you to specify sequences and sets of URL's. Say for example we're going to run a load stress test against Google we can run...

curl -s "http://google.com?[1-1000]"

This will make 1000 calls to google i.e.

http://google.com?1  
http://google.com?2  
http://google.com?3 
\... 
http://google.com?1000

So say you want to stress test your web application and it won't complain if it's fed an extra parameter, 10,000 calls could be done something like.

curl -s "http://yourappp.com/your_page_to_test.php?[1-10000]"

Multiple Pages

Easy just add each page to the command line.

curl -s "http://yourapp.com/page1.php?[1-1000]" "http://yourappp.com/page2.php?[1-1000]"

Or even...

curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]"

Timing

Using the time command we can get a view on our performance

time curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]"  

real 0m0.606s 
user 0m0.009s 
sys 0m0.008s 

Simulating consecutive users

OK, this is great for sending a whole bunch of calls one after the other but what about simultaneous calls. For this we can place the Curl calls in a script and set them running in the background. i.e. my_stress_test.sh

curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]" &
pidlist="$pidlist $!" 
curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]" &
pidlist="$pidlist $!" 
curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]" &
pidlist="$pidlist $!" 
curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]" &
pidlist="$pidlist $!" 
curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]" &
pidlist="$pidlist $!" 
curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]" &
pidlist="$pidlist $!" 
curl -s "http://yourapp.com/page{1, 2}.php?[1-1000]" &
pidlist="$pidlist $!"  

for job in $pidlist do 
  echo $job     
  wait $job || let "FAIL+=1" 
done  

if [ "$FAIL" == "0" ]; then 
  echo "YAY!" 
else 
  echo "FAIL! ($FAIL)" 
fi

Then run time my_stress_test.sh

Caveats

This does not simulate user behaviour exactly as the browser is not only downloading the page but all attached images, javascripts, stylesheet etc. You could simulate this too by adding the URL's to the url command.

Server Monitoring HQ

Spot server issues in lightning time.

Screenshot

"It's like PingDom for servers"