Creating the scenario for the tests
This is maybe the most difficult part, because you need to be sure you will be fair enough with the software you are testing, from the article I’ve read I can see:
Looking closer I realized that the real bottleneck was in fact httperf. With Varnish, it was able to keep more connections open and busy at the same time, and thus hit the upper limit of concurrency.
As you can see, here the author found a mistake the user testing G-Wan vs Varnish did not take into account, and probably I’ve made the same mistake.
As I said in my conclusion on the other article:
The best option you have is to test all possible configuration ?live? with your application, and see which of them make the most of your hardware, also consider than if you need a lot of tweaking about what is cached and what is not Varnish is a lot more flexible than NGINX.
All posts you may read about NGINX vs Varnish vs Memcache+Apache vs Squid, etc… Will only give you an idea about which of those to test, then you should always try to test them (the ones you pick up) in a live scenario.