Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.
This post focuses only on the results of the testing in the $101-200/month price bracket for WordPress Hosting.
Other Price Tier Results
$101-200/Month WordPress Hosting Products
$101-200/Month WordPress Hosting Performance Benchmarks Results
1. Load Storm
Test 500-4000 Concurrent Users over 30 Minutes, 10 Minutes at Peak
|Company||Total Requests||Total Errors||Peak RPS||Average RPS||Peak Response Time(ms)||Average Response Time(ms)||Total Data Transferred (GB)||Peak Throughput (MB/s)||Average Throughput (MB/s)|
Discussion of Load Storm Test Results
A2 Hosting [Reviews], BlueHost [Reviews], Conetix, and Pressjitsu struggled with this test. BlueHost struggled off the bat. A2 and Conetix struggled a couple minutes in. Pressjitsu made it about 12 minutes in before it started erroring, but it had increasing load times starting around 6 minutes in. They all lasted varying amounts of time, but none were ready to handle this sort of load.
Test 1-3000 Concurrent Users over 60 seconds
Blitz Test Quick Results Table
|Company||Hits||Errors||Timeouts||Average Hits/Second||Average Response Time||Fastest Response||Slowest Response|
Discussion of Blitz Test 1 Results
This test is just testing whether the company is caching the front page and how well whatever caching system they have setup is performing (generally this hits something like Varnish or Nginx).
Who performed without any major issues?
Who had some minor issues?
Pressjitsu kept a flat response time but had a lot of errors start to build as the test scaled up. Might have been a security measure blocking it.
Who had some major issues?
3. Uptime Monitoring
Both uptime monitoring solutions were third party providers that offer free services. UptimeRobot was paid for and monitoring on a 1 minute interval. All the companies were monitored over approximately two months (May-June 2016).
Conetix had some uptime issues getting recorded for 99.52% and 99.7% on StatusCake and UptimeRobot respectively.
A2 had a very strange recording with UptimeRobot showing 100% and StatusCake recording 99.64%.
Everyone else maintained above 99.9% on both monitors.
Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only.
|Company||WPT Dulles||WPT Denver||WPT LA||WPT London||WPT Frankfurt||WPT South Africa|
|Company||WPT Singapore||WPT Shanghai||WPT Japan||WPT Sydney||WPT Brazil|
Everyone was pretty fast around the world without huge red flags anywhere.
Conetix had slow scores to a lot of locations, but thankfully they were the fastest in Sydney (because they are focused on the Australian market and based there).
|Company||PHP Bench [Seconds] (lower=faster)||WP Bench [Queries Per Second](higher=faster)|
At this mid-range tier we see pretty much only VPS/Dedicated and cloud/clustered solutions. LiquidWeb's VPS again got one of the fastest PHP Bench scores I've seen recorded. The VPS/Dedicateds also generally put up much faster WP Bench scores with A2 leading the way with their dedicated server. The Cloud/Clustered solutions were around 500 and below (Kinsta, Pressable, Pressidium). The only exception was Conetix which was a VPS.
Top Tier WordPress Hosting Performance
Individual Host Analysis
The bright spot in this test was the WP Bench test where this dedicated server was way faster than the competition. The raw power of a dedicated machine is nice, but unless it's got all the extra caching software that the top tier hosts were running, it unfortunately fell flat in the load tests.
Another disappointing performance in the load tests. The uptime and other tests were fine.
Overall, they didn't perform that well. Uptime wasn't on par and the load test results were disappointing. The only bright spot was they were the fastest in Australia.
(9/19/2019 Update) Conetix have issued their own statement regarding Review Signal's test and why they believe this methodology doesn't accurately represent their performance and why a unique Australian perspective is required when evaluating them. I recommend reading the full details.
Kinsta continues to earn top tier status. I can't find anything to say beyond the fact they performed near perfectly, again.
LiquidWeb's lower priced plan performed spectacularly, their higher end product only continued that trend. They had a bracket leading PHP bench, perfect uptime, and aced the load tests. LiquidWeb has easily gone from brand new to top tier WordPress hosting status this year.
Pressable continues its trend of excellent load tests, but at this tier they put everything together and earned themselves top tier status.
Another test, another top tier performance. Not much to say beyond, excellent.
Pressjitsu did better than the other companies that didn't earn top tier status, but found themselves clearly below the top tier with some struggles in the load tests. It seems like security may have messed up the blitz test but the same can't be said about the LoadStorm which showed real signs of stress. It seems like a good foundation but they need to just get everything running a bit better to earn themselves the top tier recognition; hopefully next year.