Tag Archives: benchmarks

$500+/Month Enterprise WordPress Hosting Performance Benchmarks (2016)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

The full company list, product list, methodology, and notes can be found here

This post focuses only on the results of the testing in the $500+/month price bracket for WordPress Hosting.

Enterprise WordPress Hosting Introduction

This is super exciting for me to test the ultra high end of the market. The past three years I've focused entirely on entry level plans, but the market has changed tremendously since I started and there is a very real demand for Enterprise WordPress hosting. I think this is the first time that a lot of these companies have been benchmarked, especially at this scale and level. So I hope this adds a new and incredibly valuable door for the minority of sites out there that really need to handle massive amounts of users.

The Enterprise testing this year had some fundamental differences from all the other testing that need to be discussed upfront. These are huge and expensive systems that are normally customized on a per-customer basis by these companies. They all offer a much more hands on experience than hosting plans at the other end of the spectrum and charge accordingly. For that reason, I felt it was only responsible to change how they were tested slightly.

The first change is there is no default setup, which is what I test in every other price tier. The companies were given explicit permission to customize their platform and knew what tests were coming their way. Some even ran their own load tests to make sure they were going to perform as advertised and made changes. This is what I would expect from plans charging hundreds, if not thousands of dollars per month for large sites. So I wanted to let them perform their normal services for this tier.

Uptime monitoring was reduced for many companies in this tier. Since these plans are very expensive and consume huge amounts of resources, I didn't want to keep my test sites eating up lots of money and resources. If they had other plans entered into the system, I created a composite based on what all their other plans averaged for uptime.

 

$500+/Month Enterprise WordPress Hosting Products

review_signal_table_enterprise

$500+/Month Enterprise WordPress Hosting Performance Benchmarks Results

1. Load Storm

Test 500-10,000 Concurrent Users over 30 Minutes, 10 Minutes at Peak

Company Total Requests Total Errors Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred (GB) Peak Throughput (MB/s) Average Throughput (MB/s)
Kinsta 1314178 274 1041.28 730.1 15014 340 75.7 60.75 42.06
Pagely 1388436 18 1108.3 775.24 9086 259 76.97 61.75 42.76
Pantheon 1295178 9964 1014.58 719.54 15101 786 30.86 24.18 17.15
Pressable 1538237 7255 1162.63 854.58 15099 733 29.18 21.95 16.21
Pressidium 1349118 3792 1076.52 749.51 11798 324 73.63 60.18 40.91
WordPress.com VIP 4660190 8151 3726.38 2588.99 8186 101 197.82 158.29 109.9
WPEngine 1515128 247976 1211.18 841.74 19797 281 52.1 40.34 28.94

Discussion of Load Storm Test Results

First off, this is the biggest load tests I've run to date. I had limited resources and wanted to test a high enough number to really put some stress on these systems. 10,000 concurrent users seemed like a reasonable choice based on limited resources and high enough to be meaningful for sites that are truly getting a lot of traffic.

Kinsta and Pagely [Reviews] had basically flawless performances. Flat average response times, minimal errors and no spikes.

WordPress.com VIP had a nearly perfect looking run except for some minor issue with wp-login that might be security related but persisted the entire test at a tiny level (0.17%). The average response time was impressively flat and the fastest of any company by a good bit at 101ms. They also maintained the lowest peak response time. WP VIP also loaded a lot of extra scripts that nobody else did, which increased their transfer data to be multiple times higher than anyone else.

Pantheon [Reviews], Pressable and Pressidium each had minor spikes but put on nearly perfect performances otherwise.

WPEngine [Reviews] ran into what looks to be a similar issue to the other tests, wp-login/admin security issues. Which caused a lot of errors and makes the test look not great. However, their average response time was flat, but it's really hard to say with such a high error rate (16.37%).

 

2. Blitz.io

Test 1-5000 Concurrent Users over 60 seconds

Blitz Test Quick Results Table

Company Hits Errors Timeouts Average Hits/Second Average Response Time Fastest Response Slowest Response
Kinsta 135485 7 0 2258 85 83 87
Pagely 146339 0 0 2439 4 3 14
Pantheon 138607 4 27 2310 62 60 80
Pressable 129866 13 2 2164 132 131 139
Pressidium 143452 0 2 2391 26 24 35
WordPress.com VIP 146200 0 73 2437 6 3 21
WPEngine 108168 12939 1061 1803 158 6 346

Discussion of Blitz Test 1 Results

This test is just testing whether the company is caching the front page and how well whatever caching system they have setup is performing (generally this hits something like Varnish or Nginx).

Who performed without any major issues?

Kinsta, Pagely [Reviews], PantheonPressablePressidium and WordPress.com VIP all handled 5000 concurrent hits to the frontpage without any issue. The largest spread in response times among all of them was a minuscule 20ms. Pagely even managed an impressive perfect no errors or timeouts.

Who had some major issues?

WPEngine [Reviews] struggled with this test. Around 20 seconds into the test, there was a substantial increase in response time which continued to slowly increase for the rest of the test. The errors and timeouts started to kick in 5 seconds later at the 25 second mark and also gradually increased until the test ended.

3. Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. UptimeRobot was paid for and monitoring on a 1 minute interval. All the companies were monitored over approximately two months (May-June 2016).

For Enterprise testing, many of the plans were only setup for a short period of time because of the enormous cost involved with setting these up. Only WordPress.com VIP and WPEngine were monitored directly. The rest are composite scores based on the other plans companies entered in and the company's average uptime as denoted with an asterisk (*).

Uptime Robot & StatusCake

Company StatusCake UptimeRobot
Kinsta* 99.98  100
Pagely*  99.98  99.98
Pantheon*  99.99  99.99
Pressable*  99.92  99.90
Pressidium*  99.97  99.99
WordPress.com VIP 100 100
WPEngine 100 100

* Composite uptime based on all the plans entered in 2016 testing from a company.

Every company in the enterprise tier seems capable of keeping their servers online, thankfully.

4. WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only.

Company WPT Dulles WPT Denver WPT LA WPT London WPT Frankfurt WPT South Africa
Kinsta 0.718 0.588 0.958 1.256 1.741 5.844
Pagely 0.752 0.758 0.953 1.243 2.029 9.885
Pantheon 0.809 0.563 1.02 1.284 1.826 4.882
Pressable 1.056 0.894 1.207 1.691 2.126 7.244
Pressidium 0.848 0.661 1.165 1.279 1.634 5.819
WordPress.com VIP 1.02 0.786 0.918 1.471 1.755 3.045
WPEngine 0.813 0.592 1.07 1.223 1.743 3.814
Company WPT Singapore WPT Shanghai WPT Japan WPT Sydney WPT Brazil
Kinsta 2.084 22.391 2.055 1.643 1.891
Pagely 2.455 23.148 2.203 2.117 2.153
Pantheon 2.336 22.723 1.95 1.852 2.032
Pressable 2.707 22.521 2.227 2.807 2.205
Pressidium 2.202 22.477 2.265 1.662 1.797
WordPress.com VIP 1.809 24.098 1.83 1.386 1.916
WPEngine 2.255 22.971 2.115 1.722 1.846

It's not surprising that these companies deliver content pretty quick all around the world. What is interesting is WordPress.com VIP was the fastest to Sydney, Japan, Singapore, South Africa, and LA. Kinsta was the fastest in Dulles and Shanghai. Pantheon was fastest in Denver. WPEngine was the fastest to London. Pressidium was the fastest to Brazil. I'm not sure how meaningful it is, but it's interesting to see the most expensive product having the fastest load times in locations all across the world.

5. WPPerformanceTester

Company PHP Bench [Seconds] (lower=faster) WP Bench [Queries Per Second](higher=faster)
Kinsta 11.37 320.82
Pagely 9.136 249.81
Pantheon 11.322 216.31
Pressable 10.834 491.64
Pressidium 10.958 367.24
WordPress.com VIP 2.244 500.25
WPEngine 13.178 533.9

I'm not sure what WordPress.com VIP is running, but it put up the absolute fastest scores in the PHP bench that I've seen by a wide margin. Roughly triple the speed of the next fastest which had a 6.5 second score. Every other company looked to be in the normal range between 9-13 seconds.

Another interesting part of the results here is that nobody was really going much faster than 500 queries per second in the WP Bench. I don't think a single one is running a local database which put up some blazing fast speeds in the lower tiers. If you're looking to host enterprise WordPress sites, you lose that no network latency performance, but certainly gain in reliability and scalibility.

Conclusion

White glove service and hefty price tags makes for some spectacular performance. It's nice to see that if you really have a site getting millions of visitors per day, there are a lot of really solid choices out there who can handle the mega WordPress sites that need Enterprise level hosting.

Top Tier WordPress Hosting Performance

review_signal_2016_trophy_enterprise

Kinsta, Pagely [Reviews], PantheonPressablePressidium and WordPress.com VIP all offer Top Tier Enterprise WordPress Hosting. None of them had any real struggles with keeping their servers up, the 10,000 and 5,000 user load tests. If you can afford them, they all seem worthy and capable.

Individual Host Analysis

Kinsta

Kinsta had essentially perfect LoadStorm and Blitz tests. They also had no flaws in any other tests. I'm at a loss for words to praise their performance.

Pagely [Reviews]

Pagely aced it. The fewest errors on LoadStorm and a no errors on Blitz. I can't find any faults with Pagely's Enterprise offering.

Pantheon [Reviews]

Pantheon really stepped it up for the Enterprise testing. Effortlessly went through the Blitz test. They had a some minor spikes in the LoadStorm test and their average response time started to creep upwards but nothing worth being concerned over. Overall, a top tier performance.

Pressable

Pressable performed nearly identical to Pantheon. Excellent Blitz test, some minor spikes and increase in response times in the LoadStorm test. The uptime was the lowest of everyone with UptimeRobot having an average of 99.90% which has been my border for pass/fail. I gave them top tier, but they were about as close as you can get to the edge.

Pressidium

Pressidium had a nearly perfect Blitz test with 2 timeouts and did excellent on the LoadStorm test which had 2 very minor spikes but maintained a nearly flat average response time otherwise. Easily a top tier performance.

WordPress.com VIP 

WordPress.com VIP was by far the most expensive plan tested and it put on a fantastic performance. It had a near perfect Blitz test. Despite having what appeared to be a security issue on the LoadStorm test it had the fastest average response time at 101ms and moved more data than any other company by a wide margin because of the custom scripts. But that didn't seem to negatively impact their performance at all. I'm also not sure what sort of hardware they are running by they blew my WPPerformanceTester PHP bench out of the water. Despite the highest price tag, they put on an amazing show and easily earned Top Tier Enterprise WordPress Hosting status.

WPEngine [Reviews]

Unfortunately, WPEngine was the only company in this tier not to do well in this tier. They struggled in both load tests. LoadStorm looked like it may have been security related, but Blitz looked like it really had trouble with the load. I believe the plan I tested cost $600/month, but the sales team wasn't willing to give me specific pricing for their enterprise tier.

$201-500/Month WordPress Hosting Performance Benchmarks (2016)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

The full company list, product list, methodology, and notes can be found here

This post focuses only on the results of the testing in the $201-500/month price bracket for WordPress Hosting.

$201-500/Month WordPress Hosting Products

review_signal_table_500

$201-500/Month WordPress Hosting Performance Benchmarks Results

1. Load Storm

Test 500-5000 Concurrent Users over 30 Minutes, 10 Minutes at Peak

Company Total Requests Total Errors Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred (GB) Peak Throughput (MB/s) Average Throughput (MB/s)
Kinsta 671665 47 528.38 373.15 9991 285 38.68 31.73 21.49
MediaTemple 775277 34144 616.05 430.71 15334 761 39.71 33.5 22.06
Pagely 553754 133181 456.03 307.64 16132 3333 19.32 13.94 10.73
Pantheon 629578 49212 510.78 349.77 15091 1353 33.88 28.9 18.82
Pressable 896616 12256 740.88 498.12 6362 450 37.87 33.8 21.04
Pressidium 697020 0 547.88 387.23 4894 266 38.16 31.05 21.2
PressLabs 692581 21180 547.72 384.77 15493 2109 23.02 18.45 12.79
SiteGround 640337 48537 507.98 355.74 15564 1549 30.64 24.25 17.02

Discussion of Load Storm Test Results

Kinsta and Pressidium were clearly the two best performers in this test.

Pressable had some minor issues that looked like they may have been security related to wp-login.

MediaTemple [Reviews] had a spike of errors at the very end and some minor errors throughout the test that might have been security related since they didn't impact response times at all.

PressLabs had some spikes and wp-login related problems but the server started to slow down its response times as the test progressed.

Pantheon [Reviews] had similar issues to PressLabs with slowing down and the largest chunk being wp-login related.

SiteGround [Reviews] started to have trouble around 12 minutes in and saw spikes, also mostly related to wp-login/admin. They also had increased and unstable response times associated with the spikes.

Pagely [Reviews] struggled the most with this test with spikes and increased response times. wp-login again was the worst offender.

What is amazing is none of these companies completely failed with 5000 real users logging in and bursting caches.

2. Blitz.io

Test 1-3000 Concurrent Users over 60 seconds

Blitz Test Quick Results Table

Company Hits Errors Timeouts Average Hits/Second Average Response Time Fastest Response Slowest Response
Kinsta 81386 3 0 1356 84 84 86
MediaTemple 44310 33581 450 739 249 189 676
Pagely 79095 1554 1153 1318 23 2 195
Pantheon 83211 2 0 1387 61 61 68
Pressable 77850 11 1 1298 132 131 135
Pressidium 85439 11 14 1424 31 25 82
PressLabs 87432 0 0 1457 8 3 13
SiteGround 82396 1 0 1373 71 71 72

Discussion of Blitz Test 1 Results

This test is just testing whether the company is caching the front page and how well whatever caching system they have setup is performing (generally this hits something like Varnish or Nginx).

Who performed without any major issues?

Kinsta, Pantheon, Pressable, Pressidium, PressLabs, and SiteGround [Reviews] all had close to no errors (and exactly none in PressLabs's case).

Who had some minor issues?

Pagely [Reviews] had a couple spikes which increased response times and errors.

Who had some major issues?

MediaTemple [Reviews] had an early spike and a big spike later. The big spike later looks like it may have partially been a security measure. But it did eventually increase response times as well.

3. Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. UptimeRobot was paid for and monitoring on a 1 minute interval. All the companies were monitored over approximately two months (May-June 2016).

Uptime Robot & StatusCake

Company StatusCake UptimeRobot
Kinsta 99.98 100
MediaTemple 99.96 99.97
Pagely 99.95 99.95
Pantheon 99.98 99.98
Pressable 99.88 99.9
Pressidium 99.95 99.99
PressLabs 99.99 99.98
SiteGround 100 99.99

I hate having to penalize a company for uptime, but Pressable recorded 99.88 and 99.90 uptime scores which is below the 99.9% I expect from every company.

Every other company did well.

4. WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only.

Company WPT Dulles WPT Denver WPT LA WPT London WPT Frankfurt WPT South Africa
Kinsta 0.77 0.545 0.947 1.151 1.707 4.466
MediaTemple 1.064 0.608 0.901 1.341 1.925 6.576
Pagely 0.658 0.651 0.947 1.144 1.691 3.868
Pantheon 0.762 0.623 1.054 1.104 1.672 4.493
Pressable 0.973 0.781 1.084 1.514 1.967 7.708
Pressidium 0.687 0.641 1.181 1.17 1.68 4.516
PressLabs 0.762 0.754 1.082 1.148 1.624 5.357
SiteGround 0.801 0.725 1.25 1.214 1.757 4.514
Company WPT Singapore WPT Shanghai WPT Japan WPT Sydney WPT Brazil
Kinsta 2.165 22.777 2.114 1.785 1.848
MediaTemple 2.164 22.061 1.811 2.071 2.118
Pagely 2.215 22.811 1.798 2.193 1.794
Pantheon 2.166 22.427 1.797 1.769 1.872
Pressable 2.426 22.233 2.124 2.945 2.135
Pressidium 2.105 22.355 2.038 1.672 1.745
PressLabs 1.643 22.048 1.581 2.358 2.092
SiteGround 2.496 22.431 2.051 3.27 2.034

Fast. Not much to really say about these results. Nobody had issues, nothing was particularly interesting here other than nobody can get into China at any price level.

5. WPPerformanceTester

Company PHP Bench [Seconds] (lower=faster) WP Bench [Queries Per Second](higher=faster)
Kinsta 11.297 321.34
MediaTemple 12.331 107.49
Pagely 9.841 194.36
Pantheon 13.836 184.81
Pressable 11.016 384.32
Pressidium 11.902 304.79
PressLabs 8.055 841.04
SiteGround 17.082 738

Not sure why SiteGround's PHP bench was so slow. The average WP Bench scores are also lower than every previous tier with PressLabs leading the way at 841. These more expensive solutions are generally trending towards cloud/clustered solutions which have slower database throughput in exchange for scale.

Conclusion

The high end WordPress hosting market is growing and has a lot of good options. No company in this tier completely faltered during the load tests despite a huge strain being put on them  of 3000 concurrent hits to the frontpage and 5000 logged in users browsing the site.

Top Tier WordPress Hosting Performance

review_signal_2016_trophy_500

Kinsta and Pressidium clearly led the pack in terms of performance. They were the only two companies that handled LoadStorm without issue. They also didn't have any other issues across the other tests.

Honorable Mentions

PressLabs earned itself an honorable mention. It had some issues with the LoadStorm test but it managed to stay up and did well on all the other tests.

Individual Host Analysis

Kinsta

Overall, a splendid performance that earned them top tier WordPress hosting in the $201-500/month range. No faults in their performance at any point.

MediaTemple [Reviews]

It's nice to see Media Temple playing with the big boys and doing a respectable job. They had a little bit of trouble with the LoadStorm test and some possibly security related issues during the Blitz test which kept them out. But they weren't out of place in this bracket and were by far the cheapest at $240/month.

Pagely [Reviews]

Pagely had some minor problems with the Blitz test but the LoadStorm test really seemed to be the big problem. The 5000 users seemed to clearly tax the server too much. Pagely reviewed the results and issued a full explanation. Their tl;dr was "Wrong plan/instance size for this test.
We price the value of our human Support and DevOps resources into the plan cost, which puts the ideal Pagely plan for this test outside the $500 cap. If the customer does not utilize the full range of services we provide they are essentially overpaying for AWS instances that in this case were undersized and not tuned for the test. "

Pantheon  [Reviews]

Pantheon did well everywhere but LoadStorm which was a common theme for this bracket. They didn't fail, but they certainly were being taxed with increased load times and error rates.

Pressable

Pressable could have earned an honorable mention if it wasn't for some uptime issues. They found themselves just below my 99.% expectation. They handled Blitz without issue and LoadStorm looked pretty good except wp-login had some what I imagine was security related issues.

Pressidium

I'm running out of positive adjectives to say how well Pressidium has done this year. A perfect LoadStorm test with zero errors, the lowest peak response time and lowest average response time. Followed up by a near perfect Blitz test. Top tier for sure.

PressLabs

PressLabs was the only company to earn an honorable mention. They had a bit of issues in the LoadStorm related to wp-login of course, but other than that put on an excellent performance.

SiteGround [Reviews]

In an odd twist of fate, I accidentally ran the same Blitz test on their lower priced cloud platform and it did better than the dedicated server. The shared infrastructure can often have far more powerful hardware backing it than dedicated machines and that's one of the interesting results. For large bursts, it may work better. Overall, this plan did pretty well, but LoadStorm clearly overloaded the server a bit too much to earn any special recognition.

$101-200/Month WordPress Hosting Performance Benchmarks (2016)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

The full company list, product list, methodology, and notes can be found here

This post focuses only on the results of the testing in the $101-200/month price bracket for WordPress Hosting.

$101-200/Month WordPress Hosting Products

review_signal_table_200

$101-200/Month WordPress Hosting Performance Benchmarks Results

1. Load Storm

Test 500-4000 Concurrent Users over 30 Minutes, 10 Minutes at Peak

Company Total Requests Total Errors Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred (GB) Peak Throughput (MB/s) Average Throughput (MB/s)
A2 363070 163790 264.15 201.71 15443 6857 11.75 13.88 6.528
BlueHost 322139 166336 267.9 178.97 20999 9268 9.42 7.09 5.24
Conetix 341733 145110 243.3 189.85 16202 7347 11.74 13.87 6.52
Kinsta 546252 0 425.67 303.47 9078 286 31.47 24.95 17.48
LiquidWeb 635893 76 490.78 353.27 15097 360 31.3 25.19 17.39
Pressable 724499 1090 562.12 402.5 15024 447 30.91 26.07 17.17
Pressidium 563624 0 435.43 313.12 3561 272 30.82 24.44 17.12
Pressjitsu 434368 41339 339.37 241.32 15605 3173 22.5 18.67 12.5

Discussion of Load Storm Test Results

KinstaLiquidWeb [Reviews], Pressable, and Pressidium had no problems with this test.

A2 Hosting [Reviews], BlueHost [Reviews], Conetix, and Pressjitsu struggled with this test. BlueHost struggled off the bat. A2 and Conetix struggled a couple minutes in. Pressjitsu made it about 12 minutes in before it started erroring, but it had increasing load times starting around 6 minutes in. They all lasted varying amounts of time, but none were ready to handle this sort of load.

2. Blitz.io

Test 1-3000 Concurrent Users over 60 seconds

Blitz Test Quick Results Table

Company Hits Errors Timeouts Average Hits/Second Average Response Time Fastest Response Slowest Response
A2 120 43508 21784 2 518 304 733
BlueHost 28568 11753 7945 476 929 192 1889
Conetix 155 16827 13990 3 1470 872 2184
Kinsta 81397 3 0 1357 84 83 85
LiquidWeb 81393 47 10 1357 80 76 118
Pressable 77652 0 4 1294 134 141 133
Pressidium 85916 6 0 1432 27 25 31
Pressjitsu 67297 5833 0 1122 208 205 236

Discussion of Blitz Test 1 Results

This test is just testing whether the company is caching the front page and how well whatever caching system they have setup is performing (generally this hits something like Varnish or Nginx).

Who performed without any major issues?

KinstaLiquidWeb [Reviews], Pressable, and Pressidium all handled this test without issue, again.

Who had some minor issues?

Pressjitsu kept a flat response time but had a lot of errors start to build as the test scaled up. Might have been a security measure blocking it.

Who had some major issues?

BlueHost [Reviews] managed to last about 22 seconds before it started to be impacted by the load.

A2 Hosting and Conetix were overloaded almost immediately.

3. Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. UptimeRobot was paid for and monitoring on a 1 minute interval. All the companies were monitored over approximately two months (May-June 2016).

Uptime Robot & StatusCake

Company StatusCake UptimeRobot
A2 99.64 100
BlueHost 100 99.99
Conetix 99.52 99.7
Kinsta 99.98 99.99
LiquidWeb 100 100
Pressable 99.96 99.94
Pressidium 99.97 99.99
Pressjitsu 99.99 99.99

Conetix had some uptime issues getting recorded for 99.52% and 99.7% on StatusCake and UptimeRobot respectively.

A2 had a very strange recording with UptimeRobot showing 100% and StatusCake recording 99.64%.

Everyone else maintained above 99.9% on both monitors.

4. WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only.

Company WPT Dulles WPT Denver WPT LA WPT London WPT Frankfurt WPT South Africa
A2 0.924 0.654 1.199 1.554 1.989 5.118
BlueHost 0.969 0.588 0.988 1.684 2.006 6.23
Conetix 2.703 2.026 2.194 3.372 3.339 6.964
Kinsta 0.817 0.577 0.982 1.15 1.721 5.081
LiquidWeb 0.887 0.578 1.059 1.179 1.748 4.227
Pressable 0.969 0.738 1.135 1.493 1.95 7.669
Pressidium 0.639 0.627 1.174 1.187 1.705 5.303
Pressjitsu 0.915 0.677 0.87 1.302 1.786 6.433
Company WPT Singapore WPT Shanghai WPT Japan WPT Sydney WPT Brazil
A2 2.618 22.224 2.114 2.592 2.162
BlueHost 2.247 22.406 1.937 1.755 2.22
Conetix 3.092 22.465 2.818 1.493 3.448
Kinsta 2.054 22.743 2.064 1.704 2.345
LiquidWeb 2.215 22.378 1.983 1.977 1.823
Pressable 2.476 22.395 2.146 2.879 2.479
Pressidium 2.08 22.461 2.053 1.893 1.803
Pressjitsu 2.172 22.317 1.701 1.871 2.19

Everyone was pretty fast around the world without huge red flags anywhere.

Conetix had slow scores to a lot of locations, but thankfully they were the fastest in Sydney (because they are focused on the Australian market and based there).

5. WPPerformanceTester

Company PHP Bench [Seconds] (lower=faster) WP Bench [Queries Per Second](higher=faster)
A2 9.336 1440.92
BlueHost 12.276 956.94
Conetix 12.019 418.76
Kinsta 11.458 330.58
LiquidWeb 7.122 1102.54
Pressable 10.788 514.13
Pressidium 10.739 281.14
Pressjitsu 12.3 574.38

At this mid-range tier we see pretty much only VPS/Dedicated and cloud/clustered solutions. LiquidWeb's VPS again got one of the fastest PHP Bench scores I've seen recorded. The VPS/Dedicateds also generally put up much faster WP Bench scores with A2 leading the way with their dedicated server. The Cloud/Clustered solutions were around 500 and below (Kinsta, Pressable, Pressidium). The only exception was Conetix which was a VPS.

Conclusion

Top Tier WordPress Hosting Performance

review_signal_2016_trophy_200

KinstaLiquidWeb [Reviews], Pressable, and Pressidium were the top tier in the $101-200/month price range.

Individual Host Analysis

A2 Hosting [Reviews]

The bright spot in this test was the WP Bench test where this dedicated server was way faster than the competition. The raw power of a dedicated machine is nice, but unless it's got all the extra caching software that the top tier hosts were running, it unfortunately fell flat in the load tests.

BlueHost [Reviews]

Another disappointing performance in the load tests. The uptime and other tests were fine.

Conetix

Overall, they didn't perform that well. Uptime wasn't on par and the load test results were disappointing. The only bright spot was they were the fastest in Australia.

(9/19/2019 Update) Conetix have issued their own statement regarding Review Signal's test and why they believe this methodology doesn't accurately represent their performance and why a unique Australian perspective is required when evaluating them. I recommend reading the full details.

Kinsta

Kinsta continues to earn top tier status. I can't find anything to say beyond the fact they performed near perfectly, again.

LiquidWeb [Reviews]

LiquidWeb's lower priced plan performed spectacularly, their higher end product only continued that trend. They had a bracket leading PHP bench, perfect uptime, and aced the load tests. LiquidWeb has easily gone from brand new to top tier WordPress hosting status this year.

Pressable

Pressable continues its trend of excellent load tests, but at this tier they put everything together and earned themselves top tier status.

Pressidium

Another test, another top tier performance. Not much to say beyond, excellent.

Pressjitsu

Pressjitsu did better than the other companies that didn't earn top tier status, but found themselves clearly below the top tier with some struggles in the load tests. It seems like security may have messed up the blitz test but the same can't be said about the LoadStorm which showed real signs of stress. It seems like a good foundation but they need to just get everything running a bit better to earn themselves the top tier recognition; hopefully next year.

$51-100/Month WordPress Hosting Performance Benchmarks (2016)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

The full company list, product list, methodology, and notes can be found here

This post focuses only on the results of the testing in the $51-100/month price bracket for WordPress Hosting.

$51-100/Month WordPress Hosting Products

review_signal_table_100_updated

 

$51-100/Month WordPress Hosting Performance Benchmarks Results

1. Load Storm

Test 500-3000 Concurrent Users over 30 Minutes, 10 Minutes at Peak

Company Total Requests Total Errors Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred (GB) Peak Throughput (MB/s) Average Throughput (MB/s)
BlueHost 322139 166336 267.9 178.97 20999 9268 9.425 7.086 5.236
CloudWays Amazon 306701 73421 214.07 170.39 15256 4810 13.9 10.05 7.723
CloudWays Google 267495 128912 199.23 148.61 15392 7341 8.35 6.595 4.639
Kinsta 416335 544 324.57 231.3 15059 317 24.01 19.91 13.34
LightningBase 456430 0 356.3 253.57 3909 261 23.65 19.41 13.14
LiquidWeb 520072 2745 408.3 288.93 15322 525 24.04 19.69 13.35
Media Temple 486702 8588 397.55 270.39 16001 582 25.43 23.08 14.13
Pagely 392898 1952 298.8 218.28 15178 1593 21.38 16.85 11.88
Pantheon 409962 57051 325.53 227.76 11682 762 20.74 17.97 11.52
Pressable 569095 0 441.43 316.16 3152 239 24.35 20.19 13.53
Pressidium 429538 0 335.78 238.63 3030 306 16.11 13.26 8.951
SiteGround 449038 742 352.05 249.47 11247 383 22.93 19.26 12.74

Discussion of Load Storm Test Results

KinstaLightningBaseLiquidWeb [Reviews], Pressable, Pressidium and SiteGround [Reviews] all handled this test without any serious issues.

MediaTemple [Reviews] had some minor issues with spikes and increasing average response times.

Pagely [Reviews] had some spikes but more concerning was the increased response times which were averaging around 3000ms during the 10 minute peak of the test. It kept the website up and error rate low enough (0.5%), but it was definitely struggling to keep up.

BlueHost [Reviews], CloudWays [Reviews] (Amazon + Google) and Pantheon [Reviews] all struggled with this load test. BlueHost crashed (85% error rate). CloudWays Google had 48% errors. Amazon fared better with only 24%. Pantheon had the lowest error rate at 14% but all of them were unacceptably high along with increase response times.

2. Blitz.io

Test 1-2000 Concurrent Users over 60 seconds

Blitz Test Quick Results Table

Company Hits Errors Timeouts Average Hits/Second Average Response Time Fastest Response Slowest Response
BlueHost 28901 714 2710 482 654 185 1562
CloudWays Amazon 55678 906 0 928 24 3 106
CloudWays Google 38278 16248 158 638 102 83 226
Kinsta 54273 7 0 905 84 83 86
LightningBase 54946 0 0 916 71 71 73
LiquidWeb 54574 0 4 910 78 77 82
Media Temple 44598 442 85 743 261 195 614
Pagely 57828 1 0 964 13 2 81
Pantheon 55499 0 0 925 61 60 64
Pressable 51781 0 0 863 135 134 136
Pressidium 57348 1 0 956 27 25 30
SiteGround 83437 0 0 1391 58 58 60

Discussion of Blitz Test 1 Results

This test is just testing whether the company is caching the front page and how well whatever caching system they have setup is performing (generally this hits something like Varnish or Nginx).

I also mistakenly ran an extra thousand users against SiteGround (1-3000), but since they performed perfectly, I figured why not just leave it. The chance for random network timeouts is always there, they got a perfect score, I let them keep it. That's why their numbers look higher than everyone else's.

Who performed without any major issues?

KinstaLightningBaseLiquidWeb [Reviews], Pagely [Reviews], PantheonPressable, Pressidium and SiteGround [Reviews] all handled this test without any serious issues.

Who had some minor issues?

MediaTemple [Reviews] had some minor issues with load starting to impact response times and some errors/timeouts at the end of the test.

CloudWays (Amazon) managed to keep the server up but started to lag around 35 seconds in with some errors at the very end.

Who had some major issues?

BlueHost [Reviews] and CloudWays (Google) both failed this test.

3. Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. UptimeRobot was paid for and monitoring on a 1 minute interval. All the companies were monitored over approximately two months (May-June 2016).

Uptime Robot & StatusCake

Company StatusCake UptimeRobot
BlueHost 99.98 99.98
CloudWays Amazon 100 100
CloudWays Google 99.99 99.99
Kinsta 99.99 100
LightningBase 100 100
LiquidWeb 100 100
Media Temple 99.94 99.97
Pagely 100 100
Pantheon 100 100
Pressable 99.93 99.95
Pressidium 100 99.99
SiteGround 100 100

I can happily say every single company kept their servers up.

4. WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only.

Company WPT Dulles WPT Denver WPT LA WPT London WPT Frankfurt WPT South Africa
BlueHost 0.94 0.813 0.995 1.525 1.861 5.923
CloudWays Amazon 0.774 0.975 1.066 0.988 1.625 3.597
CloudWays Google 0.706 0.644 0.929 1.107 1.706 3.37
Kinsta 0.834 0.62 0.958 1.12 1.688 3.637
LightningBase 0.542 0.465 0.955 1.013 1.569 4.541
LiquidWeb 0.616 0.55 1.003 1.076 1.624 5.634
Media Temple 0.904 0.537 0.855 1.318 1.932 2.809
Pagely 0.808 0.542 1.04 1.137 1.675 5.583
Pantheon 0.856 0.508 0.955 1.051 1.704 5.628
Pressable 1.032 0.757 1.08 1.449 1.948 5.793
Pressidium 0.738 0.727 1.171 1.292 1.67 5.747
SiteGround 0.867 0.678 1.114 1.176 1.671 4.56
Company WPT Singapore WPT Shanghai WPT Japan WPT Sydney WPT Brazil
BlueHost 2.652 22.102 1.863 1.937 2.255
CloudWays Amazon 2.236 23.404 1.781 1.75 1.752
CloudWays Google 2.031 22.418 2.026 1.609 1.793
Kinsta 2.235 24.017 2.109 1.602 1.851
LightningBase 2.227 22.437 1.683 1.968 1.612
LiquidWeb 2.335 23.238 1.885 1.96 1.635
Media Temple 2.19 22.265 1.814 2.101 2.091
Pagely 2.415 23.124 1.914 2.103 1.943
Pantheon 2.093 25.209 1.781 1.975 1.804
Pressable 2.382 23.897 2.234 2.821 2.132
Pressidium 2.245 23.303 2.061 1.785 1.747
SiteGround 2.309 22.746 2.017 2.935 1.907

LightningBase put up the fastest individual score of any bracket this year in this test with a blazingly fast 0.465ms average response in Denver. Other than that, nothing special here other than all these companies seemed capable of delivering content fast pretty much everywhere in the world except Shanghai.

5. WPPerformanceTester

Company PHP Bench [Seconds] (lower=faster) WP Bench [Queries Per Second](higher=faster)
BlueHost 11.655 713.78
CloudWays Amazon 10.993 324.99
CloudWays Google 11.192 327.33
Kinsta 11.333 318.47
LightningBase 10.537 1067.24
LiquidWeb 7.177 1084.6
Media Temple 13.9 98.85
Pagely 10.102 165.86
Pantheon 11.687 202.92
Pressable 10.952 492.61
Pressidium 10.749 240.67
SiteGround 11.522 1030.93

LiquidWeb put up one of the fastest scores on the PHP Bench at 7.177. Everyone else fell into the 10-14 range we generally see.

The WP Bench saw some slow scores from MediaTemple and Pagely and handful breaking the 1000 barrier in LightningBase, LiquidWeb, and SiteGround. Interestingly, the trend seems to be slower as you go up in price as you get more non-local databases.

Conclusion

This is the last really crowded bracket as we go up in price. It's sitting right at the border of entry level plans and the more serious stuff. This is the first tier that tested plans more heavily than any plan last year as well. The results were also very encouraging.

Top Tier WordPress Hosting Performance

review_signal_2016_trophy_100

KinstaLightningBaseLiquidWeb [Reviews], Pressable, Pressidium and SiteGround [Reviews] all earned top tier WordPress Hosting for the $51-100/month tier.

Honorable Mentions

MediaTemple [Reviews] and Pagely [Reviews] earn honorable mentions. They had some minor issues in the LoadStorm test and MediaTemple had some minor issues in the Blitz test.

Individual Host Analysis

BlueHost [Reviews]

BlueHost fell short again in the load tests.

CloudWays [Reviews] (Amazon + Google)

CloudWays is always interesting because you can compare head to head performance on different cloud platforms. I would pretty confidently say that Amazon outperformed Google in this instance with similar specs (although Amazon charges more).

Kinsta

Kinsta's entry level plan put on a fantastic performance. The higher end providers are starting to show up in this price tier and really showing why they charge their premium prices. Kinsta easily earned top tier status.

LightningBase

LightningBase's most expensive plan that we tested this year (although they offer higher ones), and for the third consecutive price tier (and year), they handled the tests flawlessly. A literaly perfect score for LightningBase: 100% uptime on both monitors and 0 errors on all load tests. Simply perfection. Undoubtedly a top tier WordPress Host.

LiquidWeb [Reviews]

LiquidWeb is a newcomer to this testing and this is their entry level plan. Boy did they make a positive splash. 100% uptime across the board and excellent load testing scores. They also had the fastest PHP Bench in this bracket (and third fastest of any company this year). They have a fantastic reputation here at Review Signal on our reviews section, I can confidently say they also have a top tier WordPress Hosting product to boot.

MediaTemple [Reviews]

Media Temple earned an honorable mention which is a step in the right direction. They had some minor problems with the load tests. No major concerns, just need to figure out security issues and minor performance stuff to make them top tier again.

Pagely [Reviews]

Pagely was a bit of a disappointment. They've been in the top tier the past years but fell to an honorable mention this year. The increased LoadStorm test seemed to put some strain on the server and caused spikes and increased load times. Everything else looked very good like previous years.

Pantheon [Reviews]

Pantheon, like Pagely, struggled with the LoadStorm test, but to a larger degree this year. It knocked them out of the top tier and didn't even earn an honorable mention in this price bracket. Everything else looked very good.

Pressable

Pressable showed up in a big way. No problems in any of the tests. Zero errors on both load tests. Easily in the top tier for this price bracket.

Pressidium

One error, nearly perfect uptime. Hard to really expect a better performance. Pressidium's entry level plan remains in the top tier for another year.

SiteGround [Reviews]

I screwed up with the Blitz load test and they got a perfect score with an extra thousand users which is impressive. They had a small spike at the start of the LoadStorm test but otherwise put on a flawless performance with 100% uptime on both monitors as well. SiteGround is in the top tier.

$25-50/Month WordPress Hosting Performance Benchmarks (2016)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

The full company list, product list, methodology, and notes can be found here

This post focuses only on the results of the testing in the $25-50/month price bracket for WordPress Hosting.

$25-50/Month WordPress Hosting Products

review_signal_table_50

$25-50/Month WordPress Hosting Performance Benchmarks Results

1. Load Storm

Test 500-2000 Concurrent Users over 30 Minutes, 10 Minutes at Peak

Company Total Requests Total Errors Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred (GB) Peak Throughput (MB/s) Average Throughput (MB/s)
A2 193764 68658 148.28 107.65 17563 6541 7.647 11.37 4.248
CloudWays DO 196963 54589 148.48 109.42 15809 5841 8.474 7.384 4.708
CloudWays Vultr 207994 50049 144.13 115.55 16187 5339 9.439 8.398 5.244
Conetix 169625 116960 134.43 94.24 18510 8578 2.635 3.898 1.464
LightningBase 315348 1 238.4 175.19 3567 272 16.34 13.47 9.077
Pantheon 268164 866 205.5 148.98 14422 315 6466 4927 3592
Pressable 394405 26 294.6 219.11 15101 226 16.4 13.32 9.111
Pressjitsu 300931 3913 228.47 167.18 11121 502 16.86 14.29 9.365
SiteGround 300999 0 232.75 167.22 10926 462 15.83 14.35 8.972
WP Land 294459 14976 235.63 163.59 15422 864 15.15 14.04 8.417
WPEngine 348796 26572 270.23 193.78 15091 311 14.95 11.38 8.307
WPOven 288369 0 217.85 160.21 5815 283 16.64 13.63 9.245

 

Discussion of Load Storm Test Results

Many companies handled this test without any sort of struggle: LightningBasePantheon [Reviews], PressableSiteGround [Reviews], and WPOven.com. In fact, SiteGround and WPOven managed to have zero errors, while LightningBase had 1. Truly impressive performances put on by these companies.

Pressjitsu struggled a little bit. There were some errors and increased response times at the start of the test. It managed to stabilize for the last 22 minutes as load increased though.

WPEngine [Reviews] and WP.land struggled a bit more than Pressjitsu, but didn't completely fall apart. Both seemed to be having issues with the wp-login page, possibly security related.

A2 Hosting [Reviews], CloudWays [Reviews] (Digital Ocean & Vultr), and Conetix did not do well during this test. High error rates and slow response times show they were not equipped to handle this type of load.

 

2. Blitz.io

Test 1-1000 Concurrent Users over 60 seconds

Blitz Test Quick Results Table

Company Hits Errors Timeouts Average Hits/Second Average Response Time Fastest Response Slowest Response
A2 51 14265 7339 1 800 411 1047
CloudWays DO 28328 13 16 472 32 25 91
CloudWays Vultr 28763 3 0 479 24 24 25
Conetix 2359 1097 6070 39 1412 763 2410
LightningBase 27460 0 0 458 72 71 72
Pantheon 27755 0 0 463 61 60 67
Pressable 25914 0 2 432 134 134 136
Pressjitsu 23902 481 0 398 205 205 206
SiteGround 26623 1 26 444 86 71 255
WP Land 28352 0 1 473 39 38 40
WPEngine 26281 69 0 438 117 114 127
WPOven 26687 0 0 445 103 101 104

Discussion of Blitz Test 1 Results

This test is just testing whether the company is caching the front page and how well whatever caching system they have setup is performing (generally this hits something like Varnish or Nginx).

Who performed without any major issues?

CloudWays (Digital Ocean & Vultr), LightningBasePantheonPressableSiteGround [Reviews], WPEngine [Reviews], WP.land, and WPOven.com all handled the blitz test without any significant issues.

Who had some minor issues?

Pressjitsu again had what seems to be security related issues. A perfect flat response time but some timeouts at the end of the test.

Who had some major issues?

A2 Hosting and Conetix both failed the Blitz test.

3. Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. UptimeRobot was paid for and monitoring on a 1 minute interval. All the companies were monitored over approximately two months (May-June 2016).

Uptime Robot & StatusCake

Company StatusCake UptimeRobot
A2 97.91 99.35
CloudWays DO 100 100
CloudWays Vultr 99.95 99.87
Conetix 99.92 99.93
LightningBase 100 100
Pantheon 100 100
Pressable 99.91 99.92
Pressjitsu 99.78 99.65
SiteGround 99.99 100
WP Land 99.92 100
WPEngine 100 99.99
WPOven 100 100

A2 had significant downtime issues with StatusCake recording 97.91% and UptimeRobot recording 99.35% uptime. The CloudWays Vultr server had some issues with UptimeRobot recording 99.87%. Pressjitsu also had some uptime problems with StatusCake recording 99.78% and UptimeRobot 99.65%.

Everyone else was above 99.9% on both monitors including CloudWays Digital Ocean, LightningBase, Pantheon, WPOven all recording perfect 100%/100% scores.

4. WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only.

Company WPT Dulles WPT Denver WPT LA WPT London WPT Frankfurt WPT South Africa
A2 0.879 0.747 1.237 1.61 2.029 5.657
CloudWays DO 0.836 0.58 1.031 1.221 1.668 7.08
CloudWays Vultr 0.713 0.676 1.087 1.109 1.636 7.643
Conetix 2.328 2.078 2.242 3.845 3.497 8.69
LightningBase 0.567 0.563 1.054 1.067 1.511 4.199
Pantheon 0.86 0.583 1.024 1.259 1.649 7.625
Pressable 0.945 0.715 1.162 1.533 2.013 9.377
Pressjitsu 0.94 0.549 0.93 1.33 1.912 6.288
SiteGround 0.838 0.655 1.043 1.063 1.693 6.927
WP Land 0.816 0.622 1.002 1.189 1.693 3.307
WPEngine 0.872 0.523 0.939 1.199 1.796 4.434
WPOven 0.85 0.534 1.093 1.452 1.79 4.844
Company WPT Singapore WPT Shanghai WPT Japan WPT Sydney WPT Brazil
A2 2.956 22.788 2.231 2.609 2.342
CloudWays DO 2.123 22.42 1.898 1.713 1.755
CloudWays Vultr 2.223 22.573 1.832 1.7 1.797
Conetix 2.027 23.425 2.63 1.308 3.56
LightningBase 2.041 23.977 1.717 1.848 1.667
Pantheon 2.194 22.605 1.769 1.661 1.784
Pressable 2.451 22.258 2.194 3.079 2.049
Pressjitsu 2.046 22.352 1.73 1.416 2.055
SiteGround 2.245 23.087 1.806 2.27 1.855
WP Land 2.157 22.428 1.872 1.658 1.784
WPEngine 2.121 24.584 1.87 2.051 1.863
WPOven 2.089 2.82 1.796 1.712 1.859

What I learned was getting traffic into China is terrible. Nobody really did well on the Shanghai location except WPOven which somehow didn't get the delay that every other company experienced. South Africa is also really slow. Most servers were US based but were delivering content to most corners of the world in about 2 seconds or less which is impressive.

Conetix is an Australian focused company and they delivered to Sydney faster than anyone which is a relief that the geographic advantage is real. Beyond the Australian market the connectivity seemed slower to just about every other location. Australia has notoriously bad connectivity though, so I can see the advantage of having a company specializing in the local market.

I wish I could compare averages against last year except they removed one of the testing locations (Miami) and I did a global test instead because that was something people wanted to see.

The US connectivity is very fast though, with everyone delivering to Dulles(VA) and Denver (CO) in under a second (minus the Australian server) with LA at about one second exactly for everyone.

5. WPPerformanceTester

Company PHP Bench [Seconds] (lower=faster) WP Bench [Queries Per Second](higher=faster)
A2 18.456 592.77
CloudWays DO 15.028 353.98
CloudWays Vultr 13.145 392.62
Conetix 12.833 410.51
LightningBase 10.795 1353.18
Pantheon 12.292 189.54
Pressable 11.062 525.21
Pressjitsu 12.771 648.09
SiteGround 11.414 1109.88
WP Land 13.491 1094.09
WPEngine 13.494 406.17
WPOven 9.412 690.61

In this tier, there was a lot more normalized spread on the PHP Bench with most people being within the 10-14 second range we saw last year. WPOven lead the pack at 9.4. A2 was the slowest at 18.456.

The WP Bench scores varied a lot, again. LightningBase had another blazingly fast score of 1353.18. Siteground and WPLand also broke the 1000 barrier, whereas last year's fastest was 889. At the bottom of the pack was Pantheon with 189.54, which I am sure they would say infrastructure plays a large role in. Anyone with a distributed/non-local SQL database will be slower by a lot. They would probably argue that's one of the trade-offs of scalability and based on their load testing performance, it would be hard to argue against.

Conclusion

A very crowded bracket with lots of competition. This range is still pretty entry level, not the cheapest stuff like the <$25/month plans I compared. But with increased price came better performances. Although two of the top tier companies in this bracket make up two of the three top tier performers in the cheapest bracket. But it is nice to see some loose price to performance correlation in the market. Many of these plans are the entry level for their respective companies.

One of the interesting things to watch was the VPSs in this range (A2, CloudWays, Pressjitsu). They were outperformed by the Shared/Cloud providers who can presumably burst more shared resources for any given site. So for spikey sites that expect to get a Reddit/Slashdot effect, there may be some advantage in being in that sort of environment (if you cant easily scale the VPS, which some providers make quite easy). But since these are dummy sites not really tested heavily over the two months, there is the potential for bad neighbors negatively impacting you too during such a spike, and then you might want your own isolated VPS. I can see arguments for for both sides.

Without further ado, I will tell you who had the best performance, who deserved an honorable mention and then analyze each host individually. I still don't believe in ranking in any particular order, only grouping companies by how well they performed.

Top Tier WordPress Hosting Performance

review_signal_2016_trophy_50

LightningBasePantheonPressableSiteGround [Reviews] and WPOven.com.

Honorable Mentions

Unfortunately, no company deserves honorable mention status which I give to companies that came close but weren't quite perfect or they looked like they were good but something external (generally security measures) interfered with the tests but from all other indications they seemed to be doing well.

The closest in this bracket would have been Pressjitsu, except they had uptime issues which I give no leeway for being under 99.9%.

Individual Host Analysis

A2 Hosting [Reviews]

I try to find bright spots in a company's performance, but A2 really didn't have one in this test. If you buy a VPS there is also no default way to install WordPress beyond the old fashioned DIY. You have to pay extra for Softaculous installer from the admin panel.

CloudWays [Reviews] Digital Ocean / Vultr

The most interesting part of CloudWays is being able to see the same stack tested on multiple providers. It's a small sample, but it looks like Vultr marginally outperforms Digital Ocean in performance. Although, Digital Ocean was more stable (again, small sample size to compare head to head). It was nice to see CloudWays do well with the Blitz tests and keep very good uptime, especially the Digital Ocean machine which was perfect.

Conetix

Conetix had good uptime and connection to Australia, their target market. They strongly using W3TC but it didn't come fully installed and I don't test anything beyond the default configuration because it gets into too much minutia and conflict with hosts about what could be done to improve scores. I also believe most people just stick with the default based on all the user testing I've seen across various fields. So the unfortunate results were the load test performances didn't look very good for them.

(9/19/2019 Update) Conetix have issued their own statement regarding Review Signal's test and why they believe this methodology doesn't accurately represent their performance and why a unique Australian perspective is required when evaluating them. I recommend reading the full details.

LightningBase

LightningBase put on basically a perfect performance. 100% uptime on both monitors. 0 errors on blitz, 1 error on loadstorm. Unequivocally, a top tier performance.

Pantheon [Reviews]

Pantheon showed up again, in a good way. They earned themselves a top tier performance accolade. They had a few errors at the start of the LoadStorm test, but beyond that aced everything.

Pressable

Pressable is back for the first time since my first testing in 2013, with new ownership (WordPress.com). They had had some good tech back then but it wasn't perfect and had some minor issues. I can happily say that has changed as they delivered a top tier performance this year with no issues in any test.

Pressjitsu

Pressjitsu felt like 2013 Pressable, the foundations of a really good company but just didn't get it all put together. The biggest problem was the sub 99.9% uptime. They had what appeared to be security measures mar the blitz test and had some errors at the start of the LoadStorm test but managed to stabilize for the duration and put on a good showing.

SiteGround [Reviews]

SiteGround got even better this year. They jumped up from honorable mention to top tier status. Their Blitz and LoadStorm tests both improved while everything else remained at a high level. An all around fantastic performance which deserved top tier status.

WPEngine [Reviews]

WPEngine fell slightly this year, it could have been a security issue with wp-login during the LoadStorm test, but there were too many errors to give it honorable mention status for this plan which it earned last year. Everything else looked good though.

WP.land

WP Land like WPEngine had too many problems during the LoadStorm test that it didn't earn honorable mention status. Everything else looked very good for them and it's great to see a strong new entrant.

WPOven.com

The knock on WPOven last year was their LoadStorm test. Everything else was perfect. I'm glad they maintained everything else, but this time they managed a perfect LoadStorm test to boot. A huge improvement and a very well deserved entry in the top tier of WordPress Hosts in the $25-50 range.

Under $25/Month WordPress Hosting Performance Benchmarks (2016)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

The full company list, product list, methodology, and notes can be found here

This post focuses only on the results of the testing in the <$25/month price bracket for WordPress Hosting.

 

<$25/Month WordPress Hosting Products

review_signal_table_25_updated

 

<$25/Month WordPress Hosting Performance Benchmarks Results

1. Load Storm

Test 500-2000 Concurrent Users over 30 Minutes, 10 Minutes at Peak

Company Total Requests Total Errors Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred (GB) Peak Throughput (MB/s) Average Throughput (MB/s)
A2 310069 203981 249.08 172.26 15138 549 4.639 8.853 2.577
BlueHost 181995 153234 147.47 101.11 16000 7634 1.066 3.677 0.592
DreamHost 295685 43 224.1 164.27 15063 339 16.06 13.5 8.922
FlyWheel 265618 81491 205.22 147.57 15101 1154 11.5 9.361 6.391
GoDaddy 311172 1363 238.68 172.87 10100 340 16.07 13.31 8.927
Hosting Agency (DE) 182424 117939 132.65 101.35 15991 6743 3.823 10.53 2.124
IWW 272657 84 217.92 151.48 10096 266 14.93 13.77 8.293
LightningBase 314439 5 238.68 174.69 8989 255 16.24 13.24 9.023
Media Temple 327662 1466 258.45 182.03 10628 381 12.55 10.54 6.972
Pressed 289318 61 214.05 160.73 15029 266 16.25 13.01 9.03
SiteGround 301722 1 230.45 167.62 9374 447 15.9 13.76 8.833
TrafficPlanetHosting 289335 476 217.63 160.74 15216 570 16.15 14.08 8.974
WP Land 293166 11596 228.4 162.87 15608 644 15.47 13.3 8.594

Discussion of Load Storm Test Results

The companies that clearly didn't struggle at all with LoadStorm were DreamHost [Reviews], Incendia Web Works (IWW), LightningBase, Pressed, SiteGround [Reviews]. GoDaddy [Reviews], MediaTemple [Reviews] and Traffic Planet Hosting had minor spikes at the start, but they seem nearly inconsequential in the grand scheme of the test.

WP.land seemed to have some security measures which struggled with wp-login being hit so frequently.

A2 Hosting [Reviews], BlueHost [Reviews], FlyWheel [Reviews] and Hosting Agency did not do well on this test. FlyWheel explicitly stated this was too much load for that size plan and recommended upgrading if this was the expected load.

2. Blitz.io

Test 1-1000 Concurrent Users over 60 seconds

Blitz Test Quick Results Table

Company Hits Errors Timeouts Average Hits/Second Average Response Time Fastest Response Slowest Response
A2 590 27255 390 10 92 55 167
BlueHost 23340 71 274 389 214 155 604
DreamHost 29337 0 1 489 4 3 7
FlyWheel 28530 0 0 476 28 21 146
GoDaddy 15222 11093 28 254 196 190 229
Hosting Agency (DE) 662 20862 3649 11 630 400 1556
IWW 28786 9 0 480 23 21 24
LightningBase 27488 0 0 458 71 71 72
Media Temple 15255 11260 5 254 200 188 318
Pressed 26228 0 0 437 80 5 389
SiteGround 26055 1 21 434 100 72 346
TrafficPlanetHosting 1018 8344 9718 17 266 102 843
WP Land 28344 0 0 472 39 38 39

Discussion of Blitz Test 1 Results

This test is just testing whether the company is caching the front page and how well whatever caching system they have setup is performing (generally this hits something like Varnish or Nginx).

Who performed without any major issues?

DreamHost, IWW, LightningBase, SiteGround, WP Land all handled the test without any issues.

Who had some minor issues?

BlueHost had a couple spikes during the test which caused some errors and timeouts, but they weren't substantial.

FlyWheel had a spike at the very end of the test which caused a large increase in response times.

Pressed started to have a ramp up in response times but it never errored or timed out during the test.

Who had some major issues?

GoDaddy, MediaTemple and TrafficPlanetHosting seemed to pretty clearly hit security measures which couldn't be worked around. The response times were relatively stable, but errors shot up which is symptomatic of a security measure kicking in rather than the server being taxed. It's hard to know how they would have performed sans security measures.

A2 and Hosting Agency did not take kindly to the Blitz test and crashed almost immediately under load.

3. Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. UptimeRobot was paid for and monitoring on a 1 minute interval. All the companies were monitored over approximately two months (May-June 2016).

Uptime Robot & StatusCake

Company StatusCake UptimeRobot
A2 99.92 99.91
BlueHost 30.22 18.06
DreamHost 99.97 99.97
FlyWheel 99.96 99.98
GoDaddy 99.96 99.98
Hosting Agency (DE) - 100
IWW 99.73 99.88
LightningBase 99.99 100
Media Temple 99.96 99.95
Pressed 100 99.87
SiteGround 99.97 99.98
TrafficPlanetHosting 99.98 99.98
WP Land 99.92 100

BlueHost screwed up and cancelled this account mid-testing causing the uptime to look horrific. Their other two plans which were not cancelled had measurements of 99.98, 99.98, 100 and 99.99 uptime. I'm upset that it happened and there was a struggle to restore the account and have to take credit away for this type of screw up. But, they were able to keep the other servers up with near perfect uptime which I think should be stated here as well.

Hosting Agency for some reason couldn't be monitored by StatusCake (http/2 issue they still haven't fixed for nearly 9 months, which UptimeRobot fixed within 24 hours when I notified them). But they had 100% on UptimeRobot, so it looks good.

IWW had a bunch of short outages and one longer one (2hr 33m) which brought it's uptime down.

Pressed had a 1hr 51m downtime (502 error) recorded by UptimeRobot but StatusCake never picked it up. I'm not sure what to make of that, it might be something wrong with UptimeRobot's servers connecting properly since StatusCake never picked it up over an interval that long.

Everyone else had above 99.9% uptime.

4. WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only.

Company WPT Dulles WPT Denver WPT LA WPT London WPT Frankfurt WPT South Africa
A2 0.819 0.638 1.109 1.181 1.687 5.054
BlueHost 0.902 0.521 0.878 1.532 1.874 3.483
DreamHost 0.769 0.777 1.444 1.107 1.64 4.33
FlyWheel 0.74 0.722 1.077 1.082 1.649 5.241
GoDaddy 0.939 0.728 0.834 1.376 1.992 6.909
Hosting Agency (DE) 1.299 1.258 2.17 0.985 1.55 4.905
IWW 0.544 0.658 0.864 0.929 1.416 4.105
LightningBase 0.62 0.598 1.078 0.95 1.471 5.764
Media Temple 0.86 0.667 0.811 1.313 1.945 4.645
Pressed 0.773 0.902 1.276 1.176 1.691 4.845
SiteGround 0.741 0.64 1.048 1.06 1.721 4.94
TrafficPlanetHosting 0.793 0.562 1.26 1.212 1.723 3.522
WP Land 0.719 0.689 1.154 1.099 1.709 4.8

 

Company WPT Singapore WPT Shanghai WPT Japan WPT Sydney WPT Brazil
A2 2.244 22.287 1.974 2.003 1.895
BlueHost 2.255 22.728 1.809 1.467 2.274
DreamHost 1.93 22.186 2.028 1.954 1.747
FlyWheel 1.765 12.549 1.845 1.816 1.758
GoDaddy 2.173 22.373 1.826 1.959 2.103
Hosting Agency (DE) 2.311 22.406 2.651 2.772 2.596
IWW 1.98 22.547 1.615 1.96 1.535
LightningBase 1.999 19.731 1.708 1.913 1.661
Media Temple 2.113 22.141 1.802 1.959 2.135
Pressed 2.233 23.691 1.997 2.037 1.894
SiteGround 2.131 22.718 1.843 2.079 1.788
TrafficPlanetHosting 2.081 22.74 1.872 1.595 1.816
WP Land 2.25 22.305 1.852 1.959 1.752

What I learned was getting traffic into China is terrible. Nobody really did well on the Shanghai location. South Africa is also really slow. Most servers were US based but were delivering content to most corners of the world in about 2 seconds or less which is impressive. Hosting Agency based in Germany was a bit disappointing. Very slow relatively speaking to the US. But it wasn't even the fastest to London or Frankfurt. LightningBase and IWW were able to beat the German company in the US by a large margin and to Europe which reinforces that geographic location isn't everything in terms of speed.

I wish I could compare averages against last year except they removed one of the testing locations (Miami) and I did a global test instead because that was something people wanted to see.

5. WPPerformanceTester

Company PHP Bench [Seconds] (lower=faster) WP Bench [Queries Per Second](higher=faster)
A2 12.626 570.78
BlueHost 13.089 1083.42
DreamHost 17.104 446.23
FlyWheel 11.761 387.3
GoDaddy 13.804 278.47
Hosting Agency (DE) 6.501 45.28
IWW 7.637 1869.16
LightningBase 10 1315.79
Media Temple 12.241 339.79
Pressed 11.036 217.2
SiteGround 11.497 733.14
TrafficPlanetHosting 8.666 918.27
WP Land 14.485 684.93

What was enormously interesting about WPPerformanceTester results this year was the much larger spread and faster results. Last year, almost everyone was around 10-14 seconds for PHP Bench with the outlier of PressLabs doing 8.9 and DreamHost at 27. DreamHost again has the dubious honor of the slowest PHP Bench but it improved by a whopping 10 seconds down to 17. The fastest was Hosting Agency with 6.5, more than a full 2 seconds faster than last year's fastest speed. IWW, TrafficPlanetHosting also managed sub 10 second speeds.

Last year's fastest WP Bench was 889 queries per second. That was blown away by this years testing with IWW leading the group at more than double the speed (1869). BlueHost, LightningBase and TrafficPlanetHosting all managed to be faster than last year's fastest benchmark as well. Unfortunately, Hosting Agency's incredibly fast PHP bench is somewhat cancelled out by their slowest WP Bench score, which is slower than last year's slowest. It should be noted that transaction speed isn't always a great measured on distributed/clustered/cloud systems that may be running databases on different machines, but at the entry level that's less of an issue. Generally the incredibly fast scores you see are local databases with no network latency overhead.

Conclusion

It is nice to get back to a real entry level analysis with a much more level playing field. Having 13 different companies available to choose from in the <$25/month range is fantastic. Despite the change in this years format, the lower end plans still outperformed the fastest competitors from last year's tests which had plans up to ~$300/month.

Despite the hard price cap in this bracket of testing, there were still some companies that handled all the tests without any serious issue. Many more did very well but ran into minor issues.

The amount of companies jumping into the space is a fantastic win for consumers. In this tier we saw A2, Pressed, WP Land, Hosting Agency, IWW and Traffic Planet Hosting all enter for the first time. They target a variety of different niches within the space and overall it's a win for us, the consumer to have more good choices and options. From a performance standpoint, you can still get amazing performance value for the money even at the lowest tier.

Without further ado, I will tell you who had the best performance, who deserved an honorable mention and then analyze each host individually. I still don't believe in ranking in any particular order, only grouping companies by how well they performed.

Top Tier WordPress Hosting Performance

review_signal_2016_trophy_25

DreamHost [Reviews], LightningBase, and  SiteGround [Reviews],

All three of these companies went through the full testing without any meaningful issues.

Honorable Mentions

Pressed had an odd uptime issue but also showed some signs of server stress during the blitz test. For a brand new company they performed admirably, but I'm not quite comfortable awarding them the top tier status quite yet when you compare their results against the three top tier companies, but they put on a very good showing.

WP.land did well in every test except LoadStorm where it had a roughly 4% error rate. It looked like a security issue with wp-login which isn't uncommon. But there were also some spikes/delays as well. It could just be security acting up, but again, a minor issue that kept it out of the top tier, but it was worthy of an honorable mention from yet another new comer to this year's testing.

GoDaddy [Reviews]/MediaTemple [Reviews], I combine this one because it's running on the same tech and the results look very similar and experienced the same security issues. You can pretty clearly see when the security measures kick in on Blitz and I wasn't able to work with their tech team to come up with a way to responsibly bypass their security measures. LoadStorm had a spike at the start with wp-login issues but resolved itself out quickly and had a flat response time graph. It's possible their tech is just as good as the top tier hosts, but I wasn't able to accurately measure it because of security measures but it looks very good and at least deserves the honorable mention.

Traffic Planet Hosting is another new entrant and had similar issues to GoDaddy/MediaTemple. Security issues caused some problems on the Blitz test, but it did start to show some load too. Not perfect, but it did well on LoadStorm as well.  (no honorable mention?)

Individual Host Analysis

A2 Hosting [Reviews]

A2 Hosting was a new entrant to this test and as much as I love the competition in the space, A2 fell short. Other than their uptime monitoring which was good, they struggled in all the load testing experiments.

BlueHost [Reviews]

BlueHost specifically messed up with my account in this test and the uptime was terrible because of it. That alone ruined the uptime test, although as I stated in the section, the other servers all maintained excellent uptime which were on different accounts. They did ok in the blitz test, but not in the LoadStorm test. They also surprisingly managed the fastest individual WebPageTest score of any host in this price range. Compared to last year I don't see any huge signs of improvement with regards to performance.

DreamHost [Reviews]

Last year DreamHost's DreamPress product almost made the top tier except for some major downtime issues. This year, they had no such downtime issues and the performance remained top notch. DreamHost earned the top tier status for the <$25/month price bracket. It appears to be an excellent product priced very competitively.

FlyWheel [Reviews]

FlyWheel only entered one product this year and it was less powerful than last year's. It struggled a bit more on the LoadStorm test but the Blitz was perfect (although for this price tier, it was a weaker test than last year's test). They explicitly stated for LoadStorm that the plan was inappropriate for that level of traffic. They can probably handle bigger sites, but if we're comparing dollars to performance, they fell short in this price bracket on that metric. But they are still rated as the most well liked company that we track at Review Signal, so they are clearly doing something right in terms of product and customer service.

GoDaddy [Reviews]

GoDaddy had a stalwart performance marred by what appeared to be security measures. They very well could have a top notch product but we couldn't work out a responsible way to bypass the security measures for the Blitz load test. LoadStorm looked pretty good, one small spike to start and steady up to 2000 users. GoDaddy earned an honorable mention status because the product didn't seem to encounter any non-artificial problems.

Incendia Web Works

IWW did a great job in both load tests. The only concern was uptime, where IWW had 99.73% and 99.88% as recorded by each service. The performance component is definitely there, but a little more consistency and we have another serious competitor in the space. The only reason they didn't earn honorable mention while Pressed did is that there were conflicting uptime reports for Pressed where one showed 100% and the other recorded sub 99.9% uptime. Two independent services showed IWW below 99.9%, so there isn't much doubt about it in my mind. Like DreamHost last year, they put on a great performance showing and I hope next year the servers are a bit more stable and I can award top tier status.

LightningBase

LightningBase continues to impress. The last two years they've put on consistently near perfect tests. Their Blitz result was perfect and their LoadStorm had only 5 errors out of 314439 requests. Combined with 100/99.99% uptime monitors, LightningBase is unquestionably in the top tier for the <$25/month WordPress hosting bracket.

MediaTemple [Reviews]

MediaTemple's results basically mirrored GoDaddy's results. It would be even hard to tell the graphs apart if you removed the names. The MediaTemple/GoDaddy platform appears to be very solid but we couldn't responsibly get by some security measures, so I couldn't award it top tier status, but MT earned an honorable mention.

Pressed

Pressed earned itself an honorable mention. It had a weird uptime issue but more importantly it started to show some signs of load during the Blitz test where I would expect a flat response time from a static cache test like Blitz. It's a very new product and I'm sure we'll continue to see tremendous improvements as time goes on, a very good performance from possibly the newest company in this year's testing.

Hosting Agency

Hosting Agency performed as expected, it appears to have no special WordPress optimizations. If you were to install a basic lamp stack, this is the performance I expect out of the box. They had perfect uptime and oddly found themselves on both ends of the spectrum on my WPPerformanceTester. They weren't faster to England or Germany on WebPageTest, which I suspect is because there was no special caching technologies to accelerate delivery of pages despite being geographically closer. And it just collapsed during the load tests, especially Blitz which is essentially a static cache test (where they have none). Another important note is that their entire system is in German only.

SiteGround [Reviews]

SiteGround got even better this year. They jumped up from honorable mention to top tier status. Their Blitz and LoadStorm tests both improved while everything else remained at a high level. An all around fantastic performance which deserved top tier status.

Traffic Planet Hosting

Another new comer to this years testing. TPH put on a good show, there seemed to be some security measures which ruined the Blitz testing, but the LoadStorm test looked very solid. They earned an honorable mention because the only issue seemed artificial. I'm less confident about the quality of the product than GoDaddy/MediaTemple, but it still seemed to warrant recognition.

WP.land

WPLand was the final new entrant and they put on a fantastic showing. Everything went near perfect except the LoadStorm test which seemed to have an issue with wp-login triggering some security measures. But the response rate was pretty stable and quick despite the ramp up to 2000 users. They also had a perfect blitz test with no errors and a 1ms spread in fastest to slowest response times. WP Land earned honorable mention status because overall it was a very good performance with a small issue that might be security related.

 

WordPress Hosting Performance Benchmarks (2016)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

2018 WordPress Hosting Performance Benchmarks is now live.

This is the fourth round of managed WordPress web hosting performance testing. You can see the original, 2014 version , and 2015 version.

Companies Tested

A2 Hosting [Reviews]
BlueHost [Reviews]
CloudWays [Reviews]
Conetix
DreamHost [Reviews]
FlyWheel [Reviews]
GoDaddy [Reviews]
Incendia Web Works
Kinsta
LightningBase
LiquidWeb [Reviews]
MediaTemple [Reviews]
Pagely [Reviews]
Pantheon [Reviews]
Pressable (Formerly ZippyKid)
Pressed.net
Pressidium
Pressjitsu
PressLabs
Hosting Agency (German)
SiteGround [Reviews]
Traffic Planet Hosting
WordPress.com VIP
WPEngine [Reviews]
WP.land
WPOven.com

Companies that didn't participate this round but did on previous rounds: WebHostingBuzzWPProntoNexcessA Small Orange [Reviews] and  WebSynthesis [Reviews].

Every plan was donated by the company for testing purposes with the strict stipulation that it would be the same as if any normal user signed up. There is a notes section at the bottom that details the minutiae of changes made to plans at the end of this post. Nearly every single company had security issues that I had to get around, so they worked to make sure my testing went through properly. Load testing often looks like an attack and it's the only way I can do these tests.

The Products

This year is a bit different than years past where every company and plan competed against one another. When I started the price gap was from $5/month to $29/month. Last year the gap was $5.95 to $299. I was only testing entry level plans but the market has dramatically changed since I first got started. Today, there is demand at many different price points and lots of companies have gone upscale with WordPress.com VIP at the top of the price bracket starting at $5,000/month. The only logical way to break things up was by price brackets. So below you will see the brackets and which companies participated. Specific details will be included on each bracket's write up.

 

<$25/m $25-50/m $51-100/m $101-200/m $201-500/m $500+/m
A2 Hosting A2 Hosting LiquidWeb A2 Hosting Kinsta Kinsta
Bluehost Conetix Bluehost Bluehost Media Temple Pagely
DreamHost LLC Lightning Base Cloudways (AWS ) Conetix Pagley Pantheon
Flywheel Pantheon Cloudways (Google) Kinsta Pantheon Pressable
GoDaddy Pressable Kinsta Liquid Web Pressable Pressidium
Incendia Web Works Pressjitsu Lightning Base Pressable Pressidium WordPress.com VIP
Lightning Base SiteGround Media Temple Pressidium Presslabs WP Engine
Media Temple WP Engine Pagely Pressjitsu SiteGround
Pressed WP.land Pantheon
Hosting Agency.de Cloudways (DigitalOcean) Pressable
SiteGround Cloudways (Vultr) Pressidium
Traffic Planet Hosting WPOven SiteGround
WP.land

 

Methodology

The question I tried to answer is how well do these WordPress hosting services perform? I tested each company on two distinct measures of performance: peak performance and consistency. I've also included a compute and database benchmark based on a WordPress plugin.

All tests were performed on an identical WordPress dummy website with the same plugins except in cases where hosts added extra plugins. Each site was monitored for approximately two months for consistency.

1. LoadStorm

LoadStorm was kind enough to give me resources to perform load testing on their platform and multiple staff members were involved in designing and testing these WordPress hosts. I created identical scripts for each host to load a site, login to the site and browse the site. Logged in users were designed to break some of the caching and better simulate real user load. The amount of users varies by cost.

2. Blitz.io

I used Blitz again to compare against previous results. This tests the static caching of the homepage. I increased the number of users based on monthly cost this time.

3. Uptime (UptimeRobot and StatusCake)

Consistency matters. I wanted to see how well these companies performed over a longer period of time. I used two separate uptime monitoring services over the course of a month to test consistency.

4. WebPageTest.org

WebPageTest with 9 runs, first view only, native connection. I tested from Dulles, Denver, Los Angeles, London, Frankfurt, South Africa, Singapore, Shanghai, Japan, Sydney, Brazil.

5. WPPerformanceTester (free plugin on WordPress.org)

I created a WordPress plugin to benchmark CPU, MySql and WordPress DB performance. The CPU/MySql benchmarks are testing the compute power. The WordPress component tests actually calling $wpdb and executing insert, select, update and delete queries.

 

Notes - Changes made to Hosting Plans

A2 - VPS Servers can't install WordPress out of the box without extra payment for Softaculous. Disabled recaptcha.

Conetix - disabled WordFence and Stream plugins.

SiteGround - fully enable SuperCacher plugin

GoDaddy - 24 database connection limit increased if you notify them of heavy load

CloudWays - disabled WordFence

WPPerformanceTester – A WordPress Plugin to Benchmark Server Performance

Everyone who read our most popular blog post, WordPress Hosting Performance Benchmarks may have noticed a new test this year (2015) called WPPerformanceTester. It was something I built during the tests to add a new benchmark to see what the underlying performance of the server the test websites were hosted on. It wasn't hugely meaningful because I had no real basis to compare from except the benchmarks I had just generated. So it really played no role in the actual rankings and outcomes of the testing.

But the vision for it and value has slowly become more apparent. In my testing, Pagely had an unusually slow WordPress benchmark (testing WordPress database functions). It was acknowledged by their team and they have since announced a migration to a newer Amazon technology called Aurora which gave Pagely a 3-4x performance increase.

So without further ado, I'd like to announce WPPerformanceTester is now live on GitHub and licensed under the GPLv3. All problems, errors and issues should be submitted on GitHub.

What Tests Does WPPerformanceTester Run?

  • Math - 100,000 math function tests
  • String Manipulation - 100,000 string manipulation tests
  • Loops - 1,000,000 loop iterations
  • Conditionals - 1,000,000 conditional logic checks
  • MySql (connect, select, version, encode) - basic mysql functions and 1,000,000 ENCODE() iterations
  • $wpdb - 250 insert, select, update and delete operations through $wpdb

Industry Benchmarks

WPPerformanceTester also allows you to see how your server's performance stacks up against our industry benchmark. Our industry benchmark is the average of all submitted test results. After you run WPPerformanceTester, you will have the option to submit the benchmark with or without writing a review of your web host. Please consider submitting without a review so that our benchmark improves. If you feel inclined to write a review, please feel free. They will be published in an upcoming project that ties together many of the projects I've been working on here at Review Signal.

Please Note

WPPerformanceTester is a single node testing tool (if you're running a distributed/clustered system it will not give a complete picture, but only of the servers that execution touches.

Furthermore, WPPerformanceTester is not the be-all and end-all of performance testing or web host quality. Our WordPress Hosting Performance Benchmarks performs a variety of tests and that only gives insight into performance. It doesn't look at customer service quality, pricing, and other important dimensions of a good web hosting service.

WPPerformanceTester should be used as one tool in a performance toolbox. I hope it's valuable and helpful, but please keep in mind the larger picture as well. If you care about service quality, we also maintain the largest web hosting review database. My goal is to cover every aspect, and WPPerformanceTester marks a small step in that direction of being able to give consumers a complete picture of web hosting quality in the WordPress space.

WordPress Hosting Performance Benchmarks (2015)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

This is the third round of managed WordPress web hosting performance testing. You can see the original here, and the November 2014 version here.

New (9/14/2016) The 2016 WordPress Hosting Performance Benchmarks are live.

New (8/20/2015) This post is also available as an Infographic.

Companies Tested

A Small Orange [Reviews]
BlueHost [Reviews]
CloudWays [Reviews]
DreamHost [Reviews]
FlyWheel [Reviews]
GoDaddy [Reviews]
Kinsta
LightningBase
MediaTemple [Reviews]
Nexcess
Pagely [Reviews]
Pantheon [Reviews]
Pressidium
PressLabs
SiteGround† [Reviews]
WebHostingBuzz
WPEngine* [Reviews]
WPOven.com
WPPronto

Note:  Pressable and WebSynthesis [Reviews] were not interested in being tested this round and were excluded. WordPress.com dropped out due to technical difficulties in testing their platform (a large multi-site install).

Every company donated an account to test on. All were the WordPress specific plans (e.g. GoDaddy's WordPress option). I checked to make sure I was on what appeared to be a normal server. The exception is WPEngine*. They wrote that I was "moved over to isolated hardware (so your tests don’t cause any issues for other customers) that is in-line with what other $29/month folks use." From my understanding, all testing was done on a shared plan environment with no actual users on the server to share. So this is almost certainly the best case scenario performance wise, so I suspect the results look better than what most users would actually get.

†Tests were performed with SiteGround's proprietary SuperCacher module turned on fully with memcached.

The Products (Click for Full-Size Image)

wordpress_hosting_2015_product_chart

Methodology

The question I tried to answer is how well do these WordPress hosting services perform? I tested each company on two distinct measures of performance: peak performance and consistency. I've also included a new and experimental compute and database benchmark. Since it is brand new, it has no bearing on the results but is included for posterity and in the hope that it will lead to another meaningful benchmark in the future.

All tests were performed on an identical WordPress dummy website with the same plugins except in cases where hosts added extra plugins. Each site was monitored for over a month for consistency.

1. LoadStorm

LoadStorm was kind enough to give me unlimited resources to perform load testing on their platform and multiple staff members were involved in designing and testing these WordPress hosts. I created identical scripts for each host to load a site, login to the site and browse the site.  I tested every company up to 2000 concurrent users. Logged in users were designed to break some of the caching and better simulate real user load.

2. Blitz.io

I used Blitz again to compare against previous results. Since the 1000 user test wasn't meaningful anymore, I did a single test for 60 seconds, scaling from 1-2000 users.

3. Uptime (UptimeRobot and StatusCake)

Consistency matters. I wanted to see how well these companies performed over a longer period of time. I used two separate uptime monitoring services over the course of a month to test consistency.

4. WebPageTest.org

"WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster." WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. I tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

5. WPPerformanceTester

I created a WordPress plugin to benchmark CPU, MySql and WordPress DB performance. It is based on a PHP benchmark script I forked (available on GitHub) and adapted to WordPress. The CPU/MySql benchmarks are testing the compute power. The WordPress component tests actually calling $wpdb and executing insert, select, update and delete queries. This plugin will be open sourced once I clean it up and make it usable for someone beyond myself.

Background Information

Before I go over the results I wanted to explain and discuss a few things. Every provider I tested had the latest version of WordPress installed. I had to ask a lot of companies to disable some security features to perform accurate load tests. Those companies were: DreamHost, Kinsta, LightningBase, Nexcess, Pagely, Pressidium, PressLabs, SiteGround, and WPEngine.

Every company that uses a VPS based platform were standardized around 2GB of memory for their plan (or equivalent) in an effort to make those results more comparable. The exception is DreamHost which uses a VPS platform but uses multiple scaling VPSs.

CloudWays's platform that lets you deploy your WordPress stack to multiple providers: Digital Ocean, Amazon (AWS)'s EC2 servers or Google Compute Engine. I was given a server on each platform of near comparable specs (EC2 Small 1.7GB vs Digital Ocean 2GB vs GCE 1.7GB g1 Small). So CloudWays is listed as CloudWays AWS, CloudWays DO, CloudWays GCE to indicate which provider the stack was running on.

SiteGround contributed a shared and VPS account designated by the Shared or VPS after it.

Results

Load Storm

Since last round didn't have any real issues until 1000 users I skipped all the little tests and began with 100-1000 users. I also did the 500-2000 user test on every company instead of simply disqualifying companies. I ran these tests with an immense amount of help from Phillip Odom at LoadStorm. He spent hours with me, teaching me how to use LoadStorm more effectively, build tests and offering guidance/feedback on the tests themselves.

 Test 1. 100-1000 Concurrent Users over 30 minutes

Name of Test Total Requests Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred(GB) Peak Throughput(kB/s) Average Throughput(kB/s) Total Errors
A Small Orange 114997 90.27 61.83 1785 259 2.41 1878.14 1295.82 0
BlueHost 117569 93.62 63.21 15271 2522 5.41 4680.6 2909.16 23350
CloudWays AWS 138176 109.1 74.29 15086 397 7.15 6016.88 3844.49 44
CloudWays DO 139355 109.88 74.92 2666 321 7.21 5863.82 3876.3 0
CloudWays GCE 95114 76.22 52.84 15220 7138 3.63 3247.38 2014.92 23629
DreamHost 143259 113.57 77.02 15098 314 7.1 6136.75 3815.73 60
FlyWheel 128672 101.98 69.18 9782 571 7 6197.92 3764.6 333
GoDaddy 134827 104.6 72.49 15084 352 7.49 6368.32 4028.45 511
Kinsta 132011 102.98 70.97 3359 229 7.35 6078.95 3951.75 0
LightningBase 123522 100.73 68.62 4959 308 6.53 5883.15 3626.2 4
MediaTemple 134278 105.72 74.6 15096 363 7.45 6397.68 4140.7 640
Nexcess 131422 104.47 70.66 7430 307 7.17 6256.08 3854.27 0
Pagely 87669 70.8 47.13 7386 334 5.75 5090.11 3091.06 3
Pantheon 135560 106.42 72.88 7811 297 7.24 5908.27 3890.83 0
Pressidium 131234 103.03 70.56 7533 352 7.23 6092.36 3889.64 0
PressLabs 132931 107.43 71.47 10326 306 3.66 3264.02 1968.98 0
SiteGround Shared 137659 111.35 74.01 7480 843 6.85 5565.02 3683.04 111
SiteGround VPS 130993 103.45 70.43 15074 310 7.17 6061.82 3855.86 19
WebHostingBuzz
WPEngine 148744 117.15 79.97 15085 206 7.32 6224.06 3935.35 4
WPOven.com 112285 96.58 60.37 15199 2153 5.78 5680.23 3108.94 5594
WPPronto 120148 99.08 64.6 15098 681 5.61 4698.51 3018.33 19295

Discussion of Load Storm Test 1 Results

Most companies were ok with this test, but a few didn't do well: BlueHost, CloudWays GCE, WPOven and WPPronto. FlyWheel, GoDaddy and Media Temple had a couple spikes but nothing too concerning. I was actually able to work with someone at DreamHost this time and bypass their security features and their results look better than last time. I am also excited that we got PressLabs working this time around after the difficulties last round.

In general, the 1000 user test isn't terribly exciting, 7/21 companies got perfect scores with no errors. Another 6 didn't have more than 100 errors. Again, this test pointed out some weak candidates but really didn't do much for the upper end of the field.

Test 2. 500 - 2000 Concurrent Users over 30 Minutes

Note: Click the company name to see full test results.

Total Requests Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred(GB) Peak Throughput(kB/s) Average Throughput(kB/s) Total Errors
A Small Orange 242965 181.62 130.63 15078 411 5.09 3844.54 2737 1
BlueHost 201556 166.83 111.98 15438 8186 5.32 5229.07 2953.17 93781
CloudWays AWS 261050 195.23 145.03 15245 2076 13.13 9685.95 7296.4 11346
CloudWays DO 290470 218.17 161.37 15105 532 14.87 12003.3 8262.77 1189
CloudWays GCE 193024 147.22 107.24 15168 8291 4.72 4583.86 2622.85 93821
DreamHost 303536 232.27 163.19 15100 442 14.95 12619.67 8039.54 210
FlyWheel 253801 202.15 136.45 15218 1530 11.26 9939.17 6052.49 56387
GoDaddy 283904 221.12 152.64 15025 356 15.74 13731.97 8460.12 1432
Kinsta 276547 214.93 148.68 15025 573 15.16 13444.75 8151.37 1811
LightningBase 263967 211.12 141.92 7250 330 13.82 13061.01 7429.91 18
MediaTemple 286087 223.93 153.81 15093 355 15.83 14532.42 8512.11 1641
Nexcess 277111 207.73 148.98 15087 548 15 12313.29 8066.37 359
Pagely 181740 148.18 97.71 11824 791 11.82 10592.21 6355.09 1
Pantheon 287909 223.02 154.79 15039 276 15.28 13831.45 8217.49 3
Pressidium 278226 208.55 149.58 15044 439 15.28 12453.66 8213.63 12
PressLabs 280495 214.07 150.8 8042 328 7.66 6267.46 4118.34 0
SiteGround Shared 301291 231.93 161.98 15052 557 14.76 12799.09 7934.03 1837
SiteGround VPS 279109 209.67 150.06 12777 374 15.21 12506.79 8178.5 20
WebHostingBuzz
WPEngine 316924 241.67 170.39 7235 285 15.52 12989.23 8341.47 3
WPOven.com 213809 169.97 118.78 15268 4442 8.81 7153.5 4894.98 35292
WPPronto 258092 206.53 143.38 15246 539 10.85 9483.74 6026.26 76276

Discussion of Load Storm Test 2 Results 

The previous companies that struggled ( BlueHost, CloudWays GCE, WPOven and WPPronto) didn't improve, which is to be expected. FlyWheel which had a few spikes ran into more serious difficulties and wasn't able to withstand the load. CloudWays AWS ended up failing, but their Digital Ocean machine spiked but was able to handle the load.

The signs of load were much more apparent this round with a lot more spikes from many more companies. GoDaddy and Media Temple who also had spikes in the first test, had spikes again but seemed to be able to withstand the load.  Kinsta spiked early but was stable for the duration of the test. SiteGround Shared had a steady set of small spikes but didn't fail.

Nobody had the same level of perfection as last time with no spike in response times. Only one company managed an error-less run this time (PressLabs) but many achieved similar results, like A Small Orange went from 0 errors to 1, Pantheon went from 0 to 3 and Pagely had only 1 error, again.

The biggest change that occurred was WPEngine. It went from failing on the 1000 user test to having one of the better runs in the 2000 user test. I have to emphasize it was a shared plan on isolated hardware though with no competition for resources.

Blitz.io

 Test 1. 1-2000 Concurrent Users over 60 seconds

Blitz Test 1. Quick Results Table

Note: Click the company name to see full test results.

Company Hits Errors Timeouts Average Hits/Second Average Response Time Fastest Response Slowest Response
A Small Orange 51023 56 280 850 115 72 285
BlueHost 37373 475 2102 623 338 124 979
CloudWays AWS 56946 737 74 949 13 3 73
CloudWays DO 52124 1565 1499 869 35 23 87
CloudWays GCE 50463 1797 782 841 96 92 138
DreamHost 58584 1 0 978 4 4 4
FlyWheel 49960 3596 2022 833 30 24 140
GoDaddy 29611 26024 18 494 165 103 622
Kinsta 57723 1 0 962 20 20 21
LightningBase 54448 1 4 907 81 81 81
MediaTemple 29649 25356 126 494 162 104 1103
Nexcess 38616 4924 2200 644 221 70 414
Pagely 58722 1 0 979 3 2 5
Pantheon 55814 112 9 930 52 52 54
Pressidium 47567 1 9 793 233 233 234
PressLabs 58626 0 0 977 5 4 6
SiteGround Shared 49127 1123 1 819 172 171 178
SiteGround VPS 35721 75 4371 595 238 82 491
WebHostingBuzz
WPEngine 56277 827 1 938 27 21 70
WPOven.com 55027 10 2 917 69 68 71
WPPronto 54921 99 29 915 69 68 72

blitz_summary_graph

Discussion of Blitz Test 1 Results

This test is just testing whether the company is caching the front page and how well whatever caching system they have setup is performing (generally this hits something like Varnish or Nginx).

Who performed without any major issues?

DreamHost, Kinsta, LightningBase, Pagely, Pantheon, Pressidium, PressLabs, WPOven, WPPronto all performed near perfect. There's nothing more to say for these companies other than they did excellent.

Who had some minor issues?

A Small Orange started showing signs of load towards the end. CloudWays AWS had a spike and started to show signs of load towards the end. SiteGround Shared had a spike at the end that ruined a very beautiful looking run otherwise. WPEngine started to show signs of load towards the end of the test.

Who had some major issues?

BlueHost, CloudWays DO, CloudWays GCE, FlyWheel, GoDaddy, MediaTemple, Nexcess, and SiteGround VPS had some major issues. The CloudWays platform pushed a ton of requests (the only companies over 50,000) but also had a lot of errors and timeouts. The rest were below 50,000 (although FlyWheel was only a hair behind) and also had a lot of errors and timeouts. SiteGround VPS might be an example of how shared resources can get better performance versus dedicated resources. GoDaddy and Media Temple have near identical performance (again, it's the same technology I believe). Both look perfect until near the end where they crash and start erroring out. Nexcess just shows load taking its toll.

Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. All the companies were monitored over an entire month+ (May-June 2015).

Uptime Robot

Uptime (30 Day)
A Small Orange 100
BlueHost 100
CloudWays AWS 100
CloudWays DO 100
CloudWays GCE 100
DreamHost 94.06
FlyWheel 100
GoDaddy 100
Kinsta 100
LightningBase 100
MediaTemple 100
Nexcess 100
Pagely 100
Pantheon 99.94
Pressidium 100
PressLabs 100
SiteGround Shared 100
SiteGround VPS 100
WebHostingBuzz 42.9
WPEngine 100
WPOven.com 100
WPPronto 100

At this point, I will finally address the odd elephant in the blog post. WebHostingBuzz has empty lines for all the previous tests. Why? Because their service went down and never came back online. I was told that I put an incorrect IP address for the DNS. However, that IP worked when I started and was the IP address I was originally given (hence the 42% uptime, it was online when I started testing). It took weeks to even get a response and once I corrected the IP, all it ever got was a configuration error page from the server. I've not received a response yet about this issue and have written them off as untestable.

The only other company that had any major issue was DreamHost. I'm not sure what happened, but they experienced some severe downtime while I was testing the system and returned an internal server error for 42 hours.

Every other company had 99.9% uptime or better.

StatusCake

StatusCake had a slightly longer window available from their reporting interface, so the percentages are a little bit different and noticeable on companies like DreamHost.

StatusCake Availability (%) Response Time (ms)
A Small Orange 99.96 0.21
BlueHost 99.99 0.93
CloudWays AWS 100 0.76
CloudWays DO 100 0.47
CloudWays GCE 100 0.69
DreamHost 97.14 1.11
FlyWheel 100 1.25
GoDaddy 100 0.65
Kinsta 100 0.71
LightningBase 99.99 0.61
MediaTemple 100 1.38
Nexcess 100 0.61
Pagely 99.99 0.47
Pantheon 99.98 0.56
Pressidium 99.99 0.94
PressLabs 100 0.65
SiteGround Shared 100 0.54
SiteGround VPS 100 0.9
WebHostingBuzz 58.1 0.67
WPEngine 100 0.71
WPOven.com 100 0.73
WPPronto 100 1.19

The results mirror UptimeRobot pretty closely. WebHostingBuzz and DreamHost had issues. Everyone else is 99.9% or better.

StatusCake uses a real browser to track response time as well. Compared to last year, everything looks faster. Only two companies were sub one second average response time last year. This year, almost every company maintained sub one second response time, even the company that had servers in Europe (Pressidium).

WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
A Small Orange 0.624 0.709 0.391 0.8 0.631
BlueHost 0.909 1.092 0.527 0.748 0.819
CloudWays AWS 0.627 0.748 0.694 1.031 0.775
CloudWays DO 0.605 0.751 0.635 1.075 0.7665
CloudWays GCE 0.787 0.858 0.588 1.019 0.813
DreamHost 0.415 0.648 0.522 0.919 0.626
FlyWheel 0.509 0.547 0.594 0.856 0.6265
GoDaddy 0.816 1.247 0.917 0.672 0.913
Kinsta 0.574 0.559 0.587 0.903 0.65575
LightningBase 0.544 0.656 0.5 0.616 0.579
MediaTemple 0.822 0.975 0.983 0.584 0.841
Nexcess 0.712 0.871 0.593 0.795 0.74275
Pagely 0.547 0.553 0.665 0.601 0.5915
Pantheon 0.627 0.567 0.474 0.67 0.5845
Pressidium 0.777 0.945 0.898 1.05 0.9175
PressLabs 0.542 1.257 0.723 0.732 0.8135
SiteGround Shared 0.721 0.85 0.478 0.808 0.71425
SiteGround VPS 0.667 0.651 0.515 0.657 0.6225
WebHostingBuzz 0
WPEngine 0.648 0.554 0.588 0.816 0.6515
WPOven.com 0.624 0.574 0.556 0.595 0.58725
WPPronto 0.698 0.809 0.443 0.721 0.66775

In line with the StatusCake results, the WebPageTest results were shockingly fast. The first time I did this testing, only one company had a sub one second average response time. Last year about half the companies were over one second average response time. The fastest last year was LightningBase at 0.7455 seconds. This year that would be in the slower half of the results. The fastest this year was LightningBase again at 0.579 seconds. The good news for consumers appears to be that everyone is getting faster and your content will get to consumers faster than ever no matter who you choose.

WPPerformanceTester

Company PHP Ver MySql Ver PHP Bench WP Bench MySql
A Small Orange 5.5.24 5.5.42-MariaDB 13.441 406.67 LOCALHOST
BlueHost 5.4.28 5.5.42 12.217 738.01 LOCALHOST
CloudWays AWS 5.5.26 5.5.43 10.808 220.12 LOCALHOST
CloudWays DO 5.5.26 5.5.43 11.888 146.76 LOCALHOST
CloudWays GCE 5.5.26 5.5.43 10.617 192.2 LOCALHOST
DreamHost 5.5.26 5.1.39 27.144 298.6 REMOTE
FlyWheel 5.5.26 5.5.43 12.082 105.76 LOCALHOST
GoDaddy 5.4.16 5.5.40 11.846 365.76 REMOTE
Kinsta 5.6.7 10.0.17-MariaDB 11.198 619.58 LOCALHOST
LightningBase 5.5.24 5.5.42 12.369 768.64 LOCALHOST
MediaTemple 5.4.16 5.5.37 12.578 333.33 REMOTE
Nexcess 5.3.24 5.6.23 12.276 421.76 LOCALHOST
Pagely 5.5.22 5.6.19 10.791 79.79 REMOTE
Pantheon 5.5.24 5.5.337-MariaDB 12.669 194.86 REMOTE
Pressidium 5.5.23 5.6.22 11.551 327.76 LOCALHOST
PressLabs 5.6.1 5.5.43 8.918 527.7 REMOTE
SiteGround Shared 5.5.25 5.5.40 14.171 788.02 LOCALHOST
SiteGround VPS 5.6.99 5.5.31 11.156 350.51 LOCALHOST
WebHostingBuzz
WPEngine 5.5.9 5.6.24 10.97 597.37 LOCALHOST
WPOven.com 5.3.1 5.5.43 11.6 570.13 LOCALHOST
WPPronto 5.5.25 5.5.42 11.485 889.68 LOCALHOST

This test is of my own creation. I created a plugin designed to test a few aspects of performance and get information about the system it was running on. The results here have no bearing on how I am evaluating these companies because I don't have enough details to make these meaningful. My goal is to publish the plugin and get people to submit their own benchmarks though. This would allow me to get a better picture of the real performance people are experiencing from companies and track changes over time. The server details it extracted may be of some interest to many people. Most companies were running PHP 5.5 or later but a few aren't. Most companies seem to be running normal MySql, but ASO, Kinsta and Pantheon all are running MariaDB which many people think has better performance. Considering where all three of those companies ended up performing on these tests, it's not hard to believe. There seems to be an even split between running MySql localhost (BlueHost, LightningBase, Nexcess, SiteGround, WPEngine, WPPronto) or having a remote MySql server (DreamHost, GoDaddy, MediaTemple, Pagely, Pantheon, PressLabs).

The PHP Bench was fascinating because most companies were pretty close with the exception of DreamHost which took nearly twice as long to execute.

The WP Bench was all over the place. Pagely had by far the slowest result but on every load test and speed test they went through, they performed with near perfect scores. The test simulates 1000 $wpdb calls doing the primary mysql functions (insert, select, update, delete). Other companies had outrageously fast scores like WPPronto or BlueHost but didn't perform anywhere near as well as Pagely on more established tests.

For those reasons, I don't think this benchmark is usable yet. But I would love feedback and thoughts on it from the community and the hosting companies themselves.

Attempting VPS Parity in Testing

One substantial change to the testing methodology this round was that all VPS providers were tested with the same amount of memory (2 GB Ram). Since the most interesting tests were the load tests I have only graphed them below:

2gb-vps-loadstorm2gb-vps-blitz

The LoadStorm test had a huge spread in terms of performance. The Google Compute Engine server from CloudWays was by far the worst (an issue we touched on before that it's not a true VPS with dedicated resources). FlyWheel and WPOven also struggled to keep up with the LoadStorm test. Others like ASO, CloudWays DO, Kinsta, and SiteGround handled the test with minimal issues. On the other hand, it's very interesting to see how fairly consistent most of the VPSs perform in the Blitz test between 50,000 and roughly 55,000 hits. The error rates are a bit interesting though because this hardware should be about as close to the same as possible.

The easier result to explain is the Blitz performance. It is testing the ability of these companies to spit back a single page from cache (most likely Varnish or Nginx). So that level of caching seems to be pretty close to parity.

The LoadStorm test shows a wide difference in performance. The LoadStorm test is far more comprehensive and designed to bust through some caching and hit other parts of the stack. It really elucidates the difference in company's ability to tune and optimize their servers from both software and hardware perspectives.

Conclusion

Every service seems to have their issues somewhere if you look hard enough. I try to avoid injecting my personal opinion and bias as much as possible. As I've added more companies to the testing, drawing a line between which companies performed in the top tier and which did not has become blurrier. The closest test was the LoadStorm 2000 test where multiple companies (CloudWays DO, GoDaddy, Kinsta, Media Temple, SiteGround Shared) were on the absolute edge of being top tier providers. Last time I picked an arbitrary 0.5% error rate and these companies were all around the 0.5-0.7% mark. Last year the difference was quite large after that point. I openly admit to having personal connections with people at nearly all these companies and my ability to draw the line in this instance could be considered questionable. So this year I deferred the judgment to an independent party, Phillip Odom at LoadStorm, to determine what he thought of the performances. Phillip is the Director of Performance Engineering at LoadStorm and he has more experience with load testing and the LoadStorm product than almost anyone I know. His job was to determine if the performance could be considered top tier or not. He said a couple spikes early but a stable performance otherwise seemed top tier. The difference in 1/100 of a percent didn't seem like a big deal, especially over a 30 minute test where the issues were at the start as it ramped up to 2000 concurrent users. So the companies on the edge that exhibited that behavior were considered top tier for the LoadStorm test.

I won't be ranking or outright saying any single company is the best. Some providers did exceptionally well and tended to clump together performance-wise, I will call those the top tier providers. This top tier designation is related to performance only and is claimed only from the results of these tests. What each of these companies is offering is different and may best suit different audiences depending on a variety of factors beyond performance, such as features, price, support, and scale (I tested mostly entry level plans OR 2GB RAM plans for VPS providers). I will also provide a short summary and discussion of the results for each provider.

Top Tier WordPress Hosting Performance

IMG_24072015_210625

A Small Orange,  Kinsta, LightningBasePagely, Pantheon, PressidiumPressLabs

Each of these companies performed with little to no failures in all tests and exhibited best in class performance for WordPress hosting.

Honorable Mentions

CloudWays gets an honorable mention because it's Digital Ocean (DO) instance performed quite well overall. It had some issue with the Blitz test at the end but still managed to push through over 52,000 successful hits. It's Amazon stack performed better on the Blitz test but not as well on LoadStorm. I'm not sure why the performance of identical stacks is differing across tests so much between AWS/DO but they improved dramatically since the last test and are on the cusp of becoming a top tier provider.

SiteGround's Shared hosting also gets an honorable mention. It was on that edge for both LoadStorm and Blitz. It had one spike at the end of the Blitz test which caused it's error rate to spike but the response times didn't move.

WPEngine gets an honorable mention because they performed well on most tests. They struggled and were showing signs of load on the Blitz test though that kept them out of the top tier of providers.

Individual Host Analysis

A Small Orange [Reviews]

Another top tier performance from ASO. They didn't really struggle much with any of the tests. Although their performances were slightly below their results last time, it's hard to beat things like having zero errors during LoadStorm's test. It's become easier to launch the LEMP VPS stack which is also nice. All in all, the experience was in-line with what I would expect from a company that has one of the highest support ratings on our site.

BlueHost [Reviews]

Improved against their last results but well below par in the performance department. The pricing and performance just don't match yet.

CloudWays [Reviews]

CloudWays is always a fun company to test. They added another provider since their last test: Google Compute Engine (GCE). Their Digital Ocean and Amazon performances both went up substantially which tells me they've made major improvements on their WordPress stack. We did run into some huge flaws in GCE though which aren't CloudWays's fault. We used the g1.small server on GCE and ran into huge performance walls that were repeatable and inexplicable from a software standpoint. Google was contacted and we learned that the "g1 family has "fractional" CPU, meaning that not a full virtual CPU is assigned to a server. This also means that the CPU is shared with other VMs and "capped" if usage exceeds a certain amount. This is exactly what happened during the load test. The VM runs out of CPU cycles and has to wait for new ones being assigned on the shared CPU to continue to server requests." Essentially, it's not a real VPS with dedicated resources and I was told a comparable would be N1.standard1 which is 2-3x the price of the AWS/DO comparables servers. It doesn't make GCE a very attractive platform to host on if you're looking for performance and cost efficiency. CloudWays did show major improvements this round and earned themselves that honorable mention. They were by far the most improved provider between tests.

DreamHost [Reviews]

DreamPress improved their performance a lot over last round. In fact they did fantastically well on every load test once I got the opportunity to actually work with their engineers to bypass the security measures. However, they failed pretty badly on the uptime metrics. I have no idea what happened but I experienced a huge amount of downtime and ran into some very strange errors. If it wasn't for the severe downtime issues, DreamPress could have been in the top tier.

FlyWheel [Reviews]

FlyWheel were excellent on every test except the final seconds of the Blitz test. Although they were just shy of the top tier, they are showing a lot of consistency in very good performance getting an honorable mention the last two times. Just some minor performance kinks to work out. Not bad at all for a company with the best reviews of any company Review Signal has ever tracked. FlyWheel is definitely worth a look.

GoDaddy [Reviews]

GoDaddy's performance declined this round. It struggled with the Blitz test this time around. I'm not sure what changed, but it handled Blitz far worse than before and LoadStorm slightly worse. The performance between GoDaddy and Media Temple again looked near identical with the same failure points on Blitz. At the retail $6.99 price though, it's still a lot of bang for your buck compared to most providers who are in the $20-30/month range.

Kinsta

Kinsta had another top tier performance. There was a slight decline in performance but that could be explained by the fact we tested different products. Kinsta's test last year was a Shared plan they no longer offer. This year it was a 2GB VPS that we tested. Dedicated resources are great but sometimes shared gives you a little bit extra with good neighbors which could explain the difference. Either way, Kinsta handled all of the tests exceptionally well and earned itself top tier status.

LightningBase

LightningBase is another consistent performer on our list. Another test, another top tier rank earned. It had ridiculous consistency with the Blitz test where the fastest and slowest response were both 81ms. A textbook performance at incredible value of $9.95/month.

Media Temple [Reviews]

Media Temple and GoDaddy are still running the same platform by all indications. Media Temple offers a more premium set of features like Git, WP-CLI, Staging but the performance was identical. It declined from last time and had the same bottlenecks as GoDaddy.

Nexcess

I feel like copy and paste is the right move for Nexcess. Nexcess's performance was excellent in the Load Storm testing. However, it collapsed during the Blitz load testing. This was the same behavior as last year. It handled the Blitz test better this year, but still not well enough. Nexcess ends up looking like a middle of the pack web host instead of a top tier one because of the Blitz test, again.

Pagely [Reviews]

Is the extra money worth it? Only if you value perfection. Pagely came through again with an amazing set of results. It handled more hits than anyone in the Blitz test at a staggering 58,722 hits in 60 seconds (979 hits/second). We're approaching the theoretical maximum at this point of 1000 hits/second. And Pagely did it with 1 error and a 3ms performance difference from the fastest to slowest responses. The original managed WordPress company continues to put on dominant performance results.

Pantheon [Reviews]

Another test, another top tier performance. Just another day being one of the most respected web hosts in the space. Everyone I talk to wants to compare their company to these guys. It's obvious why, they've built a very developer/agency friendly platform that looks nothing like anything else on the market. It also performs fantastically well. They didn't perform the absolute best on any particular test but they were right in the top echelon with minimal errors on everything.

Pressidium

Pressidium was a new entrant and it did exceptionally well. They are UK based and suffered slightly on some performance tests because of latency between the UK and the US testing locations used. For example, the Blitz testing showed fewer  responses, but their total of 10 errors shows pretty clearly that it was a throughput across the Atlantic ocean issue more than their service struggling because it had a 1 second spread from the fastest to slowest response. Incredibly consistent performance. Despite their geographic disadvantage in this testing they still managed to keep a sub-one second response from four US testing locations in the WebPageTest testing. Overall, a top tier performance from a competitor from across the pond.

PressLabs

We finally got PressLabs working with the LoadStorm testing software. And it was worth the wait. They were the only company to handle the 2000 logged in user test with zero errors. Combined with the second fastest Blitz test (again without a single error) puts PressLabs firmly in the top tier as you would expect from the most expensive offering tested this round.

SiteGround [Reviews]

It was nice that we finally worked out the security issues in testing with LoadStorm on SiteGround. SiteGround's Shared hosting platform bounced back after last years testing. Their Blitz performance went up substantially and put it back into the honorable mention category. The VPS performance was slightly worse on the Blitz test, but noticeably better on the much longer LoadStorm test. This could be a good example of when Shared hosting can outperform dedicated resources because Shared hosting generally has access to a lot more resources than smaller VPS plans. Depending on how they are setup and managed, you can often get more burst performance from Shared over a small VPS. But in the longer term, dedicated resources are generally more stable (and guaranteed). SiteGround's Shared hosting definitely helps keep the lower priced options with excellent performance a reality for many.

WebHostingBuzz

WebHostingBuzz asked to be included in this testing and then completely disintegrated to the point I couldn't even test them. I still never heard anything from them for months. I would like to know what happened, but until I actually get a response, this one will remain a bizarre mystery.

WPEngine [Reviews]

This is a difficult one to write about. There are definitely performance improvements that occurred. They jumped up to an honorable mention. Their engineers actually worked to resolve some security issues that hindered previous testing. My biggest concern is the isolated shared environment I was on. A shared environment has a lot more resources than many dedicated environments and I was isolated away to prevent the testing from affecting any customers (which is a reasonable explanation). But that means I was likely to be getting the absolute dream scenario in terms of resource allocation, so a normal user would see this in the very best case scenario. So WPEngine is certainly capable of delivering better performance than they did in the past, but I do have concerns about the reasonable expectation of a new user getting the same results.

WPOven

WPOven was another new entrant to this testing and they performed well in a couple tests. They flew through the Blitz test without any issues. Their WebPageTest results were one of the absolute fastest in an already fast pack. Their uptime was perfect. They did struggle with the LoadStorm tests though both at the 1000 and 2000 user levels. It's nice to see more competitors enter the space, WPOven put on a good first show, but there is still some serious improvements to make to catch up to the front of the field.

WPPronto

Another new entrant who ran into a severe testing issue which caused me to re-do all the tests. The server was given more resources than the plan specified while debugging some security issues. The results on the extra resources were on par with some of the top in the field, but not representative of what the actual plan would be able to achieve. I didn't believe it was malicious (they were quite transparent about what happened), so I gave the benefit of the doubt and re-did all testing in a closely monitored condition. With the default resource allocation, WPPronto couldn't withstand LoadStorm's test. The results were pretty easy to see in the 508 errors it started to throw on the properly resourced plan. It ran out of processes to handle new connections as expected. As with all new entrants that don't leap to the forefront, I hope they continue to improve their service and do better next round.

 

Thank You

Thank you to all the companies for participating and helping make this testing a reality. Thanks to LoadStorm and specifically Phillip Odom for all his time and the tools to perform this testing. Thanks to Peter at Kinsta for offering his design support.

 

Updates

8/13/2015 : The wrong PDF was linked for DreamHost and its Blitz numbers were adjusted to reflect their actual performance. This change has no effect on how they were ranked since the issue was with downtime.

 

WordPress Hosting Performance Benchmarks (November 2014)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

This is the second round of managed WordPress web hosting performance testing. You can see the original here. The latest (2015 Edition) can be found here.

Companies Tested

A Small Orange* [Reviews]
BlueHost [Reviews]
CloudWays* [Reviews]
DreamHost [Reviews]
FlyWheel* [Reviews]
GoDaddy* [Reviews]
Kinsta*
LightningBase*
MediaTemple* [Reviews]
Nexcess*
Pagely* [Reviews]
Pantheon* [Reviews]
PressLabs*
SiteGround*† [Reviews]
WebSynthesis* [Reviews]
WPEngine* [Reviews]

Note: Digital Ocean and Pressable were removed from testing.

*Company donated an account to test on. I checked to make sure I was on what appeared to be a normal server.

†Tests were performed with SiteGround's proprietary SuperCacher module turned on fully.

The Products (Click for Interactive Table)

 

wordpress hosting product chart screenshot

Methodology

The question I tried to answer is how well do these WordPress hosting services perform? I tested each company on two distinct measures of performance: peak performance and consistency.

All tests were performed on an identical WordPress dummy website with the same plugins except in cases where hosts added extra plugins. Each site was monitored for one month (July 2014) for consistency.

1. LoadStorm

LoadStorm was kind enough to give me unlimited resources to perform load testing on their platform and multiple staff members were involved in designing and testing these WordPress hosts. I created identical scripts for each host to load a site, login to the site and browse the site. Then I increased the user load until a web host started to fail. I stopped at 2000 concurrent users for the web hosts that were left unscathed by load testing. Logged in users were designed to break some of the caching and better simulate real user load which a lot of people (both readers and hosting companies) requested after the first round of testing.

2. Blitz.io

I used Blitz again to compare against previous results. First test was 60 seconds, scaling from 1-1000 users. The second test was 60 seconds, scaling from 1-2000.

3. Uptime (UptimeRobot and StatusCake)

Consistency matters. I wanted to see how well these companies performed over a longer period of time. I used two separate uptime monitoring services over the course of a month to test consistency.

4. WebPageTest.org

"WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster." WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. I tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Background Information

Before I go over the results I wanted to explain and discuss a few things. Every provider I tested had the latest version of WordPress installed. I had to ask a lot of companies to disable some security features to perform accurate load tests. Those companies were: GoDaddy, LightningBase, MediaTemple, SiteGround and WebSynthesis. I also asked DreamHost and WPEngine, but they refused my request.

Some companies were more cooperative than others. SiteGround spent hours with me customizing their security features to let the load testing tools bypass their security measures. PressLabs ran into an issue we were never able to resolve to get Load Storm to work properly on their servers. We spent hours trying to fix it, but couldn't find a solution. That's why they are missing some test data.

CloudWays is an interesting platform that let's you deploy your WordPress stack to either Digital Ocean or Amazon's EC2 servers. I was given a server on each platform of near comparable specs (EC2 Small 1.7GB vs Digital Ocean 2GB). So CloudWays is listed as CloudWays AWS and CloudWays DO to indicate which provider the stack was running on.

Pantheon was tested on their free development environment which I was told is identical to their production environment.

Results

Load Storm

I ran multiple Load Storm tests to get a sense of where to start testing. The first was 1-100 users, which not a single company struggled with. The second was 50-500 users, which again nobody struggled with. So the first meaningful test was 100-1000 users. For the companies that didn't struggle there, I did a 500-2000 user test. I ran these tests with an immense amount of help from Scott Price at LoadStorm. He spent hours with me, teaching me how to use LoadStorm, build tests and offering guidance/feedback on the tests themselves.

 Test 1. 100-1000 Concurrent Users over 30 minutes

 

Request Count Average RPS Peak Response Time (ms) Average Response Time (ms) Average Throughput (kB/s) Errors
A Small Orange 116127 64.52 2752 356 1318.55 41
BlueHost 107427 59.68 16727 1306 1159.55 13351
Cloudways DO 103359 55.57 16983 1807 1169.28 2255
Cloudways AWS 87447 47.01 16286 5436 821.75 18530
DreamHost 115634 62.17 15514 441 1244.31 4327
FlyWheel 116027 62.38 775 368 1287.86 0
GoDaddy 133133 71.58 1905 434 3883.42 0
Kinsta 116661 62.72 552 309 1294.77 0
LightningBase 117062 62.94 1319 256 1324.89 12
MediaTemple 116120 62.43 793 403 1304.27 0
Nexcess 116634 62.71 15085 294 1299.85 8
Pagely 119768 64.39 1548 461 1227.06 0
Pantheon 117333 63.08 528 264 1316.41 0
SiteGround 117961 63.42 939 165 180.09 0
WebSynthesis 116327 62.54 1101 332 1285.83 0
WPEngine 123901 68.83 10111 416 1302.44 2956

Discussion of Load Storm Test 1 Results

There was a pretty clear division of good and bad performance in this test. Most companies didn't struggle at all. A few collapsed: BlueHost, CloudWays AWS, CloudWays DO, and DreamHost. BlueHost started spewing 500 errors almost as soon as we started. CloudWays AWS started timing out immediately. CloudWays DO started having issues around 800 users and then started timing out. DreamHost started giving 503 Service Unavailable almost right away. It looks like our script triggered a security mechanism but they refused to work with me to test any further.

SiteGround ran into a security measure we weren't able to get around in time for publishing this article. The server seemed to just throttle the connection again.

PressLabs isn't listed because we couldn't get LoadStorm to work on their system. I am not sure what was different about their backend, but I tried to work with PressLabs and LoadStorm to get it working to no avail.

 

  • Load-Storm-A-Small-Orange-2000
  • Load-Storm-Fly-Wheel-2000
  • Load-Storm-GoDaddy-2000
  • Load-Storm-Kinsta-2000
  • Load-Storm-Lightning-Base-2000
  • Load-Storm-Nexcess-2000
  • Load-Storm-Pagely-2000
  • Load-Storm-Pantheon-2000
  • Load-Storm-SiteGround-2000
  • Load-Storm-WebSynthesis-2000

Test 2. 500 - 2000 Concurrent Users over 30 Minutes

I removed the hosts that failed and doubled the concurrent users for the second test.

Request Count Average RPS Peak Response Time (ms) Average Response Time (ms) Average Throughput (kB/s) Errors
A Small Orange 248249 133.47 5905 436 2639.68 0
FlyWheel 236474 127.14 3811 983 2499.11 16841
GoDaddy 285071 153.26 8896 371 8255.24 92
Kinsta 248765 133.74 942 316 2714.82 0
LightningBase 248679 133.7 3887 343 2763.92 23
MediaTemple 249125 133.94 1499 313 2748.32 9
Nexcess 243115 130.71 15097 388 2644.72 80
Pagely 256163 137.72 15078 446 2621.04 1
Pantheon 250063 134.44 1111 297 2754.67 0
WebSynthesis 240305 129.2 4389 743 2598.83 1173

Discussion of Load Storm Test 2 Results 

FlyWheel started to fail around 1500 users causing 502 errors and remained constant at that level of failure. I'm not sure what the bottleneck was, but it didn't overload the server, but I suspect the I/O of something bottle-necked causing a certain amount of requests to fail. WebSynthesis had a few errors as well, they were 5 separate spikes somewhat evenly spaced out. The server didn't show signs of failure, it looks like it might have been an issue with caches being refreshed and some requests failing in the meantime. WebSynthesis' error rate was still under 0.5%, so I don't have any real issue with those errors. The slower average response time can also be attributed to the spikes in performance.

Remarkably, some companies didn't even struggle. Kinsta kept sub one second response times for 30 minutes and nearly a quarter million requests. Most companies had a spike or two causing a higher peak response time, but Kinsta and Pantheon didn't (and Media Temple had a tiny one at 1.5 seconds). Simply amazing performance.

Another interesting note, GoDaddy pushed triple the amount of data through because their admin screen had a lot more resources being loaded. That's why the average throughput is so high. Despite that fact, it didn't seem to impact their performance at all, which is astounding.

Full Interactive Test Results

A Small Orange
FlyWheel
GoDaddy
Kinsta
LightningBase
MediaTemple
Nexcess
Pagely
Pantheon
SiteGround
WebSynthesis

Blitz.io

 Test 1. 1-1000 Concurrent Users over 60 seconds

Blitz Test 1. Quick Results Table

Success Errors Timeouts Avg Hits/Second Avg Response (ms)
A Small Orange 27595 14 0 460 67 ms
BlueHost 23794 1134 189 397 160 ms
CloudWays AWS 24070 162 148 401 138 ms
CloudWays DO 27132 118 127 452 49 ms
DreamHost 13073 45 7885 218 21 ms
FlyWheel 28669 20 10 478 27 ms
GoDaddy 26623 8 5 444 104 ms
Kinsta 27544 0 0 459 69 ms
LightningBase 27893 0 1 465 56 ms
MediaTemple 26691 8 9 445 102 ms
Nexcess 18890 2288 641 337 517 ms
Pagely 25358 9 0 423 156 ms
Pantheon 27676 21 0 461 64 ms
PressLabs 25903 143 0 432 89 ms
SiteGround 24939 0 0 416 152 ms
WebSynthesis 28913 0 0 482 19 ms
WPEngine 23074 121 4 385 247 ms

Discussion of Blitz Test 1 Results

I learned from the last round of testing that any hosting that isn't optimized at all for WordPress (default install) will get destroyed by these tests. So I didn't include any of them this time. There wasn't any as catastrophic failures this time.

Who performed without any major issues?

A Small Orange, FlyWheel, GoDaddy, Kinsta, LightningBase, MediaTemple, Pagely, Pantheon, SiteGround, WebSynthesis all performed near perfect. There's nothing more to say for these companies other than they did excellent. All of their error/time rates were below 0.5%.

Who had some minor issues?

CloudWays AWS, CloudWays DO, PressLabs and WPEngine. All four of these providers had over 100 errors/timeouts and had an error/timeout rates between 0.5%-2%. Not a huge deal, but definitely not perfect.

Who had some major issues?

BlueHost, DreamHost, and Nexcess. BlueHost started show stress around 40 seconds in and started to buckle around 47 seconds. DreamHost had a couple spikes in response time and errors. However, it looks like the load testing tool may have hit some type of security limit because requests started timing out but it gave very fast responses and maintained roughly 250 hits/second constantly. It doesn't look like the server was failing. I couldn't get them to disable the security to really test it, so it's hard to say much more. Nexcess started to show stress around 20 seconds and buckle around 30 seconds.

 Test 2. 1-2000 Concurrent Users over 60 seconds

  • Blitz-A-Small-Orange-2000
  • Blitz-Blue-Host-2000
  • Blitz-CloudWays-AWS-2000
  • Blitz-CloudWays-DO-2000
  • Blitz-Dream-Host-2000
  • Blitz-Fly-Wheel-2000
  • Blitz-GoDaddy-2000
  • Blitz-Kinsta-2000
  • Blitz-LightningBase-2000
  • Blitz-Media-Temple-2000
  • Blitz-Nexcess-2000
  • Blitz-Pagely-2000
  • Blitz-Pantheon-2000
  • Blitz-PressLabs-2000
  • Blitz-SiteGround-2000
  • Blitz-WebSynthesis-2000
  • Blitz-WPEngine-2000

Blitz Test 2. Quick Results Table

Success Errors Timeouts Avg Hits/Second Avg Response (ms)
A Small Orange 54152 26 1 903 77 ms
BlueHost 29394 14368 3408 490 234 ms
CloudWays AWS 25498 4780 8865 425 338 ms
CloudWays DO 53034 1477 49 884 58 ms
DreamHost 10237 5201 20396 171 201 ms
FlyWheel 56940 121 68 949 29 ms
GoDaddy 53262 29 64 888 105 ms
Kinsta 55011 32 0 917 69 ms
LightningBase 55648 0 0 927 58 ms
MediaTemple 53363 16 28 889 100 ms
Nexcess 25556 15509 4666 426 279 ms
Pagely 51235 41 2 854 147 ms
Pantheon 55187 91 0 920 65 ms
PressLabs 35547 4105 1569 592 326 ms
SiteGround 42645 490 220 711 276 ms
WebSynthesis 57776 1 0 963 20 ms
WPEngine 39890 304 333 665 364 ms

Discussion of Blitz Test 2 Results

Who performed without any major issues?

A Small Orange, FlyWheel, GoDaddy, Kinsta, LightningBase, MediaTemple, Pagely, Pantheon, WebSynthesis all performed near perfect. All of their error/time rates were around 0.5% or lower.

Who had some minor issues?

SiteGround and WPEngine. All four of these providers had over 100 errors/timeouts and had an error/timeout rates between 0.5%-2%. SiteGround started to show some stress around 30 seconds and didn't started to have real issues after 50 seconds (errors). WPEngine started to show stress around 20 seconds and performed slightly erratically until the end of the test.

Who had some major issues?

BlueHost, CloudWays AWS, CloudWays DO, DreamHost, Nexcess, and PressLabs. The four that had major issues from last around completely failed with error/timeout rates exceeding 50%. DreamHost who looked like it was fine behind the security measures buckled around 35 seconds into this test and started returning errors, increased response times and the hits/second dropped. CloudWays DO definitely started to stress and show signs of buckling around 50 seconds. But its error rate was still under 3%. I don't think it would have lasted much longer had the tests gone further, but it was the least worst failure. PressLabs was a surprise, it started to show stress around 25 seconds and started to buckle around 35 seconds into the test.

 Full Blitz Results (PDFs)

A Small OrangeBlueHost, CloudWays AWS, CloudWays DO, DreamHost, FlyWheel, GoDaddy, Kinsta, LightningBase, MediaTemple, Nexcess, Pagely, Pantheon, PressLabs, SiteGroundWebSynthesis, WPEngine.

Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. All the companies were monitored over an entire month (July 2014).

Uptime Robot

Uptime (%)
A Small Orange 100
BlueHost 99.71
CloudWays AWS 100
CloudWays DO 99.93
DreamHost 99.92
FlyWheel 99.97
GoDaddy 99.9
Kinsta 100
LightningBase 100
MediaTemple 99.81
Nexcess 100
Pagely 99.95
Pantheon 100
PressLabs 100
SiteGround 100
WebSynthesis 100
WPEngine 100

According to UptimeRobot not a single company was below 99.5% uptime. In fact, with the exception of Media Temple and BlueHost, they were all above 99.9% uptime. For reference 99.5% uptime is 3.5 hours of downtime per month. 99.9% is <45 minutes of downtime per month. Overall, nothing to really complain about according to Uptime Robot.

StatusCake

Availability (%) Response Time (ms)
A Small Orange 1 0.23s
BlueHost 0.9969 2.45s
CloudWays AWS 0.998 0.75s
CloudWays DO 1 2.41s
DreamHost 1 2.22s
FlyWheel 0.999 1.99s
GoDaddy 1 2.41s
Kinsta 1 2.13s
LightningBase 1 1.6s
MediaTemple 1 1.18s
Nexcess 1 2.33s
Pagely 1 2.49s
Pantheon 1 2.04s
PressLabs 1 1.49s
SiteGround 0.9993 1.64s
WebSynthesis 1 1.77s
WPEngine 1 2.76s

According to StatusCake, the results look even better. I used multiple services to monitor because there can be networking issues unrelated to a web host's performance. StatusCake only detected issues with four companies, which is fewer than UptimeRobot detected. It's hard to say which is better or right. But they both say that uptime didn't really seem to be an issue for any company.

StatusCake also provides an average response time metric. According to them, it's using a browser instance and fully rendering the page. They also have many different geographical locations that they are testing from. I don't have any further insight into these tools beyond what I can read on their documentation. If they are to be believed, A Small Orange has astonishingly fast performance. WPEngine is the slowest average load time at 2.76 seconds which isn't that bad.

 

WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only. This was tested against the default install from every company.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
A Small Orange 1.443 0.801 0.836 0.64 0.93
BlueHost 1.925 1.321 1.012 0.785 1.26075
CloudWays AWS 0.655 0.867 0.967 0.746 0.80875
CloudWays DO 0.493 0.851 1.036 0.811 0.79775
DreamHost 1.177 0.863 1.067 1.147 1.0635
FlyWheel 0.497 0.864 1.066 1.109 0.884
GoDaddy 1.607 1.355 0.934 0.855 1.18775
Kinsta 0.759 0.752 0.947 0.592 0.7625
LightningBase 0.584 0.787 0.936 0.675 0.7455
MediaTemple 1.516 0.983 0.955 0.555 1.00225
Nexcess 1.433 1.139 1.196 0.859 1.15675
Pagely 6.831 0.86 0.913 0.709 2.32825
Pantheon 0.654 0.828 0.923 0.954 0.83975
PressLabs 0.715 1.018 1.213 0.723 0.91725
SiteGround 1.392 1.239 1.01 1.212 1.21325
WebSynthesis 0.407 0.835 0.982 1.024 0.812
WPEngine 0.821 1.086 0.839 0.685 0.85775

There isn't much surprising here. The pack is really tight with less than a half second difference average between the top and bottom hosts. If we exclude Pagely. I'm not sure what happened with their Dulles, VA test, but it seems like there was something terribly wrong with the network when I tested it. The average response times from every other location were incredibly fast (<1 second). I'm going to chalk it up to a bad node somewhere causing that particular test to perform so poorly, almost certainly not a reflection of their hosting.

What is interesting, compared to last time is that these companies are getting faster. There was only one company with a sub 1 second average last time. Now there are 10 companies (11 if you count Pagely). Three of them were above one second last time, so they are showing signs of improvement (Pagely, WebSynthesis, WPEngine). It also means there is a lot of new competition that is not behind the entrenched players in terms of performance.

Conclusion

Every service seems to have their issues somewhere if you look hard enough. I try to avoid injecting my personal opinion and bias as much as possible. So I won't be ranking or outright saying any single company is the best. Some providers did exceptionally well and tended to clump together performance-wise, I will call those the top tier providers. This top tier designation is related to performance only and is claimed only from the results of these tests. What each of these companies is offering is different and may best suit different audiences depending on a variety of factors beyond performance, such as features, price, support, and scale (I tested mostly entry level plans). But I will provide a short summary and discussion of the results for each provider.

Top Tier WordPress Hosting Performance

A Small Orange, GoDaddy, Kinsta, LightningBase, MediaTemple, Pagely, Pantheon, WebSynthesis

Each of these companies were below the 0.5% error rate on all load testing all the way up to 2000 concurrent users on both LoadStorm and Blitz.

Honorable Mention

FlyWheel gets an honorable mention. They performed really well on many of the tests. FlyWheel fell apart on the final LoadStorm test to 2000 logged in users. I'll explain more in their individual section as to why this is deserving of an honorable mention.

Amazon Web Services (AWS) vs Digital Ocean

One of the most interesting comparisons to me was CloudWays. They provide you with the ability to choose which VPS provider and type you want. It then sets up their WordPress configuration (in an identical manner from my understanding) on the VPS. I was granted access to one Amazon and one Digital Ocean VPS from them. The Amazon was a small (1.7GB ram) and the Digital Ocean was a 2GB ram instance.

aws_vs_digital_ocean_loadstorm

The head-to-head results from LoadStorm (1000 user test) results above pretty clearly show Digital ocean performing better in every category (with the exception of Peak Response Time which is a timeout). Digital Ocean sent more data, had less errors and it did it faster.

aws_vs_digital_ocean_blitz

The Blitz.io results show pretty clearly that Digital Ocean is outperforming AWS by a wide margin as well. It delivered twice as many hits with less errors and time outs.

It's pretty easy to conclude based on the tests that on the low-end VPSs, that Digital Ocean's hardware outperforms Amazon's hardware.

Individual Host Analysis

A Small Orange

They've improved their LEMP stack since the last time I tested. They never buckled in any test and were definitely one of the best. Their staff was incredibly friendly (special thank you to Ryan MacDonald) and they've stepped up their performance game. The one thing that isn't quite there yet is the documentation/user experience, there are a lot of improvements they could make to make their LEMP stack more accessible to the less tech savvy. All in all, the experience was in-line with what I would expect from a company that has one of the highest support ratings on our site.

BlueHost

Their WordPress offering is brand new. It struggled in every load test. Their price is on the middle-high end but the performance was not. Ultimately, they fell short of where I would expect based on pricing and the competition.

CloudWays

CloudWays was certainly an interesting company to test given that they had two entries, one running on Amazon Web Services (EC2) and another on Digital Ocean. The Digital Ocean VPS outperformed AWS in every category which was interesting. The AWS instance's performance was near the bottom of the pack performance wise, but the Digital Ocean one was in the middle. It is a very interesting platform they have built which allows deployment and management across providers. However, their performance isn't quite there yet. Other companies are running on the same hardware and getting better results. CloudWays doesn't do just WordPress, so it's easy to understand why their performance might not quite be as good as some of their competitors who solely focus on WordPress.

DreamHost

DreamPress was another disappointment. The security features hid some of the performance weakness on the first Blitz test, but it completely failed on the second. The way DreamPress is designed it says it has automatic RAM scaling and each site is run by two VPS instances. It's very unclear what resources you are really getting for your money. They are charging $50/month for a 1GB ram VPS, so I get the feeling a lot of resources are shared and it may not be a true VPS.

FlyWheel

FlyWheel were excellent on every test except the final 2000 logged in user test from LoadStorm. They are built on top of Digital Ocean and I was using the smallest VPS. Yet their performance beat VPSs on Digital Ocean that had four times the resources (CloudWays DO). For cached content on the Blitz test, they had the second highest hits/second and response time. I suspect the testing hit a hardware maximum. FlyWheel had the best performance with the lowest dedicated resources (512MB ram). The companies that outperformed it had more resources dedicated to them or shared resources which presumably would allow access to far greater than 512MB ram. It was an impressive performance given what they are selling and combined with them having the best reviews of any company Review Signal has ever tracked. FlyWheel certainly merit serious consideration.

GoDaddy

GoDaddy continues to surprise me. They flew through all the tests, including a weird issue where they transferred 3X the data during the LoadStorm test and didn't show any signs of stress. The only comparison I have to last time is the Blitz testing, where they eked out another 3000+ hits and raised their hits/second from 829 to 888. GoDaddy also raised their max hit rate marginally from 1750 to 1763. What's more impressive is they reduced their errors+timeouts from 686 to 93. More hits with less errors. From a performance perspective, they did excellent in absolute terms and relative to their last benchmarks.

Kinsta

A new-comer that jumped straight to the top of the performance tiers. Kinsta's performance was amazing in the Load Storm 2000 logged in user test. They had the lowest peak response time and zero errors over a 30 minute test. They didn't struggle with any tests whatsoever and showed zero downtime. Kinsta's performance was top tier.

LightningBase

Another new-comer that jumped straight to the top. One of the cheapest too starting at under $10. LightningBase aced the Blitz testing and did excellent on Load Storm. There was no downtime monitored. LightningBase belongs in the top tier and is delivering amazing value.

Media Temple

Media Temple is interesting because I was told it was running the same technology as GoDaddy (GoDaddy bought Media Temple a year ago). They have a few more premium features like Git and a staging environment. Media Temple's performance was superb. It actually beat GoDaddy's performance in just about every measure by a marginal amount on both Load Storm and Blitz's load testing. If GoDaddy has top tier performance, Media Temple certainly does as well.

Nexcess

Nexcess's performance was excellent in the Load Storm testing. However, it completely collapsed during the Blitz load testing. I'm really not sure what to make of those results. Perhaps the underlying shared hardware is very good but the static caching setup isn't quite up to snuff? It's probably not worth speculating, suffice to say, Nexcess ended up looking like a middle of the pack web host instead of a top tier one because of the Blitz test.

Pagely

Pagely put on another spectacular performance. They handled the Load Storm test with 1 error. Blitz results stayed similar to the last run. They handled more hits, but had a few more errors+timeouts (1 last time, 43 this time). Really not much to add here other than they continue to be in the top tier.

Pantheon

Pantheon specialized in Drupal hosting, so I was wondering how well it would translate to WordPress. The short answer is, it converted over really well. They had a flawless run on the LoadStorm test - 0 errors and not even any spikes in response time over 30 minutes. They are one of the most expensive (only second to PressLabs) options on this list, but definitely make a case for it. Perfect uptime and near flawless load testing sent them easily into the top tier.

PressLabs

It's hard to write much about PressLabs because we couldn't get LoadStorm to work properly to test out their hosting. However, their Blitz results were lackluster. For the most expensive plan we tested, it was a bit of a disappointment to see it not do stunningly well.

SiteGround

SiteGround sadly didn't do as well as they did last time. Their Blitz load testing score went down slightly. We couldn't bypass their security measures to properly test Load Storm. They obviously have some good protection measures to prevent malicious users from trying to access too many things, but it also meant I couldn't get a deeper look this time around. That was a change from the last round of testing. Slightly disappointing to see the performance dip, but I hope it was due to the extra security measures they put in place that made testing them difficult.

WebSynthesis

WebSynthesis was teetering on the Load Storm test of having too many errors (0.5%), but they were under it and handled the test quite well. They also had no weird security issues this time around, and WebSynthesis led the pack on Blitz testing. They went from 871 hits/second to 963 hits/second; leading every provider on the Blitz tests with a whopping 1 error to boot. Sprinkle in some perfect up time numbers and it's clear WebSynthesis is still a top tier provider and is continuing to get better.

WPEngine

I feel like I could copy+paste my last conclusion about WPEngine. "WPEngine had some issues. Uptime was not one of them, they were perfect or upwards of 99.9% in that department. However, their performance shortcomings became apparent during the load tests." They didn't even make it to the final round of Load Storm testing. They were also middle of the pack on the Blitz testing. Compared to the last round of Blitz testing, the results were nearly identical, with slightly fewer errors+timeouts. I'm not sure if I should be disappointed to not see improvement or relieved to see them maintain the exact same performance and consistency. Their vaunted rankings on Review Signal's reviews have slipped relative to a few of the other providers on here (FlyWheel and WebSynthesis). While they were once leading the pack in technology, the rest of the pack is starting to catch up.

 

Thank Yous

A special thanks goes out to the sponsor of this post and an individual employee, Scott Price of Load Storm, who worked countless hours with me in order to perform these tests.

I want to thank all the companies that participated in these tests. I tested the support staff a fair bit at some of them and I thank them for their time and patience.

A special thanks goes to Chris Piepho from LightningBase also provided a lot of feedback based on the original article and helped improve the methodology for this round.

A huge thanks goes out to Mark Gavalda at Kinsta for his feedback and performance testing discussions. He's tested some further out stuff than I have like HHVM and php-ng performance. Also to their designer, Peter Sziraki, who designed the the header image for this article.