WordPress Hosting Performance Benchmarks (November 2014)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

This is the second round of managed WordPress web hosting performance testing. You can see the original here. The latest (2015 Edition) can be found here.

Companies Tested

A Small Orange* [Reviews]
BlueHost [Reviews]
CloudWays* [Reviews]
DreamHost [Reviews]
FlyWheel* [Reviews]
GoDaddy* [Reviews]
Kinsta*
LightningBase*
MediaTemple* [Reviews]
Nexcess*
Pagely* [Reviews]
Pantheon* [Reviews]
PressLabs*
SiteGround*† [Reviews]
WebSynthesis* [Reviews]
WPEngine* [Reviews]

Note: Digital Ocean and Pressable were removed from testing.

*Company donated an account to test on. I checked to make sure I was on what appeared to be a normal server.

†Tests were performed with SiteGround's proprietary SuperCacher module turned on fully.

The Products (Click for Interactive Table)

 

wordpress hosting product chart screenshot

Methodology

The question I tried to answer is how well do these WordPress hosting services perform? I tested each company on two distinct measures of performance: peak performance and consistency.

All tests were performed on an identical WordPress dummy website with the same plugins except in cases where hosts added extra plugins. Each site was monitored for one month (July 2014) for consistency.

1. LoadStorm

LoadStorm was kind enough to give me unlimited resources to perform load testing on their platform and multiple staff members were involved in designing and testing these WordPress hosts. I created identical scripts for each host to load a site, login to the site and browse the site. Then I increased the user load until a web host started to fail. I stopped at 2000 concurrent users for the web hosts that were left unscathed by load testing. Logged in users were designed to break some of the caching and better simulate real user load which a lot of people (both readers and hosting companies) requested after the first round of testing.

2. Blitz.io

I used Blitz again to compare against previous results. First test was 60 seconds, scaling from 1-1000 users. The second test was 60 seconds, scaling from 1-2000.

3. Uptime (UptimeRobot and StatusCake)

Consistency matters. I wanted to see how well these companies performed over a longer period of time. I used two separate uptime monitoring services over the course of a month to test consistency.

4. WebPageTest.org

"WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster." WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. I tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Background Information

Before I go over the results I wanted to explain and discuss a few things. Every provider I tested had the latest version of WordPress installed. I had to ask a lot of companies to disable some security features to perform accurate load tests. Those companies were: GoDaddy, LightningBase, MediaTemple, SiteGround and WebSynthesis. I also asked DreamHost and WPEngine, but they refused my request.

Some companies were more cooperative than others. SiteGround spent hours with me customizing their security features to let the load testing tools bypass their security measures. PressLabs ran into an issue we were never able to resolve to get Load Storm to work properly on their servers. We spent hours trying to fix it, but couldn't find a solution. That's why they are missing some test data.

CloudWays is an interesting platform that let's you deploy your WordPress stack to either Digital Ocean or Amazon's EC2 servers. I was given a server on each platform of near comparable specs (EC2 Small 1.7GB vs Digital Ocean 2GB). So CloudWays is listed as CloudWays AWS and CloudWays DO to indicate which provider the stack was running on.

Pantheon was tested on their free development environment which I was told is identical to their production environment.

Results

Load Storm

I ran multiple Load Storm tests to get a sense of where to start testing. The first was 1-100 users, which not a single company struggled with. The second was 50-500 users, which again nobody struggled with. So the first meaningful test was 100-1000 users. For the companies that didn't struggle there, I did a 500-2000 user test. I ran these tests with an immense amount of help from Scott Price at LoadStorm. He spent hours with me, teaching me how to use LoadStorm, build tests and offering guidance/feedback on the tests themselves.

 Test 1. 100-1000 Concurrent Users over 30 minutes

 

Request Count Average RPS Peak Response Time (ms) Average Response Time (ms) Average Throughput (kB/s) Errors
A Small Orange 116127 64.52 2752 356 1318.55 41
BlueHost 107427 59.68 16727 1306 1159.55 13351
Cloudways DO 103359 55.57 16983 1807 1169.28 2255
Cloudways AWS 87447 47.01 16286 5436 821.75 18530
DreamHost 115634 62.17 15514 441 1244.31 4327
FlyWheel 116027 62.38 775 368 1287.86 0
GoDaddy 133133 71.58 1905 434 3883.42 0
Kinsta 116661 62.72 552 309 1294.77 0
LightningBase 117062 62.94 1319 256 1324.89 12
MediaTemple 116120 62.43 793 403 1304.27 0
Nexcess 116634 62.71 15085 294 1299.85 8
Pagely 119768 64.39 1548 461 1227.06 0
Pantheon 117333 63.08 528 264 1316.41 0
SiteGround 117961 63.42 939 165 180.09 0
WebSynthesis 116327 62.54 1101 332 1285.83 0
WPEngine 123901 68.83 10111 416 1302.44 2956

Discussion of Load Storm Test 1 Results

There was a pretty clear division of good and bad performance in this test. Most companies didn't struggle at all. A few collapsed: BlueHost, CloudWays AWS, CloudWays DO, and DreamHost. BlueHost started spewing 500 errors almost as soon as we started. CloudWays AWS started timing out immediately. CloudWays DO started having issues around 800 users and then started timing out. DreamHost started giving 503 Service Unavailable almost right away. It looks like our script triggered a security mechanism but they refused to work with me to test any further.

SiteGround ran into a security measure we weren't able to get around in time for publishing this article. The server seemed to just throttle the connection again.

PressLabs isn't listed because we couldn't get LoadStorm to work on their system. I am not sure what was different about their backend, but I tried to work with PressLabs and LoadStorm to get it working to no avail.

 

  • Load-Storm-A-Small-Orange-2000
  • Load-Storm-Fly-Wheel-2000
  • Load-Storm-GoDaddy-2000
  • Load-Storm-Kinsta-2000
  • Load-Storm-Lightning-Base-2000
  • Load-Storm-Nexcess-2000
  • Load-Storm-Pagely-2000
  • Load-Storm-Pantheon-2000
  • Load-Storm-SiteGround-2000
  • Load-Storm-WebSynthesis-2000

Test 2. 500 - 2000 Concurrent Users over 30 Minutes

I removed the hosts that failed and doubled the concurrent users for the second test.

Request Count Average RPS Peak Response Time (ms) Average Response Time (ms) Average Throughput (kB/s) Errors
A Small Orange 248249 133.47 5905 436 2639.68 0
FlyWheel 236474 127.14 3811 983 2499.11 16841
GoDaddy 285071 153.26 8896 371 8255.24 92
Kinsta 248765 133.74 942 316 2714.82 0
LightningBase 248679 133.7 3887 343 2763.92 23
MediaTemple 249125 133.94 1499 313 2748.32 9
Nexcess 243115 130.71 15097 388 2644.72 80
Pagely 256163 137.72 15078 446 2621.04 1
Pantheon 250063 134.44 1111 297 2754.67 0
WebSynthesis 240305 129.2 4389 743 2598.83 1173

Discussion of Load Storm Test 2 Results 

FlyWheel started to fail around 1500 users causing 502 errors and remained constant at that level of failure. I'm not sure what the bottleneck was, but it didn't overload the server, but I suspect the I/O of something bottle-necked causing a certain amount of requests to fail. WebSynthesis had a few errors as well, they were 5 separate spikes somewhat evenly spaced out. The server didn't show signs of failure, it looks like it might have been an issue with caches being refreshed and some requests failing in the meantime. WebSynthesis' error rate was still under 0.5%, so I don't have any real issue with those errors. The slower average response time can also be attributed to the spikes in performance.

Remarkably, some companies didn't even struggle. Kinsta kept sub one second response times for 30 minutes and nearly a quarter million requests. Most companies had a spike or two causing a higher peak response time, but Kinsta and Pantheon didn't (and Media Temple had a tiny one at 1.5 seconds). Simply amazing performance.

Another interesting note, GoDaddy pushed triple the amount of data through because their admin screen had a lot more resources being loaded. That's why the average throughput is so high. Despite that fact, it didn't seem to impact their performance at all, which is astounding.

Full Interactive Test Results

A Small Orange
FlyWheel
GoDaddy
Kinsta
LightningBase
MediaTemple
Nexcess
Pagely
Pantheon
SiteGround
WebSynthesis

Blitz.io

 Test 1. 1-1000 Concurrent Users over 60 seconds

Blitz Test 1. Quick Results Table

Success Errors Timeouts Avg Hits/Second Avg Response (ms)
A Small Orange 27595 14 0 460 67 ms
BlueHost 23794 1134 189 397 160 ms
CloudWays AWS 24070 162 148 401 138 ms
CloudWays DO 27132 118 127 452 49 ms
DreamHost 13073 45 7885 218 21 ms
FlyWheel 28669 20 10 478 27 ms
GoDaddy 26623 8 5 444 104 ms
Kinsta 27544 0 0 459 69 ms
LightningBase 27893 0 1 465 56 ms
MediaTemple 26691 8 9 445 102 ms
Nexcess 18890 2288 641 337 517 ms
Pagely 25358 9 0 423 156 ms
Pantheon 27676 21 0 461 64 ms
PressLabs 25903 143 0 432 89 ms
SiteGround 24939 0 0 416 152 ms
WebSynthesis 28913 0 0 482 19 ms
WPEngine 23074 121 4 385 247 ms

Discussion of Blitz Test 1 Results

I learned from the last round of testing that any hosting that isn't optimized at all for WordPress (default install) will get destroyed by these tests. So I didn't include any of them this time. There wasn't any as catastrophic failures this time.

Who performed without any major issues?

A Small Orange, FlyWheel, GoDaddy, Kinsta, LightningBase, MediaTemple, Pagely, Pantheon, SiteGround, WebSynthesis all performed near perfect. There's nothing more to say for these companies other than they did excellent. All of their error/time rates were below 0.5%.

Who had some minor issues?

CloudWays AWS, CloudWays DO, PressLabs and WPEngine. All four of these providers had over 100 errors/timeouts and had an error/timeout rates between 0.5%-2%. Not a huge deal, but definitely not perfect.

Who had some major issues?

BlueHost, DreamHost, and Nexcess. BlueHost started show stress around 40 seconds in and started to buckle around 47 seconds. DreamHost had a couple spikes in response time and errors. However, it looks like the load testing tool may have hit some type of security limit because requests started timing out but it gave very fast responses and maintained roughly 250 hits/second constantly. It doesn't look like the server was failing. I couldn't get them to disable the security to really test it, so it's hard to say much more. Nexcess started to show stress around 20 seconds and buckle around 30 seconds.

 Test 2. 1-2000 Concurrent Users over 60 seconds

  • Blitz-A-Small-Orange-2000
  • Blitz-Blue-Host-2000
  • Blitz-CloudWays-AWS-2000
  • Blitz-CloudWays-DO-2000
  • Blitz-Dream-Host-2000
  • Blitz-Fly-Wheel-2000
  • Blitz-GoDaddy-2000
  • Blitz-Kinsta-2000
  • Blitz-LightningBase-2000
  • Blitz-Media-Temple-2000
  • Blitz-Nexcess-2000
  • Blitz-Pagely-2000
  • Blitz-Pantheon-2000
  • Blitz-PressLabs-2000
  • Blitz-SiteGround-2000
  • Blitz-WebSynthesis-2000
  • Blitz-WPEngine-2000

Blitz Test 2. Quick Results Table

Success Errors Timeouts Avg Hits/Second Avg Response (ms)
A Small Orange 54152 26 1 903 77 ms
BlueHost 29394 14368 3408 490 234 ms
CloudWays AWS 25498 4780 8865 425 338 ms
CloudWays DO 53034 1477 49 884 58 ms
DreamHost 10237 5201 20396 171 201 ms
FlyWheel 56940 121 68 949 29 ms
GoDaddy 53262 29 64 888 105 ms
Kinsta 55011 32 0 917 69 ms
LightningBase 55648 0 0 927 58 ms
MediaTemple 53363 16 28 889 100 ms
Nexcess 25556 15509 4666 426 279 ms
Pagely 51235 41 2 854 147 ms
Pantheon 55187 91 0 920 65 ms
PressLabs 35547 4105 1569 592 326 ms
SiteGround 42645 490 220 711 276 ms
WebSynthesis 57776 1 0 963 20 ms
WPEngine 39890 304 333 665 364 ms

Discussion of Blitz Test 2 Results

Who performed without any major issues?

A Small Orange, FlyWheel, GoDaddy, Kinsta, LightningBase, MediaTemple, Pagely, Pantheon, WebSynthesis all performed near perfect. All of their error/time rates were around 0.5% or lower.

Who had some minor issues?

SiteGround and WPEngine. All four of these providers had over 100 errors/timeouts and had an error/timeout rates between 0.5%-2%. SiteGround started to show some stress around 30 seconds and didn't started to have real issues after 50 seconds (errors). WPEngine started to show stress around 20 seconds and performed slightly erratically until the end of the test.

Who had some major issues?

BlueHost, CloudWays AWS, CloudWays DO, DreamHost, Nexcess, and PressLabs. The four that had major issues from last around completely failed with error/timeout rates exceeding 50%. DreamHost who looked like it was fine behind the security measures buckled around 35 seconds into this test and started returning errors, increased response times and the hits/second dropped. CloudWays DO definitely started to stress and show signs of buckling around 50 seconds. But its error rate was still under 3%. I don't think it would have lasted much longer had the tests gone further, but it was the least worst failure. PressLabs was a surprise, it started to show stress around 25 seconds and started to buckle around 35 seconds into the test.

 Full Blitz Results (PDFs)

A Small OrangeBlueHost, CloudWays AWS, CloudWays DO, DreamHost, FlyWheel, GoDaddy, Kinsta, LightningBase, MediaTemple, Nexcess, Pagely, Pantheon, PressLabs, SiteGroundWebSynthesis, WPEngine.

Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. All the companies were monitored over an entire month (July 2014).

Uptime Robot

Uptime (%)
A Small Orange 100
BlueHost 99.71
CloudWays AWS 100
CloudWays DO 99.93
DreamHost 99.92
FlyWheel 99.97
GoDaddy 99.9
Kinsta 100
LightningBase 100
MediaTemple 99.81
Nexcess 100
Pagely 99.95
Pantheon 100
PressLabs 100
SiteGround 100
WebSynthesis 100
WPEngine 100

According to UptimeRobot not a single company was below 99.5% uptime. In fact, with the exception of Media Temple and BlueHost, they were all above 99.9% uptime. For reference 99.5% uptime is 3.5 hours of downtime per month. 99.9% is <45 minutes of downtime per month. Overall, nothing to really complain about according to Uptime Robot.

StatusCake

Availability (%) Response Time (ms)
A Small Orange 1 0.23s
BlueHost 0.9969 2.45s
CloudWays AWS 0.998 0.75s
CloudWays DO 1 2.41s
DreamHost 1 2.22s
FlyWheel 0.999 1.99s
GoDaddy 1 2.41s
Kinsta 1 2.13s
LightningBase 1 1.6s
MediaTemple 1 1.18s
Nexcess 1 2.33s
Pagely 1 2.49s
Pantheon 1 2.04s
PressLabs 1 1.49s
SiteGround 0.9993 1.64s
WebSynthesis 1 1.77s
WPEngine 1 2.76s

According to StatusCake, the results look even better. I used multiple services to monitor because there can be networking issues unrelated to a web host's performance. StatusCake only detected issues with four companies, which is fewer than UptimeRobot detected. It's hard to say which is better or right. But they both say that uptime didn't really seem to be an issue for any company.

StatusCake also provides an average response time metric. According to them, it's using a browser instance and fully rendering the page. They also have many different geographical locations that they are testing from. I don't have any further insight into these tools beyond what I can read on their documentation. If they are to be believed, A Small Orange has astonishingly fast performance. WPEngine is the slowest average load time at 2.76 seconds which isn't that bad.

 

WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only. This was tested against the default install from every company.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
A Small Orange 1.443 0.801 0.836 0.64 0.93
BlueHost 1.925 1.321 1.012 0.785 1.26075
CloudWays AWS 0.655 0.867 0.967 0.746 0.80875
CloudWays DO 0.493 0.851 1.036 0.811 0.79775
DreamHost 1.177 0.863 1.067 1.147 1.0635
FlyWheel 0.497 0.864 1.066 1.109 0.884
GoDaddy 1.607 1.355 0.934 0.855 1.18775
Kinsta 0.759 0.752 0.947 0.592 0.7625
LightningBase 0.584 0.787 0.936 0.675 0.7455
MediaTemple 1.516 0.983 0.955 0.555 1.00225
Nexcess 1.433 1.139 1.196 0.859 1.15675
Pagely 6.831 0.86 0.913 0.709 2.32825
Pantheon 0.654 0.828 0.923 0.954 0.83975
PressLabs 0.715 1.018 1.213 0.723 0.91725
SiteGround 1.392 1.239 1.01 1.212 1.21325
WebSynthesis 0.407 0.835 0.982 1.024 0.812
WPEngine 0.821 1.086 0.839 0.685 0.85775

There isn't much surprising here. The pack is really tight with less than a half second difference average between the top and bottom hosts. If we exclude Pagely. I'm not sure what happened with their Dulles, VA test, but it seems like there was something terribly wrong with the network when I tested it. The average response times from every other location were incredibly fast (<1 second). I'm going to chalk it up to a bad node somewhere causing that particular test to perform so poorly, almost certainly not a reflection of their hosting.

What is interesting, compared to last time is that these companies are getting faster. There was only one company with a sub 1 second average last time. Now there are 10 companies (11 if you count Pagely). Three of them were above one second last time, so they are showing signs of improvement (Pagely, WebSynthesis, WPEngine). It also means there is a lot of new competition that is not behind the entrenched players in terms of performance.

Conclusion

Every service seems to have their issues somewhere if you look hard enough. I try to avoid injecting my personal opinion and bias as much as possible. So I won't be ranking or outright saying any single company is the best. Some providers did exceptionally well and tended to clump together performance-wise, I will call those the top tier providers. This top tier designation is related to performance only and is claimed only from the results of these tests. What each of these companies is offering is different and may best suit different audiences depending on a variety of factors beyond performance, such as features, price, support, and scale (I tested mostly entry level plans). But I will provide a short summary and discussion of the results for each provider.

Top Tier WordPress Hosting Performance

A Small Orange, GoDaddy, Kinsta, LightningBase, MediaTemple, Pagely, Pantheon, WebSynthesis

Each of these companies were below the 0.5% error rate on all load testing all the way up to 2000 concurrent users on both LoadStorm and Blitz.

Honorable Mention

FlyWheel gets an honorable mention. They performed really well on many of the tests. FlyWheel fell apart on the final LoadStorm test to 2000 logged in users. I'll explain more in their individual section as to why this is deserving of an honorable mention.

Amazon Web Services (AWS) vs Digital Ocean

One of the most interesting comparisons to me was CloudWays. They provide you with the ability to choose which VPS provider and type you want. It then sets up their WordPress configuration (in an identical manner from my understanding) on the VPS. I was granted access to one Amazon and one Digital Ocean VPS from them. The Amazon was a small (1.7GB ram) and the Digital Ocean was a 2GB ram instance.

aws_vs_digital_ocean_loadstorm

The head-to-head results from LoadStorm (1000 user test) results above pretty clearly show Digital ocean performing better in every category (with the exception of Peak Response Time which is a timeout). Digital Ocean sent more data, had less errors and it did it faster.

aws_vs_digital_ocean_blitz

The Blitz.io results show pretty clearly that Digital Ocean is outperforming AWS by a wide margin as well. It delivered twice as many hits with less errors and time outs.

It's pretty easy to conclude based on the tests that on the low-end VPSs, that Digital Ocean's hardware outperforms Amazon's hardware.

Individual Host Analysis

A Small Orange

They've improved their LEMP stack since the last time I tested. They never buckled in any test and were definitely one of the best. Their staff was incredibly friendly (special thank you to Ryan MacDonald) and they've stepped up their performance game. The one thing that isn't quite there yet is the documentation/user experience, there are a lot of improvements they could make to make their LEMP stack more accessible to the less tech savvy. All in all, the experience was in-line with what I would expect from a company that has one of the highest support ratings on our site.

BlueHost

Their WordPress offering is brand new. It struggled in every load test. Their price is on the middle-high end but the performance was not. Ultimately, they fell short of where I would expect based on pricing and the competition.

CloudWays

CloudWays was certainly an interesting company to test given that they had two entries, one running on Amazon Web Services (EC2) and another on Digital Ocean. The Digital Ocean VPS outperformed AWS in every category which was interesting. The AWS instance's performance was near the bottom of the pack performance wise, but the Digital Ocean one was in the middle. It is a very interesting platform they have built which allows deployment and management across providers. However, their performance isn't quite there yet. Other companies are running on the same hardware and getting better results. CloudWays doesn't do just WordPress, so it's easy to understand why their performance might not quite be as good as some of their competitors who solely focus on WordPress.

DreamHost

DreamPress was another disappointment. The security features hid some of the performance weakness on the first Blitz test, but it completely failed on the second. The way DreamPress is designed it says it has automatic RAM scaling and each site is run by two VPS instances. It's very unclear what resources you are really getting for your money. They are charging $50/month for a 1GB ram VPS, so I get the feeling a lot of resources are shared and it may not be a true VPS.

FlyWheel

FlyWheel were excellent on every test except the final 2000 logged in user test from LoadStorm. They are built on top of Digital Ocean and I was using the smallest VPS. Yet their performance beat VPSs on Digital Ocean that had four times the resources (CloudWays DO). For cached content on the Blitz test, they had the second highest hits/second and response time. I suspect the testing hit a hardware maximum. FlyWheel had the best performance with the lowest dedicated resources (512MB ram). The companies that outperformed it had more resources dedicated to them or shared resources which presumably would allow access to far greater than 512MB ram. It was an impressive performance given what they are selling and combined with them having the best reviews of any company Review Signal has ever tracked. FlyWheel certainly merit serious consideration.

GoDaddy

GoDaddy continues to surprise me. They flew through all the tests, including a weird issue where they transferred 3X the data during the LoadStorm test and didn't show any signs of stress. The only comparison I have to last time is the Blitz testing, where they eked out another 3000+ hits and raised their hits/second from 829 to 888. GoDaddy also raised their max hit rate marginally from 1750 to 1763. What's more impressive is they reduced their errors+timeouts from 686 to 93. More hits with less errors. From a performance perspective, they did excellent in absolute terms and relative to their last benchmarks.

Kinsta

A new-comer that jumped straight to the top of the performance tiers. Kinsta's performance was amazing in the Load Storm 2000 logged in user test. They had the lowest peak response time and zero errors over a 30 minute test. They didn't struggle with any tests whatsoever and showed zero downtime. Kinsta's performance was top tier.

LightningBase

Another new-comer that jumped straight to the top. One of the cheapest too starting at under $10. LightningBase aced the Blitz testing and did excellent on Load Storm. There was no downtime monitored. LightningBase belongs in the top tier and is delivering amazing value.

Media Temple

Media Temple is interesting because I was told it was running the same technology as GoDaddy (GoDaddy bought Media Temple a year ago). They have a few more premium features like Git and a staging environment. Media Temple's performance was superb. It actually beat GoDaddy's performance in just about every measure by a marginal amount on both Load Storm and Blitz's load testing. If GoDaddy has top tier performance, Media Temple certainly does as well.

Nexcess

Nexcess's performance was excellent in the Load Storm testing. However, it completely collapsed during the Blitz load testing. I'm really not sure what to make of those results. Perhaps the underlying shared hardware is very good but the static caching setup isn't quite up to snuff? It's probably not worth speculating, suffice to say, Nexcess ended up looking like a middle of the pack web host instead of a top tier one because of the Blitz test.

Pagely

Pagely put on another spectacular performance. They handled the Load Storm test with 1 error. Blitz results stayed similar to the last run. They handled more hits, but had a few more errors+timeouts (1 last time, 43 this time). Really not much to add here other than they continue to be in the top tier.

Pantheon

Pantheon specialized in Drupal hosting, so I was wondering how well it would translate to WordPress. The short answer is, it converted over really well. They had a flawless run on the LoadStorm test - 0 errors and not even any spikes in response time over 30 minutes. They are one of the most expensive (only second to PressLabs) options on this list, but definitely make a case for it. Perfect uptime and near flawless load testing sent them easily into the top tier.

PressLabs

It's hard to write much about PressLabs because we couldn't get LoadStorm to work properly to test out their hosting. However, their Blitz results were lackluster. For the most expensive plan we tested, it was a bit of a disappointment to see it not do stunningly well.

SiteGround

SiteGround sadly didn't do as well as they did last time. Their Blitz load testing score went down slightly. We couldn't bypass their security measures to properly test Load Storm. They obviously have some good protection measures to prevent malicious users from trying to access too many things, but it also meant I couldn't get a deeper look this time around. That was a change from the last round of testing. Slightly disappointing to see the performance dip, but I hope it was due to the extra security measures they put in place that made testing them difficult.

WebSynthesis

WebSynthesis was teetering on the Load Storm test of having too many errors (0.5%), but they were under it and handled the test quite well. They also had no weird security issues this time around, and WebSynthesis led the pack on Blitz testing. They went from 871 hits/second to 963 hits/second; leading every provider on the Blitz tests with a whopping 1 error to boot. Sprinkle in some perfect up time numbers and it's clear WebSynthesis is still a top tier provider and is continuing to get better.

WPEngine

I feel like I could copy+paste my last conclusion about WPEngine. "WPEngine had some issues. Uptime was not one of them, they were perfect or upwards of 99.9% in that department. However, their performance shortcomings became apparent during the load tests." They didn't even make it to the final round of Load Storm testing. They were also middle of the pack on the Blitz testing. Compared to the last round of Blitz testing, the results were nearly identical, with slightly fewer errors+timeouts. I'm not sure if I should be disappointed to not see improvement or relieved to see them maintain the exact same performance and consistency. Their vaunted rankings on Review Signal's reviews have slipped relative to a few of the other providers on here (FlyWheel and WebSynthesis). While they were once leading the pack in technology, the rest of the pack is starting to catch up.

 

Thank Yous

A special thanks goes out to the sponsor of this post and an individual employee, Scott Price of Load Storm, who worked countless hours with me in order to perform these tests.

I want to thank all the companies that participated in these tests. I tested the support staff a fair bit at some of them and I thank them for their time and patience.

A special thanks goes to Chris Piepho from LightningBase also provided a lot of feedback based on the original article and helped improve the methodology for this round.

A huge thanks goes out to Mark Gavalda at Kinsta for his feedback and performance testing discussions. He's tested some further out stuff than I have like HHVM and php-ng performance. Also to their designer, Peter Sziraki, who designed the the header image for this article.

 

The following two tabs change content below.
avatar
Kevin Ohashi is the geek-in-charge at Review Signal. He is passionate about making data meaningful for consumers. Kevin is based in Washington, DC.





98 thoughts on “WordPress Hosting Performance Benchmarks (November 2014)

  1. Pingback: Managed WordPress Hosting Showdown – Performance Benchmarks Comparison | Review Signal Blog

  2. avatarPhil

    Thanks for another brilliant post!
    I’m glad to see that my own testing results, while not as meticulous as yours by far, don’t differ at all from yours, with Kinsta, Flywheel and Pantheon as the fastest WP hosting options overall, and GoDaddy having the best price/performance overall (thanks to their currently insanely cheap pricing).

    Reply
    1. avatarKevin Ohashi Post author

      Do either of them actually specialize in WordPress? Just having good VPS servers isn’t anywhere good enough to be competing in this lineup honestly. Without installing the proper software (CACHING!) they will just collapse (MySql dies first generally). It’s not the hardware’s fault. I tested Digital Ocean last time out of the box and it was terrible. It’s not their fault at all either. FlyWheel and Kinsta are both running on Digital Ocean and perform great. So it’s definitely not a hardware issue.

      Reply
  3. avatarMilan Petrovic

    Great review! I recently switched to A Small Orange and I am still amazed how fast it is and how they can have such a great support. I am glad that your testing shows how great ASO is.

    Reply
  4. avatarClifford P

    I’d suggest adding “SSL” as a feature to compare, as it can significantly increase some pricing. And some hosts only allow SSL on one domain if on a multi-domain plan (e.g. SiteGround). Additionally, please add Lightning Base, my preferred dedicated WordPress host.

    Reply
  5. Pingback: Weekly Roundup: November 7, 2014 – Sell with WP

  6. Pingback: 78p.tv – Be At Ease With Your WordPress Security

    1. avatarKevin Ohashi Post author

      Thanks Shane.

      Couple thoughts, variation is to be expected. Without actually fully linking the results it’s hard to say much. Some routes get overloaded. A spike doesn’t necessarily mean it was anything to do with the host. Could be a bad route from the testing location. Also could be a bad shared node (I’m assuming you’re testing a shared host). Problem with benchmarking a lot of normal shared hosting is that the resources aren’t isolated well at all. One bad site can affect hundreds or thousands of others. It’s hard to isolate issues and attribute them.

      I actually don’t put much weight in WebPageTest results, it’s pretty much just a minor check to see if it shows any glaring issues. I’d really most concerned about how they perform under sustained load and monitoring on a very long regular interval through uptime monitoring which has some performance monitoring as well.

      Reply
  7. avatarAffiliate Web Designers

    I’m interested to know the profile of pages you were hitting? I was running performance tests recently against rugbydata.com using blitz.io – I managed to hit > 2000 simultaneous users with < 100 errors or timeouts.

    Then I tried running the Screaming Frog spider against the site and it started timing out in a big way until I optimized the configuration. The difference of course, is that blitz.io is hitting only one page – e.g. the home page – so all you're really testing is the cache retrieval speed. Screaming frog on the other hand, while not a performance analysis tool, was at least retrieving each page only once so it was managing to avoid the cache and properly test the performance of the configuration.

    Reply
    1. avatarKevin Ohashi Post author

      You are correct about blitz testing caching. That’s why I used LoadStorm too. LoadStorm was logging in up to 2000 random users and browsing the website (loading all assets). Take a look at some of the links to the tests — you can see exactly what it’s doing. It’s testing the performance a bit more thoroughly.

      Reply
  8. avatarAbu Zafor

    Hi, How did you do all the test? It’s really amazing and very glad for being around. Right Now I’am on SiteGround, But after reading this review/comparison I am a bit confused 🙁

    However, Thanks for publishing this great review 🙂

    Reply
  9. avatarTony

    Hi Kevin,

    Thanks for this great article.

    I was wondering about the AWS instance, was it a t1.micro or a t2.micro?

    T2 is the new low tier from Amazon and much faster than T1.

    Kind regards,
    Tony

    Reply
  10. Pingback: The Great WordPress Hosting Reviews of 2014 - SellwithWP

  11. avatarFrank

    Hi Kevin

    Great read 🙂

    About the dummy WordPress installation you created to do the tests; which plugins were installed (except for the hosting company ones)? Also, how much content? Would you be willing to share your installation?

    Have you done any tests on websites using the WooCommerce plugin?

    Reply
    1. avatarKevin Ohashi Post author

      Frank,

      Sure. It was just the web hosting ones this time. I just realized I didn’t add any extras this time around. With LoadStorm logging in and testing un-cached performance, adding extra plugins to increase load didn’t seem necessary. So I didn’t test WooCommerce. One of the biggest challenges is coming up with what a ‘normal’ site looks like. And I really don’t think there is a one-size-fits-all set of plugins. So I kept it clean.

      Content wise – it was like 5-7 pages, you can take a look at what one of them looked like: http://kevinohashibenchmark.com

      Reply
  12. avatarAdam

    Curious why you didn’t include the biggest “managed” WordPress host by a long shot: WordPress.com. 🙂

    Maybe in round 3? Email me if you need to work out details; I can put you in touch with the right folks.

    Reply
  13. Pingback: Independent Reviews of Kinsta

  14. avatarMarcel Heemskerk

    How about for round 3 to take pricing into account as well? It’s interesting but maybe you can come up with a certain performance/$ number? Or divide the hosters into cheap-average-expensive?

    Would be nice to know where to get the best bang for the buck!

    Reply
    1. avatarKevin Ohashi Post author

      The cost is listed in the table. People are welcome to make that determination for themselves. There’s also different feature sets with each company. My goal is to present data and compare things apples to apples in as many cases as possible. I’m not convinced price per performance is an option because of all the other considerations that factor into cost.

      Reply
  15. avatarJohn Peterson

    Thank you so much for doing the arduous work of objectively testing the managed hosting providers. I am very disappointed that WP Engine did not work with you guys. I have sent an email to my Account Executive with WP Engine, and hopefully others will do the same. Hopefully in Round 3, they will be represented.

    Reply
  16. Pingback: How We Got Our Stripe Account Back

  17. avatarJames

    Thanks for the detailed review. I’m currently hosting with Flywheel and the service/speed has been just awesome. But it would be nice to be able to be able to host more than 1 website at the lower tier pricing.

    I came across WPOven which doesn’t seem to have much limitation to number of websites and seems to be running on top of Linode servers. Kind of hesitant to sign up with them considering they are fairly new.

    I would really like to see how match up against someone like Flywheel. Maybe you can include them in your future test.

    Reply
  18. avatarThoms N Zickell

    If only managed WordPress hosting companies are allowed to enter why is ASO ? I like them quite a bit but honestly why would they get an exception and not other hosts?

    I have accounts with WP Engine 250 plan moving to their dedicated, Flywheel a lot of DO from 15 to 75 and linode based 10 per a VPS, Pagely one base set up, one business set up and one full-fledged VPS-3.
    Web Synthesis one base set up one advanced and one Pro. Press Labs I will have another plan went soon no SSL as of I asked we talked. It’s been fixed will sign up but hydrogen and helium. Oh & Kinsta HHVM

    so my plan is introduce HHVM into my VPS on Pagely and so far so good. WP engine have lots of big plans for them including HHVM, flywheel excellent company hope they offer HHVM, synthesis has gotten better press labs has gotten better. Throw Pressable in the mix just to see what happens. Run get cloudier instead of a budget SiteGround set up. I have yet to build out Pantheon.

    what type of DNS are you using? Are any of them using content delivery networks? If so which ones What was the TTL? if you would like I will give you the resources to run Neustar to test the sites.

    My primary site is run on a FireHost 4 core 8 Gig extremely optimized and secure set up will you allow something like that in if you allow a small orange with somebody to help you as you did with a small orange?

    any machine can run faster than another for specific tests. If you’re clearing the cache every time you’re killing what gives managed WordPress hosting companies their average speed wise at least.

    Media Temple and Godaddy are pretty much the same thing have no interest.

    I like that you actually took the initiative to make this a real test but I’m asking you to consider other systems than the ones you’re using currently to Crane the performance of the websites are you willing to do that? I have the hosting I have what you need to do the tests if you need anything at all.

    Email me,
    Tom

    Reply
    1. avatarKevin Ohashi Post author

      Let me try to address things point by points.

      ASO has a LEMP stack they deploy on their VPSs. That’s what I tested. I tried some other company before that offered generic WP installs and it’s just not the same. I’ve had an open invitation for the 2nd round and double the companies participated. Nobody was excluded that didn’t either a) ask to be excluded (Pressable) or b) fail spectacularly last time because it really isn’t in the same class. ASO did ok originally, they made improvements and wanted to keep participating.

      I’m not sure what listing all your plans has to do with anything.

      The hosts were alias through my DNS. I used whatever the default setup was for each host as far as CDNs go. You would have to check each host on TTL. I tested out of the box performance on these things. What neustar resource are you referencing and what value would it add to my testing?

      Again, A Small Orange has a VPS template, they weren’t doing some custom setup for me.

      I test both cached and uncached performance. That was the idea. Logged in users break a lot of caching, but it’s a realistic scenario to have users that required dynamically generated content which hits PHP and MySql. Everyone was tested the same, it’s to create a baseline for comparison.

      As far as MT and GoDaddy, yes I believe they are running on the same platform and it showed. That’s ok. Interesting to see such near identical performance.

      Your final comment about other systems. What other systems are you referring to? The goal of this testing is to track the out of the box performance of managed WP providers and create a baseline for comparison. I’m glad you liked the real testing I did.

      Reply
  19. Pingback: GoDaddy WordPress Hosting Review | Review Signal Blog

  20. Pingback: A Small Orange WordPress Hosting Review | Review Signal Blog

  21. avatarThomas Zickell

    Hi Kevin,
    I apologize for my grammar in the last post. I’m using voice dictation software and German is my native language so I apologize if that did not make any sense. What I meant to say was I appreciate somebody doing tests that do not involve affiliate links. They’re not sponsored by a company it is rare to find that type of testing.

    The reason I listed the companies I’m hosting with is simply to offer resources if you wanted to test against maybe something a little bit more apples to apples like the three companies that can do each HHVM all running smaller VPS setups I think it would be interesting see the different tiers because these companies like you said with flywheel you must’ve picked the tiny plan which comes with very little RAM flywheel does as you pointed out allocate more resources for larger plans so picking the next plan up or something similar would be very interesting.

    All I wanted to say was that I would lend you hosting but it seems like you probably are not need. I was not trying to just list bunch of stuff.

    Great job and I would love to see a customized test where companies are config to go up against WordPress VIP.

    Respectfully,
    Tom

    Reply
    1. avatarKevin Ohashi Post author

      Thanks Thomas and I appreciate the offer. To keep things standardized I deal with each company directly. Apples to apples is really hard when there is so much variance in terms of hardware and configs. Only when it’s on top of the same infrastructure is it really easy to say X outperforms Y because both are on identical instances from the same provider. But even then, there’s the possibility of a bad node or something. Testing and benchmarking is really tough. I don’t think I’ve got it perfect, but I’m trying to improve it each round and I appreciate your comments and offer!

      Also we’ll see if I can get WordPress VIP in next round.

      Reply
  22. avatarThomas Zickell

    sorry for the two replies but neustar offers what I believe is the best synthetic testing available. I’ve tried a lot maybe a Smart bear comes close but neustar load testing with synthetic visitors and RUM really makes what I believe is the best tool to test real user measurement on synthetic bots. Of course you could use it for real users.

    I believe it would be an asset In future tests.

    The hosts were alias through your DNS? if you don’t mind me asking is it hosted DNS? the only reason I’m bringing it up is low-quality DNS can cause issues if everything’s going through it but I’m sure you check for this.

    I would be interested in seeing a method in which you choose the offering for instance flywheel base and press labs top server seem lopsided maybe you had access to all the I don’t know.

    My final comment was gibberish I apologize it was speech recognition however if did want uptime history for the hosting companies have provided for over one year I can give you that things like that I don’t know I’m trying to be of service.

    I agree Pressable actually was in good company when it was ZippyKid now it is not.

    Sorry for two posts in a row.
    All the best,
    Thomas

    Reply
    1. avatarKevin Ohashi Post author

      I honestly didn’t pay too much attention to the DNS issue. Other than the initial lookup, I think all my testing was within the TTL records of the DNS, so it should be getting cached really close to the server.

      I don’t know anything about Neustar or Smart Bear’s software for testing. What makes them better than LoadStorm and Blitz? RUM? With some of the testing price also becomes a factor, I certainly can’t afford what I’m using resource wise without the cooperation of the load testing companies. So unless they’re willing to sponsor, it’s generally out of the question sadly.

      I picked smallest plans at FlyWheel and Press Labs. PressLabs just charges a lot more to get started, but having those sort of plans in the mix makes it interesting I think.

      Sorry for the delayed response, I think I meant to answer your post but got busy and forgot until I noticed it again today. My apologies.

      Reply
  23. avatarBrian MacKinney

    Great review!
    I work at Pantheon, and personally expected even better performance. I clicked through to our Loadstorm results, and noticed you tested against your site’s dev environment. From our docs:

    What are the differences between the environments?

    Dev has lower TTL on Varnish caching and shows errors to site users.
    Test has the same caching configuration as Live and does not show errors to users, but only one application server.
    Live has optimal caching and does not show errors to users, and (depending on the plan) can have multiple application servers for high availability and high performance.

    Great article, I’m looking forward to round three.

    Reply
    1. avatarKevin Ohashi Post author

      Brian,

      Thanks for the exact docs. When I was doing the testing I talked with Cal about it. I was told ‘Yes, the sandbox is the same configuration as our professional plan. (cc:ing Josh in case I’m wrong and he can correct.) All three environments (Dev, Test, Live) are configured exactly the same. If it works in dev, it works in prod.’ Josh mentioned live would have better edge caching for some JS/CSS I am noticing in the email thread. It may have just got lost in the details, at this point I don’t remember why I used dev (maybe I had trouble accessing live? maybe it was a mistake?) Either way, Pantheon didn’t seem to have any issues.

      Reply
  24. Pingback: WordPress Hosting Providers Study: Web Performance & Scalability - LoadStorm

    1. avatarKevin Ohashi Post author

      Because they weren’t really in the same tier or providing the same service. It was an experiment in the first round of testing and it didn’t work. There are companies using Digital Ocean (FlyWheel, Kinsta, CloudWays) with actual customized WP stuff.

      Reply
  25. avatarMilo

    Thanks for the great info! Super helpful and informative.

    Just to be clear in case I missed something. For ASO, was this the out of the box cloud VPS, or did you request any modifications?

    Thanks again, fantastic write-ups.

    Reply
  26. Pingback: Pantheon WordPress Hosting Review | Review Signal Blog

    1. avatarKevin Ohashi Post author

      Do you represent either company out of curiosity? Remarkable looks like a cpanel host and from what I’ve tested in the past that kind of generic hosting isn’t in the same league.

      Reply
      1. avatarMark Bailey

        Hi,

        Brandon doesn’t represent us, though we greatly appreciate the mention!

        I wanted to clarify that we also offer cPanel, but our setup is far from generic. We modify quite a lot to achieve our level of performance and security, and we also offer multiple layers of backup beyond what cPanel alone provides.

        I invite anyone to run speed tests on us.

        Thanks,

        Mark Bailey, CEO

        Reply
        1. avatarKevin Ohashi Post author

          Mark,

          Thanks for commenting. To be very clear, you want to participate in the next round of testing? If that’s the case, I will be sending you an email when I begin testing. I try to get most companies to opt-in at this point because these tests are grueling and the majority of companies don’t stand up to them.

          Reply
    1. avatarKevin Ohashi Post author

      I haven’t even begun testing for round 3, so it won’t be anytime soon. It takes about 2 months to do the actual testing to give you some perspective, I wouldn’t think it will be published before April/May.

      Reply
  27. Pingback: The Definitive Guide to Choosing a Host for Bloggers - Once Coupled

  28. Pingback: In search of the perfect web host - Catalyst Creations West

  29. Pingback: Review Signal WordPress Hosting Report Summary

  30. Pingback: Media Temple WordPress Hosting Review | Review Signal Blog

  31. Pingback: LightningBase WordPress Hosting Review | Review Signal Blog

  32. Pingback: Pagely WordPress Hosting Review | Review Signal Blog

  33. Pingback: WebSynthesis WordPress Hosting Review | Review Signal Blog

  34. Pingback: Kinsta WordPress Hosting Review | Review Signal Blog

  35. Pingback: Drupal and WordPress Have Sold Us Out | Review Signal Blog

  36. avatarAndrei

    Hi Kevin. Any plan for a part 3?

    I’m currently looking at Kinsta to host several websites but I haven’t been able to find anything about them (reviews, opinions) since they changed the plans earlier this year.

    Reply
    1. avatarKevin Ohashi Post author

      Just starting to think about it. Got LoadStorm on board for round 3 this past week. I need to find some time and start talking with all the companies (another 5-10 want to be included in round 3). Mentally gotta prepare for the mountain of work it will be 🙂 If you’re needing data now or very soon, it’ll be at least two months away for me to organize, test, analyze and publish. Probably more like ~3 realistically.

      Reply
  37. Pingback: Endurance International Group – Profitable? | Review Signal Blog

  38. avatarDustin Meza

    Hey Kevin,

    Awesome stuff here, at WP Engine we have recently started upgrading customers to our new PHP 5.5 infrastructure and all new customers are placed on the new infrastructure as well. I’d be happy to ensure your existing account is on our PHP 5.5 environment for your round 3 testing or if you sign up for a new one for round 3 it will automatically go there. Just wanted to call this to your attention so you know we’ve been hard at work to improve the entire experience for our customers, and we look forward to round 3 showing it!

    Dustin Meza
    Senior Manager, Customer Experience Operations

    Reply
    1. avatarKevin Ohashi Post author

      Thanks Dustin. I’ve sent messages to a handful of people at WPEngine about round 3 but haven’t seen anyone sign up yet. Tomas Puig told me to contact Jason Cosper about when I was actually testing. But I’m waiting to see someone actually sign up. My initial email was sent to Sarah Pope and Evan Lacivita. If you wouldn’t mind checking what’s up with it. I can also send you a copy if you can help push this through for your participation in round 3.

      Reply
  39. avatarSteven Haffley

    Funny how the only sites you review are the ones that give you kickbacks for commissions isn’t it? You can try cloaking your links all you want to, it’s still easy to find out what your master plan is on this post, and that’s reviewing hosts that pay you money when you sign up a customer.

    Reply
    1. avatarKevin Ohashi Post author

      Kinsta and Pantheon both do not have affiliate programs and did spectacularly well. FlyWheel didn’t have one when I did the testing (they do now and I signed up right away). PressLabs doesn’t have one either.

      Having an affiliate program has nothing to do with how well the companies performed or which ones I considered reviewing.

      That said, yes, I have a lot of affiliate deals with most companies. This is a business, I don’t spend 2 months of my time multiple times per year working on something for free. I disclaim that I may be compensated on every single page of this website and explain exactly how it works here: https://reviewsignal.com/howitworks It’s not some magical secret that I’m trying to hide, it’s very much public knowledge.

      The important question is, is what I am doing accurate and fair? To answer that, I ask you to feel free to message and ask any company I have tested and see what they think whether they did well or not. That’s the benchmark you should hold me accountable to. Not some silly ‘omg he’s trying to make money’ standard, because the answer to that is yes, I enjoy eating, paying for things and having income in general.

      Reply
  40. Pingback: Pathetic SEO Made Worse by Sluggish Website Speeds

  41. Pingback: Where Can You Find Good WordPress Hosting? - Lockedown Design

  42. Pingback: DreamPress 2: The evolution of managed WordPress at DreamHost | Welcome to the Official DreamHost Blog

  43. Pingback: DreamPress 2: The evolution of managed WordPress hostong at DreamHost | Welcome to the Official DreamHost Blog

  44. avatarThomas Zickell

    Hey Steve and Keven,
    I believe after doing a significant amount of testing myself that there was no biased. It is very true that the companies Flywheel, Kinsta and Pantheon at the time of the test did not contain affiliate programs. Pantheon has a program where you would actually have to create the site for the person in order to be compensated. No links

    Kevin brings up the most important point was this test biased?

    I do not believe so at all.

    Many companies that pay extremely high amounts of money for sign-ups did not do well & if this test. So my opinion is this is much better than most tests that have come before which are extremely biased.

    Obviously Kevin is in business he has to be able to make money it would not make any sense for him not to monetize this.

    I believe he is smart enough and honest enough not to fake numbers it would not be valuable to Kevin because it would not be valuable to us.

    Kinsta & Flywheel have section for affiliates.

    FYI Kevin speak to Pedro at PressLabs they have a unique affiliate program.

    At time overview Flywheel did not have an affiliate program

    I think Kevin realizes the value of the content being trustworthy is higher than throwing the test because somebody paid you to.

    I don’t doubt the results. I do believe there should be more attention paid to security measures and it seems like Kevin has been doing that as well.

    All the best,
    Tom

    Reply
  45. avatarMarius

    Hi guys,

    I’m looking for managed WordPress hosting with a small budget (I’m just not sure about unmanaged, regarding security for example, I have no experience with maintaining a server).

    I’m currently with GoDaddy but it’s a bit too managed for my taste. I can’t create my own MySQL databases but instead had to install a WordPress installation which works with the click of a button and had delete all the tables to import my own since I migrated, and upload all my files through sftp. And they install 4 of their own plugins automatically whenever there is a new WordPress update. I can delete them but they will be re-installed at some point. Which I find annoying.

    So I was looking at Cloudways and tried their trial which all works fine. Since I’m on a budget and using the $15,- plan, I’m worried about the amount of visitors the chosen server can handle. They mention 30-70k with that plan.

    How can GoDaddy handle 400.000 on their server for $21,- I wonder.

    Any tips, idea’s?

    Reply
    1. avatarKevin Ohashi Post author

      How many visitors are you planning to get with a $15 budget? My suspicion is CloudWays should be able to handle it without much issue unless there is a huge disconnect between budget and the size of the website. On purely cacheable content their digital ocean box sustained 2000 concurrent hits/second. It struggled a bit more with 1000 logged in users (which breaks caching), but we’re talking heavy duty sites to get that sort of load. If you’re happy with CloudWays, I would stick with them. You can always scale there as you grow pretty easily.

      Reply
  46. avatarThomas Zickell

    You then state that you would like to pay around $15 a month my recommendation is
    Get Flywheel is $15 for a DO SSD VPS they will migrate and manage the site. If you are honestly worried about having 30- 70,000 visitors a month you have a site that should be able to pay for itself with advertisements depending on what type of site it is.

    You get 5000 unique visitors for $15 a month with Get Flywheel they determine the visitor to be equal to one IP per per at 24 hour period.

    if you need to handle 70,000 visitors a month please be certain to check your analytics and realize that there are ways to monetize your website with that many visitors.

    I would strongly recommend that you try I don’t know of a better deal for managed WordPress at the $15 price

    I hope this is of help,
    Tom

    Reply
  47. avatarPrasad Saxena

    Hello,

    I have hosted with GoDaddy Pro Deluxe Plan which has managed wordpress. But it fails at just 100 clients/ sec. I have tested it with loader.io and loadimpact.com
    Both give the same results.

    Where you had over 900+ clients on Godaddy per second, I am unable to do it on my WordPress website.
    I bought it seeing your review here, but it fails under every other condition.

    I feel hopeless for this type of performance.

    Reply
    1. avatarKevin Ohashi Post author

      You may not see the same results for a variety of reasons. First off, you are using two completely different tools from the ones I used. The results aren’t comparable across services because each one does it differently. Second, GoDaddy may have security measures in place to prevent this kind of behavior. Most load testing looks similar to DDoS tools and is often blocked by many companies. I often have to work specifically with the security teams to bypass it to actually get these tests to run. This isn’t a problem when it’s real load from users all around the world using different computers, devices and IPs. But when a load testing service does it, it looks very different and often gets blocked.

      That said, if you can publicly link your results from those services, I am curious to see the results and others might be too.

      Reply
  48. Pingback: WordPress Hosting Performance Benchmarks (2015) | Review Signal Blog

    1. avatarKevin Ohashi Post author

      Justin,

      I have your company in my notes. The issue for me is, are you a web hosting company or a server management/configuration company closer to easyEngine? I had some ideas too which included testing you out, but I honestly didn’t think it fit in this set of tests. And if you think you belong in this grouping, please tell me why, I’m always willing to change my mind given arguments that make sense.

      Reply
  49. avatarAlessandro

    Hello,

    thank you for your review. I can confirm that Lightning Base offer an amazing service, with incredible uptime and speed. I am using it from some months and is really a great Provider and managed hosting for WordPress. Soon, they will offer also PHP 7, and the performance will be again more terrific! For the rest, they use Apache + Varnish Cache, with SSD and CDN for free.

    Reply
  50. avatarRehbecca Lowder

    I am currently running WP with Woo on a hostgator top pro dedicated server. I am expecting to get some serious concurrent usage through our new membership site having digital downloads. I’m running the total theme and have an SSL and I’m getting an average of 1.1sec for page speed and a 98% score on non-SSL pages. But on SSL pages, im getting an avarage of 2.6 sec and an 92% score. Those numbers are with no traffic besides myself so obviously they are great. We arent live yet. Ya know, the top dedicated server is great for space, but im starting to wonder with the lack of SPDY, nginx, memcaching on my machine. Should I stay on a dedicated if I’m expecting to pull 3000 concurrent logged in users at a time or more? Or should I make the switch to a managed provider? There is so little info out there comparing a hostgator dedicated pro machine to the new managed wordpress hosting options. I’m lost. Looking forward to your thoughts! and thanks for an EXCELLENT writeup!!

    Reply
    1. avatarKevin Ohashi Post author

      If you’re seriously getting 3000 concurrent logged in users then you need a managed WordPress company. That’s an insane amount of traffic and you need some experts to help keep it optimized. In a lot of cases, software is going to give you a lot more speed than hardware until you really start to scale up big. So throwing dedicated resources at something without the proper software to back it up really isn’t a great choice.

      Reply
  51. Pingback: Developer Tips for Writing Good WordPress Code

  52. avatarDawn G.

    Thank You so much for the thorough tests and reviews here Kevin.
    Based on your conclusions above, I ended up joining LightningBase in the beginning of this year and it was a very good choice. I monitor my sites and my site hosted with them has not been down even once in over 10 month!
    Best of all I contacted support many times with various questions and needs and the support from the owner – Chris – has been absolutely OUTSTANDING!! I highly recommend them for anyone who is looking for an affordable and exceptional managed wp hosting.

    Reply
  53. avatarWebsite Designer

    I wish all the hosts were more co-operative when it came to testing, that would have helped you fully test each platform. With the varying features of each host, it becomes super difficult to analyze them on equal merits but I have to commend your tests, they’re super informative!
    I would be interested in your personal ranking of each one though (I know I can count on you being unbiased).

    Joshua

    Reply
  54. Pingback: WordPress Hosting Performance Benchmarks (2016) | Review Signal Blog

  55. Pingback: Review Signal Publishes 2016 WordPress Hosting Performance Benchmarks – WordPress Tavern

Leave a Reply to Mark Bailey Cancel reply

Your email address will not be published. Required fields are marked *

Current day month ye@r *

Loading...

Interested in seeing which web hosting companies people love (and hate!)? Click here and find out how your web host stacks up.