Tag Archives: wordpress

Kinsta WordPress Hosting Review

kinsta_logo_dark

This post is based off WordPress Hosting Performance Benchmarks (2014).

 

Overview

Kinsta is yet another new comer in our testing with something to prove. Kinsta easily shot to the top of our performance charts. Kinsta's plans have changed quite a bit since we tested them. When our testing was done they offered a $27/month plan. However, they've gone up-market and their cheapest plan is now $157/month. It seems they're targeting people who want serious performance.

The Plan

All testing was done on a shared account, which is no longer available.  This plan tested had 1 WordPress site, 1GB SSD disk space, 50GB bandwidth and costs $27/month.

Performance

LoadStorm

The first performance test was done with LoadStorm. Kinsta made it to the final round of testing where 2000 concurrent users were logging into WordPress and browsing the test site. The test was designed to test non-cached performance by logging users into WordPress. It caused many hosting setups to crumble. You can see Kinsta's result in this graph (click on it to play with the interactive results):

Load-Storm-Kinsta-2000

 

Kinsta aced the LoadStorm test. It had zero errors and one of the fastest average response times at 316ms. Kinsta also had the absolute lowest peak response time at 942ms. That's an astonishing feat, that over 30 minutes Kinsta served nearly 250,000 requests and not a single one took over a second to be delivered. Amazing.

Blitz

The second load test that was run on Kinsta was Blitz. Blitz was used to test cached performance. It simply requested the home page from 1-2000 times per second.

Blitz-Kinsta-2000

I can't draw lines this straight. The response time was flat. As you would expect from a company that aced the cache busting test, they didn't struggle in the slightest. Full Blitz Results (PDF)

Uptime

Two third-party uptime monitoring services (StatusCake and UptimeRobot) tracked the test site for a month. The results for Kinsta were perfect. 100% uptime according to both sources.

WebPageTest

“WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster.” WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. Kinsta was tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
Kinsta 0.759 0.752 0.947 0.592 0.7625

Kinsta had the second fastest average response time of all the companies we tested. No issues with this test in the slightest.

Conclusion

Kinsta, a new-comer to our testing, jumped straight to the top of the performance tiers. Kinsta’s performance was amazing in the Load Storm 2000 logged in user test. They had the lowest peak response time and zero errors over a 30 minute test. They didn’t struggle with any tests whatsoever and showed zero downtime. Kinsta’s performance was undoubtedly top tier.

Visit Kinsta

kinsta_logo_dark

WebSynthesis WordPress Hosting Review

websynthesis-big

This post is based off WordPress Hosting Performance Benchmarks (2014).

 

Overview

WebSynthesis [Reviews] had an extremely strong showing in our first round of testing once I got by a security issue. They managed to defend their status as a top tier WordPress web host.

The Plan

All testing was done on a VPS account. The plan tested had 2 GB ram, 40 GB disk space, 650 GB bandwidth, 20,000 visitors/day and costs $97/month.

Performance

LoadStorm

The first performance test was done with LoadStorm. WebSynthesis made it to the final round of testing where 2000 concurrent users were logging into WordPress and browsing the test site. The test was designed to test non-cached performance by logging users into WordPress. It caused many hosting setups to crumble. You can see WebSynthesis's result in this graph (click on it to play with the interactive results):

Load-Storm-WebSynthesis-2000

 

WebSynthesis stayed under the threshold of 0.5% error rate, but it was close. This grueling 2000 user test really put a strain on the server as you can see from the spikes but it held for 30 minutes without failing.

Blitz

The second load test that was run on WebSynthesis was Blitz. Blitz was used to test cached performance. It simply requested the home page from 1-2000 times per second.

Blitz-WebSynthesis-2000

WebSynthesis was better than flat. There is a slight downward trend in response time. WebSynthesis led the pack, again, delivering 57,776 hits in one minute with a single error. The best results of anyone on this test. Full Blitz Results (PDF)

Uptime

Two third-party uptime monitoring services (StatusCake and UptimeRobot) tracked the test site for a month. The results for WebSynthesis were 100% uptime according to both sources, again.

WebPageTest

“WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster.” WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. WebSynthesis was tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
WebSynthesis 0.407 0.835 0.982 1.024 0.7812

WebSynthesis handled this test fine. In fact, they had the single fastest average page load from a single location of any company at 0.407 seconds from Dulles, VA.

Conclusion

WebSynthesis [Reviews] was teetering on the Load Storm test of having too many errors (0.5%), but they were under it and handled the test quite well. They also had no weird security issues this time around, and WebSynthesis led the pack on Blitz testing. They went from 871 hits/second last time to 963 hits/second this time; leading every provider on the Blitz tests with a whopping 1 error to boot. Sprinkle in some perfect up time numbers and it’s clear WebSynthesis is still a top tier provider and is continuing to get better.

Visit WebSynthesis

websynthesis-big

Pagely WordPress Hosting Review

pagely-full-blue-640x220

This post is based off WordPress Hosting Performance Benchmarks (2014).

 

Overview

Pagely came in with a title to defend. Pagely was one of the top tier web hosts in our first round of testing and didn't show any signs of struggling. My biggest complaint was SFTP was an addon, which they now include with every account. Performance-wise Pagely was back at it again with another top tier performance.

The Plan

All testing was done on a shared account, the Personal / Business plan.  This plan allows for 1 WordPress site, 5GB disk space, 10GB bandwidth and costs $24/month.

Performance

LoadStorm

The first performance test was done with LoadStorm. Pagely made it to the final round of testing where 2000 concurrent users were logging into WordPress and browsing the test site. The test was designed to test non-cached performance by logging users into WordPress. It caused many hosting setups to crumble. You can see Pagely's result in this graph (click on it to play with the interactive results):

Load-Storm-Pagely-2000

 

Pagely did well on this test. There was one error total which caused a response time spike (blue line in the graph). Other than a single error, the performance was impeccable.

Blitz

The second load test that was run on Pagely was Blitz. Blitz was used to test cached performance. It simply requested the home page from 1-2000 times per second.

Blitz-Pagely-2000

Pagely's Blitz result was exemplary. There were 43 timeouts and errors combined. There was a near flat response time which means it had no issues at all. Pagely didn't blink at this test, as expected based on their performance last time on this test. Full Blitz Results (PDF)

Uptime

Two third-party uptime monitoring services (StatusCake and UptimeRobot) tracked the test site for a month. The results for Pagely were 99.95% and 100% uptime. It's hard to complain about those numbers or find any issue with Pagely's uptime.

WebPageTest

“WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster.” WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. Pagely was tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
Pagely 6.831 0.86 0.913 0.709 2.32825

Pagely was the only company that had any issue with the WebPageTest component of our testing. The test from Dulles had bizarrely high load times for no explicable reason. The other locations were all sub one second, so I dismissed it as a real issue because it's likely some fluke networking issue. But there was a weird networking issue.

Conclusion

Pagely easily defended its title as one of the top tier WordPress hosts. They handled the Load Storm test with 1 error. Blitz results stayed similar to the last run. They handled more hits, but had a few more errors+timeouts (1 last time, 43 this time). If performance is the name of the game, Pagely continues to be at the forefront.

Visit Pagely

pagely-full-blue-640x220

LightningBase WordPress Hosting Review

lightningbaselogo1600x290bThis post is based off WordPress Hosting Performance Benchmarks (2014).

 

Overview

LightningBase was a new comer to our WordPress Hosting Performance Benchmarks. Lightning Base's founder, Chris Piepho, was incredibly helpful giving feedback on how the testing in Round 1 was done. His feedback played a bit part in the differences you see in Round 2, namely, cache busting. So it's without a large surprise that someone that cares so deeply about performance that their own service did remarkably well in our testing.

The Plan

All testing was done on a shared account, the Personal plan.  The personal plan allows for 1 WordPress site, 10,000 visits/month, 1GB SSD disk space, 10GB bandwidth, 20GB CDN and costs $9.95/month.

Performance

LoadStorm

The first performance test was done with LoadStorm. LightningBase made it to the final round of testing where 2000 concurrent users were logging into WordPress and browsing the test site. The test was designed to test non-cached performance by logging users into WordPress. It caused many hosting setups to crumble. You can see LightningBase's result in this graph (click on it to play with the interactive results):

 

Load-Storm-Lightning-Base-2000

LightningBase handled the test with minimal errors (23) and showed minimal signs of struggling with the load. There appears to be a bit of delay every so often that looks like a cache update. Other than that minor detail it looks excellent.

Blitz

The second load test that was run on LightningBase was Blitz. Blitz was used to test cached performance. It simply requested the home page from 1-2000 times per second.

Blitz-LightningBase-2000

LightningBase's Blitz result looks textbook. There were no errors and no timeouts. There was a near flat response time which means it had no issues at all. LightningBase aced our Blitz testing. Full Blitz Results (PDF)

Uptime

Two third-party uptime monitoring services (StatusCake and UptimeRobot) tracked the test site for a month. The results for LightningBase in both cases was perfect uptime. In the uptime department, LightningBase had a flawless performance.

WebPageTest

“WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster.” WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. LightningBase was tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
LightningBase 0.584 0.787 0.936 0.675 0.7455

There was absolutely no issues with their WebPageTest results. LightningBase had the fastest average load time of every host compared in our testing.

Conclusion

LightningBase is another new-comer that jumped straight to the top. One of the cheapest too starting at under $10 per month. LightningBase aced the Blitz testing and did excellent on Load Storm tests. There was zero downtime monitored. LightningBase belongs in the top tier of WordPress hosting companies and is delivering amazing value on top of their stellar performance benchmarks.

Visit Lightning Base

lightningbaselogo1600x290b

Media Temple WordPress Hosting Review

media-temple-logo

 

This post is based off WordPress Hosting Performance Benchmarks, where you can read the full details of how Media Temple performed against the competition.

Overview

MediaTemple [Reviews] is a new entrant into the managed WordPress hosting space along with its parent brand GoDaddy. It was acquired by GoDaddy in 2013 and both have jumped head first into the WordPress space sharing a lot of technology. Media Temple has a generally more positive reputation than its parent company and targets developers and designers with a premium offering. MT wasn't in our first round of testing but they did very well in our second round of testing. Media Temple also recently changed their plans and pricing structure, offering plans that scale much higher than the one size fits all plan originally offered. The plan used during our testing was more expensive and had a slightly fewer features. So it seems new customers would get slightly better value and the ability to scale.

The Plan

All testing was done on Media Temple's WordPress hosting package. The plan had 20GB of SSD disk space, unlimited bandwidth usage, allowed 3 sites and had Git and Staging technology. The cost was $29/month.

Performance

LoadStorm

The first performance test was done with LoadStorm. Media Temple made it to the final round of testing where 2000 concurrent users were logging into WordPress and browsing the test site. The test was designed to test non-cached performance by logging users into WordPress. It caused many hosting setups to crumble. You can see MT's result in this graph (click on it to play with the interactive results):

 

Load-Storm-Media-Temple-2000

Media Temple handled this test barely showing signs of struggle. A staggering low error count of 9 (out of more than 249,000 requests)  one of the lowest peak response times at under 1.5 seconds.

Blitz

The second load test that was run on Media Temple was Blitz. Blitz was used to test cached performance. It simply requested the home page from 1-2000 times per second.

Blitz-Media-Temple-2000

Media Temple's Blitz results were near textbook. Flat response times while users scaled to 2000 and a <0.1% error+timeout rate. Full Blitz Results (PDF)

Uptime

Two third-party uptime monitoring services (StatusCake and UptimeRobot) tracked the test site for a month. The results for Media Temple showed 99.81% and 100% uptime respectively.

WebPageTest

“WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster.” WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. MT was tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
Media Temple 1.516 0.983 0.955 0.555 1.00225

There was absolutely no issues with their WebPageTest results, it loaded very quickly with a great average speed of one second.

Conclusion

MediaTemple [Reviews] is interesting because I was told it was running the same technology as GoDaddy (GoDaddy bought Media Temple a year ago). They have a few more premium features like Git and a staging environment. Media Temple’s performance was superb. It actually beat GoDaddy’s performance in just about every measure by a marginal amount on both Load Storm and Blitz’s load testing. If GoDaddy's WordPress Hosting has top tier performance, Media Temple certainly does as well.

 

media-temple-logo

Pantheon WordPress Hosting Review

This post is based off WordPress Hosting Performance Benchmarks (2014).

pantheon_logo_tagline

Overview

Pantheon is one of the new comers into the managed WordPress space. They came over from the Drupal world where they focused on developers and enterprise. They've taken their developer tools and brought them to the WordPress space and made quite a splash leaping into our top tier of managed WordPress hosting companies.

The Plan

All testing was done on the Professional plan (shared) and using the WordPress stack. The plan had 20GB of space and allowed 100,000 visitors per month. The price was $100/month.

Performance

LoadStorm

The first performance test was done with LoadStorm. Pantheon made it to the final round of testing where 2000 concurrent users were logging into WordPress and browsing the test site. The test was designed to test non-cached performance by logging users into WordPress. It caused many hosting setups to crumble. You can see Pantheon's result in this graph (click on it to play with the interactive results):

 

Load-Storm-Pantheon-2000

Pantheon had a whopping zero errors and scaled without issue.  LoadStorm independently analyzed my testing and named Pantheon as performing the best of any WordPress company tested.

Blitz

The second load test that was run on Pantheon was Blitz. Blitz was used to test cached performance. It simply requested the home page from 1-2000 times per second.

Blitz-Pantheon-2000

Pantheon's result looks like it was from a textbook. It maintained roughly the same response time from one to two thousand concurrent users. Full Blitz Results (PDF)

Uptime

Two third-party uptime monitoring services (StatusCake and UptimeRobot) tracked the test site for a month. The results for Pantheon in both cases was perfect uptime. There's nothing more to say than that.

WebPageTest

“WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster.” WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. Pantheon was tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
Pantheon 0.654 0.828 0.923 0.954 0.83975

There was absolutely no issues with their WebPageTest results, it loaded very quickly with a great average speed of under one second.

Conclusion

Pantheon specialized in Drupal hosting, so I was wondering how well it would translate to WordPress. The short answer is, it converted over really well. They had a flawless run on the LoadStorm test - zero errors and not even any spikes in response time over 30 minutes. Pantheon is one of the more expensive options on the market, but they make a very strong case for it. Perfect uptime and near flawless load testing sent them easily into the top tier.

pantheon_logo_tagline

 

 

A Small Orange WordPress Hosting Review

This post is based off WordPress Hosting Performance Benchmarks (2014).

asmallorange

Overview

A Small Orange [Reviews] has won numerous awards from Review Signal including Best Overall Web Host 2012, Best Shared Hosting Provider 2013 and Best Managed VPS Provider 2013. They've been consistently near the top of our rankings since the beginning. They stumbled a bit during our first round of WordPress testing.

But what differentiates a good hosting company from an average one? Accepting there was a short-coming and improving. A Small Orange did exactly that in our second round of testing.

The Plan

All testing was done on a Cloud VPS running ASO's WordPress LEMP stack. The VPS had 1 GB Ram, 15 GB SSD disk space, 600 GB of bandwidth and cost $25/month.

Performance

LoadStorm

The first performance test was done with LoadStorm. A Small Orange made it to the final round of testing where 2000 concurrent users were logging into WordPress and browsing the test site. The test was designed to test non-cached performance by logging users into WordPress. It caused many hosting setups to crumble. You can see ASO's result in this graph (click on it to play with the interactive results):

 

Load-Storm-A-Small-Orange-2000

A Small Orange handled the test without a single error and showed no signs of struggling with the load. There isn't much more so say than ASO handled LoadStorm with grace and ease.

Blitz

The second load test that was run on A Small Orange was Blitz. Blitz was used to test cached performance. It simply requested the home page from 1-2000 times per second.

Blitz-A-Small-Orange-2000

A Small Orange's Blitz results were pretty expected based on the previous test. It showed minimal signs of load around ~1800 users where the response times spiked a bit but were still under 150ms which is barely noticeable. Overall, it was a fantastic performance. Full Blitz Results (PDF)

Uptime

Two third-party uptime monitoring services (StatusCake and UptimeRobot) tracked the test site for a month. The results for A Small Orange in both cases was perfect uptime. StatusCake also had a blazingly fast average response time of 23ms, which led the pack by a wide margin. In the uptime department, ASO had a flawless performance.

WebPageTest

“WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster.” WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. ASO was tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
A Small Orange 1.443 0.801 0.836 0.64 0.93

There was absolutely no issues with their WebPageTest results, it loaded very quickly with a great average speed of under one second.

Conclusion

A Small Orange [Reviews] is one of the top tier WordPress hosting providers when looking at performance. ASO improved their LEMP stack since the last time I tested. They never buckled in any test. Their staff was incredibly friendly (special thank you to Ryan MacDonald) and they’ve stepped up their performance game. The one thing that isn’t quite there yet is the documentation/user experience, there are a lot of improvements they could make to make their LEMP stack more accessible to the less tech savvy. All in all, the experience was in-line with what I would expect from a company that has one of the highest support ratings on our site.

Visit A Small Orange and use the coupon code 'orangelover' for 15% off.

asmallorange

GoDaddy WordPress Hosting Review

GoDaddyLogo

This post is based off WordPress Hosting Performance Benchmarks, where you can read the full details of how GoDaddy performed against the competition.

Overview

GoDaddy [Reviews] is the company that sparked this series of WordPress performance testing. They said their WordPress hosting was as good as anyone else on the markets WordPress hosting. I wanted to see if it was true in our first round of WordPress testing and it certainly was. The second round of testing did not disappoint either. GoDaddy maintained their position in the top tier of managed WordPress providers. This article summarizes GoDaddy's performance under multiple testing scenarios.

The Plan

All testing was done on GoDaddy's shared WordPress Hosting service which cost $6.99/month.The plan allows for 1 site, 100GB of space and 25,000 visitors per month. It also had the usual features of automated backups, 24/7 support via phone or ticket and free migrations.

Performance

LoadStorm

The first performance test was done with LoadStorm. GoDaddy made it to the final round of testing where 2000 concurrent users were logging into WordPress and browsing the test site. The test was designed to test non-cached performance by logging users into WordPress. It caused many hosting setups to crumble. You can see GoDaddy's result in this graph (click on it to play with the interactive results):

Load-Storm-GoDaddy-2000

 

GoDaddy actually pushed through triple the amount of data compared to every other web hosting company. Amazingly, it didn't struggle with this at all. The error rate was an infinitesimal, only 92 errors in over 285,000 requests. Other than a little spike, it handled the entire test without missing a beat.

Blitz

The second load test that was run on GoDaddy was Blitz.io. Blitz was used to test cached performance. It simply requested the home page of our test site from 1-2000 times per second.

Blitz-GoDaddy-2000

 

GoDaddy's Blitz results look exactly like what you hope for. A small spike at the very beginning and then perfectly stable performance afterwards. There are no complaints against its cached performance. Full Blitz Results (PDF)

Uptime

Two third-party uptime monitoring services (UptimeRobot and StatusCake) tracked the test site for a month. The results for GoDaddy was 99.9% and 100% uptime respectively. That is the uptime level you would expect of any good service.

WebPageTest

“WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster.” WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. GoDaddy was tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
GoDaddy 1.607 1.355 0.934 0.855 1.18775

There was absolutely no issues with their WebPageTest results, it loaded very quickly at an average speed of under 1.2 seconds to completely load.

Conclusion

GoDaddy [Reviews] is one of the top tier WordPress hosting providers when looking at performance. GoDaddy continues to surprise me. They flew through all the tests, including a weird issue where they transferred 3X the data during the LoadStorm test and didn’t show any signs of stress. The only comparison I have to last time is the Blitz testing, where they eked out another 3000+ hits and raised their hits/second from 829 to 888. GoDaddy also raised their max hit rate marginally from 1750 to 1763. What’s more impressive is they reduced their errors+timeouts from 686 to 93. More hits with less errors. From a performance perspective, they did excellent in absolute terms and relative to their last benchmarks.

Get 25% off with Coupon Code: cjcwp1

GoDaddyLogo

 

WordPress Hosting Performance Benchmarks (November 2014)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

This is the second round of managed WordPress web hosting performance testing. You can see the original here. The latest (2015 Edition) can be found here.

Companies Tested

A Small Orange* [Reviews]
BlueHost [Reviews]
CloudWays* [Reviews]
DreamHost [Reviews]
FlyWheel* [Reviews]
GoDaddy* [Reviews]
Kinsta*
LightningBase*
MediaTemple* [Reviews]
Nexcess*
Pagely* [Reviews]
Pantheon* [Reviews]
PressLabs*
SiteGround*† [Reviews]
WebSynthesis* [Reviews]
WPEngine* [Reviews]

Note: Digital Ocean and Pressable were removed from testing.

*Company donated an account to test on. I checked to make sure I was on what appeared to be a normal server.

†Tests were performed with SiteGround's proprietary SuperCacher module turned on fully.

The Products (Click for Interactive Table)

 

wordpress hosting product chart screenshot

Methodology

The question I tried to answer is how well do these WordPress hosting services perform? I tested each company on two distinct measures of performance: peak performance and consistency.

All tests were performed on an identical WordPress dummy website with the same plugins except in cases where hosts added extra plugins. Each site was monitored for one month (July 2014) for consistency.

1. LoadStorm

LoadStorm was kind enough to give me unlimited resources to perform load testing on their platform and multiple staff members were involved in designing and testing these WordPress hosts. I created identical scripts for each host to load a site, login to the site and browse the site. Then I increased the user load until a web host started to fail. I stopped at 2000 concurrent users for the web hosts that were left unscathed by load testing. Logged in users were designed to break some of the caching and better simulate real user load which a lot of people (both readers and hosting companies) requested after the first round of testing.

2. Blitz.io

I used Blitz again to compare against previous results. First test was 60 seconds, scaling from 1-1000 users. The second test was 60 seconds, scaling from 1-2000.

3. Uptime (UptimeRobot and StatusCake)

Consistency matters. I wanted to see how well these companies performed over a longer period of time. I used two separate uptime monitoring services over the course of a month to test consistency.

4. WebPageTest.org

"WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster." WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. I tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Background Information

Before I go over the results I wanted to explain and discuss a few things. Every provider I tested had the latest version of WordPress installed. I had to ask a lot of companies to disable some security features to perform accurate load tests. Those companies were: GoDaddy, LightningBase, MediaTemple, SiteGround and WebSynthesis. I also asked DreamHost and WPEngine, but they refused my request.

Some companies were more cooperative than others. SiteGround spent hours with me customizing their security features to let the load testing tools bypass their security measures. PressLabs ran into an issue we were never able to resolve to get Load Storm to work properly on their servers. We spent hours trying to fix it, but couldn't find a solution. That's why they are missing some test data.

CloudWays is an interesting platform that let's you deploy your WordPress stack to either Digital Ocean or Amazon's EC2 servers. I was given a server on each platform of near comparable specs (EC2 Small 1.7GB vs Digital Ocean 2GB). So CloudWays is listed as CloudWays AWS and CloudWays DO to indicate which provider the stack was running on.

Pantheon was tested on their free development environment which I was told is identical to their production environment.

Results

Load Storm

I ran multiple Load Storm tests to get a sense of where to start testing. The first was 1-100 users, which not a single company struggled with. The second was 50-500 users, which again nobody struggled with. So the first meaningful test was 100-1000 users. For the companies that didn't struggle there, I did a 500-2000 user test. I ran these tests with an immense amount of help from Scott Price at LoadStorm. He spent hours with me, teaching me how to use LoadStorm, build tests and offering guidance/feedback on the tests themselves.

 Test 1. 100-1000 Concurrent Users over 30 minutes

 

Request Count Average RPS Peak Response Time (ms) Average Response Time (ms) Average Throughput (kB/s) Errors
A Small Orange 116127 64.52 2752 356 1318.55 41
BlueHost 107427 59.68 16727 1306 1159.55 13351
Cloudways DO 103359 55.57 16983 1807 1169.28 2255
Cloudways AWS 87447 47.01 16286 5436 821.75 18530
DreamHost 115634 62.17 15514 441 1244.31 4327
FlyWheel 116027 62.38 775 368 1287.86 0
GoDaddy 133133 71.58 1905 434 3883.42 0
Kinsta 116661 62.72 552 309 1294.77 0
LightningBase 117062 62.94 1319 256 1324.89 12
MediaTemple 116120 62.43 793 403 1304.27 0
Nexcess 116634 62.71 15085 294 1299.85 8
Pagely 119768 64.39 1548 461 1227.06 0
Pantheon 117333 63.08 528 264 1316.41 0
SiteGround 117961 63.42 939 165 180.09 0
WebSynthesis 116327 62.54 1101 332 1285.83 0
WPEngine 123901 68.83 10111 416 1302.44 2956

Discussion of Load Storm Test 1 Results

There was a pretty clear division of good and bad performance in this test. Most companies didn't struggle at all. A few collapsed: BlueHost, CloudWays AWS, CloudWays DO, and DreamHost. BlueHost started spewing 500 errors almost as soon as we started. CloudWays AWS started timing out immediately. CloudWays DO started having issues around 800 users and then started timing out. DreamHost started giving 503 Service Unavailable almost right away. It looks like our script triggered a security mechanism but they refused to work with me to test any further.

SiteGround ran into a security measure we weren't able to get around in time for publishing this article. The server seemed to just throttle the connection again.

PressLabs isn't listed because we couldn't get LoadStorm to work on their system. I am not sure what was different about their backend, but I tried to work with PressLabs and LoadStorm to get it working to no avail.

 

  • Load-Storm-A-Small-Orange-2000
  • Load-Storm-Fly-Wheel-2000
  • Load-Storm-GoDaddy-2000
  • Load-Storm-Kinsta-2000
  • Load-Storm-Lightning-Base-2000
  • Load-Storm-Nexcess-2000
  • Load-Storm-Pagely-2000
  • Load-Storm-Pantheon-2000
  • Load-Storm-SiteGround-2000
  • Load-Storm-WebSynthesis-2000

Test 2. 500 - 2000 Concurrent Users over 30 Minutes

I removed the hosts that failed and doubled the concurrent users for the second test.

Request Count Average RPS Peak Response Time (ms) Average Response Time (ms) Average Throughput (kB/s) Errors
A Small Orange 248249 133.47 5905 436 2639.68 0
FlyWheel 236474 127.14 3811 983 2499.11 16841
GoDaddy 285071 153.26 8896 371 8255.24 92
Kinsta 248765 133.74 942 316 2714.82 0
LightningBase 248679 133.7 3887 343 2763.92 23
MediaTemple 249125 133.94 1499 313 2748.32 9
Nexcess 243115 130.71 15097 388 2644.72 80
Pagely 256163 137.72 15078 446 2621.04 1
Pantheon 250063 134.44 1111 297 2754.67 0
WebSynthesis 240305 129.2 4389 743 2598.83 1173

Discussion of Load Storm Test 2 Results 

FlyWheel started to fail around 1500 users causing 502 errors and remained constant at that level of failure. I'm not sure what the bottleneck was, but it didn't overload the server, but I suspect the I/O of something bottle-necked causing a certain amount of requests to fail. WebSynthesis had a few errors as well, they were 5 separate spikes somewhat evenly spaced out. The server didn't show signs of failure, it looks like it might have been an issue with caches being refreshed and some requests failing in the meantime. WebSynthesis' error rate was still under 0.5%, so I don't have any real issue with those errors. The slower average response time can also be attributed to the spikes in performance.

Remarkably, some companies didn't even struggle. Kinsta kept sub one second response times for 30 minutes and nearly a quarter million requests. Most companies had a spike or two causing a higher peak response time, but Kinsta and Pantheon didn't (and Media Temple had a tiny one at 1.5 seconds). Simply amazing performance.

Another interesting note, GoDaddy pushed triple the amount of data through because their admin screen had a lot more resources being loaded. That's why the average throughput is so high. Despite that fact, it didn't seem to impact their performance at all, which is astounding.

Full Interactive Test Results

A Small Orange
FlyWheel
GoDaddy
Kinsta
LightningBase
MediaTemple
Nexcess
Pagely
Pantheon
SiteGround
WebSynthesis

Blitz.io

 Test 1. 1-1000 Concurrent Users over 60 seconds

Blitz Test 1. Quick Results Table

Success Errors Timeouts Avg Hits/Second Avg Response (ms)
A Small Orange 27595 14 0 460 67 ms
BlueHost 23794 1134 189 397 160 ms
CloudWays AWS 24070 162 148 401 138 ms
CloudWays DO 27132 118 127 452 49 ms
DreamHost 13073 45 7885 218 21 ms
FlyWheel 28669 20 10 478 27 ms
GoDaddy 26623 8 5 444 104 ms
Kinsta 27544 0 0 459 69 ms
LightningBase 27893 0 1 465 56 ms
MediaTemple 26691 8 9 445 102 ms
Nexcess 18890 2288 641 337 517 ms
Pagely 25358 9 0 423 156 ms
Pantheon 27676 21 0 461 64 ms
PressLabs 25903 143 0 432 89 ms
SiteGround 24939 0 0 416 152 ms
WebSynthesis 28913 0 0 482 19 ms
WPEngine 23074 121 4 385 247 ms

Discussion of Blitz Test 1 Results

I learned from the last round of testing that any hosting that isn't optimized at all for WordPress (default install) will get destroyed by these tests. So I didn't include any of them this time. There wasn't any as catastrophic failures this time.

Who performed without any major issues?

A Small Orange, FlyWheel, GoDaddy, Kinsta, LightningBase, MediaTemple, Pagely, Pantheon, SiteGround, WebSynthesis all performed near perfect. There's nothing more to say for these companies other than they did excellent. All of their error/time rates were below 0.5%.

Who had some minor issues?

CloudWays AWS, CloudWays DO, PressLabs and WPEngine. All four of these providers had over 100 errors/timeouts and had an error/timeout rates between 0.5%-2%. Not a huge deal, but definitely not perfect.

Who had some major issues?

BlueHost, DreamHost, and Nexcess. BlueHost started show stress around 40 seconds in and started to buckle around 47 seconds. DreamHost had a couple spikes in response time and errors. However, it looks like the load testing tool may have hit some type of security limit because requests started timing out but it gave very fast responses and maintained roughly 250 hits/second constantly. It doesn't look like the server was failing. I couldn't get them to disable the security to really test it, so it's hard to say much more. Nexcess started to show stress around 20 seconds and buckle around 30 seconds.

 Test 2. 1-2000 Concurrent Users over 60 seconds

  • Blitz-A-Small-Orange-2000
  • Blitz-Blue-Host-2000
  • Blitz-CloudWays-AWS-2000
  • Blitz-CloudWays-DO-2000
  • Blitz-Dream-Host-2000
  • Blitz-Fly-Wheel-2000
  • Blitz-GoDaddy-2000
  • Blitz-Kinsta-2000
  • Blitz-LightningBase-2000
  • Blitz-Media-Temple-2000
  • Blitz-Nexcess-2000
  • Blitz-Pagely-2000
  • Blitz-Pantheon-2000
  • Blitz-PressLabs-2000
  • Blitz-SiteGround-2000
  • Blitz-WebSynthesis-2000
  • Blitz-WPEngine-2000

Blitz Test 2. Quick Results Table

Success Errors Timeouts Avg Hits/Second Avg Response (ms)
A Small Orange 54152 26 1 903 77 ms
BlueHost 29394 14368 3408 490 234 ms
CloudWays AWS 25498 4780 8865 425 338 ms
CloudWays DO 53034 1477 49 884 58 ms
DreamHost 10237 5201 20396 171 201 ms
FlyWheel 56940 121 68 949 29 ms
GoDaddy 53262 29 64 888 105 ms
Kinsta 55011 32 0 917 69 ms
LightningBase 55648 0 0 927 58 ms
MediaTemple 53363 16 28 889 100 ms
Nexcess 25556 15509 4666 426 279 ms
Pagely 51235 41 2 854 147 ms
Pantheon 55187 91 0 920 65 ms
PressLabs 35547 4105 1569 592 326 ms
SiteGround 42645 490 220 711 276 ms
WebSynthesis 57776 1 0 963 20 ms
WPEngine 39890 304 333 665 364 ms

Discussion of Blitz Test 2 Results

Who performed without any major issues?

A Small Orange, FlyWheel, GoDaddy, Kinsta, LightningBase, MediaTemple, Pagely, Pantheon, WebSynthesis all performed near perfect. All of their error/time rates were around 0.5% or lower.

Who had some minor issues?

SiteGround and WPEngine. All four of these providers had over 100 errors/timeouts and had an error/timeout rates between 0.5%-2%. SiteGround started to show some stress around 30 seconds and didn't started to have real issues after 50 seconds (errors). WPEngine started to show stress around 20 seconds and performed slightly erratically until the end of the test.

Who had some major issues?

BlueHost, CloudWays AWS, CloudWays DO, DreamHost, Nexcess, and PressLabs. The four that had major issues from last around completely failed with error/timeout rates exceeding 50%. DreamHost who looked like it was fine behind the security measures buckled around 35 seconds into this test and started returning errors, increased response times and the hits/second dropped. CloudWays DO definitely started to stress and show signs of buckling around 50 seconds. But its error rate was still under 3%. I don't think it would have lasted much longer had the tests gone further, but it was the least worst failure. PressLabs was a surprise, it started to show stress around 25 seconds and started to buckle around 35 seconds into the test.

 Full Blitz Results (PDFs)

A Small OrangeBlueHost, CloudWays AWS, CloudWays DO, DreamHost, FlyWheel, GoDaddy, Kinsta, LightningBase, MediaTemple, Nexcess, Pagely, Pantheon, PressLabs, SiteGroundWebSynthesis, WPEngine.

Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. All the companies were monitored over an entire month (July 2014).

Uptime Robot

Uptime (%)
A Small Orange 100
BlueHost 99.71
CloudWays AWS 100
CloudWays DO 99.93
DreamHost 99.92
FlyWheel 99.97
GoDaddy 99.9
Kinsta 100
LightningBase 100
MediaTemple 99.81
Nexcess 100
Pagely 99.95
Pantheon 100
PressLabs 100
SiteGround 100
WebSynthesis 100
WPEngine 100

According to UptimeRobot not a single company was below 99.5% uptime. In fact, with the exception of Media Temple and BlueHost, they were all above 99.9% uptime. For reference 99.5% uptime is 3.5 hours of downtime per month. 99.9% is <45 minutes of downtime per month. Overall, nothing to really complain about according to Uptime Robot.

StatusCake

Availability (%) Response Time (ms)
A Small Orange 1 0.23s
BlueHost 0.9969 2.45s
CloudWays AWS 0.998 0.75s
CloudWays DO 1 2.41s
DreamHost 1 2.22s
FlyWheel 0.999 1.99s
GoDaddy 1 2.41s
Kinsta 1 2.13s
LightningBase 1 1.6s
MediaTemple 1 1.18s
Nexcess 1 2.33s
Pagely 1 2.49s
Pantheon 1 2.04s
PressLabs 1 1.49s
SiteGround 0.9993 1.64s
WebSynthesis 1 1.77s
WPEngine 1 2.76s

According to StatusCake, the results look even better. I used multiple services to monitor because there can be networking issues unrelated to a web host's performance. StatusCake only detected issues with four companies, which is fewer than UptimeRobot detected. It's hard to say which is better or right. But they both say that uptime didn't really seem to be an issue for any company.

StatusCake also provides an average response time metric. According to them, it's using a browser instance and fully rendering the page. They also have many different geographical locations that they are testing from. I don't have any further insight into these tools beyond what I can read on their documentation. If they are to be believed, A Small Orange has astonishingly fast performance. WPEngine is the slowest average load time at 2.76 seconds which isn't that bad.

 

WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only. This was tested against the default install from every company.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
A Small Orange 1.443 0.801 0.836 0.64 0.93
BlueHost 1.925 1.321 1.012 0.785 1.26075
CloudWays AWS 0.655 0.867 0.967 0.746 0.80875
CloudWays DO 0.493 0.851 1.036 0.811 0.79775
DreamHost 1.177 0.863 1.067 1.147 1.0635
FlyWheel 0.497 0.864 1.066 1.109 0.884
GoDaddy 1.607 1.355 0.934 0.855 1.18775
Kinsta 0.759 0.752 0.947 0.592 0.7625
LightningBase 0.584 0.787 0.936 0.675 0.7455
MediaTemple 1.516 0.983 0.955 0.555 1.00225
Nexcess 1.433 1.139 1.196 0.859 1.15675
Pagely 6.831 0.86 0.913 0.709 2.32825
Pantheon 0.654 0.828 0.923 0.954 0.83975
PressLabs 0.715 1.018 1.213 0.723 0.91725
SiteGround 1.392 1.239 1.01 1.212 1.21325
WebSynthesis 0.407 0.835 0.982 1.024 0.812
WPEngine 0.821 1.086 0.839 0.685 0.85775

There isn't much surprising here. The pack is really tight with less than a half second difference average between the top and bottom hosts. If we exclude Pagely. I'm not sure what happened with their Dulles, VA test, but it seems like there was something terribly wrong with the network when I tested it. The average response times from every other location were incredibly fast (<1 second). I'm going to chalk it up to a bad node somewhere causing that particular test to perform so poorly, almost certainly not a reflection of their hosting.

What is interesting, compared to last time is that these companies are getting faster. There was only one company with a sub 1 second average last time. Now there are 10 companies (11 if you count Pagely). Three of them were above one second last time, so they are showing signs of improvement (Pagely, WebSynthesis, WPEngine). It also means there is a lot of new competition that is not behind the entrenched players in terms of performance.

Conclusion

Every service seems to have their issues somewhere if you look hard enough. I try to avoid injecting my personal opinion and bias as much as possible. So I won't be ranking or outright saying any single company is the best. Some providers did exceptionally well and tended to clump together performance-wise, I will call those the top tier providers. This top tier designation is related to performance only and is claimed only from the results of these tests. What each of these companies is offering is different and may best suit different audiences depending on a variety of factors beyond performance, such as features, price, support, and scale (I tested mostly entry level plans). But I will provide a short summary and discussion of the results for each provider.

Top Tier WordPress Hosting Performance

A Small Orange, GoDaddy, Kinsta, LightningBase, MediaTemple, Pagely, Pantheon, WebSynthesis

Each of these companies were below the 0.5% error rate on all load testing all the way up to 2000 concurrent users on both LoadStorm and Blitz.

Honorable Mention

FlyWheel gets an honorable mention. They performed really well on many of the tests. FlyWheel fell apart on the final LoadStorm test to 2000 logged in users. I'll explain more in their individual section as to why this is deserving of an honorable mention.

Amazon Web Services (AWS) vs Digital Ocean

One of the most interesting comparisons to me was CloudWays. They provide you with the ability to choose which VPS provider and type you want. It then sets up their WordPress configuration (in an identical manner from my understanding) on the VPS. I was granted access to one Amazon and one Digital Ocean VPS from them. The Amazon was a small (1.7GB ram) and the Digital Ocean was a 2GB ram instance.

aws_vs_digital_ocean_loadstorm

The head-to-head results from LoadStorm (1000 user test) results above pretty clearly show Digital ocean performing better in every category (with the exception of Peak Response Time which is a timeout). Digital Ocean sent more data, had less errors and it did it faster.

aws_vs_digital_ocean_blitz

The Blitz.io results show pretty clearly that Digital Ocean is outperforming AWS by a wide margin as well. It delivered twice as many hits with less errors and time outs.

It's pretty easy to conclude based on the tests that on the low-end VPSs, that Digital Ocean's hardware outperforms Amazon's hardware.

Individual Host Analysis

A Small Orange

They've improved their LEMP stack since the last time I tested. They never buckled in any test and were definitely one of the best. Their staff was incredibly friendly (special thank you to Ryan MacDonald) and they've stepped up their performance game. The one thing that isn't quite there yet is the documentation/user experience, there are a lot of improvements they could make to make their LEMP stack more accessible to the less tech savvy. All in all, the experience was in-line with what I would expect from a company that has one of the highest support ratings on our site.

BlueHost

Their WordPress offering is brand new. It struggled in every load test. Their price is on the middle-high end but the performance was not. Ultimately, they fell short of where I would expect based on pricing and the competition.

CloudWays

CloudWays was certainly an interesting company to test given that they had two entries, one running on Amazon Web Services (EC2) and another on Digital Ocean. The Digital Ocean VPS outperformed AWS in every category which was interesting. The AWS instance's performance was near the bottom of the pack performance wise, but the Digital Ocean one was in the middle. It is a very interesting platform they have built which allows deployment and management across providers. However, their performance isn't quite there yet. Other companies are running on the same hardware and getting better results. CloudWays doesn't do just WordPress, so it's easy to understand why their performance might not quite be as good as some of their competitors who solely focus on WordPress.

DreamHost

DreamPress was another disappointment. The security features hid some of the performance weakness on the first Blitz test, but it completely failed on the second. The way DreamPress is designed it says it has automatic RAM scaling and each site is run by two VPS instances. It's very unclear what resources you are really getting for your money. They are charging $50/month for a 1GB ram VPS, so I get the feeling a lot of resources are shared and it may not be a true VPS.

FlyWheel

FlyWheel were excellent on every test except the final 2000 logged in user test from LoadStorm. They are built on top of Digital Ocean and I was using the smallest VPS. Yet their performance beat VPSs on Digital Ocean that had four times the resources (CloudWays DO). For cached content on the Blitz test, they had the second highest hits/second and response time. I suspect the testing hit a hardware maximum. FlyWheel had the best performance with the lowest dedicated resources (512MB ram). The companies that outperformed it had more resources dedicated to them or shared resources which presumably would allow access to far greater than 512MB ram. It was an impressive performance given what they are selling and combined with them having the best reviews of any company Review Signal has ever tracked. FlyWheel certainly merit serious consideration.

GoDaddy

GoDaddy continues to surprise me. They flew through all the tests, including a weird issue where they transferred 3X the data during the LoadStorm test and didn't show any signs of stress. The only comparison I have to last time is the Blitz testing, where they eked out another 3000+ hits and raised their hits/second from 829 to 888. GoDaddy also raised their max hit rate marginally from 1750 to 1763. What's more impressive is they reduced their errors+timeouts from 686 to 93. More hits with less errors. From a performance perspective, they did excellent in absolute terms and relative to their last benchmarks.

Kinsta

A new-comer that jumped straight to the top of the performance tiers. Kinsta's performance was amazing in the Load Storm 2000 logged in user test. They had the lowest peak response time and zero errors over a 30 minute test. They didn't struggle with any tests whatsoever and showed zero downtime. Kinsta's performance was top tier.

LightningBase

Another new-comer that jumped straight to the top. One of the cheapest too starting at under $10. LightningBase aced the Blitz testing and did excellent on Load Storm. There was no downtime monitored. LightningBase belongs in the top tier and is delivering amazing value.

Media Temple

Media Temple is interesting because I was told it was running the same technology as GoDaddy (GoDaddy bought Media Temple a year ago). They have a few more premium features like Git and a staging environment. Media Temple's performance was superb. It actually beat GoDaddy's performance in just about every measure by a marginal amount on both Load Storm and Blitz's load testing. If GoDaddy has top tier performance, Media Temple certainly does as well.

Nexcess

Nexcess's performance was excellent in the Load Storm testing. However, it completely collapsed during the Blitz load testing. I'm really not sure what to make of those results. Perhaps the underlying shared hardware is very good but the static caching setup isn't quite up to snuff? It's probably not worth speculating, suffice to say, Nexcess ended up looking like a middle of the pack web host instead of a top tier one because of the Blitz test.

Pagely

Pagely put on another spectacular performance. They handled the Load Storm test with 1 error. Blitz results stayed similar to the last run. They handled more hits, but had a few more errors+timeouts (1 last time, 43 this time). Really not much to add here other than they continue to be in the top tier.

Pantheon

Pantheon specialized in Drupal hosting, so I was wondering how well it would translate to WordPress. The short answer is, it converted over really well. They had a flawless run on the LoadStorm test - 0 errors and not even any spikes in response time over 30 minutes. They are one of the most expensive (only second to PressLabs) options on this list, but definitely make a case for it. Perfect uptime and near flawless load testing sent them easily into the top tier.

PressLabs

It's hard to write much about PressLabs because we couldn't get LoadStorm to work properly to test out their hosting. However, their Blitz results were lackluster. For the most expensive plan we tested, it was a bit of a disappointment to see it not do stunningly well.

SiteGround

SiteGround sadly didn't do as well as they did last time. Their Blitz load testing score went down slightly. We couldn't bypass their security measures to properly test Load Storm. They obviously have some good protection measures to prevent malicious users from trying to access too many things, but it also meant I couldn't get a deeper look this time around. That was a change from the last round of testing. Slightly disappointing to see the performance dip, but I hope it was due to the extra security measures they put in place that made testing them difficult.

WebSynthesis

WebSynthesis was teetering on the Load Storm test of having too many errors (0.5%), but they were under it and handled the test quite well. They also had no weird security issues this time around, and WebSynthesis led the pack on Blitz testing. They went from 871 hits/second to 963 hits/second; leading every provider on the Blitz tests with a whopping 1 error to boot. Sprinkle in some perfect up time numbers and it's clear WebSynthesis is still a top tier provider and is continuing to get better.

WPEngine

I feel like I could copy+paste my last conclusion about WPEngine. "WPEngine had some issues. Uptime was not one of them, they were perfect or upwards of 99.9% in that department. However, their performance shortcomings became apparent during the load tests." They didn't even make it to the final round of Load Storm testing. They were also middle of the pack on the Blitz testing. Compared to the last round of Blitz testing, the results were nearly identical, with slightly fewer errors+timeouts. I'm not sure if I should be disappointed to not see improvement or relieved to see them maintain the exact same performance and consistency. Their vaunted rankings on Review Signal's reviews have slipped relative to a few of the other providers on here (FlyWheel and WebSynthesis). While they were once leading the pack in technology, the rest of the pack is starting to catch up.

 

Thank Yous

A special thanks goes out to the sponsor of this post and an individual employee, Scott Price of Load Storm, who worked countless hours with me in order to perform these tests.

I want to thank all the companies that participated in these tests. I tested the support staff a fair bit at some of them and I thank them for their time and patience.

A special thanks goes to Chris Piepho from LightningBase also provided a lot of feedback based on the original article and helped improve the methodology for this round.

A huge thanks goes out to Mark Gavalda at Kinsta for his feedback and performance testing discussions. He's tested some further out stuff than I have like HHVM and php-ng performance. Also to their designer, Peter Sziraki, who designed the the header image for this article.

 

Introducing Pagely and FlyWheel

I am happy to announce two new hosts on Review Signal today.

One of them was a long time in coming, Pagely. The original managed WordPress hosting company. In my original managed WordPress hosting performance benchmarks, Pagely came out at the top - having no trouble with any of the tests I threw at their services.

Pagely_october_2014

 

It's a bit disappointing to see that their reviews don't quite match their performance. From what I can tell, it looks like they've struggled with some major outages in the past. However, there is an upward trend in opinions about them. Their performance is top notch, it would be great to see the rest of the service catch up.

Our second addition to Review Signal is FlyWheel which has an astounding 95% Overall Rating. FlyWheel is another managed WordPress hosting service built on top of Digital Ocean. I don't think I've ever seen a company have such positive reviews. It's a struggle to find anyone saying something negative about them. It's wonderful see such positive reviews for a new company.

Flywheel_october_2014

I hope they can keep it up, but my past experience says all the companies which start out so remarkably strong generally tend to come down to more 'normal' levels in the 70%ish range. Their competitors WPEngine (82% -> 73%) and WebSynthesis (83% -> 76%) both did. Also the company they were built on top of, Digital Ocean, went from 81% to 76%. Great service seems like the hardest problem to scale for a web hosting company. I hope FlyWheel can break the rules and continue it's streak of excellence.