Tag Archives: cloudways

cloudways by digital ocean logo

Analyzing Digital Ocean’s First Major Move with Cloudways

Disclaimer: I own/have owned shares of $DOCN (Digital Ocean). All opinions and analysis aren't financial advice. I attempt to be as impartial as possible, but disclosing the financial relationship is important for transparency.

Cloudways [Reviews] announced their first price increases since 2017 today on their blog, which will take effect April 1, 2023. This is interesting because it's the first major move since Cloudways was acquired by Digital Ocean [Reviews] in August 2022 for a whopping $350 million.

There is a very nice looking chart which shows you all the price changes by cloud provider.

cloudways pricing chart

Cloudways pricing change chart

The chart/tool is very easy to understand and clearly shows users the price increases. I appreciate clarity, transparency and notice about the price increase.

The increase did strike my curiosity though. How were the price increases distributed by cloud provider since Cloudways suddenly has a vested interest in Digital Ocean's success. There are so few publicly traded web hosting companies; so I enjoy analyzing the few there are. Digital Ocean may be the only meaningful publicly traded company in the US (sorry Rackspace, things don't look good looking on the financials and the recent hack doesn't help) that focuses solely on web hosting (Amazon, Google, Microsoft, GoDaddy all compete in multiple spaces beyond hosting). I used to enjoy analyzing Endurance International Group (EIG - currently, Newfold Digital) when their financials were released as they were a pure web hosting play but were taken private a few years ago. Today, I get to analyze Digital Ocean's first move with Cloudways and then look later to see how it actually played out when financials are released. Without further ado...

I spent longer than I care to admit creating the following spreadsheet. I wanted to know how much the providers charge, how much is being charged now for the instance and how much it's going up by in absolute and percentage terms.

The results were inline with what I expected. Digital Ocean, especially at the lower tiers, is getting an incredibly favorable deal versus its competitors.

If we look at 2GB ram instances across providers we would see the following:

cloudways pricing 2gb

2GB Pricing Comparison

What's interesting is that Digital Ocean seems to have had the best deal before the price increases by a significant margin. The 2GB instance is more expensive ($12) at the provider than Linode/Vulture ($10) and is selling cheaper by $1-2 on Cloudways before the price increases. The markup difference is 83% for DO and 130-140% for Vultr and Linode respectively.

After the price increases the difference is even more pronounced with Linode and Vultr ending at the same 180% markup while Digital Ocean is only at 100%.

I became even more curious. Amazon and Google don't seem like the biggest direct competitors to Digital Ocean, they have much higher pricing already. I suspect if you're picking Amazon/Google, you're picking them for the brand already, not the price to performance ratio. Linode and Vultr definitely have been comparable services to Digital Ocean for a long time. How did markups vary between them?

Digital Ocean vs Linode vs Cloudways - New Markup %

At the 2GB level Linode and Vultr were standardized in the price increase, but the differences afterwards are interesting. Linode actually becomes more marked up for the next tier up. Afterwards they all decrease in markup %. It's interesting to note Vultr's markup goes down more than Linode despite starting equally at the 2GB price tier.

Digital Ocean's markup goes negative at the 32GB tier (and the plan itself offers vastly more resources than the publicly listed plan in terms of storage and data transfer). So it's actually cheaper to get a 32GB plan through Cloudways than buying direct at Digital Ocean. That is certainly one way to attract new higher value customers to the platform.

Analysis

After spending time looking at the data, the price increases would seem to indicate a clear message. Digital Ocean owns Cloudways. It's making itself the most attractive option on the platform, even offering better deals than they offer direct in some cases. That is exactly what I expected after the acquisition, Digital Ocean should be leveraging Cloudways to increase it's value. They would be capturing margin on both the underlying infrastructure provided by Digital Ocean and from the management/services layer offered by Cloudways.

The concern as a consumer would be getting pushed towards certain infrastructure providers by pricing options controlled by the middleman instead of the infrastructure providers themselves. Does this open the door for more widely for competitors to step in and offer management/service layers that are multicloud with less pricing inequality? Are most consumers already price conscious and using Digital Ocean through Cloudways?

As an investor, I would have similar concerns about what it would do for the brand and competition. I thought the acquisition of Cloudways was smart by Digital Ocean (perhaps maybe not the purchase price, but the company itself makes sense to own). It meant for every customer on Cloudways, Digital Ocean would be getting a cut - even from customers using their competitors. That seemed like a great model. If Digital Ocean puts a slight incentive to use them as the infrastructure provider as well, it makes even more sense. I just hope the price point keeps them competitive as an option on the other clouds so they can continue to take a cut of every hosting transaction regardless of the cloud provider versus becoming solely a funnel for Digital Ocean products.

The price increase timing also indicates Digital Ocean, the publicly traded company, which recently went through a round of layoffs, likely needs to become profitable as it recorded a loss last quarter. It has only recorded a profit once, the quarter before last. With rising interest rates and tech stocks collapsing, it probably isn't the best time to be growing by taking on more debt and running at a loss. We've seen a lot of providers raising rates lately using justifications of energy costs, inflation and probably other reasons. It's probably a reasonable time to pull the trigger on the increase given the macroeconomic circumstances.

Cloudways was expected to generate roughly 50 million in revenue in 2022, so adding a 5-10% increase in revenue would be 2.5-5 million on an annual basis or 625k-1.25m on a quarterly basis. Considering Digital Ocean had a loss of 10m last quarter, that could make up over 10% of the shortfall from being profitable. A not insignificant chunk.

I don't know what the math looks like behind the scenes. I am just doing napkin math and using a bit of intuition. I am sure someone calculated and estimated the impact of the price increases and how they were applied. I will be interested to see the earnings each quarter for the next few months to measure the impact. I also wonder if Cloudways revenue will be broken out separately or not. Hosting is often quite an inelastic product, it's a pain to move and change. If someone calculated everything correctly, this could be great at helping Digital Ocean return to profitability in the short to medium term. I look forward to seeing how Digital Ocean performs in the future.

CloudWays Review (2018)

CloudWays participated for the fourth time in our WordPress Hosting Performance Benchmarks (2018). This review is based off the results of that test. This year CloudWays participated in the 25-50, 51-100 and 201-500 price tiers.

In years past it's been interesting to see CloudWays compete with the same stack on different platforms. This year is the furthest departure from that we've seen so far. It's also the first time CloudWays has earned Top Tier status for two out of three plans that competed this year. It's also important to note, the Digital Ocean plan was originally $70 when testing started but Digital Ocean reduced their pricing causing the cost of the plan to drop dramatically to $42 hence the competing in a different tier above (51-100).

The Products

Plan Monthly Price Plan Visitors Allowed Plan Memory/RAM Plan Disk Space Plan Bandwidth
Plan Sites Allowed
Vultr 4GB New York $44 Unlimited 4GB 60GB SSD 3 TB Unlimited
DigitalOcean 4GB $42 Unlimited 4GB 80gb 4TB unlimited
AWS EC2 - 2XL - USA N.Virginia $495.50 Unlimited 32GB Starts from 4GB (variable) 2GB Unlimited

Performance Review

Load Storm

Total Requests Total Errors Peak RPS Average RPS Peak Response Time(ms) Average Response Time (ms) Total Data Transferred (GB) Peak Throughput (MB/s) Average Throughput (MB/s)
CloudWays AWS 770,304 822 607.12 427.95 15,083 324 52.27 42.27 29.04
CloudWays Vultr 328,015 0 249.5 182.23 7,372 360 22.27 18.55 12.37
CloudWays DO 442,424 243 331.95 245.79 15,097 1,131 30.11 21.79 16.73

Sources: AWS, Vultr, DO

The Load Storm test is designed to simulate real users coming to the site, logging in and browsing the site bursting some of the caching mechanisms typically found on managed WordPress hosts.

The Vultr plan had no issue, the AWS and DO had a few errors but around 0.1% which is negligible. The big issue was Digital Ocean's response time started to increase as the load increased which knocked it out of earning Top Tier status. AWS and Vultr did great overall. With the price change, I wonder if the smaller test would have been handled better though.

Load Impact

Requests Errors Data Transferred (GB) Peak Average Load Time (Seconds) Peak Average Bandwidth (Mbps)
Peak Average Requests/Sec
CloudWays AWS 1049669 0 57 0.36 908 1990
CloudWays Vultr 335275 0 18.21 0.544 278 624
CloudWays DO 457906 30 24.89 1.92 312 702

Sources: AWS, Vultr, DO.

The Load Impact test makes sure static caching is effective so that if a page gets a lot of traffic the site will keep responding without issue.

AWS and Vultr handled it perfectly with zero errors. Digital Ocean had a miniscule 30 errors but an increased response time as the test went on. Great showing overall on the Load Impact test for AWS and Vultr.

Uptime

UptimeRobot StatusCake
CloudWays AWS 100 100
CloudWays Vultr 100 100
CloudWays DO 100 100

Perfect. Across the board, perfect uptime.

WebPageTest / WPPerformanceTester

PHP Bench WP Bench
CloudWays AWS 8.831 266.5955745
CloudWays Vultr 9.616 346.1405331
CloudWays DO 13.421 135.7036233

The WPPerformanceTester results are normal.

CloudWays AWS CloudWays Vultr CloudWays DO
Dulles 0.312 0.33 0.328
Denver 1.137 1.305 1.101
LA 0.924 1.014 1.069
London 0.763 0.725 0.718
Frankfurt 0.816 0.785 0.819
Rose Hill, Mauritius 1.867 2.497 1.86
Singapore 2.281 2.163 2.25
Mumbai 1.65 2.287 1.646
Japan 1.578 1.613 1.573
Sydney 1.837 1.978 2.068
Brazil 1.1 1.195 1.172

The WPT tests look good. The Dulles test scores were some of the fastest, especially the CloudWays AWS server which was located in the same testing data center.

Conclusion

Hard work does pay off. CloudWays has been participating for years and continually has been improving. Two Top Tier awards for AWS and Vultr plans. The Digital Ocean plan unfortunately didn't share the same honor but it seems it was competing above its weight with a price drop that would have put it one cost tier below now.

A2, CloudWays, Heart Internet, HostPapa, OVH, Pantheon, ScaleWay and TsoHost Added to Review Signal

Happy to announce a lot of new additions to Review Signal including our first UK companies (HeartInternet and TsoHost). UK companies are displayed with a UK flag in search results and on the company pages.

Overall score is in parentheses after the company.

A2 Hosting (49%)

CloudWays (65%)

HeartInternet (28%)

HostPapa (27%)

OVH (38%)

Pantheon (77%)

ScaleWay (62%)

Tsohost (70%)

 

$51-100/Month WordPress Hosting Performance Benchmarks (2016)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

The full company list, product list, methodology, and notes can be found here

This post focuses only on the results of the testing in the $51-100/month price bracket for WordPress Hosting.

$51-100/Month WordPress Hosting Products

review_signal_table_100_updated

 

$51-100/Month WordPress Hosting Performance Benchmarks Results

1. Load Storm

Test 500-3000 Concurrent Users over 30 Minutes, 10 Minutes at Peak

Company Total Requests Total Errors Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred (GB) Peak Throughput (MB/s) Average Throughput (MB/s)
BlueHost 322139 166336 267.9 178.97 20999 9268 9.425 7.086 5.236
CloudWays Amazon 306701 73421 214.07 170.39 15256 4810 13.9 10.05 7.723
CloudWays Google 267495 128912 199.23 148.61 15392 7341 8.35 6.595 4.639
Kinsta 416335 544 324.57 231.3 15059 317 24.01 19.91 13.34
LightningBase 456430 0 356.3 253.57 3909 261 23.65 19.41 13.14
LiquidWeb 520072 2745 408.3 288.93 15322 525 24.04 19.69 13.35
Media Temple 486702 8588 397.55 270.39 16001 582 25.43 23.08 14.13
Pagely 392898 1952 298.8 218.28 15178 1593 21.38 16.85 11.88
Pantheon 409962 57051 325.53 227.76 11682 762 20.74 17.97 11.52
Pressable 569095 0 441.43 316.16 3152 239 24.35 20.19 13.53
Pressidium 429538 0 335.78 238.63 3030 306 16.11 13.26 8.951
SiteGround 449038 742 352.05 249.47 11247 383 22.93 19.26 12.74

Discussion of Load Storm Test Results

KinstaLightningBaseLiquidWeb [Reviews], Pressable, Pressidium and SiteGround [Reviews] all handled this test without any serious issues.

MediaTemple [Reviews] had some minor issues with spikes and increasing average response times.

Pagely [Reviews] had some spikes but more concerning was the increased response times which were averaging around 3000ms during the 10 minute peak of the test. It kept the website up and error rate low enough (0.5%), but it was definitely struggling to keep up.

BlueHost [Reviews], CloudWays [Reviews] (Amazon + Google) and Pantheon [Reviews] all struggled with this load test. BlueHost crashed (85% error rate). CloudWays Google had 48% errors. Amazon fared better with only 24%. Pantheon had the lowest error rate at 14% but all of them were unacceptably high along with increase response times.

2. Blitz.io

Test 1-2000 Concurrent Users over 60 seconds

Blitz Test Quick Results Table

Company Hits Errors Timeouts Average Hits/Second Average Response Time Fastest Response Slowest Response
BlueHost 28901 714 2710 482 654 185 1562
CloudWays Amazon 55678 906 0 928 24 3 106
CloudWays Google 38278 16248 158 638 102 83 226
Kinsta 54273 7 0 905 84 83 86
LightningBase 54946 0 0 916 71 71 73
LiquidWeb 54574 0 4 910 78 77 82
Media Temple 44598 442 85 743 261 195 614
Pagely 57828 1 0 964 13 2 81
Pantheon 55499 0 0 925 61 60 64
Pressable 51781 0 0 863 135 134 136
Pressidium 57348 1 0 956 27 25 30
SiteGround 83437 0 0 1391 58 58 60

Discussion of Blitz Test 1 Results

This test is just testing whether the company is caching the front page and how well whatever caching system they have setup is performing (generally this hits something like Varnish or Nginx).

I also mistakenly ran an extra thousand users against SiteGround (1-3000), but since they performed perfectly, I figured why not just leave it. The chance for random network timeouts is always there, they got a perfect score, I let them keep it. That's why their numbers look higher than everyone else's.

Who performed without any major issues?

KinstaLightningBaseLiquidWeb [Reviews], Pagely [Reviews], PantheonPressable, Pressidium and SiteGround [Reviews] all handled this test without any serious issues.

Who had some minor issues?

MediaTemple [Reviews] had some minor issues with load starting to impact response times and some errors/timeouts at the end of the test.

CloudWays (Amazon) managed to keep the server up but started to lag around 35 seconds in with some errors at the very end.

Who had some major issues?

BlueHost [Reviews] and CloudWays (Google) both failed this test.

3. Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. UptimeRobot was paid for and monitoring on a 1 minute interval. All the companies were monitored over approximately two months (May-June 2016).

Uptime Robot & StatusCake

Company StatusCake UptimeRobot
BlueHost 99.98 99.98
CloudWays Amazon 100 100
CloudWays Google 99.99 99.99
Kinsta 99.99 100
LightningBase 100 100
LiquidWeb 100 100
Media Temple 99.94 99.97
Pagely 100 100
Pantheon 100 100
Pressable 99.93 99.95
Pressidium 100 99.99
SiteGround 100 100

I can happily say every single company kept their servers up.

4. WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only.

Company WPT Dulles WPT Denver WPT LA WPT London WPT Frankfurt WPT South Africa
BlueHost 0.94 0.813 0.995 1.525 1.861 5.923
CloudWays Amazon 0.774 0.975 1.066 0.988 1.625 3.597
CloudWays Google 0.706 0.644 0.929 1.107 1.706 3.37
Kinsta 0.834 0.62 0.958 1.12 1.688 3.637
LightningBase 0.542 0.465 0.955 1.013 1.569 4.541
LiquidWeb 0.616 0.55 1.003 1.076 1.624 5.634
Media Temple 0.904 0.537 0.855 1.318 1.932 2.809
Pagely 0.808 0.542 1.04 1.137 1.675 5.583
Pantheon 0.856 0.508 0.955 1.051 1.704 5.628
Pressable 1.032 0.757 1.08 1.449 1.948 5.793
Pressidium 0.738 0.727 1.171 1.292 1.67 5.747
SiteGround 0.867 0.678 1.114 1.176 1.671 4.56
Company WPT Singapore WPT Shanghai WPT Japan WPT Sydney WPT Brazil
BlueHost 2.652 22.102 1.863 1.937 2.255
CloudWays Amazon 2.236 23.404 1.781 1.75 1.752
CloudWays Google 2.031 22.418 2.026 1.609 1.793
Kinsta 2.235 24.017 2.109 1.602 1.851
LightningBase 2.227 22.437 1.683 1.968 1.612
LiquidWeb 2.335 23.238 1.885 1.96 1.635
Media Temple 2.19 22.265 1.814 2.101 2.091
Pagely 2.415 23.124 1.914 2.103 1.943
Pantheon 2.093 25.209 1.781 1.975 1.804
Pressable 2.382 23.897 2.234 2.821 2.132
Pressidium 2.245 23.303 2.061 1.785 1.747
SiteGround 2.309 22.746 2.017 2.935 1.907

LightningBase put up the fastest individual score of any bracket this year in this test with a blazingly fast 0.465ms average response in Denver. Other than that, nothing special here other than all these companies seemed capable of delivering content fast pretty much everywhere in the world except Shanghai.

5. WPPerformanceTester

Company PHP Bench [Seconds] (lower=faster) WP Bench [Queries Per Second](higher=faster)
BlueHost 11.655 713.78
CloudWays Amazon 10.993 324.99
CloudWays Google 11.192 327.33
Kinsta 11.333 318.47
LightningBase 10.537 1067.24
LiquidWeb 7.177 1084.6
Media Temple 13.9 98.85
Pagely 10.102 165.86
Pantheon 11.687 202.92
Pressable 10.952 492.61
Pressidium 10.749 240.67
SiteGround 11.522 1030.93

LiquidWeb put up one of the fastest scores on the PHP Bench at 7.177. Everyone else fell into the 10-14 range we generally see.

The WP Bench saw some slow scores from MediaTemple and Pagely and handful breaking the 1000 barrier in LightningBase, LiquidWeb, and SiteGround. Interestingly, the trend seems to be slower as you go up in price as you get more non-local databases.

Conclusion

This is the last really crowded bracket as we go up in price. It's sitting right at the border of entry level plans and the more serious stuff. This is the first tier that tested plans more heavily than any plan last year as well. The results were also very encouraging.

Top Tier WordPress Hosting Performance

review_signal_2016_trophy_100

KinstaLightningBaseLiquidWeb [Reviews], Pressable, Pressidium and SiteGround [Reviews] all earned top tier WordPress Hosting for the $51-100/month tier.

Honorable Mentions

MediaTemple [Reviews] and Pagely [Reviews] earn honorable mentions. They had some minor issues in the LoadStorm test and MediaTemple had some minor issues in the Blitz test.

Individual Host Analysis

BlueHost [Reviews]

BlueHost fell short again in the load tests.

CloudWays [Reviews] (Amazon + Google)

CloudWays is always interesting because you can compare head to head performance on different cloud platforms. I would pretty confidently say that Amazon outperformed Google in this instance with similar specs (although Amazon charges more).

Kinsta

Kinsta's entry level plan put on a fantastic performance. The higher end providers are starting to show up in this price tier and really showing why they charge their premium prices. Kinsta easily earned top tier status.

LightningBase

LightningBase's most expensive plan that we tested this year (although they offer higher ones), and for the third consecutive price tier (and year), they handled the tests flawlessly. A literaly perfect score for LightningBase: 100% uptime on both monitors and 0 errors on all load tests. Simply perfection. Undoubtedly a top tier WordPress Host.

LiquidWeb [Reviews]

LiquidWeb is a newcomer to this testing and this is their entry level plan. Boy did they make a positive splash. 100% uptime across the board and excellent load testing scores. They also had the fastest PHP Bench in this bracket (and third fastest of any company this year). They have a fantastic reputation here at Review Signal on our reviews section, I can confidently say they also have a top tier WordPress Hosting product to boot.

MediaTemple [Reviews]

Media Temple earned an honorable mention which is a step in the right direction. They had some minor problems with the load tests. No major concerns, just need to figure out security issues and minor performance stuff to make them top tier again.

Pagely [Reviews]

Pagely was a bit of a disappointment. They've been in the top tier the past years but fell to an honorable mention this year. The increased LoadStorm test seemed to put some strain on the server and caused spikes and increased load times. Everything else looked very good like previous years.

Pantheon [Reviews]

Pantheon, like Pagely, struggled with the LoadStorm test, but to a larger degree this year. It knocked them out of the top tier and didn't even earn an honorable mention in this price bracket. Everything else looked very good.

Pressable

Pressable showed up in a big way. No problems in any of the tests. Zero errors on both load tests. Easily in the top tier for this price bracket.

Pressidium

One error, nearly perfect uptime. Hard to really expect a better performance. Pressidium's entry level plan remains in the top tier for another year.

SiteGround [Reviews]

I screwed up with the Blitz load test and they got a perfect score with an extra thousand users which is impressive. They had a small spike at the start of the LoadStorm test but otherwise put on a flawless performance with 100% uptime on both monitors as well. SiteGround is in the top tier.

$25-50/Month WordPress Hosting Performance Benchmarks (2016)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

The full company list, product list, methodology, and notes can be found here

This post focuses only on the results of the testing in the $25-50/month price bracket for WordPress Hosting.

$25-50/Month WordPress Hosting Products

review_signal_table_50

$25-50/Month WordPress Hosting Performance Benchmarks Results

1. Load Storm

Test 500-2000 Concurrent Users over 30 Minutes, 10 Minutes at Peak

Company Total Requests Total Errors Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred (GB) Peak Throughput (MB/s) Average Throughput (MB/s)
A2 193764 68658 148.28 107.65 17563 6541 7.647 11.37 4.248
CloudWays DO 196963 54589 148.48 109.42 15809 5841 8.474 7.384 4.708
CloudWays Vultr 207994 50049 144.13 115.55 16187 5339 9.439 8.398 5.244
Conetix 169625 116960 134.43 94.24 18510 8578 2.635 3.898 1.464
LightningBase 315348 1 238.4 175.19 3567 272 16.34 13.47 9.077
Pantheon 268164 866 205.5 148.98 14422 315 6466 4927 3592
Pressable 394405 26 294.6 219.11 15101 226 16.4 13.32 9.111
Pressjitsu 300931 3913 228.47 167.18 11121 502 16.86 14.29 9.365
SiteGround 300999 0 232.75 167.22 10926 462 15.83 14.35 8.972
WP Land 294459 14976 235.63 163.59 15422 864 15.15 14.04 8.417
WPEngine 348796 26572 270.23 193.78 15091 311 14.95 11.38 8.307
WPOven 288369 0 217.85 160.21 5815 283 16.64 13.63 9.245

 

Discussion of Load Storm Test Results

Many companies handled this test without any sort of struggle: LightningBasePantheon [Reviews], PressableSiteGround [Reviews], and WPOven.com. In fact, SiteGround and WPOven managed to have zero errors, while LightningBase had 1. Truly impressive performances put on by these companies.

Pressjitsu struggled a little bit. There were some errors and increased response times at the start of the test. It managed to stabilize for the last 22 minutes as load increased though.

WPEngine [Reviews] and WP.land struggled a bit more than Pressjitsu, but didn't completely fall apart. Both seemed to be having issues with the wp-login page, possibly security related.

A2 Hosting [Reviews], CloudWays [Reviews] (Digital Ocean & Vultr), and Conetix did not do well during this test. High error rates and slow response times show they were not equipped to handle this type of load.

 

2. Blitz.io

Test 1-1000 Concurrent Users over 60 seconds

Blitz Test Quick Results Table

Company Hits Errors Timeouts Average Hits/Second Average Response Time Fastest Response Slowest Response
A2 51 14265 7339 1 800 411 1047
CloudWays DO 28328 13 16 472 32 25 91
CloudWays Vultr 28763 3 0 479 24 24 25
Conetix 2359 1097 6070 39 1412 763 2410
LightningBase 27460 0 0 458 72 71 72
Pantheon 27755 0 0 463 61 60 67
Pressable 25914 0 2 432 134 134 136
Pressjitsu 23902 481 0 398 205 205 206
SiteGround 26623 1 26 444 86 71 255
WP Land 28352 0 1 473 39 38 40
WPEngine 26281 69 0 438 117 114 127
WPOven 26687 0 0 445 103 101 104

Discussion of Blitz Test 1 Results

This test is just testing whether the company is caching the front page and how well whatever caching system they have setup is performing (generally this hits something like Varnish or Nginx).

Who performed without any major issues?

CloudWays (Digital Ocean & Vultr), LightningBasePantheonPressableSiteGround [Reviews], WPEngine [Reviews], WP.land, and WPOven.com all handled the blitz test without any significant issues.

Who had some minor issues?

Pressjitsu again had what seems to be security related issues. A perfect flat response time but some timeouts at the end of the test.

Who had some major issues?

A2 Hosting and Conetix both failed the Blitz test.

3. Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. UptimeRobot was paid for and monitoring on a 1 minute interval. All the companies were monitored over approximately two months (May-June 2016).

Uptime Robot & StatusCake

Company StatusCake UptimeRobot
A2 97.91 99.35
CloudWays DO 100 100
CloudWays Vultr 99.95 99.87
Conetix 99.92 99.93
LightningBase 100 100
Pantheon 100 100
Pressable 99.91 99.92
Pressjitsu 99.78 99.65
SiteGround 99.99 100
WP Land 99.92 100
WPEngine 100 99.99
WPOven 100 100

A2 had significant downtime issues with StatusCake recording 97.91% and UptimeRobot recording 99.35% uptime. The CloudWays Vultr server had some issues with UptimeRobot recording 99.87%. Pressjitsu also had some uptime problems with StatusCake recording 99.78% and UptimeRobot 99.65%.

Everyone else was above 99.9% on both monitors including CloudWays Digital Ocean, LightningBase, Pantheon, WPOven all recording perfect 100%/100% scores.

4. WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only.

Company WPT Dulles WPT Denver WPT LA WPT London WPT Frankfurt WPT South Africa
A2 0.879 0.747 1.237 1.61 2.029 5.657
CloudWays DO 0.836 0.58 1.031 1.221 1.668 7.08
CloudWays Vultr 0.713 0.676 1.087 1.109 1.636 7.643
Conetix 2.328 2.078 2.242 3.845 3.497 8.69
LightningBase 0.567 0.563 1.054 1.067 1.511 4.199
Pantheon 0.86 0.583 1.024 1.259 1.649 7.625
Pressable 0.945 0.715 1.162 1.533 2.013 9.377
Pressjitsu 0.94 0.549 0.93 1.33 1.912 6.288
SiteGround 0.838 0.655 1.043 1.063 1.693 6.927
WP Land 0.816 0.622 1.002 1.189 1.693 3.307
WPEngine 0.872 0.523 0.939 1.199 1.796 4.434
WPOven 0.85 0.534 1.093 1.452 1.79 4.844
Company WPT Singapore WPT Shanghai WPT Japan WPT Sydney WPT Brazil
A2 2.956 22.788 2.231 2.609 2.342
CloudWays DO 2.123 22.42 1.898 1.713 1.755
CloudWays Vultr 2.223 22.573 1.832 1.7 1.797
Conetix 2.027 23.425 2.63 1.308 3.56
LightningBase 2.041 23.977 1.717 1.848 1.667
Pantheon 2.194 22.605 1.769 1.661 1.784
Pressable 2.451 22.258 2.194 3.079 2.049
Pressjitsu 2.046 22.352 1.73 1.416 2.055
SiteGround 2.245 23.087 1.806 2.27 1.855
WP Land 2.157 22.428 1.872 1.658 1.784
WPEngine 2.121 24.584 1.87 2.051 1.863
WPOven 2.089 2.82 1.796 1.712 1.859

What I learned was getting traffic into China is terrible. Nobody really did well on the Shanghai location except WPOven which somehow didn't get the delay that every other company experienced. South Africa is also really slow. Most servers were US based but were delivering content to most corners of the world in about 2 seconds or less which is impressive.

Conetix is an Australian focused company and they delivered to Sydney faster than anyone which is a relief that the geographic advantage is real. Beyond the Australian market the connectivity seemed slower to just about every other location. Australia has notoriously bad connectivity though, so I can see the advantage of having a company specializing in the local market.

I wish I could compare averages against last year except they removed one of the testing locations (Miami) and I did a global test instead because that was something people wanted to see.

The US connectivity is very fast though, with everyone delivering to Dulles(VA) and Denver (CO) in under a second (minus the Australian server) with LA at about one second exactly for everyone.

5. WPPerformanceTester

Company PHP Bench [Seconds] (lower=faster) WP Bench [Queries Per Second](higher=faster)
A2 18.456 592.77
CloudWays DO 15.028 353.98
CloudWays Vultr 13.145 392.62
Conetix 12.833 410.51
LightningBase 10.795 1353.18
Pantheon 12.292 189.54
Pressable 11.062 525.21
Pressjitsu 12.771 648.09
SiteGround 11.414 1109.88
WP Land 13.491 1094.09
WPEngine 13.494 406.17
WPOven 9.412 690.61

In this tier, there was a lot more normalized spread on the PHP Bench with most people being within the 10-14 second range we saw last year. WPOven lead the pack at 9.4. A2 was the slowest at 18.456.

The WP Bench scores varied a lot, again. LightningBase had another blazingly fast score of 1353.18. Siteground and WPLand also broke the 1000 barrier, whereas last year's fastest was 889. At the bottom of the pack was Pantheon with 189.54, which I am sure they would say infrastructure plays a large role in. Anyone with a distributed/non-local SQL database will be slower by a lot. They would probably argue that's one of the trade-offs of scalability and based on their load testing performance, it would be hard to argue against.

Conclusion

A very crowded bracket with lots of competition. This range is still pretty entry level, not the cheapest stuff like the <$25/month plans I compared. But with increased price came better performances. Although two of the top tier companies in this bracket make up two of the three top tier performers in the cheapest bracket. But it is nice to see some loose price to performance correlation in the market. Many of these plans are the entry level for their respective companies.

One of the interesting things to watch was the VPSs in this range (A2, CloudWays, Pressjitsu). They were outperformed by the Shared/Cloud providers who can presumably burst more shared resources for any given site. So for spikey sites that expect to get a Reddit/Slashdot effect, there may be some advantage in being in that sort of environment (if you cant easily scale the VPS, which some providers make quite easy). But since these are dummy sites not really tested heavily over the two months, there is the potential for bad neighbors negatively impacting you too during such a spike, and then you might want your own isolated VPS. I can see arguments for for both sides.

Without further ado, I will tell you who had the best performance, who deserved an honorable mention and then analyze each host individually. I still don't believe in ranking in any particular order, only grouping companies by how well they performed.

Top Tier WordPress Hosting Performance

review_signal_2016_trophy_50

LightningBasePantheonPressableSiteGround [Reviews] and WPOven.com.

Honorable Mentions

Unfortunately, no company deserves honorable mention status which I give to companies that came close but weren't quite perfect or they looked like they were good but something external (generally security measures) interfered with the tests but from all other indications they seemed to be doing well.

The closest in this bracket would have been Pressjitsu, except they had uptime issues which I give no leeway for being under 99.9%.

Individual Host Analysis

A2 Hosting [Reviews]

I try to find bright spots in a company's performance, but A2 really didn't have one in this test. If you buy a VPS there is also no default way to install WordPress beyond the old fashioned DIY. You have to pay extra for Softaculous installer from the admin panel.

CloudWays [Reviews] Digital Ocean / Vultr

The most interesting part of CloudWays is being able to see the same stack tested on multiple providers. It's a small sample, but it looks like Vultr marginally outperforms Digital Ocean in performance. Although, Digital Ocean was more stable (again, small sample size to compare head to head). It was nice to see CloudWays do well with the Blitz tests and keep very good uptime, especially the Digital Ocean machine which was perfect.

Conetix

Conetix had good uptime and connection to Australia, their target market. They strongly using W3TC but it didn't come fully installed and I don't test anything beyond the default configuration because it gets into too much minutia and conflict with hosts about what could be done to improve scores. I also believe most people just stick with the default based on all the user testing I've seen across various fields. So the unfortunate results were the load test performances didn't look very good for them.

(9/19/2019 Update) Conetix have issued their own statement regarding Review Signal's test and why they believe this methodology doesn't accurately represent their performance and why a unique Australian perspective is required when evaluating them. I recommend reading the full details.

LightningBase

LightningBase put on basically a perfect performance. 100% uptime on both monitors. 0 errors on blitz, 1 error on loadstorm. Unequivocally, a top tier performance.

Pantheon [Reviews]

Pantheon showed up again, in a good way. They earned themselves a top tier performance accolade. They had a few errors at the start of the LoadStorm test, but beyond that aced everything.

Pressable

Pressable is back for the first time since my first testing in 2013, with new ownership (WordPress.com). They had had some good tech back then but it wasn't perfect and had some minor issues. I can happily say that has changed as they delivered a top tier performance this year with no issues in any test.

Pressjitsu

Pressjitsu felt like 2013 Pressable, the foundations of a really good company but just didn't get it all put together. The biggest problem was the sub 99.9% uptime. They had what appeared to be security measures mar the blitz test and had some errors at the start of the LoadStorm test but managed to stabilize for the duration and put on a good showing.

SiteGround [Reviews]

SiteGround got even better this year. They jumped up from honorable mention to top tier status. Their Blitz and LoadStorm tests both improved while everything else remained at a high level. An all around fantastic performance which deserved top tier status.

WPEngine [Reviews]

WPEngine fell slightly this year, it could have been a security issue with wp-login during the LoadStorm test, but there were too many errors to give it honorable mention status for this plan which it earned last year. Everything else looked good though.

WP.land

WP Land like WPEngine had too many problems during the LoadStorm test that it didn't earn honorable mention status. Everything else looked very good for them and it's great to see a strong new entrant.

WPOven.com

The knock on WPOven last year was their LoadStorm test. Everything else was perfect. I'm glad they maintained everything else, but this time they managed a perfect LoadStorm test to boot. A huge improvement and a very well deserved entry in the top tier of WordPress Hosts in the $25-50 range.

WordPress Hosting Performance Benchmarks (2016)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

2018 WordPress Hosting Performance Benchmarks is now live.

This is the fourth round of managed WordPress web hosting performance testing. You can see the original, 2014 version , and 2015 version.

Companies Tested

A2 Hosting [Reviews]
BlueHost [Reviews]
CloudWays [Reviews]
Conetix
DreamHost [Reviews]
FlyWheel [Reviews]
GoDaddy [Reviews]
Incendia Web Works
Kinsta
LightningBase
LiquidWeb [Reviews]
MediaTemple [Reviews]
Pagely [Reviews]
Pantheon [Reviews]
Pressable (Formerly ZippyKid)
Pressed.net
Pressidium
Pressjitsu
PressLabs
Hosting Agency (German)
SiteGround [Reviews]
Traffic Planet Hosting
WordPress.com VIP
WPEngine [Reviews]
WP.land
WPOven.com

Companies that didn't participate this round but did on previous rounds: WebHostingBuzzWPProntoNexcessA Small Orange [Reviews] and  WebSynthesis [Reviews].

Every plan was donated by the company for testing purposes with the strict stipulation that it would be the same as if any normal user signed up. There is a notes section at the bottom that details the minutiae of changes made to plans at the end of this post. Nearly every single company had security issues that I had to get around, so they worked to make sure my testing went through properly. Load testing often looks like an attack and it's the only way I can do these tests.

The Products

This year is a bit different than years past where every company and plan competed against one another. When I started the price gap was from $5/month to $29/month. Last year the gap was $5.95 to $299. I was only testing entry level plans but the market has dramatically changed since I first got started. Today, there is demand at many different price points and lots of companies have gone upscale with WordPress.com VIP at the top of the price bracket starting at $5,000/month. The only logical way to break things up was by price brackets. So below you will see the brackets and which companies participated. Specific details will be included on each bracket's write up.

 

<$25/m $25-50/m $51-100/m $101-200/m $201-500/m $500+/m
A2 Hosting A2 Hosting LiquidWeb A2 Hosting Kinsta Kinsta
Bluehost Conetix Bluehost Bluehost Media Temple Pagely
DreamHost LLC Lightning Base Cloudways (AWS ) Conetix Pagley Pantheon
Flywheel Pantheon Cloudways (Google) Kinsta Pantheon Pressable
GoDaddy Pressable Kinsta Liquid Web Pressable Pressidium
Incendia Web Works Pressjitsu Lightning Base Pressable Pressidium WordPress.com VIP
Lightning Base SiteGround Media Temple Pressidium Presslabs WP Engine
Media Temple WP Engine Pagely Pressjitsu SiteGround
Pressed WP.land Pantheon
Hosting Agency.de Cloudways (DigitalOcean) Pressable
SiteGround Cloudways (Vultr) Pressidium
Traffic Planet Hosting WPOven SiteGround
WP.land

 

Methodology

The question I tried to answer is how well do these WordPress hosting services perform? I tested each company on two distinct measures of performance: peak performance and consistency. I've also included a compute and database benchmark based on a WordPress plugin.

All tests were performed on an identical WordPress dummy website with the same plugins except in cases where hosts added extra plugins. Each site was monitored for approximately two months for consistency.

1. LoadStorm

LoadStorm was kind enough to give me resources to perform load testing on their platform and multiple staff members were involved in designing and testing these WordPress hosts. I created identical scripts for each host to load a site, login to the site and browse the site. Logged in users were designed to break some of the caching and better simulate real user load. The amount of users varies by cost.

2. Blitz.io

I used Blitz again to compare against previous results. This tests the static caching of the homepage. I increased the number of users based on monthly cost this time.

3. Uptime (UptimeRobot and StatusCake)

Consistency matters. I wanted to see how well these companies performed over a longer period of time. I used two separate uptime monitoring services over the course of a month to test consistency.

4. WebPageTest.org

WebPageTest with 9 runs, first view only, native connection. I tested from Dulles, Denver, Los Angeles, London, Frankfurt, South Africa, Singapore, Shanghai, Japan, Sydney, Brazil.

5. WPPerformanceTester (free plugin on WordPress.org)

I created a WordPress plugin to benchmark CPU, MySql and WordPress DB performance. The CPU/MySql benchmarks are testing the compute power. The WordPress component tests actually calling $wpdb and executing insert, select, update and delete queries.

 

Notes - Changes made to Hosting Plans

A2 - VPS Servers can't install WordPress out of the box without extra payment for Softaculous. Disabled recaptcha.

Conetix - disabled WordFence and Stream plugins.

SiteGround - fully enable SuperCacher plugin

GoDaddy - 24 database connection limit increased if you notify them of heavy load

CloudWays - disabled WordFence