Tag Archives: bluehost

Endurance International Group 2016 Financials

Endurance International Group (NASDAQ: EIGI) published their 10-K on February 24, 2017.

As one of the biggest players in the space, I like to look through and see what's going on with them.

2016's biggest news for EIG was the acquisition of Constant Contact for 1.1 billion dollars. Their financials have been broken apart now between Web Presence (hosting, domains, etc) and Email (Constant Contact).

It was reinforced early that BlueHost and HostGator are their primary brands and they plan on pushing them with more brand advertising (tv, podcasts, etc). I wonder if we will see a BlueHost superbowl ad to compete with GoDaddy?

In 2015, our total subscriber base increased. In 2016, excluding the effect of acquisitions and adjustments, our total subscriber base was essentially flat, and in our web presence segment, ARPS decreased from $14.18 for 2015 to $13.65 for 2016. We expect that our total subscriber base will decrease in 2017. The factors contributing to our lack of growth in total subscribers and decrease in web presence segment ARPS during 2016 and our expected decrease in total subscribers during 2017 are discussed in “Item 7 -  Management’s Discussion and Analysis of Financial Condition and Results of Operations ”. If we are not successful in addressing these factors, including by improving subscriber satisfaction and retention, we may not be able to return to or maintain positive subscriber or revenue growth in the future, which could have a material adverse effect on our business and financial results.

Year Ended December 31,
2014
2015
2016
Consolidated metrics:
Total subscribers
4,087
4,669
5,371
Average subscribers
3,753
4,358
5,283
Average revenue per subscriber
$
13.98
$
14.18
$
17.53
Adjusted EBITDA
$
171,447
$
219,249
$
288,396
Web presence segment metrics:
Total subscribers
4,827
Average subscribers
4,789
Average revenue per subscriber
$
13.65
Adjusted EBITDA
$
172,135
Email marketing segment metrics:
Total subscribers
544
Average subscribers
494
Average revenue per subscriber
$
55.11
Adjusted EBITDA
$
116,26

 

Overall, it's probably not a good sign to see Average Revenue Per Subscriber going down on their hosting segment which was the core of the business. The Email segment is hiding/offsetting that a lot.

HostGator, iPage, Bluehost, and our site builder brand) showed positive net subscriber adds in the aggregate during 2016, but these positive net adds were outweighed by the negative impact of subscriber losses in non-strategic hosting brands, our cloud storage and backup solution, and discontinued gateway products such as our VPN product. We expect total subscribers to decrease overall and in our web presence segment during 2017, due primarily to the impact of subscriber churn in these non-strategic and discontinued brands. We expect total subscribers to remain flat to slightly down in our email marketing segment.

The future doesn't look good based on these statements. Decreasing ARPS and decreasing subscriber base seem like a recipe for decline. They don't seem to even expect growth in the email marketing segment. I'm really having a hard time seeing any positive outlook on this.

In 2017, we are focused on improving our product, customer support and user experience within our web presence segment in order to improve our levels of customer satisfaction and retention. If this initiative is not successful, and if we are unable to provide subscribers with quality service, this may result in subscriber dissatisfaction, billing disputes and litigation, higher subscriber churn, lower than expected renewal rates and impairments to our efforts to sell additional products and services to our subscribers, and we could face damage to our reputation, claims of loss, negative publicity or social media attention, decreased overall demand for our solutions and loss of revenue, any of which could have a negative effect on our business, financial condition and operating results.
Our planned transfer of our Bluehost customer support operations to our Tempe, Arizona customer support facility presents a risk to our customer satisfaction and retention efforts in 2017. Although we believe that the move to Tempe will ultimately result in better customer support, the transition may have the opposite effect in the short term. We expect that the transition will take place in stages through the fourth quarter of 2017, and until the transition is complete, we may continue to handle some support calls from our current Orem, Utah customer support center. The morale of our customer support agents in Orem may be low due to the pending closure of the Orem office, and agents may decide to leave for other opportunities sooner than their scheduled departure dates. Either or both of these factors could result in a negative impact on Bluehost customer support, which could lead to subscriber cancellations and harm to our reputation, and generally impede our efforts to improve customer satisfaction and retention in the short term. In addition, we are consolidating our Austin, Texas support operation into our Houston, Texas support center, which could also negatively impact customer support provided from those locations during the transition period.

The story about BlueHost getting rid of hundreds of jobs in Orem was widely talked about. It also came up that A Small Orange was getting some of the same treatment. That would be in line with getting rid of Austin where ASO was based. It's interesting to see EIG selling this as a 'long term' move, unless it's entirely a financial one to reduce costs. I've yet to track a single EIG brand substantially increase its rating, but it has destroyed plenty of them (The Rise and Fall of A Small Orange or The Sinking of Site5). These companies they acquired often had much better ratings and knew how to provide customer support.

I did find one interesting bit in the contract with Tregaron India Holdings (Operating as GLOWTOUCH or Daya), the line item for "New hire and ongoing training for all support positions." It makes it sound like this third party company is responsible for training all EIG support staff, along with many other things like migrations. Which have been absolutely disastrous and how Arvixe ended up as one of Review Signal's lowest rated brands which was done by this group.

But who are Tregaron?

The Company has contracts with Tregaron India Holdings, LLC and its affiliates, including Diya Systems (Mangalore) Private Limited, Glowtouch Technologies Pvt. Ltd. and Touchweb Designs, LLC, (collectively, “Tregaron”), for outsourced services, including email- and chat-based customer and technical support, network monitoring, engineering and development support, web design and web building services, and an office space lease. These entities are owned directly or indirectly by family members of the Company’s chief executive officer, who is also a director and stockholder of the Company.

In 2016 EIG spent $14,300,000 with Tregaron. And it wasn't the only business connected to the CEO.

The Company also has agreements with Innovative Business Services, LLC (“IBS”), which provides multi-layered third-party security applications that are sold by the Company. IBS is indirectly majority owned by the Company’s chief executive officer and a director of the Company, each of whom are also stockholders of the Company. During the year ended December 31, 2014, the Company’s principal agreement with this entity was amended which resulted in the accounting treatment of expenses being recorded against revenue.

Another $5,100,000 for this particular company.

So how bad were those migrations?

A key purpose of many of our smaller acquisitions, typically acquisitions of small hosting companies, has been to achieve subscriber growth, cost synergies and economies of scale by migrating customers of these companies to our platform. However, for several of our most recent acquisitions of this type, migrations to our platform have taken longer and been more disruptive to subscribers than we anticipated. If we are unable to improve upon our recent migration efforts and continue to experience unanticipated delays and subscriber disruption from migrations, we may not be able to achieve the expected benefits from these types of acquisitions.

Understatement at its finest.

Overall, things look pretty glum at EIG, which was trading at over $9 on the day this came out and is now under $8/share.

I generally try to keep my opinions fairly limited, but some things need to be called out for the good of consumers. EIG has acquired a lot of talented people and managed to squander them repeatedly. I'm not sure why the company seems to be toxic towards retaining good talent. When EIG are writing statements about trying to improve customer service and have acquired some of the highest rated brands (A Small Orange) Review Signal tracks, and then dismantles them it creates a cognitive dissonance.

Perhaps EIG needs to get rid of the top management. The incestuous relationships between the contracted companies and the CEO are create some questionable incentives. Combined with the objectively poor results from those companies on things like migrations, it seems inexcusable. I'm not optimistic about anything EIG are doing and feel bad for some of the exceptional people I know that still work there.

The Sinking of Site5 – Tracking EIG Brands Post Acquisition

"You'll notice their ratings, in general, are not very good with Site5 (their most recent acquisition) being the exception. iPage was acquired before I started tracking data. BlueHost/HostMonster also had a decline, although the data doesn't start pre-acquisition. JustHost collapses post acquisition. NetFirms has remained consistently mediocre. HostGator collapses with a major outage a year after acquisition. Arvixe collapses a year after being acquired. Site5 is still very recent and hasn't shown any signs of decline yet." - The Rise and Fall of A Small Orange, January 2016

Review Signal Rating Calculated Pos/(Pos+Neg), without duplicate filtering

Review Signal Rating Calculated Pos/(Pos+Neg), without duplicate filtering (January 2016)

That's what I wrote at the beginning of 2016 as I watched A Small Orange's rating collapse in a pretty popular post called The Rise and Fall of A Small Orange, which documented ASO's Rise and Fall, but also the fall of many EIG brands. One thing I mentioned was the recent acquisition of Site5 (and Verio) which had a fairly good rating on Review Signal at the time of acquisition. The trend seemed to be roughly a year to see the drop in rating, post acquisition.

Site5 ~ 1 Year Later

The acquisition of Site5 was announced August 2015. Here's the updated EIG brand tracking graph. One thing to note, this now uses the new rating algorithm which has a built in decay function to weight older reviews less. So the new graph uses the new algorithm but calculating each point in time as if it always used it. There will be some differences between it and the original graph (which prompted the change in algorithm). It's minimal for most brands, only when there is a major change in sentiment, it shows a change more quickly. Full details about the change can be read on Review Signal Ranking Algorithm Update.

eig_brand_review_signal_ratings_2016

What you can see is the reputation remained relatively stable until about April 2016 and then started a slow but steady decline where it has dipped below 50% for the first time recently. As with nearly every brand, except A Small Orange, the decline happened within a year.

Since the original post there also hasn't been much movement in any other brands beyond Site5 crashing and A Small Orange continuing to slide downward. Verio didn't see a dip post-acquisition, but it had a pretty low rating to start with that put it in the bottom half of EIG brand ratings already.

Why Do EIG Brands Go Down Post Acquisition?

The longer I am in this industry, the more stories I hear. A Small Orange was such an interesting exception and I've heard a lot about it from a lot of people. It's relative independence and keeping the staff seemed to be the key to maintaining a good brand even within the EIG conglomerate.

Site5 offers what I imagine is more business-as-usual in the EIG world. Cut staff, migrate to EIG  and maximize profit (in the short term). Site5's founder, Ben, reached out to a competitor, SiteGround, and arranged for them to hire a large number of Site5 staff that EIG had no plans on keeping according to SiteGround's blog. A very classy move from the former CEO and a seeming win for SiteGround, one of EIG's larger hosting competitors. I also saw similar behavior of long time staff all leaving when A Small Orange started to go downhill and staff from other EIG brands showed up.

Beyond simply trying to cut costs, you have to wonder why would you spend all that money acquiring these brands that have lots of customers, good reputations and talented staff that obviously are keeping the operation running successfully only to get rid of nearly all of that except the customers. But once you gut the staff, it seems like the customers notice, because it certainly shows up in the data I track.

Conveniently, EIG just published their Q3 2016 10-Q.

We have certain hosting and other brands to which we no longer allocate significant marketing or other funds. These brands generally have healthy free cash flow, but we do not consider them strategic or growth priorities. Subscriber counts for these non-strategic brands are decreasing. While our more strategic brands, in the aggregate, showed net subscriber adds during the quarter ended September 30, 2016, the net subscriber losses in non-strategic brands and certain gateway brands contributed to a decrease in our total subscribers of approximately 42,000 during the quarter. We expect that total subscribers will continue to decrease in the near term.

Overall, our core hosting and web presence business showed relatively slow revenue and subscriber growth during the first nine months of 2016. We believe that this is due to flat marketing expenditures relative to 2015 levels on this business in the first half of 2016 as a result of our focus on gateway products during that period, and to trends in the competitive landscape, including greater competition for referral sources and an increasing trend among consumers to search for web presence and marketing solutions using brand-related search terms rather than generic search terms such as “shared hosting” or “website builder”. We believe this trend assists competitors who have focused more heavily than we have on building consumer awareness of their brand, and that it has made it more challenging and more expensive for us to attract new subscribers. In order to address this trend, during the third quarter of 2016, we began to allocate additional marketing investment to a subset of our hosting brands, including our largest brands, Bluehost.com, HostGator and iPage. We plan to continue this increased level of marketing investment in the near term, and are evaluating different marketing strategies aimed at increasing brand awareness.

So the result of their current strategy this past quarter has been a net loss of 42,000 customers. They say their strategic brands on aggregate had a net subscriber increase and named the largest ones (BlueHost, HostGator, iPage) and they are going to focus on a subset of brands going forward. But the phrasing would seem to imply that some of the strategic brands experienced losses as well. It also means that the non-strategic brands lost more than 42,000 customers and pulled down the net subscribers to -42,000 customers last quarter.

The cap it all off, I got one of the most surprising emails from Site5 a couple days ago.

We wanted to let you know that we’ve decided to terminate the Site5 Affiliate program as of November 30th, 2016.

We want to thank you for your support of Site5, especially during our most recent move into Impact Radius, and we hope that you’ll consider promoting another one of Endurance’s other programs.

I guess Site5 isn't being considered a strategic brand if they are killing off the affiliate channel on it entirely, right after a big migration from Site5's custom affiliate program to Impact Radius. They also asked that affiliates promote HostGator now, which certainly fits in the strategic brand category.

It's extremely disappointing to see this trend continue of brands collapsing after a year in EIG's hands. What will be interesting going forward is that EIG hasn't acquired any new hosting brands for a while. They seem to be focused on their existing brands for now. I wonder if that will mean we will see any noticeable positive change or improvements in existing brands (or at least some of the strategic brands).

$101-200/Month WordPress Hosting Performance Benchmarks (2016)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

The full company list, product list, methodology, and notes can be found here

This post focuses only on the results of the testing in the $101-200/month price bracket for WordPress Hosting.

$101-200/Month WordPress Hosting Products

review_signal_table_200

$101-200/Month WordPress Hosting Performance Benchmarks Results

1. Load Storm

Test 500-4000 Concurrent Users over 30 Minutes, 10 Minutes at Peak

Company Total Requests Total Errors Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred (GB) Peak Throughput (MB/s) Average Throughput (MB/s)
A2 363070 163790 264.15 201.71 15443 6857 11.75 13.88 6.528
BlueHost 322139 166336 267.9 178.97 20999 9268 9.42 7.09 5.24
Conetix 341733 145110 243.3 189.85 16202 7347 11.74 13.87 6.52
Kinsta 546252 0 425.67 303.47 9078 286 31.47 24.95 17.48
LiquidWeb 635893 76 490.78 353.27 15097 360 31.3 25.19 17.39
Pressable 724499 1090 562.12 402.5 15024 447 30.91 26.07 17.17
Pressidium 563624 0 435.43 313.12 3561 272 30.82 24.44 17.12
Pressjitsu 434368 41339 339.37 241.32 15605 3173 22.5 18.67 12.5

Discussion of Load Storm Test Results

KinstaLiquidWeb [Reviews], Pressable, and Pressidium had no problems with this test.

A2 Hosting [Reviews], BlueHost [Reviews], Conetix, and Pressjitsu struggled with this test. BlueHost struggled off the bat. A2 and Conetix struggled a couple minutes in. Pressjitsu made it about 12 minutes in before it started erroring, but it had increasing load times starting around 6 minutes in. They all lasted varying amounts of time, but none were ready to handle this sort of load.

2. Blitz.io

Test 1-3000 Concurrent Users over 60 seconds

Blitz Test Quick Results Table

Company Hits Errors Timeouts Average Hits/Second Average Response Time Fastest Response Slowest Response
A2 120 43508 21784 2 518 304 733
BlueHost 28568 11753 7945 476 929 192 1889
Conetix 155 16827 13990 3 1470 872 2184
Kinsta 81397 3 0 1357 84 83 85
LiquidWeb 81393 47 10 1357 80 76 118
Pressable 77652 0 4 1294 134 141 133
Pressidium 85916 6 0 1432 27 25 31
Pressjitsu 67297 5833 0 1122 208 205 236

Discussion of Blitz Test 1 Results

This test is just testing whether the company is caching the front page and how well whatever caching system they have setup is performing (generally this hits something like Varnish or Nginx).

Who performed without any major issues?

KinstaLiquidWeb [Reviews], Pressable, and Pressidium all handled this test without issue, again.

Who had some minor issues?

Pressjitsu kept a flat response time but had a lot of errors start to build as the test scaled up. Might have been a security measure blocking it.

Who had some major issues?

BlueHost [Reviews] managed to last about 22 seconds before it started to be impacted by the load.

A2 Hosting and Conetix were overloaded almost immediately.

3. Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. UptimeRobot was paid for and monitoring on a 1 minute interval. All the companies were monitored over approximately two months (May-June 2016).

Uptime Robot & StatusCake

Company StatusCake UptimeRobot
A2 99.64 100
BlueHost 100 99.99
Conetix 99.52 99.7
Kinsta 99.98 99.99
LiquidWeb 100 100
Pressable 99.96 99.94
Pressidium 99.97 99.99
Pressjitsu 99.99 99.99

Conetix had some uptime issues getting recorded for 99.52% and 99.7% on StatusCake and UptimeRobot respectively.

A2 had a very strange recording with UptimeRobot showing 100% and StatusCake recording 99.64%.

Everyone else maintained above 99.9% on both monitors.

4. WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only.

Company WPT Dulles WPT Denver WPT LA WPT London WPT Frankfurt WPT South Africa
A2 0.924 0.654 1.199 1.554 1.989 5.118
BlueHost 0.969 0.588 0.988 1.684 2.006 6.23
Conetix 2.703 2.026 2.194 3.372 3.339 6.964
Kinsta 0.817 0.577 0.982 1.15 1.721 5.081
LiquidWeb 0.887 0.578 1.059 1.179 1.748 4.227
Pressable 0.969 0.738 1.135 1.493 1.95 7.669
Pressidium 0.639 0.627 1.174 1.187 1.705 5.303
Pressjitsu 0.915 0.677 0.87 1.302 1.786 6.433
Company WPT Singapore WPT Shanghai WPT Japan WPT Sydney WPT Brazil
A2 2.618 22.224 2.114 2.592 2.162
BlueHost 2.247 22.406 1.937 1.755 2.22
Conetix 3.092 22.465 2.818 1.493 3.448
Kinsta 2.054 22.743 2.064 1.704 2.345
LiquidWeb 2.215 22.378 1.983 1.977 1.823
Pressable 2.476 22.395 2.146 2.879 2.479
Pressidium 2.08 22.461 2.053 1.893 1.803
Pressjitsu 2.172 22.317 1.701 1.871 2.19

Everyone was pretty fast around the world without huge red flags anywhere.

Conetix had slow scores to a lot of locations, but thankfully they were the fastest in Sydney (because they are focused on the Australian market and based there).

5. WPPerformanceTester

Company PHP Bench [Seconds] (lower=faster) WP Bench [Queries Per Second](higher=faster)
A2 9.336 1440.92
BlueHost 12.276 956.94
Conetix 12.019 418.76
Kinsta 11.458 330.58
LiquidWeb 7.122 1102.54
Pressable 10.788 514.13
Pressidium 10.739 281.14
Pressjitsu 12.3 574.38

At this mid-range tier we see pretty much only VPS/Dedicated and cloud/clustered solutions. LiquidWeb's VPS again got one of the fastest PHP Bench scores I've seen recorded. The VPS/Dedicateds also generally put up much faster WP Bench scores with A2 leading the way with their dedicated server. The Cloud/Clustered solutions were around 500 and below (Kinsta, Pressable, Pressidium). The only exception was Conetix which was a VPS.

Conclusion

Top Tier WordPress Hosting Performance

review_signal_2016_trophy_200

KinstaLiquidWeb [Reviews], Pressable, and Pressidium were the top tier in the $101-200/month price range.

Individual Host Analysis

A2 Hosting [Reviews]

The bright spot in this test was the WP Bench test where this dedicated server was way faster than the competition. The raw power of a dedicated machine is nice, but unless it's got all the extra caching software that the top tier hosts were running, it unfortunately fell flat in the load tests.

BlueHost [Reviews]

Another disappointing performance in the load tests. The uptime and other tests were fine.

Conetix

Overall, they didn't perform that well. Uptime wasn't on par and the load test results were disappointing. The only bright spot was they were the fastest in Australia.

(9/19/2019 Update) Conetix have issued their own statement regarding Review Signal's test and why they believe this methodology doesn't accurately represent their performance and why a unique Australian perspective is required when evaluating them. I recommend reading the full details.

Kinsta

Kinsta continues to earn top tier status. I can't find anything to say beyond the fact they performed near perfectly, again.

LiquidWeb [Reviews]

LiquidWeb's lower priced plan performed spectacularly, their higher end product only continued that trend. They had a bracket leading PHP bench, perfect uptime, and aced the load tests. LiquidWeb has easily gone from brand new to top tier WordPress hosting status this year.

Pressable

Pressable continues its trend of excellent load tests, but at this tier they put everything together and earned themselves top tier status.

Pressidium

Another test, another top tier performance. Not much to say beyond, excellent.

Pressjitsu

Pressjitsu did better than the other companies that didn't earn top tier status, but found themselves clearly below the top tier with some struggles in the load tests. It seems like security may have messed up the blitz test but the same can't be said about the LoadStorm which showed real signs of stress. It seems like a good foundation but they need to just get everything running a bit better to earn themselves the top tier recognition; hopefully next year.

$51-100/Month WordPress Hosting Performance Benchmarks (2016)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

The full company list, product list, methodology, and notes can be found here

This post focuses only on the results of the testing in the $51-100/month price bracket for WordPress Hosting.

$51-100/Month WordPress Hosting Products

review_signal_table_100_updated

 

$51-100/Month WordPress Hosting Performance Benchmarks Results

1. Load Storm

Test 500-3000 Concurrent Users over 30 Minutes, 10 Minutes at Peak

Company Total Requests Total Errors Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred (GB) Peak Throughput (MB/s) Average Throughput (MB/s)
BlueHost 322139 166336 267.9 178.97 20999 9268 9.425 7.086 5.236
CloudWays Amazon 306701 73421 214.07 170.39 15256 4810 13.9 10.05 7.723
CloudWays Google 267495 128912 199.23 148.61 15392 7341 8.35 6.595 4.639
Kinsta 416335 544 324.57 231.3 15059 317 24.01 19.91 13.34
LightningBase 456430 0 356.3 253.57 3909 261 23.65 19.41 13.14
LiquidWeb 520072 2745 408.3 288.93 15322 525 24.04 19.69 13.35
Media Temple 486702 8588 397.55 270.39 16001 582 25.43 23.08 14.13
Pagely 392898 1952 298.8 218.28 15178 1593 21.38 16.85 11.88
Pantheon 409962 57051 325.53 227.76 11682 762 20.74 17.97 11.52
Pressable 569095 0 441.43 316.16 3152 239 24.35 20.19 13.53
Pressidium 429538 0 335.78 238.63 3030 306 16.11 13.26 8.951
SiteGround 449038 742 352.05 249.47 11247 383 22.93 19.26 12.74

Discussion of Load Storm Test Results

KinstaLightningBaseLiquidWeb [Reviews], Pressable, Pressidium and SiteGround [Reviews] all handled this test without any serious issues.

MediaTemple [Reviews] had some minor issues with spikes and increasing average response times.

Pagely [Reviews] had some spikes but more concerning was the increased response times which were averaging around 3000ms during the 10 minute peak of the test. It kept the website up and error rate low enough (0.5%), but it was definitely struggling to keep up.

BlueHost [Reviews], CloudWays [Reviews] (Amazon + Google) and Pantheon [Reviews] all struggled with this load test. BlueHost crashed (85% error rate). CloudWays Google had 48% errors. Amazon fared better with only 24%. Pantheon had the lowest error rate at 14% but all of them were unacceptably high along with increase response times.

2. Blitz.io

Test 1-2000 Concurrent Users over 60 seconds

Blitz Test Quick Results Table

Company Hits Errors Timeouts Average Hits/Second Average Response Time Fastest Response Slowest Response
BlueHost 28901 714 2710 482 654 185 1562
CloudWays Amazon 55678 906 0 928 24 3 106
CloudWays Google 38278 16248 158 638 102 83 226
Kinsta 54273 7 0 905 84 83 86
LightningBase 54946 0 0 916 71 71 73
LiquidWeb 54574 0 4 910 78 77 82
Media Temple 44598 442 85 743 261 195 614
Pagely 57828 1 0 964 13 2 81
Pantheon 55499 0 0 925 61 60 64
Pressable 51781 0 0 863 135 134 136
Pressidium 57348 1 0 956 27 25 30
SiteGround 83437 0 0 1391 58 58 60

Discussion of Blitz Test 1 Results

This test is just testing whether the company is caching the front page and how well whatever caching system they have setup is performing (generally this hits something like Varnish or Nginx).

I also mistakenly ran an extra thousand users against SiteGround (1-3000), but since they performed perfectly, I figured why not just leave it. The chance for random network timeouts is always there, they got a perfect score, I let them keep it. That's why their numbers look higher than everyone else's.

Who performed without any major issues?

KinstaLightningBaseLiquidWeb [Reviews], Pagely [Reviews], PantheonPressable, Pressidium and SiteGround [Reviews] all handled this test without any serious issues.

Who had some minor issues?

MediaTemple [Reviews] had some minor issues with load starting to impact response times and some errors/timeouts at the end of the test.

CloudWays (Amazon) managed to keep the server up but started to lag around 35 seconds in with some errors at the very end.

Who had some major issues?

BlueHost [Reviews] and CloudWays (Google) both failed this test.

3. Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. UptimeRobot was paid for and monitoring on a 1 minute interval. All the companies were monitored over approximately two months (May-June 2016).

Uptime Robot & StatusCake

Company StatusCake UptimeRobot
BlueHost 99.98 99.98
CloudWays Amazon 100 100
CloudWays Google 99.99 99.99
Kinsta 99.99 100
LightningBase 100 100
LiquidWeb 100 100
Media Temple 99.94 99.97
Pagely 100 100
Pantheon 100 100
Pressable 99.93 99.95
Pressidium 100 99.99
SiteGround 100 100

I can happily say every single company kept their servers up.

4. WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only.

Company WPT Dulles WPT Denver WPT LA WPT London WPT Frankfurt WPT South Africa
BlueHost 0.94 0.813 0.995 1.525 1.861 5.923
CloudWays Amazon 0.774 0.975 1.066 0.988 1.625 3.597
CloudWays Google 0.706 0.644 0.929 1.107 1.706 3.37
Kinsta 0.834 0.62 0.958 1.12 1.688 3.637
LightningBase 0.542 0.465 0.955 1.013 1.569 4.541
LiquidWeb 0.616 0.55 1.003 1.076 1.624 5.634
Media Temple 0.904 0.537 0.855 1.318 1.932 2.809
Pagely 0.808 0.542 1.04 1.137 1.675 5.583
Pantheon 0.856 0.508 0.955 1.051 1.704 5.628
Pressable 1.032 0.757 1.08 1.449 1.948 5.793
Pressidium 0.738 0.727 1.171 1.292 1.67 5.747
SiteGround 0.867 0.678 1.114 1.176 1.671 4.56
Company WPT Singapore WPT Shanghai WPT Japan WPT Sydney WPT Brazil
BlueHost 2.652 22.102 1.863 1.937 2.255
CloudWays Amazon 2.236 23.404 1.781 1.75 1.752
CloudWays Google 2.031 22.418 2.026 1.609 1.793
Kinsta 2.235 24.017 2.109 1.602 1.851
LightningBase 2.227 22.437 1.683 1.968 1.612
LiquidWeb 2.335 23.238 1.885 1.96 1.635
Media Temple 2.19 22.265 1.814 2.101 2.091
Pagely 2.415 23.124 1.914 2.103 1.943
Pantheon 2.093 25.209 1.781 1.975 1.804
Pressable 2.382 23.897 2.234 2.821 2.132
Pressidium 2.245 23.303 2.061 1.785 1.747
SiteGround 2.309 22.746 2.017 2.935 1.907

LightningBase put up the fastest individual score of any bracket this year in this test with a blazingly fast 0.465ms average response in Denver. Other than that, nothing special here other than all these companies seemed capable of delivering content fast pretty much everywhere in the world except Shanghai.

5. WPPerformanceTester

Company PHP Bench [Seconds] (lower=faster) WP Bench [Queries Per Second](higher=faster)
BlueHost 11.655 713.78
CloudWays Amazon 10.993 324.99
CloudWays Google 11.192 327.33
Kinsta 11.333 318.47
LightningBase 10.537 1067.24
LiquidWeb 7.177 1084.6
Media Temple 13.9 98.85
Pagely 10.102 165.86
Pantheon 11.687 202.92
Pressable 10.952 492.61
Pressidium 10.749 240.67
SiteGround 11.522 1030.93

LiquidWeb put up one of the fastest scores on the PHP Bench at 7.177. Everyone else fell into the 10-14 range we generally see.

The WP Bench saw some slow scores from MediaTemple and Pagely and handful breaking the 1000 barrier in LightningBase, LiquidWeb, and SiteGround. Interestingly, the trend seems to be slower as you go up in price as you get more non-local databases.

Conclusion

This is the last really crowded bracket as we go up in price. It's sitting right at the border of entry level plans and the more serious stuff. This is the first tier that tested plans more heavily than any plan last year as well. The results were also very encouraging.

Top Tier WordPress Hosting Performance

review_signal_2016_trophy_100

KinstaLightningBaseLiquidWeb [Reviews], Pressable, Pressidium and SiteGround [Reviews] all earned top tier WordPress Hosting for the $51-100/month tier.

Honorable Mentions

MediaTemple [Reviews] and Pagely [Reviews] earn honorable mentions. They had some minor issues in the LoadStorm test and MediaTemple had some minor issues in the Blitz test.

Individual Host Analysis

BlueHost [Reviews]

BlueHost fell short again in the load tests.

CloudWays [Reviews] (Amazon + Google)

CloudWays is always interesting because you can compare head to head performance on different cloud platforms. I would pretty confidently say that Amazon outperformed Google in this instance with similar specs (although Amazon charges more).

Kinsta

Kinsta's entry level plan put on a fantastic performance. The higher end providers are starting to show up in this price tier and really showing why they charge their premium prices. Kinsta easily earned top tier status.

LightningBase

LightningBase's most expensive plan that we tested this year (although they offer higher ones), and for the third consecutive price tier (and year), they handled the tests flawlessly. A literaly perfect score for LightningBase: 100% uptime on both monitors and 0 errors on all load tests. Simply perfection. Undoubtedly a top tier WordPress Host.

LiquidWeb [Reviews]

LiquidWeb is a newcomer to this testing and this is their entry level plan. Boy did they make a positive splash. 100% uptime across the board and excellent load testing scores. They also had the fastest PHP Bench in this bracket (and third fastest of any company this year). They have a fantastic reputation here at Review Signal on our reviews section, I can confidently say they also have a top tier WordPress Hosting product to boot.

MediaTemple [Reviews]

Media Temple earned an honorable mention which is a step in the right direction. They had some minor problems with the load tests. No major concerns, just need to figure out security issues and minor performance stuff to make them top tier again.

Pagely [Reviews]

Pagely was a bit of a disappointment. They've been in the top tier the past years but fell to an honorable mention this year. The increased LoadStorm test seemed to put some strain on the server and caused spikes and increased load times. Everything else looked very good like previous years.

Pantheon [Reviews]

Pantheon, like Pagely, struggled with the LoadStorm test, but to a larger degree this year. It knocked them out of the top tier and didn't even earn an honorable mention in this price bracket. Everything else looked very good.

Pressable

Pressable showed up in a big way. No problems in any of the tests. Zero errors on both load tests. Easily in the top tier for this price bracket.

Pressidium

One error, nearly perfect uptime. Hard to really expect a better performance. Pressidium's entry level plan remains in the top tier for another year.

SiteGround [Reviews]

I screwed up with the Blitz load test and they got a perfect score with an extra thousand users which is impressive. They had a small spike at the start of the LoadStorm test but otherwise put on a flawless performance with 100% uptime on both monitors as well. SiteGround is in the top tier.

Under $25/Month WordPress Hosting Performance Benchmarks (2016)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

The full company list, product list, methodology, and notes can be found here

This post focuses only on the results of the testing in the <$25/month price bracket for WordPress Hosting.

 

<$25/Month WordPress Hosting Products

review_signal_table_25_updated

 

<$25/Month WordPress Hosting Performance Benchmarks Results

1. Load Storm

Test 500-2000 Concurrent Users over 30 Minutes, 10 Minutes at Peak

Company Total Requests Total Errors Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred (GB) Peak Throughput (MB/s) Average Throughput (MB/s)
A2 310069 203981 249.08 172.26 15138 549 4.639 8.853 2.577
BlueHost 181995 153234 147.47 101.11 16000 7634 1.066 3.677 0.592
DreamHost 295685 43 224.1 164.27 15063 339 16.06 13.5 8.922
FlyWheel 265618 81491 205.22 147.57 15101 1154 11.5 9.361 6.391
GoDaddy 311172 1363 238.68 172.87 10100 340 16.07 13.31 8.927
Hosting Agency (DE) 182424 117939 132.65 101.35 15991 6743 3.823 10.53 2.124
IWW 272657 84 217.92 151.48 10096 266 14.93 13.77 8.293
LightningBase 314439 5 238.68 174.69 8989 255 16.24 13.24 9.023
Media Temple 327662 1466 258.45 182.03 10628 381 12.55 10.54 6.972
Pressed 289318 61 214.05 160.73 15029 266 16.25 13.01 9.03
SiteGround 301722 1 230.45 167.62 9374 447 15.9 13.76 8.833
TrafficPlanetHosting 289335 476 217.63 160.74 15216 570 16.15 14.08 8.974
WP Land 293166 11596 228.4 162.87 15608 644 15.47 13.3 8.594

Discussion of Load Storm Test Results

The companies that clearly didn't struggle at all with LoadStorm were DreamHost [Reviews], Incendia Web Works (IWW), LightningBase, Pressed, SiteGround [Reviews]. GoDaddy [Reviews], MediaTemple [Reviews] and Traffic Planet Hosting had minor spikes at the start, but they seem nearly inconsequential in the grand scheme of the test.

WP.land seemed to have some security measures which struggled with wp-login being hit so frequently.

A2 Hosting [Reviews], BlueHost [Reviews], FlyWheel [Reviews] and Hosting Agency did not do well on this test. FlyWheel explicitly stated this was too much load for that size plan and recommended upgrading if this was the expected load.

2. Blitz.io

Test 1-1000 Concurrent Users over 60 seconds

Blitz Test Quick Results Table

Company Hits Errors Timeouts Average Hits/Second Average Response Time Fastest Response Slowest Response
A2 590 27255 390 10 92 55 167
BlueHost 23340 71 274 389 214 155 604
DreamHost 29337 0 1 489 4 3 7
FlyWheel 28530 0 0 476 28 21 146
GoDaddy 15222 11093 28 254 196 190 229
Hosting Agency (DE) 662 20862 3649 11 630 400 1556
IWW 28786 9 0 480 23 21 24
LightningBase 27488 0 0 458 71 71 72
Media Temple 15255 11260 5 254 200 188 318
Pressed 26228 0 0 437 80 5 389
SiteGround 26055 1 21 434 100 72 346
TrafficPlanetHosting 1018 8344 9718 17 266 102 843
WP Land 28344 0 0 472 39 38 39

Discussion of Blitz Test 1 Results

This test is just testing whether the company is caching the front page and how well whatever caching system they have setup is performing (generally this hits something like Varnish or Nginx).

Who performed without any major issues?

DreamHost, IWW, LightningBase, SiteGround, WP Land all handled the test without any issues.

Who had some minor issues?

BlueHost had a couple spikes during the test which caused some errors and timeouts, but they weren't substantial.

FlyWheel had a spike at the very end of the test which caused a large increase in response times.

Pressed started to have a ramp up in response times but it never errored or timed out during the test.

Who had some major issues?

GoDaddy, MediaTemple and TrafficPlanetHosting seemed to pretty clearly hit security measures which couldn't be worked around. The response times were relatively stable, but errors shot up which is symptomatic of a security measure kicking in rather than the server being taxed. It's hard to know how they would have performed sans security measures.

A2 and Hosting Agency did not take kindly to the Blitz test and crashed almost immediately under load.

3. Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. UptimeRobot was paid for and monitoring on a 1 minute interval. All the companies were monitored over approximately two months (May-June 2016).

Uptime Robot & StatusCake

Company StatusCake UptimeRobot
A2 99.92 99.91
BlueHost 30.22 18.06
DreamHost 99.97 99.97
FlyWheel 99.96 99.98
GoDaddy 99.96 99.98
Hosting Agency (DE) - 100
IWW 99.73 99.88
LightningBase 99.99 100
Media Temple 99.96 99.95
Pressed 100 99.87
SiteGround 99.97 99.98
TrafficPlanetHosting 99.98 99.98
WP Land 99.92 100

BlueHost screwed up and cancelled this account mid-testing causing the uptime to look horrific. Their other two plans which were not cancelled had measurements of 99.98, 99.98, 100 and 99.99 uptime. I'm upset that it happened and there was a struggle to restore the account and have to take credit away for this type of screw up. But, they were able to keep the other servers up with near perfect uptime which I think should be stated here as well.

Hosting Agency for some reason couldn't be monitored by StatusCake (http/2 issue they still haven't fixed for nearly 9 months, which UptimeRobot fixed within 24 hours when I notified them). But they had 100% on UptimeRobot, so it looks good.

IWW had a bunch of short outages and one longer one (2hr 33m) which brought it's uptime down.

Pressed had a 1hr 51m downtime (502 error) recorded by UptimeRobot but StatusCake never picked it up. I'm not sure what to make of that, it might be something wrong with UptimeRobot's servers connecting properly since StatusCake never picked it up over an interval that long.

Everyone else had above 99.9% uptime.

4. WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only.

Company WPT Dulles WPT Denver WPT LA WPT London WPT Frankfurt WPT South Africa
A2 0.819 0.638 1.109 1.181 1.687 5.054
BlueHost 0.902 0.521 0.878 1.532 1.874 3.483
DreamHost 0.769 0.777 1.444 1.107 1.64 4.33
FlyWheel 0.74 0.722 1.077 1.082 1.649 5.241
GoDaddy 0.939 0.728 0.834 1.376 1.992 6.909
Hosting Agency (DE) 1.299 1.258 2.17 0.985 1.55 4.905
IWW 0.544 0.658 0.864 0.929 1.416 4.105
LightningBase 0.62 0.598 1.078 0.95 1.471 5.764
Media Temple 0.86 0.667 0.811 1.313 1.945 4.645
Pressed 0.773 0.902 1.276 1.176 1.691 4.845
SiteGround 0.741 0.64 1.048 1.06 1.721 4.94
TrafficPlanetHosting 0.793 0.562 1.26 1.212 1.723 3.522
WP Land 0.719 0.689 1.154 1.099 1.709 4.8

 

Company WPT Singapore WPT Shanghai WPT Japan WPT Sydney WPT Brazil
A2 2.244 22.287 1.974 2.003 1.895
BlueHost 2.255 22.728 1.809 1.467 2.274
DreamHost 1.93 22.186 2.028 1.954 1.747
FlyWheel 1.765 12.549 1.845 1.816 1.758
GoDaddy 2.173 22.373 1.826 1.959 2.103
Hosting Agency (DE) 2.311 22.406 2.651 2.772 2.596
IWW 1.98 22.547 1.615 1.96 1.535
LightningBase 1.999 19.731 1.708 1.913 1.661
Media Temple 2.113 22.141 1.802 1.959 2.135
Pressed 2.233 23.691 1.997 2.037 1.894
SiteGround 2.131 22.718 1.843 2.079 1.788
TrafficPlanetHosting 2.081 22.74 1.872 1.595 1.816
WP Land 2.25 22.305 1.852 1.959 1.752

What I learned was getting traffic into China is terrible. Nobody really did well on the Shanghai location. South Africa is also really slow. Most servers were US based but were delivering content to most corners of the world in about 2 seconds or less which is impressive. Hosting Agency based in Germany was a bit disappointing. Very slow relatively speaking to the US. But it wasn't even the fastest to London or Frankfurt. LightningBase and IWW were able to beat the German company in the US by a large margin and to Europe which reinforces that geographic location isn't everything in terms of speed.

I wish I could compare averages against last year except they removed one of the testing locations (Miami) and I did a global test instead because that was something people wanted to see.

5. WPPerformanceTester

Company PHP Bench [Seconds] (lower=faster) WP Bench [Queries Per Second](higher=faster)
A2 12.626 570.78
BlueHost 13.089 1083.42
DreamHost 17.104 446.23
FlyWheel 11.761 387.3
GoDaddy 13.804 278.47
Hosting Agency (DE) 6.501 45.28
IWW 7.637 1869.16
LightningBase 10 1315.79
Media Temple 12.241 339.79
Pressed 11.036 217.2
SiteGround 11.497 733.14
TrafficPlanetHosting 8.666 918.27
WP Land 14.485 684.93

What was enormously interesting about WPPerformanceTester results this year was the much larger spread and faster results. Last year, almost everyone was around 10-14 seconds for PHP Bench with the outlier of PressLabs doing 8.9 and DreamHost at 27. DreamHost again has the dubious honor of the slowest PHP Bench but it improved by a whopping 10 seconds down to 17. The fastest was Hosting Agency with 6.5, more than a full 2 seconds faster than last year's fastest speed. IWW, TrafficPlanetHosting also managed sub 10 second speeds.

Last year's fastest WP Bench was 889 queries per second. That was blown away by this years testing with IWW leading the group at more than double the speed (1869). BlueHost, LightningBase and TrafficPlanetHosting all managed to be faster than last year's fastest benchmark as well. Unfortunately, Hosting Agency's incredibly fast PHP bench is somewhat cancelled out by their slowest WP Bench score, which is slower than last year's slowest. It should be noted that transaction speed isn't always a great measured on distributed/clustered/cloud systems that may be running databases on different machines, but at the entry level that's less of an issue. Generally the incredibly fast scores you see are local databases with no network latency overhead.

Conclusion

It is nice to get back to a real entry level analysis with a much more level playing field. Having 13 different companies available to choose from in the <$25/month range is fantastic. Despite the change in this years format, the lower end plans still outperformed the fastest competitors from last year's tests which had plans up to ~$300/month.

Despite the hard price cap in this bracket of testing, there were still some companies that handled all the tests without any serious issue. Many more did very well but ran into minor issues.

The amount of companies jumping into the space is a fantastic win for consumers. In this tier we saw A2, Pressed, WP Land, Hosting Agency, IWW and Traffic Planet Hosting all enter for the first time. They target a variety of different niches within the space and overall it's a win for us, the consumer to have more good choices and options. From a performance standpoint, you can still get amazing performance value for the money even at the lowest tier.

Without further ado, I will tell you who had the best performance, who deserved an honorable mention and then analyze each host individually. I still don't believe in ranking in any particular order, only grouping companies by how well they performed.

Top Tier WordPress Hosting Performance

review_signal_2016_trophy_25

DreamHost [Reviews], LightningBase, and  SiteGround [Reviews],

All three of these companies went through the full testing without any meaningful issues.

Honorable Mentions

Pressed had an odd uptime issue but also showed some signs of server stress during the blitz test. For a brand new company they performed admirably, but I'm not quite comfortable awarding them the top tier status quite yet when you compare their results against the three top tier companies, but they put on a very good showing.

WP.land did well in every test except LoadStorm where it had a roughly 4% error rate. It looked like a security issue with wp-login which isn't uncommon. But there were also some spikes/delays as well. It could just be security acting up, but again, a minor issue that kept it out of the top tier, but it was worthy of an honorable mention from yet another new comer to this year's testing.

GoDaddy [Reviews]/MediaTemple [Reviews], I combine this one because it's running on the same tech and the results look very similar and experienced the same security issues. You can pretty clearly see when the security measures kick in on Blitz and I wasn't able to work with their tech team to come up with a way to responsibly bypass their security measures. LoadStorm had a spike at the start with wp-login issues but resolved itself out quickly and had a flat response time graph. It's possible their tech is just as good as the top tier hosts, but I wasn't able to accurately measure it because of security measures but it looks very good and at least deserves the honorable mention.

Traffic Planet Hosting is another new entrant and had similar issues to GoDaddy/MediaTemple. Security issues caused some problems on the Blitz test, but it did start to show some load too. Not perfect, but it did well on LoadStorm as well.  (no honorable mention?)

Individual Host Analysis

A2 Hosting [Reviews]

A2 Hosting was a new entrant to this test and as much as I love the competition in the space, A2 fell short. Other than their uptime monitoring which was good, they struggled in all the load testing experiments.

BlueHost [Reviews]

BlueHost specifically messed up with my account in this test and the uptime was terrible because of it. That alone ruined the uptime test, although as I stated in the section, the other servers all maintained excellent uptime which were on different accounts. They did ok in the blitz test, but not in the LoadStorm test. They also surprisingly managed the fastest individual WebPageTest score of any host in this price range. Compared to last year I don't see any huge signs of improvement with regards to performance.

DreamHost [Reviews]

Last year DreamHost's DreamPress product almost made the top tier except for some major downtime issues. This year, they had no such downtime issues and the performance remained top notch. DreamHost earned the top tier status for the <$25/month price bracket. It appears to be an excellent product priced very competitively.

FlyWheel [Reviews]

FlyWheel only entered one product this year and it was less powerful than last year's. It struggled a bit more on the LoadStorm test but the Blitz was perfect (although for this price tier, it was a weaker test than last year's test). They explicitly stated for LoadStorm that the plan was inappropriate for that level of traffic. They can probably handle bigger sites, but if we're comparing dollars to performance, they fell short in this price bracket on that metric. But they are still rated as the most well liked company that we track at Review Signal, so they are clearly doing something right in terms of product and customer service.

GoDaddy [Reviews]

GoDaddy had a stalwart performance marred by what appeared to be security measures. They very well could have a top notch product but we couldn't work out a responsible way to bypass the security measures for the Blitz load test. LoadStorm looked pretty good, one small spike to start and steady up to 2000 users. GoDaddy earned an honorable mention status because the product didn't seem to encounter any non-artificial problems.

Incendia Web Works

IWW did a great job in both load tests. The only concern was uptime, where IWW had 99.73% and 99.88% as recorded by each service. The performance component is definitely there, but a little more consistency and we have another serious competitor in the space. The only reason they didn't earn honorable mention while Pressed did is that there were conflicting uptime reports for Pressed where one showed 100% and the other recorded sub 99.9% uptime. Two independent services showed IWW below 99.9%, so there isn't much doubt about it in my mind. Like DreamHost last year, they put on a great performance showing and I hope next year the servers are a bit more stable and I can award top tier status.

LightningBase

LightningBase continues to impress. The last two years they've put on consistently near perfect tests. Their Blitz result was perfect and their LoadStorm had only 5 errors out of 314439 requests. Combined with 100/99.99% uptime monitors, LightningBase is unquestionably in the top tier for the <$25/month WordPress hosting bracket.

MediaTemple [Reviews]

MediaTemple's results basically mirrored GoDaddy's results. It would be even hard to tell the graphs apart if you removed the names. The MediaTemple/GoDaddy platform appears to be very solid but we couldn't responsibly get by some security measures, so I couldn't award it top tier status, but MT earned an honorable mention.

Pressed

Pressed earned itself an honorable mention. It had a weird uptime issue but more importantly it started to show some signs of load during the Blitz test where I would expect a flat response time from a static cache test like Blitz. It's a very new product and I'm sure we'll continue to see tremendous improvements as time goes on, a very good performance from possibly the newest company in this year's testing.

Hosting Agency

Hosting Agency performed as expected, it appears to have no special WordPress optimizations. If you were to install a basic lamp stack, this is the performance I expect out of the box. They had perfect uptime and oddly found themselves on both ends of the spectrum on my WPPerformanceTester. They weren't faster to England or Germany on WebPageTest, which I suspect is because there was no special caching technologies to accelerate delivery of pages despite being geographically closer. And it just collapsed during the load tests, especially Blitz which is essentially a static cache test (where they have none). Another important note is that their entire system is in German only.

SiteGround [Reviews]

SiteGround got even better this year. They jumped up from honorable mention to top tier status. Their Blitz and LoadStorm tests both improved while everything else remained at a high level. An all around fantastic performance which deserved top tier status.

Traffic Planet Hosting

Another new comer to this years testing. TPH put on a good show, there seemed to be some security measures which ruined the Blitz testing, but the LoadStorm test looked very solid. They earned an honorable mention because the only issue seemed artificial. I'm less confident about the quality of the product than GoDaddy/MediaTemple, but it still seemed to warrant recognition.

WP.land

WPLand was the final new entrant and they put on a fantastic showing. Everything went near perfect except the LoadStorm test which seemed to have an issue with wp-login triggering some security measures. But the response rate was pretty stable and quick despite the ramp up to 2000 users. They also had a perfect blitz test with no errors and a 1ms spread in fastest to slowest response times. WP Land earned honorable mention status because overall it was a very good performance with a small issue that might be security related.

 

WordPress Hosting Performance Benchmarks (2016)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

2018 WordPress Hosting Performance Benchmarks is now live.

This is the fourth round of managed WordPress web hosting performance testing. You can see the original, 2014 version , and 2015 version.

Companies Tested

A2 Hosting [Reviews]
BlueHost [Reviews]
CloudWays [Reviews]
Conetix
DreamHost [Reviews]
FlyWheel [Reviews]
GoDaddy [Reviews]
Incendia Web Works
Kinsta
LightningBase
LiquidWeb [Reviews]
MediaTemple [Reviews]
Pagely [Reviews]
Pantheon [Reviews]
Pressable (Formerly ZippyKid)
Pressed.net
Pressidium
Pressjitsu
PressLabs
Hosting Agency (German)
SiteGround [Reviews]
Traffic Planet Hosting
WordPress.com VIP
WPEngine [Reviews]
WP.land
WPOven.com

Companies that didn't participate this round but did on previous rounds: WebHostingBuzzWPProntoNexcessA Small Orange [Reviews] and  WebSynthesis [Reviews].

Every plan was donated by the company for testing purposes with the strict stipulation that it would be the same as if any normal user signed up. There is a notes section at the bottom that details the minutiae of changes made to plans at the end of this post. Nearly every single company had security issues that I had to get around, so they worked to make sure my testing went through properly. Load testing often looks like an attack and it's the only way I can do these tests.

The Products

This year is a bit different than years past where every company and plan competed against one another. When I started the price gap was from $5/month to $29/month. Last year the gap was $5.95 to $299. I was only testing entry level plans but the market has dramatically changed since I first got started. Today, there is demand at many different price points and lots of companies have gone upscale with WordPress.com VIP at the top of the price bracket starting at $5,000/month. The only logical way to break things up was by price brackets. So below you will see the brackets and which companies participated. Specific details will be included on each bracket's write up.

 

<$25/m $25-50/m $51-100/m $101-200/m $201-500/m $500+/m
A2 Hosting A2 Hosting LiquidWeb A2 Hosting Kinsta Kinsta
Bluehost Conetix Bluehost Bluehost Media Temple Pagely
DreamHost LLC Lightning Base Cloudways (AWS ) Conetix Pagley Pantheon
Flywheel Pantheon Cloudways (Google) Kinsta Pantheon Pressable
GoDaddy Pressable Kinsta Liquid Web Pressable Pressidium
Incendia Web Works Pressjitsu Lightning Base Pressable Pressidium WordPress.com VIP
Lightning Base SiteGround Media Temple Pressidium Presslabs WP Engine
Media Temple WP Engine Pagely Pressjitsu SiteGround
Pressed WP.land Pantheon
Hosting Agency.de Cloudways (DigitalOcean) Pressable
SiteGround Cloudways (Vultr) Pressidium
Traffic Planet Hosting WPOven SiteGround
WP.land

 

Methodology

The question I tried to answer is how well do these WordPress hosting services perform? I tested each company on two distinct measures of performance: peak performance and consistency. I've also included a compute and database benchmark based on a WordPress plugin.

All tests were performed on an identical WordPress dummy website with the same plugins except in cases where hosts added extra plugins. Each site was monitored for approximately two months for consistency.

1. LoadStorm

LoadStorm was kind enough to give me resources to perform load testing on their platform and multiple staff members were involved in designing and testing these WordPress hosts. I created identical scripts for each host to load a site, login to the site and browse the site. Logged in users were designed to break some of the caching and better simulate real user load. The amount of users varies by cost.

2. Blitz.io

I used Blitz again to compare against previous results. This tests the static caching of the homepage. I increased the number of users based on monthly cost this time.

3. Uptime (UptimeRobot and StatusCake)

Consistency matters. I wanted to see how well these companies performed over a longer period of time. I used two separate uptime monitoring services over the course of a month to test consistency.

4. WebPageTest.org

WebPageTest with 9 runs, first view only, native connection. I tested from Dulles, Denver, Los Angeles, London, Frankfurt, South Africa, Singapore, Shanghai, Japan, Sydney, Brazil.

5. WPPerformanceTester (free plugin on WordPress.org)

I created a WordPress plugin to benchmark CPU, MySql and WordPress DB performance. The CPU/MySql benchmarks are testing the compute power. The WordPress component tests actually calling $wpdb and executing insert, select, update and delete queries.

 

Notes - Changes made to Hosting Plans

A2 - VPS Servers can't install WordPress out of the box without extra payment for Softaculous. Disabled recaptcha.

Conetix - disabled WordFence and Stream plugins.

SiteGround - fully enable SuperCacher plugin

GoDaddy - 24 database connection limit increased if you notify them of heavy load

CloudWays - disabled WordFence

WordPress.org Removes BlueHost + SiteGround and Relists Within Days

This story gets stranger.

You've probably read my previous coverage WordPress.org Updates Hosting Recommendations, Nobody Knows Selection Criteria and WordPress.org Hosting Recommendations Listing Criteria.

BlueHost and SiteGround were removed May 27 or 28.

 

There was a rumor that is was about modifying config files, but that was denied by Hristo Pandjarov (SiteGround Employee) who wrote, "we don't do nothing of the sort, the wp-config file as well as the entire installation is default including the themes. We have a plugin if you want to use the SuperCacher. We're still checking out why we've been removed from that page but that's definitelly[sic] not the case."

BlueHost and SiteGround reappeared today (June 1, 2016).

I reached out to Matt Mullenweg for a comment and was told:

"Temporary issues may cause recommended hosts to change at any time, but given the long track record of both I expect they'll be back. We might also try out different presentations and layouts there in the future."

So essentially, there's no new information, and even the web hosts themselves seem to be in the dark about what's going on.

People celebrated the removal of BlueHost and were shocked by the removal of SiteGround. But these erratic listing changes are exactly why transparency needs to be applied to the page. It's worth millions of dollars in business and is influential in one of the largest internet communities. But nobody knows or understands what's really going on.

WordPress.org Hosting Recommendations Listing Criteria

UPDATE (5/13/2016 7:05 PM): Official comment from Matt Mullenweg was posted. Quoted below, click or scroll to the comment section to see the original.

“I would like to see some transparency in the process”

As stated on the page, the listing is completely arbitrary. The process was: There was a survey, four applicants were chosen, and the page was updated. That might repeat later in the year, or the process might change.

“how criteria are weighted”

There is no criteria or weighting. It ultimately is one person’s opinion. Information that is important is reflected in the questions asked in the survey, but that is not everything that is taken into account. (I have looked at this site in the past, for example.)

“who is making the decisions”

I am. James helped in sorting through the many submissions that came in, doing additional research, and digging on finalists, but ultimately the decision was mine. You can and should blame me fully for any issues you have with it. I appreciate James’ help in this go-round, but he will not be involved at all with any future updates. (So, please leave him alone.)

“how much money is involved”

There was no money involved. Obviously being listed on the page is hugely valuable and impacts the listed (or unlisted) businesses a great deal. This is why I take full responsibility for the listing, now and in the future — I have been fortunate to be extraordinarily successful and no financial or business consideration any of the applicants could offer matters to me. A host could offer $100,000,000 to be listed on the page for 1 day, and I would say no.

-Matt Mullenweg


Yesterday, I posted WordPress.org Updates Hosting Recommendations, Nobody Knows Selection Criteria. Which naturally meant I was going to find out as much as I could about the process, because it's a big deal and my mission here at Review Signal is honest and transparent web hosting reviews.

I confirmed with multiple sources that the newly listed companies didn't pay any money to get listed. Everyone seems to have filled out the form and then heard nothing back until the updated page was published yesterday. Both the winners (BlueHost [Reviews], DreamHost [Reviews], FlyWheel [Reviews], SiteGround [Reviews]) and losers (everyone else) seemed to agree on this process based on everyone I talked to.

Great. The application process seems fair.

But the selection process is still a black box, with help from people who follow WordPress more closely than myself, I found James Huff (macmanx) a 12 year volunteer and 5 year employee at Automattic who was directly involved with the new WP.org hosting recommendations.

James_huff1

I didn't hide who I was or my interest. The most concerning part of this exchange was that 'Absolutely no money changed hands, unless you consider sponsorship of WordCamps as monetary with regards to the "contributions to WordPress.org."'

No money changed hands except a lot of sponsorship dollars to the organization. Guess who the top global gold community sponsors are? BlueHost (and JetPack/WooCommerce, both owned by Automattic). Somehow BlueHost are also a Silver sponsor too, along with GoDaddy. BlueHost is pouring a lot of money into WordCamps/WordPress.org Foundation.

I'm sorry, but I do consider that money changing hands. They are giving a large sum of money - it's material enough to mention in their SEC filings.

James_huff2

We're still going to have to agree to disagree about what money changing hands means. But he says it was fair. But fair is pretty meaningless when we don't really have any insight into what standard of fairness is the goal. How is each criteria being weighed and evaluated. But this is the list of hosts that they can confidently tell everyone are good.

I'm not sold.

James_huff3

Historical perception seems to be the proxy for what marketers might call Net Promoter Score (NPS). How much do consumers like/recommend something. That's essentially what I measure here at Review Signal and my data has been incredibly close to what company's internal data shows (LiquidWeb NPS Scores vs LiquidWeb Review Signal Rating).

It is arguably the most important factor of recommendations and for service businesses, it's about the best metric for all encompassing quality available.

But it's only part of the criteria and that's fair. But should there be some minimum threshold? Can a company score a zero in quality and high in everything else be worthy of a listing? BlueHost's rating is 41%. That means roughly 6/10 people don't recommend it or have anything good to say about them.

There are WordCamp sponsors that didn't make the cut. Of the global community sponsors 2/3 hosting companies did though, BlueHost and DreamHost, while one didn't, GoDaddy. But the largest sponsor made it and is at the top and it's still BlueHost.

But moving on, James mentioned Automattic has no play in the process, but he does wear multiple hats. Which means he is aware of the potential perception of conflict of interest.

James_huff4

Finally, a mention of Matt. Important again when thinking about the context for potential conflicts of interest. I outline what would happen in a dream world and what's realistic. I think honest disclosure and basic transparency is perfectly realistic. It's ok to make money, just be clear about where it's coming from. A standard I try to uphold here at Review Signal, see how we make money and read the entire process for how our rankings are calculated. See? It's not hard and I still make money giving the best information available.

James_huff5

AWP comments

That is the comment thread I referenced. Not a single person said anything positive about BlueHost and the assumption is they just paid for it. BlueHost being listed ruins the credibility of the recommendations when there is no transparency about what criteria was being used.

James_huff6

Moving on, the survey itself has issues which I brought up before. It's asking for sensitive company information and being handled by employees of a company that owns two competitors in the space (WP.com VIP, Pressable), took $15 million in investment from another (BlueHost), and is an investor in a fourth competitor (WP Engine).

That seems like a huge potential conflict of interest and I know it dissuaded at least one company from even applying.

James_huff7

james huff 3 tweets

It didn't end on the nicest note, I don't think James took my criticisms well. From his original messages, I think he knows and understands the perception of conflicts of interest but admitting them in this context puts him in a very awkward position that I don't envy. He wears multiple hats and surely wants to wear them all fairly. I would say admitting that those multiple hats has the potential for conflicts of interest isn't a weakness of character, it's an admission of humanity. I'm sure James is a great guy and has done a lot of good things for the community. But I think people who can be perceived with a strong potential for conflict of interest, which anyone connected to Automattic in this situation would have, shouldn't be managing this particular process.

I truly don't have any ill will towards James personally or Automattic. Even BlueHost/EIG, I've been more than willing to give them the benefit of the doubt and continue to hope that they will be better (ASO did break my heart a bit, I thought they were turning EIG around). My data continues to show them being mediocre and a seeming touch of death in terms of quality (their strategy does seem to be cost cutting and economies of scale). But I don't fault them for their behavior, I expect it, it's well published in their SEC filings.

Conclusion

I still think WordPress.org can do better with its hosting recommendations and I'm not going to stop advocating for them until they are better. I would like to see some transparency in the process, how criteria are weighted, who is making the decisions and how much money is involved. I think the companies that applied would appreciate feedback about why they weren't selected, what makes them different and fall short of the companies that do make the cut. Or just call them Ads / Sponsors. Don't say they are the best and brightest and endorse them. Say, we took money and this guy paid us the most. At least we meet the minimum threshold of honesty and transparency.

 

References

For posterity, the logs in their entirety are available below. It's long, so I tried to cut down some stuff to get to the most important bits. But I don't want to hide anything.

Direct Message Archive macmanx Making WordPress Slack Direct Message Archive macmanx Making WordPress Slack2

WordPress.org Updates Hosting Recommendations, Nobody Knows Selection Criteria

I've railed about Drupal and WordPress Have Sold Us Out in terms of hosting recommendations before. We've been waiting a long time (around a year now?) for WordPress.org to do it's revamp of its hosting recommendation page.

The Winners

BlueHost, DreamHost, FlyWheel, SiteGround

I'm not shocked at all to see BlueHost somehow still manages to be at the very top (albeit the list is alphabetical). They've continuously survived being listed, I guess that's what a million dollars will do.

Where is the transparency?

They requested hosts submit a ridiculous amount of personal information. You can see the full survey below:

2016 WordPress Hosting Survey - WordPressorg

It asks some deeply private questions like number of employees, how many 30 day active paying customers you have, and how many net paying customers are you gaining or losing each month?

Mind you, as far as anyone can tell, Matt has complete control over who shows up, and Automattic bought the majority stake in a company competing in the WordPress hosting space, Pressable. They also run WordPress.com VIP. They are also an investor in WPEngine. So some of the most secretive numbers a company competing in this space might have are being disclosed potentially to multiple of their biggest competitors through a process with no transparency or even a person named to be responsible for it.

That alone is worrisome for the process, it should definitely be run independent of Matt.

Everything else needs to be explained too. Who is responsible for this revamp? What were the selection criteria? How often will it be updated? Will existing companies be continuously re-evaluated?

wordpress_org_listing

It's not clear who 'we' is. They say listing is arbitrary but then add criteria. I'm not sure they understanding what arbitrary means. Or maybe they simply ignore the criteria they mention. Maybe it's just a terrible joke? Just like the process (or lack thereof) that seems to be in place.

A lot of it is pretty subjective. design, tone, ease of WP auto-install, historical perception? BlueHost is still listed, which is has consistently been pretty poorly reviewed (along with just about all EIG brands) and continues a downward trend.

BlueHost_review_signal_rating_apr_2016

Furthermore, it's the same criteria that's been written since at least 2010.

So maybe saying it's arbitrary gives them as escape to list whomever they want, especially considering the financial considerations involved.

Newly Listed Companies

I tried to find some explanation for how the three new companies were selected, but there really isn't much to go on. DreamHost is a Silver Community Sponsor for WordCamp, but so is GoDaddy who did not make the cut.

FlyWheel only does WordPress, but DreamHost and SiteGround do a lot more.

DreamHost has a ton of forum threads on WordPress.org, SiteGround has only a few over 10 years. FlyWheel has one total.

I talked to someone at one of the newly listed hosted companies and they confirmed that the form was filled out and that was it. Also, there was no financial consideration involved with the listing.

Which is very nice to hear, but doesn't really inspire confidence in the recommendations.

I've aired my concern with BlueHost multiple times.

But what about the new companies and their ratings?

DreamHost has a 59% rating on Review Signal, which is ok, given the upper end of the shared hosting spectrum is SiteGround at 71%. FlyWheel, the specialized hosting company has the highest rating of any company at a whopping 85%.

So the new companies are all far better than BlueHost (41%). But there are other very highly rated companies that didn't make the cut. For example, WP Engine (72%) is probably the biggest  name not listed based on size, brand in the WP community and rating at Review Signal.

Conclusion

I'm glad there are some much better companies than Blue Host listed and at least one of them got there without paying for the privilege. There is still language about some donating a portion of the fee back, which makes you think it's still at least BlueHost.

I'm still unhappy with the lack of transparency of the entire process. The most influential place for people entering the WordPress community is recommending one very mediocre hosting company who has historically paid large sums to be listed and has a deep financial relationship with the person ultimately responsible for the recommendations. The revamp didn't change that.

I am disappointed and I don't expect to hear anything from WordPress.org/Matt clarifying the hosting page, again.

 

UPDATES

(5/13/2016)

There was a little discussion in the WordPress slack. macmanx is James Huff, an Automattic employee. Seems they wanted only 1 managed WordPress host. Other details include around 100 applications. And even in the WordPress slack, the first comment doubts that these are really the best (well, one which almost everyone assumes to be BlueHost).

james_huff_3_outta_4 James_Huff_hosting_recommendations

Free & Discount Web Hosting for Students

There are a handful of very good offers for students. The very best package for student developers is easily the GitHub Student Pack. It includes offers from Amazon, Digital Ocean and Microsoft Azure along with a number of other free tools and services.

There are also a handful of companies that offer discounts and credits for students which may be better suited for non developers. Some of them (e.g. BlueHost) offer cheaper pricing publicly than their student offering regularly during sales and promotions. The student deals may not be as good as they seem.

Company Offer Requirement
GitHub Bonus Amazon AWS Credits ($15), $50 DigitalOcean Credit, Microsoft Azure Credit If you're a student aged 13+ and enrolled in a degree or diploma granting course of study, the GitHub Student Developer Pack is for you. All you need is a school-issued email address, valid student identification card, or other official proof of enrollment
SiteGround $1.99/month .edu address
BlueHost $4.95 .edu address
InMotionHosting 50% off .edu address
Microsoft Azure $200 Credit None
Kickassd 6 Months Hosting School Email + ID

Bonus Offer

NameCheap for Education - Free .ME Domain for US, UK, CAN, AUS universities.

 

Free & Discount Web Hosting for Educators, Teachers and Institutions

 

If you know of any deals that are missing, please comment or contact us to add it to this list.

Header image credit:  Icon made by http://www.flaticon.com/authors/freepik from www.flaticon.com