Category Archives: Web Hosting

The Rise and Fall of A Small Orange

If you're an unhappy A Small Orange customer looking to find a better web host and don't want to read why the quality went down, simply head over to our Web Hosting Reviews and find a better hosting company. 

How did a small web hosting company have such a huge impact on Review Signal?

The Early Days

This story begins in October 2011, a year before Review Signal launched. Review Signal had been collecting data for months and early ratings data was starting to become meaningful. A tiny company was at the top of the rankings. A Small Orange.

The most worrisome part of this revelation was that A Small Orange did not have an affiliate program. Which isn't a requirement at all for a listing on Review Signal.

However, after investing years of work, if the top rated company ended up not having an affiliate program, the business was likely sunk before it even started. So I inquired early and heard back from the CEO at the time, “we don't have an affiliate program and at the moment, we have no plans for one.” This was a potential death knell because the entire business model relies on making at least some money, even though I assumed it would be much lower than my competitors who simply sell their rankings to the highest bidder. But as any entrepreneur knows, almost everything is negotiable if you understand what the other person really wants and why. After talking further with the CEO, he explained his issue with web hosting review websites, “they typically have a pay for ranking sort of model and do it either through set rates or affiliate payouts. It varies. The economics at ASO don't really work out for a standard affiliate program.” A Small Orange didn't want to play the game that every other review site out there did. Pay to play, quality be damned.

This CEO hated the games being played as much as I did.

That was all the opportunity I needed. Review Signal's mission has been to fight against that very same model and I knew I had an early ally who could make this work. We ended up working out a deal to pay three months of whatever plan someone purchased and he put a cap on my potential earnings at $250 before he would review the performance. Considering the most popular plans were $25/year and $5/month, this wasn't going to earn a lot, but at least it might start covering some of the very basic costs. The first month I earned $52.38 on 6 sales for an average of $8.73 per sale with A Small Orange.
At least it was something. And a foot in the door was all I needed to prove this crazy idea called Review Signal might have some legs. A Small Orange opened that door and for that our histories will forever be intertwined.

The Good Times

The next few years were very good. I was their first affiliate. I was their biggest affiliate for many years, bringing in over a thousand new customers. I got to know many of the staff and would consider some of them friends. And A Small Orange continued to be the best rated shared hosting company through 2014. Everyone was happy - their customers, the company and Review Signal. I was happy to recommend them based on the data showing they had incredibly satisfied customers. I had people tell me personally they were very happy with them after signing up because of the data I publish here at Review Signal.

2014-01-20 13.34.07

Free Swag and Annual Thank You Card from ASO

The EIG Acquisition

A Small Orange was quietly acquired in 2012. They were acquired by a behemoth in the hosting industry called Endurance International Group (NASDAQ: EIGI) which owns dozens of brands including some of the largest and most well known hosting companies: Blue Host, Host Gator, Host Monster, Just Host, Site5, iPage, Arvixe and more.

EIG has a very bad reputation in the web hosting world. If you ask most industry veterans they will tell you to run to the hills when it comes to EIG. The oft-repeated story is EIG acquires a hosting company, migrates them to their platform and the quality of service falls off a cliff. The best example of this is perhaps their migration to their Provo, UT data-center which had a catastrophic outage in 2013. This outage was huge. The impact dropped four of EIG's largest brands many percentage points in the Review Signal rankings in a single day.  But these major outages continue to happen as recently as November 2015.

In a recent earnings call with share holders, EIG CEO Hari Ravichandran talked about two recent acquisitions and their plans for them. “We expect to manage these businesses at breakeven to marginally profitable for the rest of the year as we migrate their subscriber bases onto our back-end platform. Once on platform, we expect to reach favorable economics and adjusted EBITDA contribution consistent with our previous framework for realizing synergies from acquisitions.”

The EIG Playbook

EIG's playbook has been to acquire web hosting brands, migrate them to their platform and 'reach favorable economics.' They've been doing it for years and it seems to be working well enough for investors to continue to put money into the company. M&A to grow subscriber bases and economies of scale to lower costs. It's a very simple and straightforward business plan. It doesn't speak to anything beyond spreadsheet math though, such as brand value and customer loyalty. And those are certainly lowered and lost post-EIG acquisition according to all the data we've collected over years and multiple acquired brands. It's calloused business accounting, but it makes perfect sense in the race to the bottom industry that is commodity shared hosting.

Review Signal Rating Calculated Pos/(Pos+Neg), without duplicate filtering

Review Signal Rating Calculated Pos/(Pos+Neg), without duplicate filtering

You can see all the EIG brands tracked here on Review Signal in the chart above and their acquisition dates below:

iPage - 2009. BlueHost/HostMonster - 2010. JustHost - Feb 2011. NetFirms - March 2011. HostGator - June 2012. A Small Orange  - July 2012. Arvixe - November 2014. Site5 - August 2015.

You'll notice their ratings, in general, are not very good with Site5 (their most recent acquisition) being the exception. iPage was acquired before I started tracking data. BlueHost/HostMonster also had a decline, although the data doesn't start pre-acquisition. JustHost collapses post acquisition. NetFirms has remained consistently mediocre. HostGator collapses with a major outage a year after acquisition. Arvixe collapses a year after being acquired. Site5 is still very recent and hasn't shown any signs of decline yet.

The Expected Decline of A Small Orange

So nearly every industry veteran I talked to expected A Small Orange to collapse. Immediately after acquisition. Except me. I was, am and will continue to be willing to give the benefit of the doubt to a company until I am shown evidence.

For years, post acquisition people were saying ASO's demise was right around the corner. For years, I still waited for that evidence and the prophecy to become true. But it didn't happen.

It often took EIG less than a year to ruin a brand. We don't have to look further than Arvixe for an example of this, which was acquired in November 2014. Today, Arvixe has one of the lowest ratings of any company on Review Signal at a shockingly low 27%.

But A Small Orange continued to chug along. It didn't hear the naysayers or believe itself to be a victim of the EIG curse. Instead, ASO was the best shared host for years post-acquisition. It seemed to have a fair level of autonomy from the EIG conglomerate. The staff I knew there, remained there, and all indications showed they were still the same company.

Until it wasn't.

The Fall of A Small Orange

A Small Orange Historical Rating

A Small Orange Historical Rating

The chart above shows Review Signal's rating of A Small Orange. The Blue line is the rating as calculated by [Positive Reviews / (Positive Reviews + Negative Reviews)]. The Red line only calculates the rating from the past 12 months of data. It's slightly different than Review Signal's actual calculation because I am not filtering out duplicates for quick analysis. The difference for A Small Orange is that when you remove the duplicates, the year 2015 had a 43% rating indicating there was quite a few people writing multiple negative things about A Small Orange.

Sometime in 2015, the A Small Orange that thousands of people trusted and raved about became another EIG brand. I tried to get the inside story. I reached out to the former CEO who sold the company to EIG and became an executive there for a couple years post acquisition. He reached out on my behalf to EIG's PR team to see if they would participate in this story. Both declined to participate.

So, I'm left to speculate on what happened at A Small Orange based on what's been publicly stated by their CEO and watching their strategy unfold for years across many companies/brands. My best guess is EIG finally got involved with A Small Orange. They used to be a distributed/remote team, now all positions they are hiring for are listed as in Texas (their headquarters). I saw a HostGator representative get moved over to ASO's team, so the internal staff was changing and people were being moved from brands with less than stellar reputations to ASO. The former CEO left mid-2014, which likely left a leadership and responsibility gap. ASO could probably run on auto pilot through the end of 2014, but over time having no champion for your brand in upper management eventually will come back to hurt the brand when decisions get made based on simple economics.

Once 2015 rolled around, the service had noticeably declined. The overall rating for A Small Orange in 2015 was 43% (only using 2015 data). For years, they had been in the 70's. It also ended the year with a massive outage for most, if not all, of their VPS customers which has been going on since Christmas. I personally received multiple messages from users of this site asking about what was happening and alerting me to this decline in service quality.

ASO was also responsible for the Arvixe migration that went very poorly and caused the Arvixe brand to tank. I'm not sure why EIG doesn't have a dedicated migration team to handle these type of moves considering how many acquisitions they go through and how large a role it plays in their growth strategy. But that's a whole separate issue.
It's with great disappointment that I have to admit, the A Small Orange that played such a huge role in the founding and success of Review Signal and provided a great service to many thousands of customers is dead. It's become another hollow EIG brand where the quality has gone down to mediocre levels. And that seems perfectly ok to them, because it's probably more profitable for their bottom line.

Going Forward

This story has had a profound impact on Review Signal. One thing that it made painfully obvious is that the ranking algorithm needs its first update since inception. The current ranking treats every review equally. Which was great when this site launched, because time didn't have any opportunity to be a factor yet. But as this site continues to move forward, I need to acknowledge that a significant amount of time has passed since launch and today. A review from the beginning of Review Signal isn't as relevant as one from this past week in determining the current quality of a web hosting company. A Small Orange right now shows up around 64% which is artificially high because of their long history of good service and it hasn't been brought down yet by the marginally small (by time scale) decline of the past year. But it's painfully clear that it's not a 64% rating company anymore.

Another thing to note is the graphs here all used a simpler calculation [Pos / (Pos + Neg)] to calculate rating without duplicate filtering. What this means is the difference between the rating here and the actual rating on the live site is a measure of the degree people are being positive or negative about a company. If the rating here is higher than the published, it means people are saying on average, more than one good thing about the same company. If the rating is below (as is in most if not all cases here), it means people are are saying more than one negative thing about the company. I'm not sure if this will factor into a new algorithm, but it is something to consider. My intuition says you would see it hinge around 50%, those companies above would likely have more positive supporters, and those below would have detractors.

In the coming months I will try to figure out a better way to generate the ranking number that more fairly represents the current state of a company. My initial thought is to use some sort of time discounting, so that the older the review, the less weight it would carry in the rankings. If anyone has experience working with this or wants to propose/discuss ideas, please reach out - comment here, email me, or tweet @ReviewSignal.

BlueHost, HostMonster, and JustHost Down (11/25/2015)

We are seeing a lot of people complaining about BlueHost, HostMonster, and JustHost being down right now. It doesn't seem to affecting other brands I checked (HostGator, iPage, A Small Orange, Arvixe, Site5).

It is very strange that HostGator isn't down this time. Their last major outage in 2013 affected all four of those brands (source: mashable). I am wondering if their infrastructure has been separated meaningfully between HostGator and the other three.

I've heard rumors of a DDoS attack from people saying that's what support told them. No official confirmation.

If you're considering changing web hosts, we track and publish what people think of most major webhosting companies here.

It's gotten to the point people are making memes. Not good.

LiquidWeb NPS Scores vs LiquidWeb Review Signal Rating

Yesterday (August 18, 2015), LiquidWeb [Reviews] sent out a press release announcing its NPS Score:

"...its [LiquidWeb] Net Promoter Score reached an all-time high of 74 at the end of Q2 2015. This represents a "best-in-class" rating in the consumer-driven metric. Liquid Web's score is particularly noteworthy as scores above 60 are extremely rare in the Web Hosting industry, where average scores historically hover in the single-digit realm. Regardless of industry, a score of 74 - on top of Liquid Web's confirmed 12-month rolling average of 67 - is strongly indicative of excellent customer satisfaction."

The further explain how NPS is measured:

"Net Promoter Score (or NPS®) is the result of a 3rd party study on customers' direct feedback and gauges their likelihood to recommend a business's products or services. The results calculated the percentage of Liquid Web's customers who qualify as "promoters" (rating the company 9 or 10 on a 0-to-10 point scale) minus the percentage who are "detractors" (rating 6 or lower). Scores can range from -100 to +100 with scores of +50 and higher considered a "best-in-class" customer service level."

NPS is fairly well known in the marketing and branding world as a way to measure how a company is doing. In fact, it's something I was studying while working on the technology that powers Review Signal in graduate school. The boiled down version of it is, the greater the percentage of people who speak your praises versus the percentage of people who say negative things about your company is a measurement of how well your service is perceived.

What's interesting about a company publishing their NPS score is that I can compare it to the data of people saying good and bad things publicly that I track here at Review Signal. There are some differences in how NPS is measured versus how I measure at Review Signal. Some of the bigger differences being a 1-10 scale in NPS versus the binary (Good/Bad) on Review Signal and what type of messages are being looked at. Review Signal looks at all kinds of messages people publicly post while NPS would generally be done in a survey of customers. [If you're curious how/what Review Signal measures, it's all publicly explained at https://reviewsignal.com/howitworks]

So without further ado, the numbers. LiquidWeb's all time high is 74 and their 12 month average is a 67. On Review Signal, LiquidWeb have a lifetime 70% rating. Putting them squarely within their publicly disclosed NPS rating.

liquidweb_stats_box

A lot of people question web hosting reviews and I constantly have to question what I'm doing and how I'm doing it. It's a very rare event when I'm given an external metric that I can compare Review Signal against. So when it matches up so cleanly, it's a great validation that what I'm doing here is working.

 

Site 5 Web Hosting Logo

Site5 Acquired by Endurance International Group (EIG)

Endurance International Group yesterday announced in their second quarter results that they acquired Site5 and Verio.

During the quarter, the company acquired assets of Verio and Site5. The total cash consideration for these acquisitions is expected to be approximately $36 million.

Via MarketWatch.

EIG continues to acquire hosting companies as a growth strategy and doesn't seem to plan on stopping any time soon. The hope is that Site5, which is rated as one of the better companies on Review Signal, operates more like A Small Orange which was acquired in 2012 and continues to be rated very highly. Time will tell how it plays out, I will certainly be watching the data and trends.

That brings the list of EIG companies here on Review Signal to:

 

WordPress Hosting Performance Benchmarks (2015)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

This is the third round of managed WordPress web hosting performance testing. You can see the original here, and the November 2014 version here.

New (9/14/2016) The 2016 WordPress Hosting Performance Benchmarks are live.

New (8/20/2015) This post is also available as an Infographic.

Companies Tested

A Small Orange [Reviews]
BlueHost [Reviews]
CloudWays [Reviews]
DreamHost [Reviews]
FlyWheel [Reviews]
GoDaddy [Reviews]
Kinsta
LightningBase
MediaTemple [Reviews]
Nexcess
Pagely [Reviews]
Pantheon [Reviews]
Pressidium
PressLabs
SiteGround† [Reviews]
WebHostingBuzz
WPEngine* [Reviews]
WPOven.com
WPPronto

Note:  Pressable and WebSynthesis [Reviews] were not interested in being tested this round and were excluded. WordPress.com dropped out due to technical difficulties in testing their platform (a large multi-site install).

Every company donated an account to test on. All were the WordPress specific plans (e.g. GoDaddy's WordPress option). I checked to make sure I was on what appeared to be a normal server. The exception is WPEngine*. They wrote that I was "moved over to isolated hardware (so your tests don’t cause any issues for other customers) that is in-line with what other $29/month folks use." From my understanding, all testing was done on a shared plan environment with no actual users on the server to share. So this is almost certainly the best case scenario performance wise, so I suspect the results look better than what most users would actually get.

†Tests were performed with SiteGround's proprietary SuperCacher module turned on fully with memcached.

The Products (Click for Full-Size Image)

wordpress_hosting_2015_product_chart

Methodology

The question I tried to answer is how well do these WordPress hosting services perform? I tested each company on two distinct measures of performance: peak performance and consistency. I've also included a new and experimental compute and database benchmark. Since it is brand new, it has no bearing on the results but is included for posterity and in the hope that it will lead to another meaningful benchmark in the future.

All tests were performed on an identical WordPress dummy website with the same plugins except in cases where hosts added extra plugins. Each site was monitored for over a month for consistency.

1. LoadStorm

LoadStorm was kind enough to give me unlimited resources to perform load testing on their platform and multiple staff members were involved in designing and testing these WordPress hosts. I created identical scripts for each host to load a site, login to the site and browse the site.  I tested every company up to 2000 concurrent users. Logged in users were designed to break some of the caching and better simulate real user load.

2. Blitz.io

I used Blitz again to compare against previous results. Since the 1000 user test wasn't meaningful anymore, I did a single test for 60 seconds, scaling from 1-2000 users.

3. Uptime (UptimeRobot and StatusCake)

Consistency matters. I wanted to see how well these companies performed over a longer period of time. I used two separate uptime monitoring services over the course of a month to test consistency.

4. WebPageTest.org

"WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster." WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. I tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

5. WPPerformanceTester

I created a WordPress plugin to benchmark CPU, MySql and WordPress DB performance. It is based on a PHP benchmark script I forked (available on GitHub) and adapted to WordPress. The CPU/MySql benchmarks are testing the compute power. The WordPress component tests actually calling $wpdb and executing insert, select, update and delete queries. This plugin will be open sourced once I clean it up and make it usable for someone beyond myself.

Background Information

Before I go over the results I wanted to explain and discuss a few things. Every provider I tested had the latest version of WordPress installed. I had to ask a lot of companies to disable some security features to perform accurate load tests. Those companies were: DreamHost, Kinsta, LightningBase, Nexcess, Pagely, Pressidium, PressLabs, SiteGround, and WPEngine.

Every company that uses a VPS based platform were standardized around 2GB of memory for their plan (or equivalent) in an effort to make those results more comparable. The exception is DreamHost which uses a VPS platform but uses multiple scaling VPSs.

CloudWays's platform that lets you deploy your WordPress stack to multiple providers: Digital Ocean, Amazon (AWS)'s EC2 servers or Google Compute Engine. I was given a server on each platform of near comparable specs (EC2 Small 1.7GB vs Digital Ocean 2GB vs GCE 1.7GB g1 Small). So CloudWays is listed as CloudWays AWS, CloudWays DO, CloudWays GCE to indicate which provider the stack was running on.

SiteGround contributed a shared and VPS account designated by the Shared or VPS after it.

Results

Load Storm

Since last round didn't have any real issues until 1000 users I skipped all the little tests and began with 100-1000 users. I also did the 500-2000 user test on every company instead of simply disqualifying companies. I ran these tests with an immense amount of help from Phillip Odom at LoadStorm. He spent hours with me, teaching me how to use LoadStorm more effectively, build tests and offering guidance/feedback on the tests themselves.

 Test 1. 100-1000 Concurrent Users over 30 minutes

Name of Test Total Requests Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred(GB) Peak Throughput(kB/s) Average Throughput(kB/s) Total Errors
A Small Orange 114997 90.27 61.83 1785 259 2.41 1878.14 1295.82 0
BlueHost 117569 93.62 63.21 15271 2522 5.41 4680.6 2909.16 23350
CloudWays AWS 138176 109.1 74.29 15086 397 7.15 6016.88 3844.49 44
CloudWays DO 139355 109.88 74.92 2666 321 7.21 5863.82 3876.3 0
CloudWays GCE 95114 76.22 52.84 15220 7138 3.63 3247.38 2014.92 23629
DreamHost 143259 113.57 77.02 15098 314 7.1 6136.75 3815.73 60
FlyWheel 128672 101.98 69.18 9782 571 7 6197.92 3764.6 333
GoDaddy 134827 104.6 72.49 15084 352 7.49 6368.32 4028.45 511
Kinsta 132011 102.98 70.97 3359 229 7.35 6078.95 3951.75 0
LightningBase 123522 100.73 68.62 4959 308 6.53 5883.15 3626.2 4
MediaTemple 134278 105.72 74.6 15096 363 7.45 6397.68 4140.7 640
Nexcess 131422 104.47 70.66 7430 307 7.17 6256.08 3854.27 0
Pagely 87669 70.8 47.13 7386 334 5.75 5090.11 3091.06 3
Pantheon 135560 106.42 72.88 7811 297 7.24 5908.27 3890.83 0
Pressidium 131234 103.03 70.56 7533 352 7.23 6092.36 3889.64 0
PressLabs 132931 107.43 71.47 10326 306 3.66 3264.02 1968.98 0
SiteGround Shared 137659 111.35 74.01 7480 843 6.85 5565.02 3683.04 111
SiteGround VPS 130993 103.45 70.43 15074 310 7.17 6061.82 3855.86 19
WebHostingBuzz
WPEngine 148744 117.15 79.97 15085 206 7.32 6224.06 3935.35 4
WPOven.com 112285 96.58 60.37 15199 2153 5.78 5680.23 3108.94 5594
WPPronto 120148 99.08 64.6 15098 681 5.61 4698.51 3018.33 19295

Discussion of Load Storm Test 1 Results

Most companies were ok with this test, but a few didn't do well: BlueHost, CloudWays GCE, WPOven and WPPronto. FlyWheel, GoDaddy and Media Temple had a couple spikes but nothing too concerning. I was actually able to work with someone at DreamHost this time and bypass their security features and their results look better than last time. I am also excited that we got PressLabs working this time around after the difficulties last round.

In general, the 1000 user test isn't terribly exciting, 7/21 companies got perfect scores with no errors. Another 6 didn't have more than 100 errors. Again, this test pointed out some weak candidates but really didn't do much for the upper end of the field.

Test 2. 500 - 2000 Concurrent Users over 30 Minutes

Note: Click the company name to see full test results.

Total Requests Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred(GB) Peak Throughput(kB/s) Average Throughput(kB/s) Total Errors
A Small Orange 242965 181.62 130.63 15078 411 5.09 3844.54 2737 1
BlueHost 201556 166.83 111.98 15438 8186 5.32 5229.07 2953.17 93781
CloudWays AWS 261050 195.23 145.03 15245 2076 13.13 9685.95 7296.4 11346
CloudWays DO 290470 218.17 161.37 15105 532 14.87 12003.3 8262.77 1189
CloudWays GCE 193024 147.22 107.24 15168 8291 4.72 4583.86 2622.85 93821
DreamHost 303536 232.27 163.19 15100 442 14.95 12619.67 8039.54 210
FlyWheel 253801 202.15 136.45 15218 1530 11.26 9939.17 6052.49 56387
GoDaddy 283904 221.12 152.64 15025 356 15.74 13731.97 8460.12 1432
Kinsta 276547 214.93 148.68 15025 573 15.16 13444.75 8151.37 1811
LightningBase 263967 211.12 141.92 7250 330 13.82 13061.01 7429.91 18
MediaTemple 286087 223.93 153.81 15093 355 15.83 14532.42 8512.11 1641
Nexcess 277111 207.73 148.98 15087 548 15 12313.29 8066.37 359
Pagely 181740 148.18 97.71 11824 791 11.82 10592.21 6355.09 1
Pantheon 287909 223.02 154.79 15039 276 15.28 13831.45 8217.49 3
Pressidium 278226 208.55 149.58 15044 439 15.28 12453.66 8213.63 12
PressLabs 280495 214.07 150.8 8042 328 7.66 6267.46 4118.34 0
SiteGround Shared 301291 231.93 161.98 15052 557 14.76 12799.09 7934.03 1837
SiteGround VPS 279109 209.67 150.06 12777 374 15.21 12506.79 8178.5 20
WebHostingBuzz
WPEngine 316924 241.67 170.39 7235 285 15.52 12989.23 8341.47 3
WPOven.com 213809 169.97 118.78 15268 4442 8.81 7153.5 4894.98 35292
WPPronto 258092 206.53 143.38 15246 539 10.85 9483.74 6026.26 76276

Discussion of Load Storm Test 2 Results 

The previous companies that struggled ( BlueHost, CloudWays GCE, WPOven and WPPronto) didn't improve, which is to be expected. FlyWheel which had a few spikes ran into more serious difficulties and wasn't able to withstand the load. CloudWays AWS ended up failing, but their Digital Ocean machine spiked but was able to handle the load.

The signs of load were much more apparent this round with a lot more spikes from many more companies. GoDaddy and Media Temple who also had spikes in the first test, had spikes again but seemed to be able to withstand the load.  Kinsta spiked early but was stable for the duration of the test. SiteGround Shared had a steady set of small spikes but didn't fail.

Nobody had the same level of perfection as last time with no spike in response times. Only one company managed an error-less run this time (PressLabs) but many achieved similar results, like A Small Orange went from 0 errors to 1, Pantheon went from 0 to 3 and Pagely had only 1 error, again.

The biggest change that occurred was WPEngine. It went from failing on the 1000 user test to having one of the better runs in the 2000 user test. I have to emphasize it was a shared plan on isolated hardware though with no competition for resources.

Blitz.io

 Test 1. 1-2000 Concurrent Users over 60 seconds

Blitz Test 1. Quick Results Table

Note: Click the company name to see full test results.

Company Hits Errors Timeouts Average Hits/Second Average Response Time Fastest Response Slowest Response
A Small Orange 51023 56 280 850 115 72 285
BlueHost 37373 475 2102 623 338 124 979
CloudWays AWS 56946 737 74 949 13 3 73
CloudWays DO 52124 1565 1499 869 35 23 87
CloudWays GCE 50463 1797 782 841 96 92 138
DreamHost 58584 1 0 978 4 4 4
FlyWheel 49960 3596 2022 833 30 24 140
GoDaddy 29611 26024 18 494 165 103 622
Kinsta 57723 1 0 962 20 20 21
LightningBase 54448 1 4 907 81 81 81
MediaTemple 29649 25356 126 494 162 104 1103
Nexcess 38616 4924 2200 644 221 70 414
Pagely 58722 1 0 979 3 2 5
Pantheon 55814 112 9 930 52 52 54
Pressidium 47567 1 9 793 233 233 234
PressLabs 58626 0 0 977 5 4 6
SiteGround Shared 49127 1123 1 819 172 171 178
SiteGround VPS 35721 75 4371 595 238 82 491
WebHostingBuzz
WPEngine 56277 827 1 938 27 21 70
WPOven.com 55027 10 2 917 69 68 71
WPPronto 54921 99 29 915 69 68 72

blitz_summary_graph

Discussion of Blitz Test 1 Results

This test is just testing whether the company is caching the front page and how well whatever caching system they have setup is performing (generally this hits something like Varnish or Nginx).

Who performed without any major issues?

DreamHost, Kinsta, LightningBase, Pagely, Pantheon, Pressidium, PressLabs, WPOven, WPPronto all performed near perfect. There's nothing more to say for these companies other than they did excellent.

Who had some minor issues?

A Small Orange started showing signs of load towards the end. CloudWays AWS had a spike and started to show signs of load towards the end. SiteGround Shared had a spike at the end that ruined a very beautiful looking run otherwise. WPEngine started to show signs of load towards the end of the test.

Who had some major issues?

BlueHost, CloudWays DO, CloudWays GCE, FlyWheel, GoDaddy, MediaTemple, Nexcess, and SiteGround VPS had some major issues. The CloudWays platform pushed a ton of requests (the only companies over 50,000) but also had a lot of errors and timeouts. The rest were below 50,000 (although FlyWheel was only a hair behind) and also had a lot of errors and timeouts. SiteGround VPS might be an example of how shared resources can get better performance versus dedicated resources. GoDaddy and Media Temple have near identical performance (again, it's the same technology I believe). Both look perfect until near the end where they crash and start erroring out. Nexcess just shows load taking its toll.

Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. All the companies were monitored over an entire month+ (May-June 2015).

Uptime Robot

Uptime (30 Day)
A Small Orange 100
BlueHost 100
CloudWays AWS 100
CloudWays DO 100
CloudWays GCE 100
DreamHost 94.06
FlyWheel 100
GoDaddy 100
Kinsta 100
LightningBase 100
MediaTemple 100
Nexcess 100
Pagely 100
Pantheon 99.94
Pressidium 100
PressLabs 100
SiteGround Shared 100
SiteGround VPS 100
WebHostingBuzz 42.9
WPEngine 100
WPOven.com 100
WPPronto 100

At this point, I will finally address the odd elephant in the blog post. WebHostingBuzz has empty lines for all the previous tests. Why? Because their service went down and never came back online. I was told that I put an incorrect IP address for the DNS. However, that IP worked when I started and was the IP address I was originally given (hence the 42% uptime, it was online when I started testing). It took weeks to even get a response and once I corrected the IP, all it ever got was a configuration error page from the server. I've not received a response yet about this issue and have written them off as untestable.

The only other company that had any major issue was DreamHost. I'm not sure what happened, but they experienced some severe downtime while I was testing the system and returned an internal server error for 42 hours.

Every other company had 99.9% uptime or better.

StatusCake

StatusCake had a slightly longer window available from their reporting interface, so the percentages are a little bit different and noticeable on companies like DreamHost.

StatusCake Availability (%) Response Time (ms)
A Small Orange 99.96 0.21
BlueHost 99.99 0.93
CloudWays AWS 100 0.76
CloudWays DO 100 0.47
CloudWays GCE 100 0.69
DreamHost 97.14 1.11
FlyWheel 100 1.25
GoDaddy 100 0.65
Kinsta 100 0.71
LightningBase 99.99 0.61
MediaTemple 100 1.38
Nexcess 100 0.61
Pagely 99.99 0.47
Pantheon 99.98 0.56
Pressidium 99.99 0.94
PressLabs 100 0.65
SiteGround Shared 100 0.54
SiteGround VPS 100 0.9
WebHostingBuzz 58.1 0.67
WPEngine 100 0.71
WPOven.com 100 0.73
WPPronto 100 1.19

The results mirror UptimeRobot pretty closely. WebHostingBuzz and DreamHost had issues. Everyone else is 99.9% or better.

StatusCake uses a real browser to track response time as well. Compared to last year, everything looks faster. Only two companies were sub one second average response time last year. This year, almost every company maintained sub one second response time, even the company that had servers in Europe (Pressidium).

WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
A Small Orange 0.624 0.709 0.391 0.8 0.631
BlueHost 0.909 1.092 0.527 0.748 0.819
CloudWays AWS 0.627 0.748 0.694 1.031 0.775
CloudWays DO 0.605 0.751 0.635 1.075 0.7665
CloudWays GCE 0.787 0.858 0.588 1.019 0.813
DreamHost 0.415 0.648 0.522 0.919 0.626
FlyWheel 0.509 0.547 0.594 0.856 0.6265
GoDaddy 0.816 1.247 0.917 0.672 0.913
Kinsta 0.574 0.559 0.587 0.903 0.65575
LightningBase 0.544 0.656 0.5 0.616 0.579
MediaTemple 0.822 0.975 0.983 0.584 0.841
Nexcess 0.712 0.871 0.593 0.795 0.74275
Pagely 0.547 0.553 0.665 0.601 0.5915
Pantheon 0.627 0.567 0.474 0.67 0.5845
Pressidium 0.777 0.945 0.898 1.05 0.9175
PressLabs 0.542 1.257 0.723 0.732 0.8135
SiteGround Shared 0.721 0.85 0.478 0.808 0.71425
SiteGround VPS 0.667 0.651 0.515 0.657 0.6225
WebHostingBuzz 0
WPEngine 0.648 0.554 0.588 0.816 0.6515
WPOven.com 0.624 0.574 0.556 0.595 0.58725
WPPronto 0.698 0.809 0.443 0.721 0.66775

In line with the StatusCake results, the WebPageTest results were shockingly fast. The first time I did this testing, only one company had a sub one second average response time. Last year about half the companies were over one second average response time. The fastest last year was LightningBase at 0.7455 seconds. This year that would be in the slower half of the results. The fastest this year was LightningBase again at 0.579 seconds. The good news for consumers appears to be that everyone is getting faster and your content will get to consumers faster than ever no matter who you choose.

WPPerformanceTester

Company PHP Ver MySql Ver PHP Bench WP Bench MySql
A Small Orange 5.5.24 5.5.42-MariaDB 13.441 406.67 LOCALHOST
BlueHost 5.4.28 5.5.42 12.217 738.01 LOCALHOST
CloudWays AWS 5.5.26 5.5.43 10.808 220.12 LOCALHOST
CloudWays DO 5.5.26 5.5.43 11.888 146.76 LOCALHOST
CloudWays GCE 5.5.26 5.5.43 10.617 192.2 LOCALHOST
DreamHost 5.5.26 5.1.39 27.144 298.6 REMOTE
FlyWheel 5.5.26 5.5.43 12.082 105.76 LOCALHOST
GoDaddy 5.4.16 5.5.40 11.846 365.76 REMOTE
Kinsta 5.6.7 10.0.17-MariaDB 11.198 619.58 LOCALHOST
LightningBase 5.5.24 5.5.42 12.369 768.64 LOCALHOST
MediaTemple 5.4.16 5.5.37 12.578 333.33 REMOTE
Nexcess 5.3.24 5.6.23 12.276 421.76 LOCALHOST
Pagely 5.5.22 5.6.19 10.791 79.79 REMOTE
Pantheon 5.5.24 5.5.337-MariaDB 12.669 194.86 REMOTE
Pressidium 5.5.23 5.6.22 11.551 327.76 LOCALHOST
PressLabs 5.6.1 5.5.43 8.918 527.7 REMOTE
SiteGround Shared 5.5.25 5.5.40 14.171 788.02 LOCALHOST
SiteGround VPS 5.6.99 5.5.31 11.156 350.51 LOCALHOST
WebHostingBuzz
WPEngine 5.5.9 5.6.24 10.97 597.37 LOCALHOST
WPOven.com 5.3.1 5.5.43 11.6 570.13 LOCALHOST
WPPronto 5.5.25 5.5.42 11.485 889.68 LOCALHOST

This test is of my own creation. I created a plugin designed to test a few aspects of performance and get information about the system it was running on. The results here have no bearing on how I am evaluating these companies because I don't have enough details to make these meaningful. My goal is to publish the plugin and get people to submit their own benchmarks though. This would allow me to get a better picture of the real performance people are experiencing from companies and track changes over time. The server details it extracted may be of some interest to many people. Most companies were running PHP 5.5 or later but a few aren't. Most companies seem to be running normal MySql, but ASO, Kinsta and Pantheon all are running MariaDB which many people think has better performance. Considering where all three of those companies ended up performing on these tests, it's not hard to believe. There seems to be an even split between running MySql localhost (BlueHost, LightningBase, Nexcess, SiteGround, WPEngine, WPPronto) or having a remote MySql server (DreamHost, GoDaddy, MediaTemple, Pagely, Pantheon, PressLabs).

The PHP Bench was fascinating because most companies were pretty close with the exception of DreamHost which took nearly twice as long to execute.

The WP Bench was all over the place. Pagely had by far the slowest result but on every load test and speed test they went through, they performed with near perfect scores. The test simulates 1000 $wpdb calls doing the primary mysql functions (insert, select, update, delete). Other companies had outrageously fast scores like WPPronto or BlueHost but didn't perform anywhere near as well as Pagely on more established tests.

For those reasons, I don't think this benchmark is usable yet. But I would love feedback and thoughts on it from the community and the hosting companies themselves.

Attempting VPS Parity in Testing

One substantial change to the testing methodology this round was that all VPS providers were tested with the same amount of memory (2 GB Ram). Since the most interesting tests were the load tests I have only graphed them below:

2gb-vps-loadstorm2gb-vps-blitz

The LoadStorm test had a huge spread in terms of performance. The Google Compute Engine server from CloudWays was by far the worst (an issue we touched on before that it's not a true VPS with dedicated resources). FlyWheel and WPOven also struggled to keep up with the LoadStorm test. Others like ASO, CloudWays DO, Kinsta, and SiteGround handled the test with minimal issues. On the other hand, it's very interesting to see how fairly consistent most of the VPSs perform in the Blitz test between 50,000 and roughly 55,000 hits. The error rates are a bit interesting though because this hardware should be about as close to the same as possible.

The easier result to explain is the Blitz performance. It is testing the ability of these companies to spit back a single page from cache (most likely Varnish or Nginx). So that level of caching seems to be pretty close to parity.

The LoadStorm test shows a wide difference in performance. The LoadStorm test is far more comprehensive and designed to bust through some caching and hit other parts of the stack. It really elucidates the difference in company's ability to tune and optimize their servers from both software and hardware perspectives.

Conclusion

Every service seems to have their issues somewhere if you look hard enough. I try to avoid injecting my personal opinion and bias as much as possible. As I've added more companies to the testing, drawing a line between which companies performed in the top tier and which did not has become blurrier. The closest test was the LoadStorm 2000 test where multiple companies (CloudWays DO, GoDaddy, Kinsta, Media Temple, SiteGround Shared) were on the absolute edge of being top tier providers. Last time I picked an arbitrary 0.5% error rate and these companies were all around the 0.5-0.7% mark. Last year the difference was quite large after that point. I openly admit to having personal connections with people at nearly all these companies and my ability to draw the line in this instance could be considered questionable. So this year I deferred the judgment to an independent party, Phillip Odom at LoadStorm, to determine what he thought of the performances. Phillip is the Director of Performance Engineering at LoadStorm and he has more experience with load testing and the LoadStorm product than almost anyone I know. His job was to determine if the performance could be considered top tier or not. He said a couple spikes early but a stable performance otherwise seemed top tier. The difference in 1/100 of a percent didn't seem like a big deal, especially over a 30 minute test where the issues were at the start as it ramped up to 2000 concurrent users. So the companies on the edge that exhibited that behavior were considered top tier for the LoadStorm test.

I won't be ranking or outright saying any single company is the best. Some providers did exceptionally well and tended to clump together performance-wise, I will call those the top tier providers. This top tier designation is related to performance only and is claimed only from the results of these tests. What each of these companies is offering is different and may best suit different audiences depending on a variety of factors beyond performance, such as features, price, support, and scale (I tested mostly entry level plans OR 2GB RAM plans for VPS providers). I will also provide a short summary and discussion of the results for each provider.

Top Tier WordPress Hosting Performance

IMG_24072015_210625

A Small Orange,  Kinsta, LightningBasePagely, Pantheon, PressidiumPressLabs

Each of these companies performed with little to no failures in all tests and exhibited best in class performance for WordPress hosting.

Honorable Mentions

CloudWays gets an honorable mention because it's Digital Ocean (DO) instance performed quite well overall. It had some issue with the Blitz test at the end but still managed to push through over 52,000 successful hits. It's Amazon stack performed better on the Blitz test but not as well on LoadStorm. I'm not sure why the performance of identical stacks is differing across tests so much between AWS/DO but they improved dramatically since the last test and are on the cusp of becoming a top tier provider.

SiteGround's Shared hosting also gets an honorable mention. It was on that edge for both LoadStorm and Blitz. It had one spike at the end of the Blitz test which caused it's error rate to spike but the response times didn't move.

WPEngine gets an honorable mention because they performed well on most tests. They struggled and were showing signs of load on the Blitz test though that kept them out of the top tier of providers.

Individual Host Analysis

A Small Orange [Reviews]

Another top tier performance from ASO. They didn't really struggle much with any of the tests. Although their performances were slightly below their results last time, it's hard to beat things like having zero errors during LoadStorm's test. It's become easier to launch the LEMP VPS stack which is also nice. All in all, the experience was in-line with what I would expect from a company that has one of the highest support ratings on our site.

BlueHost [Reviews]

Improved against their last results but well below par in the performance department. The pricing and performance just don't match yet.

CloudWays [Reviews]

CloudWays is always a fun company to test. They added another provider since their last test: Google Compute Engine (GCE). Their Digital Ocean and Amazon performances both went up substantially which tells me they've made major improvements on their WordPress stack. We did run into some huge flaws in GCE though which aren't CloudWays's fault. We used the g1.small server on GCE and ran into huge performance walls that were repeatable and inexplicable from a software standpoint. Google was contacted and we learned that the "g1 family has "fractional" CPU, meaning that not a full virtual CPU is assigned to a server. This also means that the CPU is shared with other VMs and "capped" if usage exceeds a certain amount. This is exactly what happened during the load test. The VM runs out of CPU cycles and has to wait for new ones being assigned on the shared CPU to continue to server requests." Essentially, it's not a real VPS with dedicated resources and I was told a comparable would be N1.standard1 which is 2-3x the price of the AWS/DO comparables servers. It doesn't make GCE a very attractive platform to host on if you're looking for performance and cost efficiency. CloudWays did show major improvements this round and earned themselves that honorable mention. They were by far the most improved provider between tests.

DreamHost [Reviews]

DreamPress improved their performance a lot over last round. In fact they did fantastically well on every load test once I got the opportunity to actually work with their engineers to bypass the security measures. However, they failed pretty badly on the uptime metrics. I have no idea what happened but I experienced a huge amount of downtime and ran into some very strange errors. If it wasn't for the severe downtime issues, DreamPress could have been in the top tier.

FlyWheel [Reviews]

FlyWheel were excellent on every test except the final seconds of the Blitz test. Although they were just shy of the top tier, they are showing a lot of consistency in very good performance getting an honorable mention the last two times. Just some minor performance kinks to work out. Not bad at all for a company with the best reviews of any company Review Signal has ever tracked. FlyWheel is definitely worth a look.

GoDaddy [Reviews]

GoDaddy's performance declined this round. It struggled with the Blitz test this time around. I'm not sure what changed, but it handled Blitz far worse than before and LoadStorm slightly worse. The performance between GoDaddy and Media Temple again looked near identical with the same failure points on Blitz. At the retail $6.99 price though, it's still a lot of bang for your buck compared to most providers who are in the $20-30/month range.

Kinsta

Kinsta had another top tier performance. There was a slight decline in performance but that could be explained by the fact we tested different products. Kinsta's test last year was a Shared plan they no longer offer. This year it was a 2GB VPS that we tested. Dedicated resources are great but sometimes shared gives you a little bit extra with good neighbors which could explain the difference. Either way, Kinsta handled all of the tests exceptionally well and earned itself top tier status.

LightningBase

LightningBase is another consistent performer on our list. Another test, another top tier rank earned. It had ridiculous consistency with the Blitz test where the fastest and slowest response were both 81ms. A textbook performance at incredible value of $9.95/month.

Media Temple [Reviews]

Media Temple and GoDaddy are still running the same platform by all indications. Media Temple offers a more premium set of features like Git, WP-CLI, Staging but the performance was identical. It declined from last time and had the same bottlenecks as GoDaddy.

Nexcess

I feel like copy and paste is the right move for Nexcess. Nexcess's performance was excellent in the Load Storm testing. However, it collapsed during the Blitz load testing. This was the same behavior as last year. It handled the Blitz test better this year, but still not well enough. Nexcess ends up looking like a middle of the pack web host instead of a top tier one because of the Blitz test, again.

Pagely [Reviews]

Is the extra money worth it? Only if you value perfection. Pagely came through again with an amazing set of results. It handled more hits than anyone in the Blitz test at a staggering 58,722 hits in 60 seconds (979 hits/second). We're approaching the theoretical maximum at this point of 1000 hits/second. And Pagely did it with 1 error and a 3ms performance difference from the fastest to slowest responses. The original managed WordPress company continues to put on dominant performance results.

Pantheon [Reviews]

Another test, another top tier performance. Just another day being one of the most respected web hosts in the space. Everyone I talk to wants to compare their company to these guys. It's obvious why, they've built a very developer/agency friendly platform that looks nothing like anything else on the market. It also performs fantastically well. They didn't perform the absolute best on any particular test but they were right in the top echelon with minimal errors on everything.

Pressidium

Pressidium was a new entrant and it did exceptionally well. They are UK based and suffered slightly on some performance tests because of latency between the UK and the US testing locations used. For example, the Blitz testing showed fewer  responses, but their total of 10 errors shows pretty clearly that it was a throughput across the Atlantic ocean issue more than their service struggling because it had a 1 second spread from the fastest to slowest response. Incredibly consistent performance. Despite their geographic disadvantage in this testing they still managed to keep a sub-one second response from four US testing locations in the WebPageTest testing. Overall, a top tier performance from a competitor from across the pond.

PressLabs

We finally got PressLabs working with the LoadStorm testing software. And it was worth the wait. They were the only company to handle the 2000 logged in user test with zero errors. Combined with the second fastest Blitz test (again without a single error) puts PressLabs firmly in the top tier as you would expect from the most expensive offering tested this round.

SiteGround [Reviews]

It was nice that we finally worked out the security issues in testing with LoadStorm on SiteGround. SiteGround's Shared hosting platform bounced back after last years testing. Their Blitz performance went up substantially and put it back into the honorable mention category. The VPS performance was slightly worse on the Blitz test, but noticeably better on the much longer LoadStorm test. This could be a good example of when Shared hosting can outperform dedicated resources because Shared hosting generally has access to a lot more resources than smaller VPS plans. Depending on how they are setup and managed, you can often get more burst performance from Shared over a small VPS. But in the longer term, dedicated resources are generally more stable (and guaranteed). SiteGround's Shared hosting definitely helps keep the lower priced options with excellent performance a reality for many.

WebHostingBuzz

WebHostingBuzz asked to be included in this testing and then completely disintegrated to the point I couldn't even test them. I still never heard anything from them for months. I would like to know what happened, but until I actually get a response, this one will remain a bizarre mystery.

WPEngine [Reviews]

This is a difficult one to write about. There are definitely performance improvements that occurred. They jumped up to an honorable mention. Their engineers actually worked to resolve some security issues that hindered previous testing. My biggest concern is the isolated shared environment I was on. A shared environment has a lot more resources than many dedicated environments and I was isolated away to prevent the testing from affecting any customers (which is a reasonable explanation). But that means I was likely to be getting the absolute dream scenario in terms of resource allocation, so a normal user would see this in the very best case scenario. So WPEngine is certainly capable of delivering better performance than they did in the past, but I do have concerns about the reasonable expectation of a new user getting the same results.

WPOven

WPOven was another new entrant to this testing and they performed well in a couple tests. They flew through the Blitz test without any issues. Their WebPageTest results were one of the absolute fastest in an already fast pack. Their uptime was perfect. They did struggle with the LoadStorm tests though both at the 1000 and 2000 user levels. It's nice to see more competitors enter the space, WPOven put on a good first show, but there is still some serious improvements to make to catch up to the front of the field.

WPPronto

Another new entrant who ran into a severe testing issue which caused me to re-do all the tests. The server was given more resources than the plan specified while debugging some security issues. The results on the extra resources were on par with some of the top in the field, but not representative of what the actual plan would be able to achieve. I didn't believe it was malicious (they were quite transparent about what happened), so I gave the benefit of the doubt and re-did all testing in a closely monitored condition. With the default resource allocation, WPPronto couldn't withstand LoadStorm's test. The results were pretty easy to see in the 508 errors it started to throw on the properly resourced plan. It ran out of processes to handle new connections as expected. As with all new entrants that don't leap to the forefront, I hope they continue to improve their service and do better next round.

 

Thank You

Thank you to all the companies for participating and helping make this testing a reality. Thanks to LoadStorm and specifically Phillip Odom for all his time and the tools to perform this testing. Thanks to Peter at Kinsta for offering his design support.

 

Updates

8/13/2015 : The wrong PDF was linked for DreamHost and its Blitz numbers were adjusted to reflect their actual performance. This change has no effect on how they were ranked since the issue was with downtime.

 

WordPress.org Revamping Hosting Recommendations

wordpresshappening

Almost exactly three months ago I wrote how WordPress sold us out with regards to their web hosting 'recommendations' which were (and still are) just advertisements (now a single ad for BlueHost). So what changed?

revampingwordpress

It appears that WordPress.org is going to change their web hosting recommendations and acknowledges that the industry has changed (dramatically I might add). They left the nonsense text at the top about how BlueHost is the best and brightest while the new text at the bottom admits that isn't the case at all. I will assume it was a minor oversight in the updating of the page.

So what appears to be changing? They are taking requests from companies to be included with a very long survey which is fairly comprehensive covering demographics, staff, usability, technology, financials and more. It looks like it's either designed for a very thorough vetting process or to prevent companies from signing up because of the amount of disclosure required.

Questions like How many net paying customers are you gaining/losing each month? and Approximately how many 30-day-active paying customers do you have? are pretty revealing questions for private companies. Especially considering that there are potentially two competitors that have a strong financial relationship with the murky organization structure of WordPress Foundation - Automattic (WordPress.com) and Endurance International Group (Investor in Automattic, parent company of BlueHost, the primary affiliate of WordPress.org).

There doesn't seem to be any details about how this process will work, who will be managing it, or when the community will get to have any input.

Going Forward

It looks like a step in the right direction, because any direction is better than the current one. But I'm not sold yet.

What I would like to see is actual community input about how recommendations should be made. The current survey is for hosting companies only and doesn't cover the quality of service actually experienced. It asks very detailed questions about what the company says they are doing, but we all know that is the best case scenario that's communicated publicly. Naturally, as a web hosting review website, I think reviews are an important feedback and decision mechanism for making informed choices. There are other ways to evaluate companies like performance benchmarking as well. The community may or may not agree with any of these methods, but they should at least be given a chance to make their voice heard in this discussion. It's the people who know the least that are going to be reading and making decisions from WordPress.org's hosting page, it's our responsibility to help them as much as possible.

I would like transparency on how the companies are going to be evaluated. The black box magic that has given us BlueHost as the best WordPress host for a decade needs to end.

Proper disclosure is a must. The wishy-washy these companies donate a portion of your fee back to us crap needs to end. If you're going to run affiliate links, be up front about them and don't pretend it's something else. This ties back into transparency. If you're going to use affiliate programs and get kickbacks it needs to be done honestly. People have a right to full information about the relationship between WordPress.org and the companies it recommends.

In conclusion, I'm glad to see that changes might be forthcoming, but I urge WordPress.org to do it properly and provide something that is honest and transparent.

Endurance International Group – Profitable?

Endurance International Group is one of the largest web hosting companies who own many of the brands you see in the consumer space. EIG owns A Small Orange, BlueHost, HostGator, HostMonster and JustHost to name a few of their most well known brands.

What caught my eye was an article on Nasdaq, where EIGI (EIG's Stock Ticker) is up and at an all time high. A lot of analysts are rating it as a buy and the price surge seems to indicate people are listening. But I'm not a financial adviser, nor am I interested in making stock recommendations.

What does interest me is web hosting and considering that is the core of EIG's business, the underlying numbers are quite fascinating.

EIG had its first year with a positive operating income with $629.85 million in revenue and $617.37 million in total operating expense leaving $12.48 million in operating income. However, they weren't profitable because they have a lot of debt they are paying off. EIG's net income was a loss of $42.82 million.

"Total subscribers increased by 91,000 in the fourth quarter. Average monthly revenue per subscriber rose 12% year over year to $14.78. For all of 2014, the number of subscribers rose 17% to 4.087 million and the average monthly revenue per subscriber increased 11% to $14.48." - according to the article on Nasdaq

$14.48 per month, per subscriber. $173.76 per year per subscriber. It's easy to understand how they are paying such high commissions with those numbers. That number also seems to be trending up which is a good sign for the financial direction the company is going.

How does that compare to other companies?

I dug up an old GoDaddy S1 from 2014 [Godaddy Reviews] which states their average revenue per user for the trailing 12 months is $105 (it's fluctuated between $93-$105 over the past few years).

I also found Web.com's latest 10K filing which stated monthly ARPU of $14.62. which is $175.44 annually.

EIG and Web.com look very similar just reaching positive operating income this year and very similar revenue per subscribers. It states pretty clearly in Web.com's filing "The growth in average revenue per subscriber continues to be driven principally by our up-sell and cross-sell campaigns focused on selling higher revenue products to our existing customers as well as the introduction of new product offerings and sales channels oriented toward acquiring higher value customers."

It seems like common knowledge to anyone in the web hosting industry that these companies are getting users in cheap. Those ~$5/month hosting plans are obviously not the only thing being sold. It would seem they are able to on average roughly triple that monthly figure by selling other services.

So the question in my mind becomes what do those new products look like? We're seeing a jump into the managed WordPress hosting space. Is there actual innovation that's going to happen or are these big companies simply going to carve out some of the high margin services provided by niche providers? Is that going to be a win for consumers?

I don't have the answers, but I'm certainly interested to see how it plays out.

Drupal and WordPress Have Sold Us Out

Don't trust web hosting recommendations. They are always bullshit (at least that should be your attitude until proven otherwise*).

First off, I'm not the first person to know or say this. It's been known in the industry and by tech savvy users since forever. TechCrunch even wrote web hosting reviews are a cesspool when they covered this site. It's this pervasive semi-secret that everyone seems to have accepted, but it screws over countless people who aren't in on the secret. They get tricked into using sub-par services that authorities are recommending.

 

drupal_logo-blue

Let's take a look at Drupal's hosting recommendations page:

drupal_hosting

Oh that's interesting. What does that disclaimer say?

drupal_hosting_disclaimer

These companies are great choices but you don't endorse them? That seems like a nonsense disclaimer. The only reason they have such a stupid disclaimer is because if you click that advertising policy you find out those spots are sold to the highest bidder and based on nothing but how much a company is paying.

drupal_hosting_forsale

At least in the past they were upfront about how much they cost. If you go further back you can even see shared hosting prices.

drupal_hosting_forsale_2

$7,000/month for the top spot and you must sponsor events. So we're talking $100,000/year commitment roughly. They've removed that pricing information though and one can only assume raised prices since 2013 (we saw PaaS jump from $100 to $500/month). They changed the wording to "minimum monthly guarantee plus an affiliate commission once the guarantee is met." The hosting companies must still be making a lot of money, let's get even more from them.

So when did they give up even listing companies that didn't pay? Seems around 2011 you could be a bronze level hosting provider without payment or affiliate commission.

So when did they sell out? 2010-2011 it would seem. If you look at the 2010 version of the hosting advertising document they still demanded affiliate links to be tier 1 and tier 2, but at least they pretended to care with language like this, "Retains a high Better Business Bureau rating and consistent positive reviews around the Internet" and being a member of Drupal Organization didn't cost thousands. Tier 3 seems to be bronze equivalent and free.

So since approximately 2010, quality didn't even factor into their 'great choices,' but what does factor into it today? If you're a member of the Drupal Hosting Supporter Program, which is just another fee. They even do a security test!

drupal_hosting_support_program

Oh right. You're not required to pass a Drupal security test to become a member. Thanks for your money and here, have a badge, you're awesome! That payment ensures you can pay us more and get priority to pass us more money for your advertisement to be masqueraded around as a recommendation, but not really since we say you're great but not endorse you.

Update (4/21/2015): I managed to find Drupal's 990 tax filing. It's fun to see how in 2010 there was zero hosting affiliate income. 2011 had $32,701. 2012 had $219,824. (2011 and 2012 were combined hosting and advertising revenue). 2013 had $247,927 in hosting affiliate revenue.

tl;dr: So selling out pays out pretty nicely.

 

wordpressorglogo

wordpress_donate

WordPress.org recommends BlueHost, DreamHost and Laughing Squid. They have for years. Apparently nothing has changed in the WordPress hosting space over that span. Except there's dozens of companies specializing in WordPress hosting. BlueHost and DreamHost are just joining the managed WordPress hosting game and are quite late. But these are the best and brightest for years? Sorry, that's just not true and we've got performance and real consumer opinions to back that statement up. Less than 50% of the opinions on BlueHost we see are favorable. DreamHost isn't much better at 55%.

One thing I am sure of though, is that they are probably pretty good at paying to be listed in places. BlueHost is listed first on both Drupal and WordPress. I can't fathom what sort of money that costs if Drupal is 2% of the CMS market and WordPress is 23.5%. Using some old Drupal numbers, $7,000/month x (23.5 / 2) = $82,250/month. That's ~$1,000,000 per year based on some napkin math.

Need more math? Take a look at Endurance International Group (parent company of BlueHost and many other brands)'s 10-K filing with the SEC. "During the year ended December 31, 2014, the Company made a strategic investment of $15.0 million in Automattic, Inc. (“Automattic”), an entity that provides content management systems associated with WordPress. The investment represents less than 5% of the outstanding shares of Automattic and better aligns the Company with an important partner."

How important is that relationship? "In addition to word-of mouth, PPC and reseller and referral channels, we have also entered into strategic partnerships, such as our partnership with Google through the “Get Your Business Online” initiative in the United States, India, Africa and Southeast Asia and our strategic alliance with WordPress, which help us reach additional subscribers." It's mentioned as one of two partnerships that they called out specifically by name at the top of their filing. Pretty damn important.

But is that even plausible? Sure. More gems from the 10-K filing, "The Company engages in sales and marketing through various online marketing channels, which include affiliate and search marketing as well as online partnerships. The Company expenses sales and marketing costs as incurred. For the years ended December 31, 2012, 2013 and 2014, the Company’s sales and marketing costs were $83.1 million, $117.7 million and $146.8 million, respectively." So between PPC, Affiliates and Partnerships they spent nearly 150 million last year.

Update (April 2, 2015): It's been pointed out that WordPress and Automattic are separate entities, but both are run by Matt. I've also managed to find WordPress Foundation 990 document for 2013. It shows $848,925 in contributions/grants. My best guess is that is mostly the web hosting recommendation 'donations.' If I am correct, my napkin math doesn't seem to bad considering that number more than doubled from 2012's $357,451. I can't wait to see 2014's numbers, if the trend continues they definitely should clear that one million dollars I guessed.

This is why we can't have nice things

Do I blame BlueHost (and EIG) or any of the other companies paying to be listed and get these deals? Nope. I've even privately told friends running companies in these spaces that it's probably worth it for them to be paying to get listed. Competition is fierce and whomever has the most customers and reach is winning the game. Ask GoDaddy or EIG. They will go to whatever length they need to in order to get new clients and that's their job. I expect that behavior from the web hosting companies.

I am disappointed in WordPress and Drupal though. They've sold out their community for presumably a fairly large sums of money. At least call it what it is and be transparent they are ads and not real recommendations without using intentionally misleading language. The best and brightest and great companies these are not. These are the companies who paid you the most money.

*So where does that leave people looking for honest web hosting reviews?

Here at Review Signal. Of course. We have a vested interest in telling this story. And if you listened to the first thing I wrote you've got a healthy dose of skepticism (great!) about Review Signal too. Web hosting reviews are bullshit. That's why I started this site and the only way to cut through bullshit is with transparency. Every algorithm we use to display / rank companies is published in our How It Works section. We also link every single review to its original source (on Twitter) so you can verify who said it and if you're really interested, figure out their intent. We publish how and what we classify as positive and negative reviews.  So what we're left with is Trust, But Verify. Our entire system can be verified by anyone with enough will to go through it.

Is it perfect? Nope. Based on the scale of data we process, it's almost entirely automated, and automated classifiers make mistakes. Luckily our users report mistakes, we fix them and our system tries to learn from its mistakes by getting better and better over the long run.

We also have affiliate deals with most companies. That's the most common complaint against us. It's impossible to compete for free and maintain this technology without making some money. Affiliate deals can be corrupting when they pay so much, but our belief is that the transparency protects us manipulating the rankings based on affiliate pay. Currently, the highest rated company on Review Signal, FlyWheel, doesn't even have an affiliate program. I hope for my financial sake that they do open one in the future, but will that change FlyWheel's ranking? Not one bit. (Update: FlyWheel did announce their affiliate program and Review Signal was just accepted into it)

If you're ready to see what honest reviews look like, check out our web hosting reviews.

Kinsta WordPress Hosting Review

kinsta_logo_dark

This post is based off WordPress Hosting Performance Benchmarks (2014).

 

Overview

Kinsta is yet another new comer in our testing with something to prove. Kinsta easily shot to the top of our performance charts. Kinsta's plans have changed quite a bit since we tested them. When our testing was done they offered a $27/month plan. However, they've gone up-market and their cheapest plan is now $157/month. It seems they're targeting people who want serious performance.

The Plan

All testing was done on a shared account, which is no longer available.  This plan tested had 1 WordPress site, 1GB SSD disk space, 50GB bandwidth and costs $27/month.

Performance

LoadStorm

The first performance test was done with LoadStorm. Kinsta made it to the final round of testing where 2000 concurrent users were logging into WordPress and browsing the test site. The test was designed to test non-cached performance by logging users into WordPress. It caused many hosting setups to crumble. You can see Kinsta's result in this graph (click on it to play with the interactive results):

Load-Storm-Kinsta-2000

 

Kinsta aced the LoadStorm test. It had zero errors and one of the fastest average response times at 316ms. Kinsta also had the absolute lowest peak response time at 942ms. That's an astonishing feat, that over 30 minutes Kinsta served nearly 250,000 requests and not a single one took over a second to be delivered. Amazing.

Blitz

The second load test that was run on Kinsta was Blitz. Blitz was used to test cached performance. It simply requested the home page from 1-2000 times per second.

Blitz-Kinsta-2000

I can't draw lines this straight. The response time was flat. As you would expect from a company that aced the cache busting test, they didn't struggle in the slightest. Full Blitz Results (PDF)

Uptime

Two third-party uptime monitoring services (StatusCake and UptimeRobot) tracked the test site for a month. The results for Kinsta were perfect. 100% uptime according to both sources.

WebPageTest

“WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster.” WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. Kinsta was tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
Kinsta 0.759 0.752 0.947 0.592 0.7625

Kinsta had the second fastest average response time of all the companies we tested. No issues with this test in the slightest.

Conclusion

Kinsta, a new-comer to our testing, jumped straight to the top of the performance tiers. Kinsta’s performance was amazing in the Load Storm 2000 logged in user test. They had the lowest peak response time and zero errors over a 30 minute test. They didn’t struggle with any tests whatsoever and showed zero downtime. Kinsta’s performance was undoubtedly top tier.

Visit Kinsta

kinsta_logo_dark

WebSynthesis WordPress Hosting Review

websynthesis-big

This post is based off WordPress Hosting Performance Benchmarks (2014).

 

Overview

WebSynthesis [Reviews] had an extremely strong showing in our first round of testing once I got by a security issue. They managed to defend their status as a top tier WordPress web host.

The Plan

All testing was done on a VPS account. The plan tested had 2 GB ram, 40 GB disk space, 650 GB bandwidth, 20,000 visitors/day and costs $97/month.

Performance

LoadStorm

The first performance test was done with LoadStorm. WebSynthesis made it to the final round of testing where 2000 concurrent users were logging into WordPress and browsing the test site. The test was designed to test non-cached performance by logging users into WordPress. It caused many hosting setups to crumble. You can see WebSynthesis's result in this graph (click on it to play with the interactive results):

Load-Storm-WebSynthesis-2000

 

WebSynthesis stayed under the threshold of 0.5% error rate, but it was close. This grueling 2000 user test really put a strain on the server as you can see from the spikes but it held for 30 minutes without failing.

Blitz

The second load test that was run on WebSynthesis was Blitz. Blitz was used to test cached performance. It simply requested the home page from 1-2000 times per second.

Blitz-WebSynthesis-2000

WebSynthesis was better than flat. There is a slight downward trend in response time. WebSynthesis led the pack, again, delivering 57,776 hits in one minute with a single error. The best results of anyone on this test. Full Blitz Results (PDF)

Uptime

Two third-party uptime monitoring services (StatusCake and UptimeRobot) tracked the test site for a month. The results for WebSynthesis were 100% uptime according to both sources, again.

WebPageTest

“WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster.” WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. WebSynthesis was tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
WebSynthesis 0.407 0.835 0.982 1.024 0.7812

WebSynthesis handled this test fine. In fact, they had the single fastest average page load from a single location of any company at 0.407 seconds from Dulles, VA.

Conclusion

WebSynthesis [Reviews] was teetering on the Load Storm test of having too many errors (0.5%), but they were under it and handled the test quite well. They also had no weird security issues this time around, and WebSynthesis led the pack on Blitz testing. They went from 871 hits/second last time to 963 hits/second this time; leading every provider on the Blitz tests with a whopping 1 error to boot. Sprinkle in some perfect up time numbers and it’s clear WebSynthesis is still a top tier provider and is continuing to get better.

Visit WebSynthesis

websynthesis-big

Loading...

Interested in seeing which web hosting companies people love (and hate!)? Click here and find out how your web host stacks up.