Tag Archives: a small orange

Review Signal’s Best Web Hosting Companies in 2016

2016 Year in Review

I like to take this opportunity to look back at the year and see how Review Signal has changed. This past year we added ~36,000 new reviews. Added one new company: WebFaction. 49.6% of reviews were positive overall. 52.1% of unique reviews were positive. What is interesting about the difference is that people with negative things to say were more likely to send multiple negative messages, but as a whole more individual people said positive things than negative.

This year was also full of interesting articles that took advantage of our unique position in the web hosting review space. The WordPress Hosting Performance Benchmarks (2016) was the biggest hit as usual. It grew massively in size/scope and tested companies across multiple price tiers up to Enterprise WordPress Hosting.

I also wrote about the Dirty, Slimy Secrets of the Web Hosting Review Underworld. I also tracked some major changes with The Rise and Fall of A Small Orange and The Sinking of Site 5 which tracked Endurance International Group acquisitions and how their ratings fell post-acquisition. A Small Orange's fall from grace even caused the first ranking algorithm update on Review Signal's history.

Best Shared Hosting 2016 – SiteGround [Reviews] (74.2%)

Best Specialty Hosting 2016 – FlyWheel [Reviews] (83.7%)

Best Managed VPS Hosting 2016 – KnownHost [Reviews] (80.9%)

Best Unmanaged VPS Hosting 2016 – Digital Ocean [Reviews] (71.3%)

Best Support 2016 – SiteGround [Reviews]  (80.81%). KnownHost [Reviews], LiquidWeb [Reviews], WiredTree [Reviews] all tied for second at 80% (WiredTree was acquired by LiquidWeb in 2016).

A big congratulations goes out to all of this years winners.

The Sinking of Site5 – Tracking EIG Brands Post Acquisition

"You'll notice their ratings, in general, are not very good with Site5 (their most recent acquisition) being the exception. iPage was acquired before I started tracking data. BlueHost/HostMonster also had a decline, although the data doesn't start pre-acquisition. JustHost collapses post acquisition. NetFirms has remained consistently mediocre. HostGator collapses with a major outage a year after acquisition. Arvixe collapses a year after being acquired. Site5 is still very recent and hasn't shown any signs of decline yet." - The Rise and Fall of A Small Orange, January 2016

Review Signal Rating Calculated Pos/(Pos+Neg), without duplicate filtering

Review Signal Rating Calculated Pos/(Pos+Neg), without duplicate filtering (January 2016)

That's what I wrote at the beginning of 2016 as I watched A Small Orange's rating collapse in a pretty popular post called The Rise and Fall of A Small Orange, which documented ASO's Rise and Fall, but also the fall of many EIG brands. One thing I mentioned was the recent acquisition of Site5 (and Verio) which had a fairly good rating on Review Signal at the time of acquisition. The trend seemed to be roughly a year to see the drop in rating, post acquisition.

Site5 ~ 1 Year Later

The acquisition of Site5 was announced August 2015. Here's the updated EIG brand tracking graph. One thing to note, this now uses the new rating algorithm which has a built in decay function to weight older reviews less. So the new graph uses the new algorithm but calculating each point in time as if it always used it. There will be some differences between it and the original graph (which prompted the change in algorithm). It's minimal for most brands, only when there is a major change in sentiment, it shows a change more quickly. Full details about the change can be read on Review Signal Ranking Algorithm Update.

eig_brand_review_signal_ratings_2016

What you can see is the reputation remained relatively stable until about April 2016 and then started a slow but steady decline where it has dipped below 50% for the first time recently. As with nearly every brand, except A Small Orange, the decline happened within a year.

Since the original post there also hasn't been much movement in any other brands beyond Site5 crashing and A Small Orange continuing to slide downward. Verio didn't see a dip post-acquisition, but it had a pretty low rating to start with that put it in the bottom half of EIG brand ratings already.

Why Do EIG Brands Go Down Post Acquisition?

The longer I am in this industry, the more stories I hear. A Small Orange was such an interesting exception and I've heard a lot about it from a lot of people. It's relative independence and keeping the staff seemed to be the key to maintaining a good brand even within the EIG conglomerate.

Site5 offers what I imagine is more business-as-usual in the EIG world. Cut staff, migrate to EIG  and maximize profit (in the short term). Site5's founder, Ben, reached out to a competitor, SiteGround, and arranged for them to hire a large number of Site5 staff that EIG had no plans on keeping according to SiteGround's blog. A very classy move from the former CEO and a seeming win for SiteGround, one of EIG's larger hosting competitors. I also saw similar behavior of long time staff all leaving when A Small Orange started to go downhill and staff from other EIG brands showed up.

Beyond simply trying to cut costs, you have to wonder why would you spend all that money acquiring these brands that have lots of customers, good reputations and talented staff that obviously are keeping the operation running successfully only to get rid of nearly all of that except the customers. But once you gut the staff, it seems like the customers notice, because it certainly shows up in the data I track.

Conveniently, EIG just published their Q3 2016 10-Q.

We have certain hosting and other brands to which we no longer allocate significant marketing or other funds. These brands generally have healthy free cash flow, but we do not consider them strategic or growth priorities. Subscriber counts for these non-strategic brands are decreasing. While our more strategic brands, in the aggregate, showed net subscriber adds during the quarter ended September 30, 2016, the net subscriber losses in non-strategic brands and certain gateway brands contributed to a decrease in our total subscribers of approximately 42,000 during the quarter. We expect that total subscribers will continue to decrease in the near term.

Overall, our core hosting and web presence business showed relatively slow revenue and subscriber growth during the first nine months of 2016. We believe that this is due to flat marketing expenditures relative to 2015 levels on this business in the first half of 2016 as a result of our focus on gateway products during that period, and to trends in the competitive landscape, including greater competition for referral sources and an increasing trend among consumers to search for web presence and marketing solutions using brand-related search terms rather than generic search terms such as “shared hosting” or “website builder”. We believe this trend assists competitors who have focused more heavily than we have on building consumer awareness of their brand, and that it has made it more challenging and more expensive for us to attract new subscribers. In order to address this trend, during the third quarter of 2016, we began to allocate additional marketing investment to a subset of our hosting brands, including our largest brands, Bluehost.com, HostGator and iPage. We plan to continue this increased level of marketing investment in the near term, and are evaluating different marketing strategies aimed at increasing brand awareness.

So the result of their current strategy this past quarter has been a net loss of 42,000 customers. They say their strategic brands on aggregate had a net subscriber increase and named the largest ones (BlueHost, HostGator, iPage) and they are going to focus on a subset of brands going forward. But the phrasing would seem to imply that some of the strategic brands experienced losses as well. It also means that the non-strategic brands lost more than 42,000 customers and pulled down the net subscribers to -42,000 customers last quarter.

The cap it all off, I got one of the most surprising emails from Site5 a couple days ago.

We wanted to let you know that we’ve decided to terminate the Site5 Affiliate program as of November 30th, 2016.

We want to thank you for your support of Site5, especially during our most recent move into Impact Radius, and we hope that you’ll consider promoting another one of Endurance’s other programs.

I guess Site5 isn't being considered a strategic brand if they are killing off the affiliate channel on it entirely, right after a big migration from Site5's custom affiliate program to Impact Radius. They also asked that affiliates promote HostGator now, which certainly fits in the strategic brand category.

It's extremely disappointing to see this trend continue of brands collapsing after a year in EIG's hands. What will be interesting going forward is that EIG hasn't acquired any new hosting brands for a while. They seem to be focused on their existing brands for now. I wonder if that will mean we will see any noticeable positive change or improvements in existing brands (or at least some of the strategic brands).

WordPress Hosting Performance Benchmarks (2016)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

This is the fourth round of managed WordPress web hosting performance testing. You can see the original, 2014 version , and 2015 version.

Companies Tested

A2 Hosting [Reviews]
BlueHost [Reviews]
CloudWays [Reviews]
Conetix
DreamHost [Reviews]
FlyWheel [Reviews]
GoDaddy [Reviews]
Incendia Web Works
Kinsta
LightningBase
LiquidWeb [Reviews]
MediaTemple [Reviews]
Pagely [Reviews]
Pantheon [Reviews]
Pressable (Formerly ZippyKid)
Pressed.net
Pressidium
Pressjitsu
PressLabs
Hosting Agency (German)
SiteGround [Reviews]
Traffic Planet Hosting
WordPress.com VIP
WPEngine [Reviews]
WP.land
WPOven.com

Companies that didn't participate this round but did on previous rounds: WebHostingBuzzWPProntoNexcessA Small Orange [Reviews] and  WebSynthesis [Reviews].

Every plan was donated by the company for testing purposes with the strict stipulation that it would be the same as if any normal user signed up. There is a notes section at the bottom that details the minutiae of changes made to plans at the end of this post. Nearly every single company had security issues that I had to get around, so they worked to make sure my testing went through properly. Load testing often looks like an attack and it's the only way I can do these tests.

The Products

This year is a bit different than years past where every company and plan competed against one another. When I started the price gap was from $5/month to $29/month. Last year the gap was $5.95 to $299. I was only testing entry level plans but the market has dramatically changed since I first got started. Today, there is demand at many different price points and lots of companies have gone upscale with WordPress.com VIP at the top of the price bracket starting at $5,000/month. The only logical way to break things up was by price brackets. So below you will see the brackets and which companies participated. Specific details will be included on each bracket's write up.

 

<$25/m $25-50/m $51-100/m $101-200/m $201-500/m $500+/m
A2 Hosting A2 Hosting LiquidWeb A2 Hosting Kinsta Kinsta
Bluehost Conetix Bluehost Bluehost Media Temple Pagely
DreamHost LLC Lightning Base Cloudways (AWS ) Conetix Pagley Pantheon
Flywheel Pantheon Cloudways (Google) Kinsta Pantheon Pressable
GoDaddy Pressable Kinsta Liquid Web Pressable Pressidium
Incendia Web Works Pressjitsu Lightning Base Pressable Pressidium WordPress.com VIP
Lightning Base SiteGround Media Temple Pressidium Presslabs WP Engine
Media Temple WP Engine Pagely Pressjitsu SiteGround
Pressed WP.land Pantheon
Hosting Agency.de Cloudways (DigitalOcean) Pressable
SiteGround Cloudways (Vultr) Pressidium
Traffic Planet Hosting WPOven SiteGround
WP.land

 

Methodology

The question I tried to answer is how well do these WordPress hosting services perform? I tested each company on two distinct measures of performance: peak performance and consistency. I've also included a compute and database benchmark based on a WordPress plugin.

All tests were performed on an identical WordPress dummy website with the same plugins except in cases where hosts added extra plugins. Each site was monitored for approximately two months for consistency.

1. LoadStorm

LoadStorm was kind enough to give me resources to perform load testing on their platform and multiple staff members were involved in designing and testing these WordPress hosts. I created identical scripts for each host to load a site, login to the site and browse the site. Logged in users were designed to break some of the caching and better simulate real user load. The amount of users varies by cost.

2. Blitz.io

I used Blitz again to compare against previous results. This tests the static caching of the homepage. I increased the number of users based on monthly cost this time.

3. Uptime (UptimeRobot and StatusCake)

Consistency matters. I wanted to see how well these companies performed over a longer period of time. I used two separate uptime monitoring services over the course of a month to test consistency.

4. WebPageTest.org

WebPageTest with 9 runs, first view only, native connection. I tested from Dulles, Denver, Los Angeles, London, Frankfurt, South Africa, Singapore, Shanghai, Japan, Sydney, Brazil.

5. WPPerformanceTester (free plugin on WordPress.org)

I created a WordPress plugin to benchmark CPU, MySql and WordPress DB performance. The CPU/MySql benchmarks are testing the compute power. The WordPress component tests actually calling $wpdb and executing insert, select, update and delete queries.

 

Notes - Changes made to Hosting Plans

A2 - VPS Servers can't install WordPress out of the box without extra payment for Softaculous. Disabled recaptcha.

Conetix - disabled WordFence and Stream plugins.

SiteGround - fully enable SuperCacher plugin

GoDaddy - 24 database connection limit increased if you notify them of heavy load

CloudWays - disabled WordFence

Review Signal Ranking Algorithm Update

In a recent article, The Rise and Fall of A Small Orange, it became quite apparent that our ranking algorithm here at Review Signal needed an update. Review Signal launched on September 25, 2012 which was almost 3.5 years ago. At launch, we had data from as early as 2011, which means this site's data is up to 5 years old today. It wasn't an issue back then because the oldest data would be at most, two years old and still relevant.

Today, our older data isn't really as relevant as it once was. A Small Orange exposed that weakness. It was an issue I knew I would have to deal with eventually, but nobody has really made the system fail until now. Since writing about The Rise and Fall of A Small Orange, I've been working hard to figure out a good way to update the ranking algorithm.

The solution I have come up with is a decay function. Older reviews will be worth a fraction of their more recent counterparts.

(1/(ABS(TIMESTAMPDIFF(YEAR, NOW(), max(timestamp)))+1))

This is the mathematical formula that Review Signal will now be using to calculate the value of a review.

An English explanation would be that for every year old the review is, it becomes worth one divided by years old. A one year or less old review would be worth 1/1 or 1.00. A two year old review would be worth 1/2 or 0.5. A 3 year old review would be worth 1/3 or .33 and so on.

This allows old reviews to still be a part of a company's rankings, but with a strong bias towards more recent reviews so that if a company starts performing poorly, it will decline faster in the rankings.

Checkout the full chart of how these changes affect the rankings and ratings of every published company.

Perhaps the most interesting column is how the Overall Ranking changed because of this algorithm update which I have included below. A Small Orange has the biggest change by a wide margin. HostWay lost a lot as well, but it was already at the bottom and the difference between 36% (old) and 27% (new) isn't very meaningful when you only fall two ranking spots.

HostMonster, Arvixe, HostGator, JustHost, BlueHost and some other EIG brands falling a bit more isn't surprising. It does highlight how old reviews were keeping them slightly higher than they should be, but none were ranked particularly well.

WebSynthesis dropping was a bit of a surprise. Still a decent rating at 62%, but a pretty substantial of 7 ranking places which dropped it from 10th to 17th.

On the other end, there is a lot less change upwards. However, Pagely got a nice little boost which jumped it 8 places upwards to 12th.

Then there's MochaHost, which has the dubious honor of jumping up one slot, to not be the absolute worst company we track, which is now Arvixe.

Name Overall Change
A Small Orange -10.57
Hostway -8.55
Host Monster -6.89
WebSynthesis -6.13
Arvixe -6.02
Linode -5.35
HostGator -5.24
LunarPages -5.15
ServInt -4.99
JustHost -3.52
BlueHost -3.09
NetFirms -3.01
IX Web Hosting -2.53
Flywheel -2.49
West Host -2.29
SingleHop -1.73
Verio -1.36
iPage -1.32
RackSpace -1.06
Hetzner -1
MediaTemple -0.97
1 and 1 -0.95
SiteGround -0.83
LiquidWeb -0.49
WPEngine -0.38
Heroku -0.24
Digital Ocean -0.24
Godaddy -0.2
Site5 0.2
Azure 0.48
SliceHost 0.81
AN Hosting 0.93
InMotion 0.96
Amazon 1.35
GoGrid 1.42
MidPhase 1.5
SoftLayer 2.12
Dream Host 2.13
WiredTree 2.4
KnownHost 2.65
HostDime 2.67
MochaHost 3.34
Pagely 4.9

Review Signal’s Best Web Hosting Companies in 2015

Another year, another mountain of data added to the largest web hosting review site. This year we added over 49,000 new reviews (a slight increase from the 45,000 last year). We added two new companies in Arvixe and Site5, both of which are now owned by EIG. We published our first WordPress Plugin WPPerformanceTester. WPPerformanceTester was built for our WordPress Hosting Performance Benchmarks which we performed yet again with our largest batch of companies ever. We even got some outside validation from LiquidWeb which published its internal NPS benchmarks which matched very closely to their Review Signal Rating.

But the year ended on a somewhat sour note with The Rise and Fall of A Small Orange. It tells the story of ASO and how they've played such a huge role on this site. Including winning at least one of these awards every year since inception. But not anymore. So without further ado...

Best Shared Web Host: LiquidWeb [Reviews]

2015-best-shared-hosting-liquidweb

 

Best Web Hosting Support: SiteGround [Reviews]

2015-best-hosting-support-siteground

 

Best Specialty Web Hosting: FlyWheel [Reviews]

2014-best-specialty-flywheel

Best Unmanaged VPS: Digital Ocean [Reviews]

2014-best-unmanaged-vps-digitalocean

Best Managed VPS: LiquidWeb [Reviews]

2015-best-managed-vps-liquidweb

 

For the second year in a row FlyWheel [Reviews] has set the bar in terms of how high a company's rating can be. They won the best specialty web hosting award with their managed WordPress hosting.

For the first time ever someone besides A Small Orange [Reviews] has won the best shared web hosting. A huge congratulations to LiquidWeb [Reviews]! They also managed to pickup the Best Managed VPS hosting award.

Digital Ocean [Reviews] continues its massive growth and popularity, they have won the Best Unmanaged VPS provider for the third year in a row.

Finally, SiteGround [Reviews] returned to our awards and won Best Web Hosting Support, an honor they last received in 2013.

The Rise and Fall of A Small Orange

How did a small web hosting company have such a huge impact on Review Signal?

The Early Days

This story begins in October 2011, a year before Review Signal launched. Review Signal had been collecting data for months and early ratings data was starting to become meaningful. A tiny company was at the top of the rankings. A Small Orange.

The most worrisome part of this revelation was that A Small Orange did not have an affiliate program. Which isn't a requirement at all for a listing on Review Signal.

However, after investing years of work, if the top rated company ended up not having an affiliate program, the business was likely sunk before it even started. So I inquired early and heard back from the CEO at the time, “we don't have an affiliate program and at the moment, we have no plans for one.” This was a potential death knell because the entire business model relies on making at least some money, even though I assumed it would be much lower than my competitors who simply sell their rankings to the highest bidder. But as any entrepreneur knows, almost everything is negotiable if you understand what the other person really wants and why. After talking further with the CEO, he explained his issue with web hosting review websites, “they typically have a pay for ranking sort of model and do it either through set rates or affiliate payouts. It varies. The economics at ASO don't really work out for a standard affiliate program.” A Small Orange didn't want to play the game that every other review site out there did. Pay to play, quality be damned.

This CEO hated the games being played as much as I did.

That was all the opportunity I needed. Review Signal's mission has been to fight against that very same model and I knew I had an early ally who could make this work. We ended up working out a deal to pay three months of whatever plan someone purchased and he put a cap on my potential earnings at $250 before he would review the performance. Considering the most popular plans were $25/year and $5/month, this wasn't going to earn a lot, but at least it might start covering some of the very basic costs. The first month I earned $52.38 on 6 sales for an average of $8.73 per sale with A Small Orange.
At least it was something. And a foot in the door was all I needed to prove this crazy idea called Review Signal might have some legs. A Small Orange opened that door and for that our histories will forever be intertwined.

The Good Times

The next few years were very good. I was their first affiliate. I was their biggest affiliate for many years, bringing in over a thousand new customers. I got to know many of the staff and would consider some of them friends. And A Small Orange continued to be the best rated shared hosting company through 2014. Everyone was happy - their customers, the company and Review Signal. I was happy to recommend them based on the data showing they had incredibly satisfied customers. I had people tell me personally they were very happy with them after signing up because of the data I publish here at Review Signal.

2014-01-20 13.34.07

Free Swag and Annual Thank You Card from ASO

The EIG Acquisition

A Small Orange was quietly acquired in 2012. They were acquired by a behemoth in the hosting industry called Endurance International Group (NASDAQ: EIGI) which owns dozens of brands including some of the largest and most well known hosting companies: Blue Host, Host Gator, Host Monster, Just Host, Site5, iPage, Arvixe and more.

EIG has a very bad reputation in the web hosting world. If you ask most industry veterans they will tell you to run to the hills when it comes to EIG. The oft-repeated story is EIG acquires a hosting company, migrates them to their platform and the quality of service falls off a cliff. The best example of this is perhaps their migration to their Provo, UT data-center which had a catastrophic outage in 2013. This outage was huge. The impact dropped four of EIG's largest brands many percentage points in the Review Signal rankings in a single day.  But these major outages continue to happen as recently as November 2015.

In a recent earnings call with share holders, EIG CEO Hari Ravichandran talked about two recent acquisitions and their plans for them. “We expect to manage these businesses at breakeven to marginally profitable for the rest of the year as we migrate their subscriber bases onto our back-end platform. Once on platform, we expect to reach favorable economics and adjusted EBITDA contribution consistent with our previous framework for realizing synergies from acquisitions.”

The EIG Playbook

EIG's playbook has been to acquire web hosting brands, migrate them to their platform and 'reach favorable economics.' They've been doing it for years and it seems to be working well enough for investors to continue to put money into the company. M&A to grow subscriber bases and economies of scale to lower costs. It's a very simple and straightforward business plan. It doesn't speak to anything beyond spreadsheet math though, such as brand value and customer loyalty. And those are certainly lowered and lost post-EIG acquisition according to all the data we've collected over years and multiple acquired brands. It's calloused business accounting, but it makes perfect sense in the race to the bottom industry that is commodity shared hosting.

Review Signal Rating Calculated Pos/(Pos+Neg), without duplicate filtering

Review Signal Rating Calculated Pos/(Pos+Neg), without duplicate filtering

You can see all the EIG brands tracked here on Review Signal in the chart above and their acquisition dates below:

iPage - 2009. BlueHost/HostMonster - 2010. JustHost - Feb 2011. NetFirms - March 2011. HostGator - June 2012. A Small Orange  - July 2012. Arvixe - November 2014. Site5 - August 2015.

You'll notice their ratings, in general, are not very good with Site5 (their most recent acquisition) being the exception. iPage was acquired before I started tracking data. BlueHost/HostMonster also had a decline, although the data doesn't start pre-acquisition. JustHost collapses post acquisition. NetFirms has remained consistently mediocre. HostGator collapses with a major outage a year after acquisition. Arvixe collapses a year after being acquired. Site5 is still very recent and hasn't shown any signs of decline yet.

The Expected Decline of A Small Orange

So nearly every industry veteran I talked to expected A Small Orange to collapse. Immediately after acquisition. Except me. I was, am and will continue to be willing to give the benefit of the doubt to a company until I am shown evidence.

For years, post acquisition people were saying ASO's demise was right around the corner. For years, I still waited for that evidence and the prophecy to become true. But it didn't happen.

It often took EIG less than a year to ruin a brand. We don't have to look further than Arvixe for an example of this, which was acquired in November 2014. Today, Arvixe has one of the lowest ratings of any company on Review Signal at a shockingly low 27%.

But A Small Orange continued to chug along. It didn't hear the naysayers or believe itself to be a victim of the EIG curse. Instead, ASO was the best shared host for years post-acquisition. It seemed to have a fair level of autonomy from the EIG conglomerate. The staff I knew there, remained there, and all indications showed they were still the same company.

Until it wasn't.

The Fall of A Small Orange

A Small Orange Historical Rating

A Small Orange Historical Rating

The chart above shows Review Signal's rating of A Small Orange. The Blue line is the rating as calculated by [Positive Reviews / (Positive Reviews + Negative Reviews)]. The Red line only calculates the rating from the past 12 months of data. It's slightly different than Review Signal's actual calculation because I am not filtering out duplicates for quick analysis. The difference for A Small Orange is that when you remove the duplicates, the year 2015 had a 43% rating indicating there was quite a few people writing multiple negative things about A Small Orange.

Sometime in 2015, the A Small Orange that thousands of people trusted and raved about became another EIG brand. I tried to get the inside story. I reached out to the former CEO who sold the company to EIG and became an executive there for a couple years post acquisition. He reached out on my behalf to EIG's PR team to see if they would participate in this story. Both declined to participate.

So, I'm left to speculate on what happened at A Small Orange based on what's been publicly stated by their CEO and watching their strategy unfold for years across many companies/brands. My best guess is EIG finally got involved with A Small Orange. They used to be a distributed/remote team, now all positions they are hiring for are listed as in Texas (their headquarters). I saw a HostGator representative get moved over to ASO's team, so the internal staff was changing and people were being moved from brands with less than stellar reputations to ASO. The former CEO left mid-2014, which likely left a leadership and responsibility gap. ASO could probably run on auto pilot through the end of 2014, but over time having no champion for your brand in upper management eventually will come back to hurt the brand when decisions get made based on simple economics.

Once 2015 rolled around, the service had noticeably declined. The overall rating for A Small Orange in 2015 was 43% (only using 2015 data). For years, they had been in the 70's. It also ended the year with a massive outage for most, if not all, of their VPS customers which has been going on since Christmas. I personally received multiple messages from users of this site asking about what was happening and alerting me to this decline in service quality.

ASO was also responsible for the Arvixe migration that went very poorly and caused the Arvixe brand to tank. I'm not sure why EIG doesn't have a dedicated migration team to handle these type of moves considering how many acquisitions they go through and how large a role it plays in their growth strategy. But that's a whole separate issue.
It's with great disappointment that I have to admit, the A Small Orange that played such a huge role in the founding and success of Review Signal and provided a great service to many thousands of customers is dead. It's become another hollow EIG brand where the quality has gone down to mediocre levels. And that seems perfectly ok to them, because it's probably more profitable for their bottom line.

Going Forward

This story has had a profound impact on Review Signal. One thing that it made painfully obvious is that the ranking algorithm needs its first update since inception. The current ranking treats every review equally. Which was great when this site launched, because time didn't have any opportunity to be a factor yet. But as this site continues to move forward, I need to acknowledge that a significant amount of time has passed since launch and today. A review from the beginning of Review Signal isn't as relevant as one from this past week in determining the current quality of a web hosting company. A Small Orange right now shows up around 64% which is artificially high because of their long history of good service and it hasn't been brought down yet by the marginally small (by time scale) decline of the past year. But it's painfully clear that it's not a 64% rating company anymore.

Another thing to note is the graphs here all used a simpler calculation [Pos / (Pos + Neg)] to calculate rating without duplicate filtering. What this means is the difference between the rating here and the actual rating on the live site is a measure of the degree people are being positive or negative about a company. If the rating here is higher than the published, it means people are saying on average, more than one good thing about the same company. If the rating is below (as is in most if not all cases here), it means people are are saying more than one negative thing about the company. I'm not sure if this will factor into a new algorithm, but it is something to consider. My intuition says you would see it hinge around 50%, those companies above would likely have more positive supporters, and those below would have detractors.

In the coming months I will try to figure out a better way to generate the ranking number that more fairly represents the current state of a company. My initial thought is to use some sort of time discounting, so that the older the review, the less weight it would carry in the rankings. If anyone has experience working with this or wants to propose/discuss ideas, please reach out - comment here, email me, or tweet @ReviewSignal.

Site 5 Web Hosting Logo

Site5 Acquired by Endurance International Group (EIG)

Endurance International Group yesterday announced in their second quarter results that they acquired Site5 and Verio.

During the quarter, the company acquired assets of Verio and Site5. The total cash consideration for these acquisitions is expected to be approximately $36 million.

Via MarketWatch.

EIG continues to acquire hosting companies as a growth strategy and doesn't seem to plan on stopping any time soon. The hope is that Site5, which is rated as one of the better companies on Review Signal, operates more like A Small Orange which was acquired in 2012 and continues to be rated very highly. Time will tell how it plays out, I will certainly be watching the data and trends.

That brings the list of EIG companies here on Review Signal to:

 

Endurance International Group – Profitable?

Endurance International Group is one of the largest web hosting companies who own many of the brands you see in the consumer space. EIG owns A Small Orange, BlueHost, HostGator, HostMonster and JustHost to name a few of their most well known brands.

What caught my eye was an article on Nasdaq, where EIGI (EIG's Stock Ticker) is up and at an all time high. A lot of analysts are rating it as a buy and the price surge seems to indicate people are listening. But I'm not a financial adviser, nor am I interested in making stock recommendations.

What does interest me is web hosting and considering that is the core of EIG's business, the underlying numbers are quite fascinating.

EIG had its first year with a positive operating income with $629.85 million in revenue and $617.37 million in total operating expense leaving $12.48 million in operating income. However, they weren't profitable because they have a lot of debt they are paying off. EIG's net income was a loss of $42.82 million.

"Total subscribers increased by 91,000 in the fourth quarter. Average monthly revenue per subscriber rose 12% year over year to $14.78. For all of 2014, the number of subscribers rose 17% to 4.087 million and the average monthly revenue per subscriber increased 11% to $14.48." - according to the article on Nasdaq

$14.48 per month, per subscriber. $173.76 per year per subscriber. It's easy to understand how they are paying such high commissions with those numbers. That number also seems to be trending up which is a good sign for the financial direction the company is going.

How does that compare to other companies?

I dug up an old GoDaddy S1 from 2014 [Godaddy Reviews] which states their average revenue per user for the trailing 12 months is $105 (it's fluctuated between $93-$105 over the past few years).

I also found Web.com's latest 10K filing which stated monthly ARPU of $14.62. which is $175.44 annually.

EIG and Web.com look very similar just reaching positive operating income this year and very similar revenue per subscribers. It states pretty clearly in Web.com's filing "The growth in average revenue per subscriber continues to be driven principally by our up-sell and cross-sell campaigns focused on selling higher revenue products to our existing customers as well as the introduction of new product offerings and sales channels oriented toward acquiring higher value customers."

It seems like common knowledge to anyone in the web hosting industry that these companies are getting users in cheap. Those ~$5/month hosting plans are obviously not the only thing being sold. It would seem they are able to on average roughly triple that monthly figure by selling other services.

So the question in my mind becomes what do those new products look like? We're seeing a jump into the managed WordPress hosting space. Is there actual innovation that's going to happen or are these big companies simply going to carve out some of the high margin services provided by niche providers? Is that going to be a win for consumers?

I don't have the answers, but I'm certainly interested to see how it plays out.

The Best Web Hosting Companies in 2014

It's always interesting to look back at a year and analyze what happened. 2014 was the second full year of operation for Review Signal. Four new companies were published on Review Signal: Azure, FlyWheel, Pagely, WebSynthesis. We added roughly 45,000 new reviews (oddly enough about half as many as last year). We ran two massive performance testing reviews of managed WordPress hosting companies (1, 2).

So I finally got around to slicing and dicing the data exclusively looking at data collected in 2014 and here are the awards:

 

Best Shared Web Host: A Small Orange [Reviews]

2014-best-shared-asmallorange

Best Web Hosting Support: FlyWheel [Reviews]

2014-best-support-flywheel

Best Specialty Web Hosting: FlyWheel [Reviews]

2014-best-specialty-flywheel

Best Unmanaged VPS: Digital Ocean [Reviews]

2014-best-unmanaged-vps-digitalocean

Best Managed VPS: KnownHost [Reviews]

2014-best-managed-vps-knownhost

New comer FlyWheel [Reviews] has set the bar in terms of how high a company can fly (I'm sorry!). When I introduced FlyWheel they had the absolute highest numbers I've ever seen and continue to be in a tier of their own. They do WordPress hosting and that is it, so maybe there is some advantage to specialization. They took the best specialty hosting and support awards this year.

For the third consecutive year in a row, A Small Orange [Reviews] has the best shared web hosting.

Digital Ocean [Reviews] has become the fourth largest web hosting company in under two years according to netcraft. It's easy to understand why when they take home the best unmanaged VPS provider for a second year in a row.

Finally, a new-comer into our awards list, Known Host [Reviews] managed to take the Best Managed VPS award this year beating out last year's winner, A Small Orange.

 

A Small Orange WordPress Hosting Review

This post is based off WordPress Hosting Performance Benchmarks (2014).

asmallorange

Overview

A Small Orange [Reviews] has won numerous awards from Review Signal including Best Overall Web Host 2012, Best Shared Hosting Provider 2013 and Best Managed VPS Provider 2013. They've been consistently near the top of our rankings since the beginning. They stumbled a bit during our first round of WordPress testing.

But what differentiates a good hosting company from an average one? Accepting there was a short-coming and improving. A Small Orange did exactly that in our second round of testing.

The Plan

All testing was done on a Cloud VPS running ASO's WordPress LEMP stack. The VPS had 1 GB Ram, 15 GB SSD disk space, 600 GB of bandwidth and cost $25/month.

Performance

LoadStorm

The first performance test was done with LoadStorm. A Small Orange made it to the final round of testing where 2000 concurrent users were logging into WordPress and browsing the test site. The test was designed to test non-cached performance by logging users into WordPress. It caused many hosting setups to crumble. You can see ASO's result in this graph (click on it to play with the interactive results):

 

Load-Storm-A-Small-Orange-2000

A Small Orange handled the test without a single error and showed no signs of struggling with the load. There isn't much more so say than ASO handled LoadStorm with grace and ease.

Blitz

The second load test that was run on A Small Orange was Blitz. Blitz was used to test cached performance. It simply requested the home page from 1-2000 times per second.

Blitz-A-Small-Orange-2000

A Small Orange's Blitz results were pretty expected based on the previous test. It showed minimal signs of load around ~1800 users where the response times spiked a bit but were still under 150ms which is barely noticeable. Overall, it was a fantastic performance. Full Blitz Results (PDF)

Uptime

Two third-party uptime monitoring services (StatusCake and UptimeRobot) tracked the test site for a month. The results for A Small Orange in both cases was perfect uptime. StatusCake also had a blazingly fast average response time of 23ms, which led the pack by a wide margin. In the uptime department, ASO had a flawless performance.

WebPageTest

“WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster.” WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. ASO was tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
A Small Orange 1.443 0.801 0.836 0.64 0.93

There was absolutely no issues with their WebPageTest results, it loaded very quickly with a great average speed of under one second.

Conclusion

A Small Orange [Reviews] is one of the top tier WordPress hosting providers when looking at performance. ASO improved their LEMP stack since the last time I tested. They never buckled in any test. Their staff was incredibly friendly (special thank you to Ryan MacDonald) and they’ve stepped up their performance game. The one thing that isn’t quite there yet is the documentation/user experience, there are a lot of improvements they could make to make their LEMP stack more accessible to the less tech savvy. All in all, the experience was in-line with what I would expect from a company that has one of the highest support ratings on our site.

Visit A Small Orange and use the coupon code 'orangelover' for 15% off.

asmallorange