Category Archives: Data

The Sinking of Site5 – Tracking EIG Brands Post Acquisition

"You'll notice their ratings, in general, are not very good with Site5 (their most recent acquisition) being the exception. iPage was acquired before I started tracking data. BlueHost/HostMonster also had a decline, although the data doesn't start pre-acquisition. JustHost collapses post acquisition. NetFirms has remained consistently mediocre. HostGator collapses with a major outage a year after acquisition. Arvixe collapses a year after being acquired. Site5 is still very recent and hasn't shown any signs of decline yet." - The Rise and Fall of A Small Orange, January 2016

Review Signal Rating Calculated Pos/(Pos+Neg), without duplicate filtering

Review Signal Rating Calculated Pos/(Pos+Neg), without duplicate filtering (January 2016)

That's what I wrote at the beginning of 2016 as I watched A Small Orange's rating collapse in a pretty popular post called The Rise and Fall of A Small Orange, which documented ASO's Rise and Fall, but also the fall of many EIG brands. One thing I mentioned was the recent acquisition of Site5 (and Verio) which had a fairly good rating on Review Signal at the time of acquisition. The trend seemed to be roughly a year to see the drop in rating, post acquisition.

Site5 ~ 1 Year Later

The acquisition of Site5 was announced August 2015. Here's the updated EIG brand tracking graph. One thing to note, this now uses the new rating algorithm which has a built in decay function to weight older reviews less. So the new graph uses the new algorithm but calculating each point in time as if it always used it. There will be some differences between it and the original graph (which prompted the change in algorithm). It's minimal for most brands, only when there is a major change in sentiment, it shows a change more quickly. Full details about the change can be read on Review Signal Ranking Algorithm Update.

eig_brand_review_signal_ratings_2016

What you can see is the reputation remained relatively stable until about April 2016 and then started a slow but steady decline where it has dipped below 50% for the first time recently. As with nearly every brand, except A Small Orange, the decline happened within a year.

Since the original post there also hasn't been much movement in any other brands beyond Site5 crashing and A Small Orange continuing to slide downward. Verio didn't see a dip post-acquisition, but it had a pretty low rating to start with that put it in the bottom half of EIG brand ratings already.

Why Do EIG Brands Go Down Post Acquisition?

The longer I am in this industry, the more stories I hear. A Small Orange was such an interesting exception and I've heard a lot about it from a lot of people. It's relative independence and keeping the staff seemed to be the key to maintaining a good brand even within the EIG conglomerate.

Site5 offers what I imagine is more business-as-usual in the EIG world. Cut staff, migrate to EIG  and maximize profit (in the short term). Site5's founder, Ben, reached out to a competitor, SiteGround, and arranged for them to hire a large number of Site5 staff that EIG had no plans on keeping according to SiteGround's blog. A very classy move from the former CEO and a seeming win for SiteGround, one of EIG's larger hosting competitors. I also saw similar behavior of long time staff all leaving when A Small Orange started to go downhill and staff from other EIG brands showed up.

Beyond simply trying to cut costs, you have to wonder why would you spend all that money acquiring these brands that have lots of customers, good reputations and talented staff that obviously are keeping the operation running successfully only to get rid of nearly all of that except the customers. But once you gut the staff, it seems like the customers notice, because it certainly shows up in the data I track.

Conveniently, EIG just published their Q3 2016 10-Q.

We have certain hosting and other brands to which we no longer allocate significant marketing or other funds. These brands generally have healthy free cash flow, but we do not consider them strategic or growth priorities. Subscriber counts for these non-strategic brands are decreasing. While our more strategic brands, in the aggregate, showed net subscriber adds during the quarter ended September 30, 2016, the net subscriber losses in non-strategic brands and certain gateway brands contributed to a decrease in our total subscribers of approximately 42,000 during the quarter. We expect that total subscribers will continue to decrease in the near term.

Overall, our core hosting and web presence business showed relatively slow revenue and subscriber growth during the first nine months of 2016. We believe that this is due to flat marketing expenditures relative to 2015 levels on this business in the first half of 2016 as a result of our focus on gateway products during that period, and to trends in the competitive landscape, including greater competition for referral sources and an increasing trend among consumers to search for web presence and marketing solutions using brand-related search terms rather than generic search terms such as “shared hosting” or “website builder”. We believe this trend assists competitors who have focused more heavily than we have on building consumer awareness of their brand, and that it has made it more challenging and more expensive for us to attract new subscribers. In order to address this trend, during the third quarter of 2016, we began to allocate additional marketing investment to a subset of our hosting brands, including our largest brands, Bluehost.com, HostGator and iPage. We plan to continue this increased level of marketing investment in the near term, and are evaluating different marketing strategies aimed at increasing brand awareness.

So the result of their current strategy this past quarter has been a net loss of 42,000 customers. They say their strategic brands on aggregate had a net subscriber increase and named the largest ones (BlueHost, HostGator, iPage) and they are going to focus on a subset of brands going forward. But the phrasing would seem to imply that some of the strategic brands experienced losses as well. It also means that the non-strategic brands lost more than 42,000 customers and pulled down the net subscribers to -42,000 customers last quarter.

The cap it all off, I got one of the most surprising emails from Site5 a couple days ago.

We wanted to let you know that we’ve decided to terminate the Site5 Affiliate program as of November 30th, 2016.

We want to thank you for your support of Site5, especially during our most recent move into Impact Radius, and we hope that you’ll consider promoting another one of Endurance’s other programs.

I guess Site5 isn't being considered a strategic brand if they are killing off the affiliate channel on it entirely, right after a big migration from Site5's custom affiliate program to Impact Radius. They also asked that affiliates promote HostGator now, which certainly fits in the strategic brand category.

It's extremely disappointing to see this trend continue of brands collapsing after a year in EIG's hands. What will be interesting going forward is that EIG hasn't acquired any new hosting brands for a while. They seem to be focused on their existing brands for now. I wonder if that will mean we will see any noticeable positive change or improvements in existing brands (or at least some of the strategic brands).

The Rise and Fall of A Small Orange

If you're an unhappy A Small Orange customer looking to find a better web host and don't want to read why the quality went down, simply head over to our Web Hosting Reviews and find a better hosting company. 

How did a small web hosting company have such a huge impact on Review Signal?

The Early Days

This story begins in October 2011, a year before Review Signal launched. Review Signal had been collecting data for months and early ratings data was starting to become meaningful. A tiny company was at the top of the rankings. A Small Orange.

The most worrisome part of this revelation was that A Small Orange did not have an affiliate program. Which isn't a requirement at all for a listing on Review Signal.

However, after investing years of work, if the top rated company ended up not having an affiliate program, the business was likely sunk before it even started. So I inquired early and heard back from the CEO at the time, “we don't have an affiliate program and at the moment, we have no plans for one.” This was a potential death knell because the entire business model relies on making at least some money, even though I assumed it would be much lower than my competitors who simply sell their rankings to the highest bidder. But as any entrepreneur knows, almost everything is negotiable if you understand what the other person really wants and why. After talking further with the CEO, he explained his issue with web hosting review websites, “they typically have a pay for ranking sort of model and do it either through set rates or affiliate payouts. It varies. The economics at ASO don't really work out for a standard affiliate program.” A Small Orange didn't want to play the game that every other review site out there did. Pay to play, quality be damned.

This CEO hated the games being played as much as I did.

That was all the opportunity I needed. Review Signal's mission has been to fight against that very same model and I knew I had an early ally who could make this work. We ended up working out a deal to pay three months of whatever plan someone purchased and he put a cap on my potential earnings at $250 before he would review the performance. Considering the most popular plans were $25/year and $5/month, this wasn't going to earn a lot, but at least it might start covering some of the very basic costs. The first month I earned $52.38 on 6 sales for an average of $8.73 per sale with A Small Orange.
At least it was something. And a foot in the door was all I needed to prove this crazy idea called Review Signal might have some legs. A Small Orange opened that door and for that our histories will forever be intertwined.

The Good Times

The next few years were very good. I was their first affiliate. I was their biggest affiliate for many years, bringing in over a thousand new customers. I got to know many of the staff and would consider some of them friends. And A Small Orange continued to be the best rated shared hosting company through 2014. Everyone was happy - their customers, the company and Review Signal. I was happy to recommend them based on the data showing they had incredibly satisfied customers. I had people tell me personally they were very happy with them after signing up because of the data I publish here at Review Signal.

2014-01-20 13.34.07

Free Swag and Annual Thank You Card from ASO

The EIG Acquisition

A Small Orange was quietly acquired in 2012. They were acquired by a behemoth in the hosting industry called Endurance International Group (NASDAQ: EIGI) which owns dozens of brands including some of the largest and most well known hosting companies: Blue Host, Host Gator, Host Monster, Just Host, Site5, iPage, Arvixe and more.

EIG has a very bad reputation in the web hosting world. If you ask most industry veterans they will tell you to run to the hills when it comes to EIG. The oft-repeated story is EIG acquires a hosting company, migrates them to their platform and the quality of service falls off a cliff. The best example of this is perhaps their migration to their Provo, UT data-center which had a catastrophic outage in 2013. This outage was huge. The impact dropped four of EIG's largest brands many percentage points in the Review Signal rankings in a single day.  But these major outages continue to happen as recently as November 2015.

In a recent earnings call with share holders, EIG CEO Hari Ravichandran talked about two recent acquisitions and their plans for them. “We expect to manage these businesses at breakeven to marginally profitable for the rest of the year as we migrate their subscriber bases onto our back-end platform. Once on platform, we expect to reach favorable economics and adjusted EBITDA contribution consistent with our previous framework for realizing synergies from acquisitions.”

The EIG Playbook

EIG's playbook has been to acquire web hosting brands, migrate them to their platform and 'reach favorable economics.' They've been doing it for years and it seems to be working well enough for investors to continue to put money into the company. M&A to grow subscriber bases and economies of scale to lower costs. It's a very simple and straightforward business plan. It doesn't speak to anything beyond spreadsheet math though, such as brand value and customer loyalty. And those are certainly lowered and lost post-EIG acquisition according to all the data we've collected over years and multiple acquired brands. It's calloused business accounting, but it makes perfect sense in the race to the bottom industry that is commodity shared hosting.

Review Signal Rating Calculated Pos/(Pos+Neg), without duplicate filtering

Review Signal Rating Calculated Pos/(Pos+Neg), without duplicate filtering

You can see all the EIG brands tracked here on Review Signal in the chart above and their acquisition dates below:

iPage - 2009. BlueHost/HostMonster - 2010. JustHost - Feb 2011. NetFirms - March 2011. HostGator - June 2012. A Small Orange  - July 2012. Arvixe - November 2014. Site5 - August 2015.

You'll notice their ratings, in general, are not very good with Site5 (their most recent acquisition) being the exception. iPage was acquired before I started tracking data. BlueHost/HostMonster also had a decline, although the data doesn't start pre-acquisition. JustHost collapses post acquisition. NetFirms has remained consistently mediocre. HostGator collapses with a major outage a year after acquisition. Arvixe collapses a year after being acquired. Site5 is still very recent and hasn't shown any signs of decline yet.

The Expected Decline of A Small Orange

So nearly every industry veteran I talked to expected A Small Orange to collapse. Immediately after acquisition. Except me. I was, am and will continue to be willing to give the benefit of the doubt to a company until I am shown evidence.

For years, post acquisition people were saying ASO's demise was right around the corner. For years, I still waited for that evidence and the prophecy to become true. But it didn't happen.

It often took EIG less than a year to ruin a brand. We don't have to look further than Arvixe for an example of this, which was acquired in November 2014. Today, Arvixe has one of the lowest ratings of any company on Review Signal at a shockingly low 27%.

But A Small Orange continued to chug along. It didn't hear the naysayers or believe itself to be a victim of the EIG curse. Instead, ASO was the best shared host for years post-acquisition. It seemed to have a fair level of autonomy from the EIG conglomerate. The staff I knew there, remained there, and all indications showed they were still the same company.

Until it wasn't.

The Fall of A Small Orange

A Small Orange Historical Rating

A Small Orange Historical Rating

The chart above shows Review Signal's rating of A Small Orange. The Blue line is the rating as calculated by [Positive Reviews / (Positive Reviews + Negative Reviews)]. The Red line only calculates the rating from the past 12 months of data. It's slightly different than Review Signal's actual calculation because I am not filtering out duplicates for quick analysis. The difference for A Small Orange is that when you remove the duplicates, the year 2015 had a 43% rating indicating there was quite a few people writing multiple negative things about A Small Orange.

Sometime in 2015, the A Small Orange that thousands of people trusted and raved about became another EIG brand. I tried to get the inside story. I reached out to the former CEO who sold the company to EIG and became an executive there for a couple years post acquisition. He reached out on my behalf to EIG's PR team to see if they would participate in this story. Both declined to participate.

So, I'm left to speculate on what happened at A Small Orange based on what's been publicly stated by their CEO and watching their strategy unfold for years across many companies/brands. My best guess is EIG finally got involved with A Small Orange. They used to be a distributed/remote team, now all positions they are hiring for are listed as in Texas (their headquarters). I saw a HostGator representative get moved over to ASO's team, so the internal staff was changing and people were being moved from brands with less than stellar reputations to ASO. The former CEO left mid-2014, which likely left a leadership and responsibility gap. ASO could probably run on auto pilot through the end of 2014, but over time having no champion for your brand in upper management eventually will come back to hurt the brand when decisions get made based on simple economics.

Once 2015 rolled around, the service had noticeably declined. The overall rating for A Small Orange in 2015 was 43% (only using 2015 data). For years, they had been in the 70's. It also ended the year with a massive outage for most, if not all, of their VPS customers which has been going on since Christmas. I personally received multiple messages from users of this site asking about what was happening and alerting me to this decline in service quality.

ASO was also responsible for the Arvixe migration that went very poorly and caused the Arvixe brand to tank. I'm not sure why EIG doesn't have a dedicated migration team to handle these type of moves considering how many acquisitions they go through and how large a role it plays in their growth strategy. But that's a whole separate issue.
It's with great disappointment that I have to admit, the A Small Orange that played such a huge role in the founding and success of Review Signal and provided a great service to many thousands of customers is dead. It's become another hollow EIG brand where the quality has gone down to mediocre levels. And that seems perfectly ok to them, because it's probably more profitable for their bottom line.

Going Forward

This story has had a profound impact on Review Signal. One thing that it made painfully obvious is that the ranking algorithm needs its first update since inception. The current ranking treats every review equally. Which was great when this site launched, because time didn't have any opportunity to be a factor yet. But as this site continues to move forward, I need to acknowledge that a significant amount of time has passed since launch and today. A review from the beginning of Review Signal isn't as relevant as one from this past week in determining the current quality of a web hosting company. A Small Orange right now shows up around 64% which is artificially high because of their long history of good service and it hasn't been brought down yet by the marginally small (by time scale) decline of the past year. But it's painfully clear that it's not a 64% rating company anymore.

Another thing to note is the graphs here all used a simpler calculation [Pos / (Pos + Neg)] to calculate rating without duplicate filtering. What this means is the difference between the rating here and the actual rating on the live site is a measure of the degree people are being positive or negative about a company. If the rating here is higher than the published, it means people are saying on average, more than one good thing about the same company. If the rating is below (as is in most if not all cases here), it means people are are saying more than one negative thing about the company. I'm not sure if this will factor into a new algorithm, but it is something to consider. My intuition says you would see it hinge around 50%, those companies above would likely have more positive supporters, and those below would have detractors.

In the coming months I will try to figure out a better way to generate the ranking number that more fairly represents the current state of a company. My initial thought is to use some sort of time discounting, so that the older the review, the less weight it would carry in the rankings. If anyone has experience working with this or wants to propose/discuss ideas, please reach out - comment here, email me, or tweet @ReviewSignal.

WPPerformanceTester – A WordPress Plugin to Benchmark Server Performance

Everyone who read our most popular blog post, WordPress Hosting Performance Benchmarks may have noticed a new test this year (2015) called WPPerformanceTester. It was something I built during the tests to add a new benchmark to see what the underlying performance of the server the test websites were hosted on. It wasn't hugely meaningful because I had no real basis to compare from except the benchmarks I had just generated. So it really played no role in the actual rankings and outcomes of the testing.

But the vision for it and value has slowly become more apparent. In my testing, Pagely had an unusually slow WordPress benchmark (testing WordPress database functions). It was acknowledged by their team and they have since announced a migration to a newer Amazon technology called Aurora which gave Pagely a 3-4x performance increase.

So without further ado, I'd like to announce WPPerformanceTester is now live on GitHub and licensed under the GPLv3. All problems, errors and issues should be submitted on GitHub.

What Tests Does WPPerformanceTester Run?

  • Math - 100,000 math function tests
  • String Manipulation - 100,000 string manipulation tests
  • Loops - 1,000,000 loop iterations
  • Conditionals - 1,000,000 conditional logic checks
  • MySql (connect, select, version, encode) - basic mysql functions and 1,000,000 ENCODE() iterations
  • $wpdb - 250 insert, select, update and delete operations through $wpdb

Industry Benchmarks

WPPerformanceTester also allows you to see how your server's performance stacks up against our industry benchmark. Our industry benchmark is the average of all submitted test results. After you run WPPerformanceTester, you will have the option to submit the benchmark with or without writing a review of your web host. Please consider submitting without a review so that our benchmark improves. If you feel inclined to write a review, please feel free. They will be published in an upcoming project that ties together many of the projects I've been working on here at Review Signal.

Please Note

WPPerformanceTester is a single node testing tool (if you're running a distributed/clustered system it will not give a complete picture, but only of the servers that execution touches.

Furthermore, WPPerformanceTester is not the be-all and end-all of performance testing or web host quality. Our WordPress Hosting Performance Benchmarks performs a variety of tests and that only gives insight into performance. It doesn't look at customer service quality, pricing, and other important dimensions of a good web hosting service.

WPPerformanceTester should be used as one tool in a performance toolbox. I hope it's valuable and helpful, but please keep in mind the larger picture as well. If you care about service quality, we also maintain the largest web hosting review database. My goal is to cover every aspect, and WPPerformanceTester marks a small step in that direction of being able to give consumers a complete picture of web hosting quality in the WordPress space.

Amazon Giveaway Marketing Results and Advice

Amazon Launched Amazon Giveaway platform one week ago. I immediately thought it would be interesting to try out as a marketing channel. I've never done a giveaway. Although multiple hosting companies have offered to give free plans if I advertise their services; which didn't seem ethical given what we do.

Here's what I learned.

What to Give Away?

The first question is what product would correlate well with the service I'm offering (web hosting reviews). I looked at web hosting books and that was about the only thing related on web hosting on Amazon. But they looked terrible and I wouldn't want one, so why would my audience? So I had to get creative. I decided the easiest thing for me would be flash drives, a very common promotional item. Maybe I could spin a message about backing up your data (I honestly don't think the product selection and message was that good). I think the more tailored the giveaway is to your company the better. If you can giveaway your own product that would be the best.

How Amazon Giveaway Works

Basically, you just browse around Amazon looking for a product that says

setup-amazon-giveaway

Then you choose how you want to run it. There are currently two options. One gives away X items to the first N people who click. The second option gives away X items to the Nth person to click. I think you would be crazy to use option 1 and I'm not sure why it's even an option. So I will pretend everyone is selecting to giveaway an item for every Nth person.

The second setting is whether it's free or you a visitor to follow you on Twitter. I think a Twitter follow should be the default setting. Otherwise you get nothing for your giveaway beyond people looking at the landing page. You have no idea who they are and have no means to contact them again otherwise.

Then you pay for the X items you are giving away (plus shipping).

The giveaway lasts for one week and you get a refund at the end of any unspent money.

The Marketing

How you market your giveaway probably has the biggest impact on how well it does. There's also probably a correlation with the quality of the giveaway and the targeting of it towards the audience you're after.

Our giveaway did the minimal marketing effort. It was posted on our Facebook page once. I also posted it on Twitter with the #AmazonGiveaway hashtag a few times. It was also posted to Reddit /r/AmazonGiveaways. I didn't use any targeted distribution for my audience, so it was the lowest common denominator of marketing.

However, I got really lucky and @Amazon retweeted me.

AmazonReviewSignal

Which caused this:

amazonRTeffect

And this:

amazonRTanalytics

So lots of followers, but definitely no conversions.

 

The Results

reviewsignal-giveaway

I configured my giveaway to be every 500 people wins and I was willing to give 20 of them away. I only gave away 3 and was refunded $244.76.

Amazon also emailed me some basic analytics (which I can't find on the dashboard)

giveaway-analytics

So the net result was I spent $40.24 to get 1854 followers. Of which, 444 unfollowed me within that week. I started with 357 followers, and now have 1767. So roughly 24% of the followers I bought were worthless and I only really got 1410 real new followers. So the total cost per new follower was 2.8 cents.

I was hoping to get in early on a new product and had to guess about a lot of things without anything to go on. Marketing basics still should apply if you're doing a giveaway. You need to market it towards your audience and giveaway something that they are most likely to care about. You also can't expect great results just by doing the bare minimum. I think I got lucky because Amazon retweeted me (which was one of the hopes of jumping in early, getting early press). But it's nothing to bank on.

If I were going to do it again, I would do it very differently. I wouldn't bother with #AmazonGiveaway hashtag and create a landing page specifically for the purpose. I would highlight what I am giving away and also highlight what Review Signal does. I would probably try to capture an email address before giving someone a link to the giveaway. It's more of a barrier, but also would hopefully filter people most interested in my giveaway (which hopefully would be very targeted towards my audience).

TL;DR Lessons:

  • Choose a product relevant to your company/business
  • Market the giveaway in places where your audience/customers are (not just tweeting #AmazonGiveaway)
  • Capturing Twitter followers is a mediocre reward, you should probably build some type of landing page to convert more users before they get the link to the giveaway

 

For the curious, some more analytics screenshots are below

The full analytics from the Tweet Amazon RT'd.

amazonRTfullanalytics

What the whole engagement/analytics looked like over the period from Twitter Analytics:reviewsignaltwitteranalytics

Amazon VS normal tweets with #AmazonGiveaway hashtagreviewsignaltwitteranalyticsamazontweet

What a normal non-giveaway tweet looked like:reviewsignaltwitteranalyticsnonamazon

Post Mortem of the EIG Outage (August 2, 2013) That Affected BlueHost, HostGator, JustHost and HostMonster

I first wrote about EIG's major outage as it was occurring and had to speculate on a few things before I had the data to support those guesses. This post is a more complete picture of what happened.

Recap

EIG had a major outage on August 2, 2013 that lasted for many hours because core switches in their Provo, Utah datacenter failed. This failure caused customers of BlueHost, HostGator, JustHost and HostMonster to be taken offline.

I speculated as to what would occur after the outage. How would the brands of the affected companies be perceived after such a catastrophic failure? I looked for a comparable event: the GoDaddy DNS outage in September 2012. What I observed from that event was a very quick return to normal volumes of messages and sentiment. GoDaddy regressed to the mean. 

GoDaddy

The charts I used in my original post were lacking. I didn't have time to really collect and analyze all the data, especially sentiment. I could eyeball the historical data and see the ratings bounced back to their original levels but it wasn't a granular look.

godaddy_dns_outage_full

This chart shows the actual outage, tweet volume and sentiment. It's immediately clear that negative sentiment has a huge spike. I also suspect that a lot of the positive messages are actually mis-categorized; Review Signal isn't perfect and things like sarcasm are one of the hardest things for the sentiment analysis algorithms to categorize. The unusual volume lasts three days and then quickly drops back to a normal looking pattern with perhaps a slightly higher baseline volume. The actual rating goes back to hovering around 50%, which GoDaddy's long-term graph hovers around as well.godaddy_chart

Let's get back to the EIG outage and the affected brands. I am only going to talk about two of the brands, BlueHost and HostGator, in this post because on a granular level, the other two, HostMonster and JustHost, didn't have enough data. The brands without enough data will take more time to develop a clear picture about the effects of the outage.

BlueHost

bluehost_sentiment

I was wrong. So far at least. BlueHost had an overall rating of 57% before August 2. It hasn't broken 50% since the outage. BlueHost did not, or has not yet, regressed back to the mean. What's interesting is that the volume of tweets about BlueHost's outage was more than double in quantity to the similar GoDaddy outage, but they both quickly dropped back to normal volume within days of the event.

I will explore this a bit more, but to do that I need to show you the other brand.

 HostGator

hostgator_sentiment

HostGator's outage looks almost identical to GoDaddy's outage. Around 1000 negative messages on the day of the outage and back to normal within days. HostGator appears to have regressed to the mean as quickly as GoDaddy, its rating has been over 60% two days, which are pre-crash levels, where its average rating was 62%.  HostGator behaved exactly as I predicted.

Weird Conclusions and Speculations

Why hasn't BlueHost regress to the mean? One explanation, which I was alerted to by a kind reader (Thanks Linda!), is that not all of HostGator's customers were in the Provo, UT data center. So the outage may have disproportionately affected BlueHost customers compared to HostGator customers. BlueHost is also the larger hosting company by number of customers, although not domain count.

That explanation may explain the volume difference, but I don't think it explains the regression to the mean for one brand and not the other. Presumably the affected customers of both brands should be equally upset. Those lingering feelings should last equally long for both groups of customers.

I can't explain why we haven't seen BlueHost regress, but I can point out a few differences between this outage and the GoDaddy comparison which may be factors. One important factor is duration. GoDaddy's outage lasted 4-5 hours according to reports. The EIG outage lasted from the morning of August 2 until 9 PM. They were reporting 'intermittent instability' into August 3 according to their official website.

I could speculate that the combination of severity, duration and size of the affected brand has caused some sort of more permanent brand damage to BlueHost, but I think that's premature. BlueHost hasn't regressed yet, but I still think it will eventually. A company that large, with such a huge brand and marketing infrastructure will probably recover. I will be watching BlueHost carefully for the next few weeks or months along with the smaller brands to see if it happens. If it doesn't, this will be an interesting case study in branding, communication and perhaps social media.

 

Thank you for reading and if you have any ideas, feedback or suggestions please leave them in the comments below.

Service Interrupted: A Look at the EIG (BlueHost, HostGator, HostMonster, JustHost) Outage through Twitter

I woke up today and quickly found out that one of the major players in the hosting space was having a massive outage.  According to their own blog:

During the morning of August 2, 2013, Endurance International Group’s data center in Provo, UT experienced unexpected issues that impacted customers of bluehost, HostGator, HostMonster and JustHost. Company websites and some phone services were affected as well.

That sounds bad. Really bad. But how bad? Let's take a look at the data:

tweets_per_day_by_company

 

It's pretty clear that today was an outlier. A major outlier for all the affected companies.

Our data collection system here at Review Signal collected over 35,000 tweets today alone about these four companies. That is roughly 14 times the normal amount.

Interestingly enough, there are some very understanding customers out there too, it wasn't all negative.

hostgator_positive

 

How has it affected their rankings?

I must first note that most messages don't make it through our spam filtering systems for a variety of reasons. So despite there being over 35,000 tweets, we did not get 35,000 new reviews. Many of the messages were not up to our quality standards, eg. retweets, spam, duplicate messages and news. If you are interested in learning more about how we calculate scores and what kinds of messages count see our How It Works section.

 

BlueHost

I am not sure why, but BlueHost was impacted a lot more than it's bigger brother HostGator. BlueHost has 1.9 million domains on their server. They also received over 15,000 tweets about them today (50% more than HostGator).

BlueHost was rated at 57% (Overall Rating) from over two years worth of data collected. Today they dropped 8% to 49%. There were over 1,500 negative reviews today (Note: Our data was calculated early to write this article, the day isn't fully over yet).

HostGator

HostGator is the largest of the bunch and has 2.15 million domains under management. They seemed to have fared the storm better than their brothers with less tweets about them in absolute number and relative to their size.

HostGator was rated at 62% (Overall Rating) and dropped 5% to 57%. HostGator received approximately 700 negative reviews today.

HostMonster and JustHost

These are the babies of the bunch, HostMonster has 'only' 700,000 domainso and JustHost has barely over 350,000.

HostMonster went from a 56% (Overall Rating) to 48%, which is a 8% decline. JustHost dropped from 46% to 41%, a total of 5%.

Conclusion

Today was a pretty awful day for all the companies above but some were affected more than others. I don't have any answer as to why that might be. There are many plausible theories such as perhaps there were more BlueHost customers in the Provo, UT data center than the other companies. But without further information, it's only speculation. UPDATE: I was told BlueHost actually has more customers than HostGator, even if HostGator customers have more domainers. A simple explanation as to why BlueHost was impacted more.

What I can say is a major screw up definitely impacts a company's reputation. But large companies seem to regress to the mean.

GoDaddy is a good comparison. They had a major DNS outage around September 11-12. It left a noticeable dip on the overall rating but it seemed to bounce back. February's dip is the super bowl effect that brings a lot of attention to them (more negative than positive, but attention nonetheless). The long-term volume of tweets also doesn't appear to be affected after a few days.

godaddy_chart

godaddy_dns_outage

If we use GoDaddy as a benchmark, these companies will probably be back to their usual levels of service within a week, but today and the next couple days will leave a very long term impact on their rating at Review Signal.

What did people say about GoDaddy’s 2013 Super Bowl Ad?

Many of us watched the Super Bowl last night. Some of us even watched only to see the commercials. GoDaddy made a tremendous impact this year. They received 290,000 tweets from the Super Bowl. We here at Review Signal got curious, what did people say about GoDaddy?

(Click to View in Full Size)

image

Where did this data come from?

We used our sample of ~27,000 tweets and found the most popular words used when talking about GoDaddy. The bigger the word(s) the more commonly used they were to describe.

Watch The Commercial

Hurricane in the Cloud: How Hurricane Sandy Impacted Web Hosting Companies

I thought it would be really fun to create an infographic about the effects of Hurricane Sandy on the web hosting companies we track. I learned my infographic skills are optimistically rated: very poor.

So here's the interesting stats and trends we saw occur during Hurricane Sandy:

418 People Tweeted about Sandy and a web hosting company.

119 of those Tweets were talking about Intel's Sandy Bridge technology

15 People were concerned about their web hosting

7 (/15) of those people were worried about Amazon

23 People Claimed to have issues related to Sandy

8 (/23) of those complaints were directed at BlueHost

The Most Popular Sandy Tweets:

"Unsure why Heroku is prepping for Sandy. I thought hurricanes were the strongest kind of cloud." - @tenderlove (42 RTs)

"Linode HQ weathered #sandy only to lose power hours later, late last night. It runs our VoIP phones so no calls until we can work around." -  @linode (15 RTs)

"Laughing Squid founder @ScottBeale is live tweeting post #Sandy recovery from Manhattan. Follow him updates: https://twitter.com/ScottBeale" - @LaughingSquid (9 RTs)

Some Angry/Happy Tweets:

"@GoDaddy UMADBRO?????? UMAD?!?!?!?!?!? I HOPE SANDY COMES AND DESTROYS ALL YOUR SERVERS." - @djdarrenmallett

"With Gawker, HuffPo and others experiencing outages, Sandy is IRL the equivalent of GoDaddy hosting." - @spydergrrl

"wow how in the hell did @linode have 100% uptime in Newark during #Sandy? that's some badass hosting right there." - @procdaddy

"Wow, so Jersey is getting HAMMERED by Hurricane Sandy right now, and yet my @linode in Newark is still up. THAT'S service right there!" - @bill_clark

Other stories we found in the data:

Heroku released regions to help deal with customers who might potentially be affected by Sandy.

We saw BlueHost decided to offer flexible payments to those affected by Sandy.

Finally.

For your entertainment.

You can see my first ever attempt at creating an infographic (in the future, I will hire someone else to design these!)

image

Loading...

Interested in seeing which web hosting companies people love (and hate!)? Click here and find out how your web host stacks up.