SliceHost was acquired by RackSpace in 2008. The SliceHost brand hasn't operated in many years, it was time to remove it from our listings.
It's strange writing a positive article about a competitor. This industry is so full of shit, that it feels weird to be writing this. But if we don't recognize the good as well as the bad, what's the point?
In order to bring you more accurate web hosting reviews, HostingReviews.io is now merged with HostingFacts.com and soon to be redirected to HostingFacts.com-HostingReviews.io popup notice
HostingReviews.io was a project created by Steven Gliebe. It was basically a manually done version of Review Signal. He went through by hand categorizing tweets and displaying data about hosting company. Besides the automated vs manual approach, the only other big difference was he didn't use affiliate links. It was bought out by HostingFacts.com last year sometime and left alone.
I'm truly saddened because it's disappearing at some point 'soon.' The only real competitor whose data I trusted to compare myself against. So I thought I would take the opportunity to write about my favorite competitor.
I am consciously not linking to either in this article because HostingFacts.com, who purchased HostingReviews.io, has HostGator listed as their #1 host in 2017 who are rated terribly by both Review Signal (42%) and HostingReviews.io (9%). So whatever their methodology purports to be, it's so drastically out of line with the data Steven and I have meticulously collected for years that I don't trust it one bit. Not to mention their misleading and bizarre rankings showing BlueHost #4 and in the footer recommending A Small Orange as the #4 rated company. In their actual review, they recommend A Small Orange, I guess they missed The Rise and Fall of A Small Orange and the fact that HostingReviews.io (March 2017) has ASO at 27%.
It would be easy to be upset that someone copied the same idea, but the reality is, it's quite helpful. Steven is a completely different person, with different interests, values and techniques. He also didn't put any affiliate links which many people believe are inherently corrupt. But our results for the most part, were very much the same. So the whole idea that affiliate links corrupts Review Signal rankings, I could pretty confidently throw out the door.
I just want to clarify that I had actually built most of the site before seeing Review Signal. I probably wouldn't have started if I knew about yours first. We came up with the same idea independently. They are similar but nothing was copied. I was a bit disappointed to find out that I wasn't first but later happy, after chatting with you and seeing that we were kind of the Two Musketeers of web hosting reviews.
- Steven Gliebe
I decided to actually look at how close we really were using old archive.org data from Jan 2015 when his site was still being actively updated a lot.
Comparing HostingReviews.io vs Review Signal Ratings (Jan 2015). I've only included companies that both sites covered for comparison's sake.
|Company||HR Rating||Review Signal Rating||Rating Difference||HR Rank||RS Rank||Rank Difference|
|A Small Orange||74||77||3||5||2||3|
|Amazon Web Services||55||67||12||13||13||0|
|IX Web Hosting||0||38||38||34||31||3|
The biggest difference is Pagely. I'm not sure why we're so different, but it could be a few factors: small sample size (HR had 42 reviews vs RS having 291), time frame (Review Signal has been collecting data on companies as early as 2011) or perhaps categorization methodology.
To calculate the actual ratings, we both used the same simple formula of % positive reviews (Review Signal has since changed it's methodology to decrease the value of older views). There is a lot greater difference there than between ranking order. This could also be a sampling or categorization issue, but the rankings actually were a lot closer than rating numbers especially at the bottom. The biggest differences were about Pagely, WiredTree, WebSynthesis, Azure, RackSpace, HostGator, and LunarPages. Review Signal had most of those companies higher than HostingReviews with the exceptions of Pagely and WiredTree. WiredTree the actual % difference isn't that high, it looks more distorted because of how many companies were ranked in that neighborhood. Pagely still remains the only concerning discrepancy, some of which could be corrected by using a different rating algorithm to compensate for small sample sizes (Wilson Confidence Interval). If you use a Wilson Confidence Interval with 95% confidence, Pagely would have 67% which makes the difference only 11%. Something still is off, but I'm not sure what. Towards the bottom, HostingReviews had companies with a lot lower ratings in general. I'm not sure why that is, but I'm not sure that it concerns me that greatly if a company is at 20 or 40%, that's pretty terrible through any lens.
The Wilson Confidence Interval is something I'm a big fan of, but the trouble is explaining it. It's not exactly intuitive and most users won't understand. To get around that problem here at Review Signal, I don't list companies with small sample sizes. I think it's unfair to small companies because those companies will always have lower scores using a Wilson Confidence Interval.
I always thought if you were going to list low data companies, you would have to use it for the ratings to be meaningful. So I went ahead and applied it to HostingReviews since they list low data companies.
|Company||HR Rating||HR Wilson Score||RS Rating||Wilson HR Rank||RS Rank||Rank Difference|
|A Small Orange||74||0.665222315806623||77||5||2||3|
|Amazon Web Services||55||0.50771491300961||67||10||13||-3|
|IX Web Hosting||0||0||38||34||31||3|
This actually made the ranked order between companies even closer between Review Signal and HostingReviews. Pagely and WebSynthesis are still the two major outliers which suggests we have a more fundamental problem between the two sites and how we've measured those companies. But overall, the ranks got closer together, the original being off by a total of 124 (sum of how far off each rank was from one another) and Wilson rank being 104 which is 16% closer together. A win for the Wilson Confidence Interval and sample sizing issues!
Bonus: HostingReviews with Wilson Confidence Interval vs Original Rating/Ranking
|Company||Rating||Reviews||Wilson Score||Rating Difference||Rank||Wilson Rank||Difference|
|A Small Orange||74||153||0.665222315806623||7||7||7||0|
|Google Cloud Platform||70||101||0.604645970406924||10||11||11||0|
|Amazon Web Services||55||537||0.50771491300961||4||18||14||4|
|Web Hosting Hub||38||34||0.237049871468482||14||29||29||0|
|IX Web Hosting||0||45||0||0||49||49||0|
You will notice the biggest differences are companies with more reviews generally moving up in rank, small sample sizes move down. Because the sample sizes are so small on some companies, you can see their % rating drops dramatically. But since most companies don't have a lot of data, it doesn't influence the rankings as much.
It's been nice having HostingReviews.io around when it was actively being updated (the manual process is certainly overwhelming for any individual I think!). I will miss having a real competitor to compare what I'm seeing in my data. I don't know the new owners, but I do consider Steven, the creator, a friend and wish him the best of luck going forward while he works on his primary business, ChurchThemes.com. It saddens me to see the new owners ruining his work with what looks like another mediocre affiliate review site pushing some of the highest paying companies in the space. But it's yet another unfortunate reminder of why I'm so disappointed by the web hosting review industry.
Sources: All data was pulled from Archive.org.
This was originally written on July 7, 2015. The screenshots are mostly from that period using archive.org. The site has changed (no longer has a Top 10 that I see, but still misuses Review Signal in the exact same way). I was hesitant to bash competitors, but I decided I don't care, they are the ones behaving badly, I will call them out on it.
This is Episode 2 of Dirty Slimy Shady Secrets of the Web Hosting Review World
I've long hated the fake review sites that plague the web hosting review business. But it just became even more personal. HostingAdvice.com decided to take reviews from Review Signal, edit them and selectively use them to promote companies with very poor ratings.
Let's take a look at what is happening at HostingAdvice.com (This links to archive of their site in case they change it and I don't want them getting any benefit for the BS they are pulling).
They claim to be an expert and say everyone sucks. They are calling everyone else spammy and unreliable. It's hard to argue with the sentiment considering I have the same stance here.
But let's take a look at their Top Hosts in 2015
Media Temple as number one, not the most abusive ranking I've seen. They don't have the best reviews here, but they are 58% (56% as of Jan 2017), which is 2nd tierish, at least more than half their customers are saying good things. BlueHost is #2? That's just nonsense. They have a 47% (40% as of Jan 2017) which means less than half their users are saying good things about them.
BUT WAIT, THERE'S MORE!
Remember that Highfalutin rhetoric about them being different and not spammy/unreliable? How could you possible need a disclosure like that if it were true? That's right, you're just like every other crap web hosting review site out there trying to pimp the highest paying affiliate program on unsuspecting visitors.
If that wasn't enough, there's always the coup de grâce:
Things are starting to make sense. But none of this has gotten personal yet.
So I took a look at the #3 Ranked iPage and to my absolute delight found this under 'Customer Reviews'
Yes, those are the two highest rated positive comments about iPage on Review Signal.
Except they've been given 5 stars which isn't something we do here. Also, they've edited this review without indicating they changed it (adding 'I'), which tells me they did this by hand and not scraping.
So that five star rating is made up. How made up?
So made up that this stolen review was given four stars. They are simply adding their own narrative and judgement to Review Signal's data.
At Review Signal, we only categorize as positive or negative.
Why does this matter and why is this so personal?
This matters because they were conscious enough of Review Signal to steal its content. They were also conscious enough to cherry pick the data they wanted to push the highest paying affiliates and ignored the fact they are selling out to some of the lowest rated companies around. They have JustHost listed as #9 (like many Fake review sites have in their top lists) when every indication shows that they have a terrible reputation. One of the absolute lowest on this site at 39% ( 31% as of Jan 2017) or you can look at the 21% on a no-affiliate link site that uses a similar methodology to Review Signal (now down to 7% as of Jan 2017).
2017 Update: iPage is still listed as 5 Stars with a 4.9/5 Rating as one of their best hosts in 2017.
Finally, what made this so personal is they are using the Review Signal brand to mislead consumers. This site was built to help consumers in a space filled with charlatans and it is painful to watch the brand be used by one of them to enhance their bottom line.
If you're not familiar with Review Signal, I suggest start by looking at our full dataset. Alternatively, you can read about how it works where our entire methodology is detailed including the algorithms used to generate our ratings. The gist of it is we use twitter data to listen to what good and bad things are saying about web hosting companies and publish the results. We validate our method using the few limited available metrics like NPS scores when given the opportunity.
Happy to announce a lot of new additions to Review Signal including our first UK companies (HeartInternet and TsoHost). UK companies are displayed with a UK flag in search results and on the company pages.
Overall score is in parentheses after the company.
A2 Hosting (49%)
2016 Year in Review
I like to take this opportunity to look back at the year and see how Review Signal has changed. This past year we added ~36,000 new reviews. Added one new company: WebFaction. 49.6% of reviews were positive overall. 52.1% of unique reviews were positive. What is interesting about the difference is that people with negative things to say were more likely to send multiple negative messages, but as a whole more individual people said positive things than negative.
This year was also full of interesting articles that took advantage of our unique position in the web hosting review space. The WordPress Hosting Performance Benchmarks (2016) was the biggest hit as usual. It grew massively in size/scope and tested companies across multiple price tiers up to Enterprise WordPress Hosting.
I also wrote about the Dirty, Slimy Secrets of the Web Hosting Review Underworld. I also tracked some major changes with The Rise and Fall of A Small Orange and The Sinking of Site 5 which tracked Endurance International Group acquisitions and how their ratings fell post-acquisition. A Small Orange's fall from grace even caused the first ranking algorithm update on Review Signal's history.
A big congratulations goes out to all of this years winners.
It leaves quite a large gap between SiteGround (72%) and pretty much everyone else still in the shared hosting space (<60%).
I do wonder if this is a bellwether for shared hosting becoming a thing of the past. There are still millions of people on it and in all likelihood will continue to be. But we've seen the rise of all sorts of specialty hosting which is likely eating up a lot of the market. The rise of developer oriented providers like Amazon, Azure, Digital Ocean have opened up the floodgates for building services on top of them. We've seen numerous companies built on top of these companies and targeting niches, especially WordPress like FlyWheel, Pagely. We've even seen configurable providers like CloudWays which lets you select the cloud provider of your choice and install and manage your websites on them.
These new hosting providers are charging more and giving different experiences to users. Developers have flocked to them and are building the next generation of web hosting services. High quality companies seem to be moving up market, charging more and providing more where I'm guessing the margins are substantially better than in the shared hosting space unless you're trying to upsell everything.
It will be interesting to to follow, will we continue to see more consolidation ala EIG and GoDaddy? Is there room for another great shared hosting provider that grows very large? Or will shared hosting slowly fade away as superior technologies (VPS) and specialized companies eat away at it providing the specific services people really want. We've also seen non-webhosts like SquareSpace, Wix and Weebly gain large market shares. On the BuiltWith estimates ranging from 880,000-1.6m websites for each of them.
The one trend I am not a fan of is that there are fewer and fewer really good choices in the shared hosting space that are of significant scale.
WebFaction specializes in web hosting for developers in the VPS/Cloud market with offers starting at just $10/month with all servers fully managed (hardware, os, system).
WebFaction has some of the best reviews of any company we currently track, only trailing FlyWheel as of writing.
It's been approximately three and a half years since Review Signal launched.
The mission was simple: provide honest web hosting reviews.
(Almost) Everyone wants that. Consumers would love to not get screwed over by fake reviews/recommendations. Tech savvy consumers have all but given up on honest web hosting reviews even existing.
So why has it been so difficult to spread the word about what Review Signal does and why it's different? How come nobody else is really making a strong effort to do the same?
The easiest explanation is money. Money corrupts everything is a pretty common belief and in the web hosting world it's practically the law of the land.
Many web hosting companies are willing to pay hundreds of dollars for you to sign up new customers with them. And it's generally not the ones you would in good faith recommend to a friend. And these companies hire many people with the sole goal of convincing reviewers, bloggers, anyone with a voice that they should sell out.
And it's worked.
From some of the largest players like Drupal and WordPress down to the small, anonymous review sites that plague Google's search results for web hosting reviews. They have sold consumers out; for millions of dollars into their pocket.
How Are Web Hosting Companies Paying Hundreds of Dollars for a $5/month plan?
Let's look at underlying numbers that make this whole business possible before we continue. It seems crazy that companies could offer hundreds of dollars per sale for such small purchases.
The basic goal is Lifetime Value of Customer (LTV) > Customer Acquisition Cost (CAC)
CAC is the hundreds of dollars they pay someone to send them a new customer. So the value they are getting from a referred sale must be greater than the X hundred dollars they pay.
So how are they getting hundreds of dollars per customer? Lockins and cross/up-sells are the primary ones. They generally only give the super discounted rates for customers who commit to long-term contracts (1-3 years generally) and often require you to pay for it entirely up-front. So that $5/month hosting deal, may cost $180 up front ( $5/month * 36 months = $180). That's before they have attempted to sell you any extra services such as backups, domains, security, premium features, etc. They don't have to make more on any specific customer, but they know in aggregate how much extras they are going to sell.
If you're really curious, I dug into the financials of some of the publicly traded companies (EIG, GoDaddy, Web.com) to see what some of those numbers looked like. They were getting between $100-180 per subscriber per year. I'm also fairly sure that most customers stay for longer than a year.
So if companies are extracting $180/year/subscriber, paying a $200 commission for a new subscriber is a no brainer if the new subscriber stays over 14 months. Suddenly, the economics of these incredibly high payouts should make sense.
Back to the Corruption
Corruption doesn't happen in a bubble. Someone has to be corrupted. In many cases, it would seem the pure motivation of making a lot of money is enough. Most people create a site dedicated to pimping their visitors to the highest paying companies.
In other cases, there is blatant astro-turfing going on.
Perfect example from the techcrunch article about reviews being a cesspool, now deleted of course.
But the most hidden corruption happens behind the scenes. It's the people with titles like Affiliate Program Manager and Partner Marketing Specialist. For many of these companies, their job is to try to convince people to use their brand/authority to sell the company's product for a commission.
What Happens Behind The Scenes of Operating a Web Hosting Review Site?
I'm going to show you exactly what kind of offers I get regularly here at Review Signal.
This is just a tiny sample of the 'offers' I get regularly. Most look like the email from dedicatedsolutions, trying to convince me to sign up for their affiliate program. Some, like Eli Saad from domain.com straight up tell me that my rankings are for sale (really classy). Alec from tdwebservices won't stop spamming me and refuses to remove me from his list, while literally offering to provide reviews of his own company for me to publish (vomit). Scroll to the bottom for bonus Alec Mwali material. And I've redacted someone from InMotionHosting's name because they were extremely apologetic, but they asked to be placed in the top 5 (sorry, they are based on actual reviews, not paid for).
A lot of companies just ask to be listed and mention their affiliate program as the reason why it should happen. They don't even think twice about what they are implying, it's become so ingrained in the culture of web hosting reviews that they are all for sale that nobody even takes a moment to realize how f***ed up that is.
Consider this me putting up notice, I will be periodically publishing the slimy emails and offers I get here at Review Signal. You may be named and shamed. So don't do it.
BONUS ALEC MWALI MATERIAL
Alec has contacted me on behalf of TDWebServices, Unihost and Codeguard. He has sent me full word docs with fake reviews to publish. He repeatedly uses the fake 'Re:' topic to get people to open and read his emails. When called out about it, he claims it 'was not meant to happen' and it 'keyboard error' or an 'issue with my email platform'. I think you better invest in a better keyboard and email platform, because your current one seems to be stuck in spam mode.
Update: CodeGuard no longer works with Alec. (Source)
In a recent article, The Rise and Fall of A Small Orange, it became quite apparent that our ranking algorithm here at Review Signal needed an update. Review Signal launched on September 25, 2012 which was almost 3.5 years ago. At launch, we had data from as early as 2011, which means this site's data is up to 5 years old today. It wasn't an issue back then because the oldest data would be at most, two years old and still relevant.
Today, our older data isn't really as relevant as it once was. A Small Orange exposed that weakness. It was an issue I knew I would have to deal with eventually, but nobody has really made the system fail until now. Since writing about The Rise and Fall of A Small Orange, I've been working hard to figure out a good way to update the ranking algorithm.
The solution I have come up with is a decay function. Older reviews will be worth a fraction of their more recent counterparts.
(1/(ABS(TIMESTAMPDIFF(YEAR, NOW(), max(timestamp)))+1))
This is the mathematical formula that Review Signal will now be using to calculate the value of a review.
An English explanation would be that for every year old the review is, it becomes worth one divided by years old. A one year or less old review would be worth 1/1 or 1.00. A two year old review would be worth 1/2 or 0.5. A 3 year old review would be worth 1/3 or .33 and so on.
This allows old reviews to still be a part of a company's rankings, but with a strong bias towards more recent reviews so that if a company starts performing poorly, it will decline faster in the rankings.
Checkout the full chart of how these changes affect the rankings and ratings of every published company.
Perhaps the most interesting column is how the Overall Ranking changed because of this algorithm update which I have included below. A Small Orange has the biggest change by a wide margin. HostWay lost a lot as well, but it was already at the bottom and the difference between 36% (old) and 27% (new) isn't very meaningful when you only fall two ranking spots.
HostMonster, Arvixe, HostGator, JustHost, BlueHost and some other EIG brands falling a bit more isn't surprising. It does highlight how old reviews were keeping them slightly higher than they should be, but none were ranked particularly well.
WebSynthesis dropping was a bit of a surprise. Still a decent rating at 62%, but a pretty substantial of 7 ranking places which dropped it from 10th to 17th.
On the other end, there is a lot less change upwards. However, Pagely got a nice little boost which jumped it 8 places upwards to 12th.
|A Small Orange||-10.57|
|IX Web Hosting||-2.53|
|1 and 1||-0.95|
Another year, another mountain of data added to the largest web hosting review site. This year we added over 49,000 new reviews (a slight increase from the 45,000 last year). We added two new companies in Arvixe and Site5, both of which are now owned by EIG. We published our first WordPress Plugin WPPerformanceTester. WPPerformanceTester was built for our WordPress Hosting Performance Benchmarks which we performed yet again with our largest batch of companies ever. We even got some outside validation from LiquidWeb which published its internal NPS benchmarks which matched very closely to their Review Signal Rating.
But the year ended on a somewhat sour note with The Rise and Fall of A Small Orange. It tells the story of ASO and how they've played such a huge role on this site. Including winning at least one of these awards every year since inception. But not anymore. So without further ado...
For the first time ever someone besides A Small Orange [Reviews] has won the best shared web hosting. A huge congratulations to LiquidWeb [Reviews]! They also managed to pickup the Best Managed VPS hosting award.