It's strange writing a positive article about a competitor. This industry is so full of shit, that it feels weird to be writing this. But if we don't recognize the good as well as the bad, what's the point?
In order to bring you more accurate web hosting reviews, HostingReviews.io is now merged with HostingFacts.com and soon to be redirected to HostingFacts.com-HostingReviews.io popup notice
HostingReviews.io was a project created by Steven Gliebe. It was basically a manually done version of Review Signal. He went through by hand categorizing tweets and displaying data about hosting company. Besides the automated vs manual approach, the only other big difference was he didn't use affiliate links. It was bought out by HostingFacts.com last year sometime and left alone.
I'm truly saddened because it's disappearing at some point 'soon.' The only real competitor whose data I trusted to compare myself against. So I thought I would take the opportunity to write about my favorite competitor.
I am consciously not linking to either in this article because HostingFacts.com, who purchased HostingReviews.io, has HostGator listed as their #1 host in 2017 who are rated terribly by both Review Signal (42%) and HostingReviews.io (9%). So whatever their methodology purports to be, it's so drastically out of line with the data Steven and I have meticulously collected for years that I don't trust it one bit. Not to mention their misleading and bizarre rankings showing BlueHost #4 and in the footer recommending A Small Orange as the #4 rated company. In their actual review, they recommend A Small Orange, I guess they missed The Rise and Fall of A Small Orange and the fact that HostingReviews.io (March 2017) has ASO at 27%.
It would be easy to be upset that someone copied the same idea, but the reality is, it's quite helpful. Steven is a completely different person, with different interests, values and techniques. He also didn't put any affiliate links which many people believe are inherently corrupt. But our results for the most part, were very much the same. So the whole idea that affiliate links corrupts Review Signal rankings, I could pretty confidently throw out the door.
I just want to clarify that I had actually built most of the site before seeing Review Signal. I probably wouldn't have started if I knew about yours first. We came up with the same idea independently. They are similar but nothing was copied. I was a bit disappointed to find out that I wasn't first but later happy, after chatting with you and seeing that we were kind of the Two Musketeers of web hosting reviews.
- Steven Gliebe
I decided to actually look at how close we really were using old archive.org data from Jan 2015 when his site was still being actively updated a lot.
Comparing HostingReviews.io vs Review Signal Ratings (Jan 2015). I've only included companies that both sites covered for comparison's sake.
|Company||HR Rating||Review Signal Rating||Rating Difference||HR Rank||RS Rank||Rank Difference|
|A Small Orange||74||77||3||5||2||3|
|Amazon Web Services||55||67||12||13||13||0|
|IX Web Hosting||0||38||38||34||31||3|
The biggest difference is Pagely. I'm not sure why we're so different, but it could be a few factors: small sample size (HR had 42 reviews vs RS having 291), time frame (Review Signal has been collecting data on companies as early as 2011) or perhaps categorization methodology.
To calculate the actual ratings, we both used the same simple formula of % positive reviews (Review Signal has since changed it's methodology to decrease the value of older views). There is a lot greater difference there than between ranking order. This could also be a sampling or categorization issue, but the rankings actually were a lot closer than rating numbers especially at the bottom. The biggest differences were about Pagely, WiredTree, WebSynthesis, Azure, RackSpace, HostGator, and LunarPages. Review Signal had most of those companies higher than HostingReviews with the exceptions of Pagely and WiredTree. WiredTree the actual % difference isn't that high, it looks more distorted because of how many companies were ranked in that neighborhood. Pagely still remains the only concerning discrepancy, some of which could be corrected by using a different rating algorithm to compensate for small sample sizes (Wilson Confidence Interval). If you use a Wilson Confidence Interval with 95% confidence, Pagely would have 67% which makes the difference only 11%. Something still is off, but I'm not sure what. Towards the bottom, HostingReviews had companies with a lot lower ratings in general. I'm not sure why that is, but I'm not sure that it concerns me that greatly if a company is at 20 or 40%, that's pretty terrible through any lens.
The Wilson Confidence Interval is something I'm a big fan of, but the trouble is explaining it. It's not exactly intuitive and most users won't understand. To get around that problem here at Review Signal, I don't list companies with small sample sizes. I think it's unfair to small companies because those companies will always have lower scores using a Wilson Confidence Interval.
I always thought if you were going to list low data companies, you would have to use it for the ratings to be meaningful. So I went ahead and applied it to HostingReviews since they list low data companies.
|Company||HR Rating||HR Wilson Score||RS Rating||Wilson HR Rank||RS Rank||Rank Difference|
|A Small Orange||74||0.665222315806623||77||5||2||3|
|Amazon Web Services||55||0.50771491300961||67||10||13||-3|
|IX Web Hosting||0||0||38||34||31||3|
This actually made the ranked order between companies even closer between Review Signal and HostingReviews. Pagely and WebSynthesis are still the two major outliers which suggests we have a more fundamental problem between the two sites and how we've measured those companies. But overall, the ranks got closer together, the original being off by a total of 124 (sum of how far off each rank was from one another) and Wilson rank being 104 which is 16% closer together. A win for the Wilson Confidence Interval and sample sizing issues!
Bonus: HostingReviews with Wilson Confidence Interval vs Original Rating/Ranking
|Company||Rating||Reviews||Wilson Score||Rating Difference||Rank||Wilson Rank||Difference|
|A Small Orange||74||153||0.665222315806623||7||7||7||0|
|Google Cloud Platform||70||101||0.604645970406924||10||11||11||0|
|Amazon Web Services||55||537||0.50771491300961||4||18||14||4|
|Web Hosting Hub||38||34||0.237049871468482||14||29||29||0|
|IX Web Hosting||0||45||0||0||49||49||0|
You will notice the biggest differences are companies with more reviews generally moving up in rank, small sample sizes move down. Because the sample sizes are so small on some companies, you can see their % rating drops dramatically. But since most companies don't have a lot of data, it doesn't influence the rankings as much.
It's been nice having HostingReviews.io around when it was actively being updated (the manual process is certainly overwhelming for any individual I think!). I will miss having a real competitor to compare what I'm seeing in my data. I don't know the new owners, but I do consider Steven, the creator, a friend and wish him the best of luck going forward while he works on his primary business, ChurchThemes.com. It saddens me to see the new owners ruining his work with what looks like another mediocre affiliate review site pushing some of the highest paying companies in the space. But it's yet another unfortunate reminder of why I'm so disappointed by the web hosting review industry.
Sources: All data was pulled from Archive.org.
Latest posts by Kevin Ohashi (see all)
- Analyzing Digital Ocean’s First Major Move with Cloudways - February 28, 2023
- Removing old companies - June 28, 2021
- WordPress & WooCommerce Hosting Performance Benchmarks 2021 - May 27, 2021
Thank you for the well wishes. It was always gratifying to see our data correlate. Press on with your excellent work.
Pingback: In Case You Missed It – Issue 18 – WordPress Tavern
I’d like to know what do you think of this review website:
I think he has posted some interesting stuff. He keeps a good list of EIG brands. He’s certainly not a fan of them. The companies he is promoting are generally well rated here or too small for me to say. They look like darling companies (popular with a small group/niche/community). I’m wary of darling companies because the data doesn’t support their position. When you see one mentioned in a negative light in a community, the treatment is different than had you written the exact same thing about a disliked brand like EIG.
I think his intentions are likely good, but I am not sure I like his methodologies. For example, his performance monitoring “The third place is taken by StableHost. Its uptime is 99.73%, average full page load time is 0.88 seconds and satisfactory Apdex is 99.60.” Why is he giving an award (bronze model) to a company at 99.73% uptime. He even writes this uptime level is not ok (his written threshold is 99.9, which I think is fine). But he monitored three companies and gave third place award to one who was a completely failure by his own metric. That seems misleading to me and he admits he’s picking what he believes are the best and his favorites. This is the fundamental problem with darling companies in my mind, they get special treatment where personal opinions are influencing the results. Also hand picking only the companies you like before you apply ‘objective’ methods isn’t an objective methodology. EIG has terrible ratings here for all their brands, but I’ve given them no different treatment from popular companies like SiteGround, LiquidWeb and WPEngine which sit up high on his list. To me, that’s objective, giving fair treatment to all brands regardless of personal feelings or even that of the communities around you.
I also don’t see much value in the performance monitoring he does, seeing companies wildly flop about on rankings every month doesn’t give anyone useful information in my mind. At least on the level he’s giving it: awarding gold, silver, bronze medals every month. At best it should be detecting longer term problems with companies who have slow servers/slowing down/bad networks.
Overall, I think he’s probably a good guy trying to do the best he can with good intentions. I don’t agree with his methods and think they could be improved, there is a little too much personal influence on what is said and tested for my liking. But definitely looks better than most of the competition out there. So anyone who is trying to make this space better I can appreciate.
yours and hosting reviews are sites that rare, 98% of the sites list hostgator and bluehost as top sites, not because they are actual top hosts, just cause the comissions they get, which is totally dishonest.
way too many sites do this when searching for top hosts.