goodbye hostingreviews

Goodbye HostingReviews.io, I Will Miss You

It's strange writing a positive article about a competitor. This industry is so full of shit, that it feels weird to be writing this. But if we don't recognize the good as well as the bad, what's the point?

In order to bring you more accurate web hosting reviews, HostingReviews.io is now merged with HostingFacts.com and soon to be redirected to HostingFacts.com
-HostingReviews.io popup notice

HostingReviews.io was a project created by Steven Gliebe. It was basically a manually done version of Review Signal. He went through by hand categorizing tweets and displaying data about hosting company. Besides the automated vs manual approach, the only other big difference was he didn't use affiliate links. It was bought out by HostingFacts.com last year sometime and left alone.

I'm truly saddened because it's disappearing at some point 'soon.' The only real competitor whose data I trusted to compare myself against. So I thought I would take the opportunity to write about my favorite competitor.

I am consciously not linking to either in this article because HostingFacts.com, who purchased HostingReviews.io, has HostGator listed as their #1 host in 2017 who are rated terribly by both Review Signal (42%) and HostingReviews.io (9%). So whatever their methodology purports to be, it's so drastically out of line with the data Steven and I have meticulously collected for years that I don't trust it one bit. Not to mention their misleading and bizarre rankings showing BlueHost #4 and in the footer recommending A Small Orange as the #4 rated company. In their actual review, they recommend A Small Orange, I guess they missed The Rise and Fall of A Small Orange and the fact that HostingReviews.io (March 2017) has ASO at 27%.

It would be easy to be upset that someone copied the same idea, but the reality is, it's quite helpful. Steven is a completely different person, with different interests, values and techniques. He also didn't put any affiliate links which many people believe are inherently corrupt. But our results for the most part, were very much the same. So the whole idea that affiliate links corrupts Review Signal rankings, I could pretty confidently throw out the door.

I just want to clarify that I had actually built most of the site before seeing Review Signal. I probably wouldn't have started if I knew about yours first. We came up with the same idea independently. They are similar but nothing was copied. I was a bit disappointed to find out that I wasn't first but later happy, after chatting with you and seeing that we were kind of the Two Musketeers of web hosting reviews.

- Steven Gliebe

I decided to actually look at how close we really were using old archive.org data from Jan 2015 when his site was still being actively updated a lot.

Comparing HostingReviews.io vs Review Signal Ratings (Jan 2015). I've only included companies that both sites covered for comparison's sake.

Company HR Rating Review Signal Rating Rating Difference HR Rank RS Rank Rank Difference
Flywheel 97 93 -4 1 1 0
Pagely 81 56 -25 2 18 -16
SiteGround 79 74 -5 3 4 -1
WiredTree 75 67 -8 4 12 -8
A Small Orange 74 77 3 5 2 3
Linode 72 74 2 6 5 1
WP Engine 72 74 2 7 6 1
LiquidWeb 70 70 0 8 8 0
DigitalOcean 66 75 9 9 3 6
MidPhase 61 59 -2 10 16 -6
HostDime 60 60 0 11 14 -3
Servint 59 60 1 12 15 -3
Amazon Web Services 55 67 12 13 13 0
SoftLayer 50 70 20 14 9 5
DreamHost 49 55 6 15 19 -4
Synthesis 47 74 27 16 7 9
Microsoft Azure 43 70 27 17 10 7
InMotion Hosting 43 52 9 18 20 -2
WestHost 43 51 8 19 21 -2
Rackspace 40 69 29 20 11 9
Media Temple 35 58 23 21 17 4
GoDaddy 30 45 15 22 25 -3
HostMonster 25 42 17 23 29 -6
Bluehost 22 46 24 24 24 0
Just Host 22 39 17 25 30 -5
Netfirms 21 44 23 26 26 0
Hostway 18 44 26 27 27 0
iPage 16 44 28 28 28 0
HostGator 13 49 36 29 22 7
Lunarpages 11 49 38 30 23 7
Mochahost 10 20 10 31 34 -3
1&1 6 36 30 32 32 0
Verio 5 35 30 33 33 0
IX Web Hosting 0 38 38 34 31 3

The biggest difference is Pagely. I'm not sure why we're so different, but it could be a few factors: small sample size (HR had 42 reviews vs RS having 291), time frame (Review Signal has been collecting data on companies as early as 2011) or perhaps categorization methodology.

To calculate the actual ratings, we both used the same simple formula of % positive reviews (Review Signal has since changed it's methodology to decrease the value of older views). There is a lot greater difference there than between ranking order. This could also be a sampling or categorization issue, but the rankings actually were a lot closer than rating numbers especially at the bottom. The biggest differences were about Pagely, WiredTree, WebSynthesis, Azure, RackSpace, HostGator, and LunarPages. Review Signal had most of those companies higher than HostingReviews with the exceptions of Pagely and WiredTree. WiredTree the actual % difference isn't that high, it looks more distorted because of how many companies were ranked in that neighborhood. Pagely still remains the only concerning discrepancy, some of which could be corrected by using a different rating algorithm to compensate for small sample sizes (Wilson Confidence Interval). If you use a Wilson Confidence Interval with 95% confidence, Pagely would have 67% which makes the difference only 11%. Something still is off, but I'm not sure what. Towards the bottom, HostingReviews had companies with a lot lower ratings in general. I'm not sure why that is, but I'm not sure that it concerns me that greatly if a company is at 20 or 40%, that's pretty terrible through any lens.

The Wilson Confidence Interval is something I'm a big fan of, but the trouble is explaining it. It's not exactly intuitive and most users won't understand. To get around that problem here at Review Signal, I don't list companies with small sample sizes. I think it's unfair to small companies because those companies will always have lower scores using a Wilson Confidence Interval.

I always thought if you were going to list low data companies, you would have to use it for the ratings to be meaningful. So I went ahead and applied it to HostingReviews since they list low data companies.

Company HR Rating HR Wilson Score RS Rating Wilson HR Rank RS Rank Rank Difference
Flywheel 97 0.903664487437776 93 1 1 0
SiteGround 79 0.725707872732548 74 2 4 -2
Linode 72 0.667820465467272 74 3 5 -2
Pagely 81 0.667522785017387 56 4 18 -14
A Small Orange 74 0.665222315806623 77 5 2 3
WP Engine 72 0.664495409977001 74 6 6 0
DigitalOcean 66 0.632643582791056 75 7 3 4
WiredTree 75 0.62426879138105 67 8 12 -4
LiquidWeb 70 0.591607758450507 70 9 8 1
Amazon Web Services 55 0.50771491300961 67 10 13 -3
DreamHost 49 0.440663157139283 55 11 19 -8
Servint 59 0.410929374988646 60 12 14 -2
SoftLayer 50 0.387468960047243 70 13 9 4
MidPhase 61 0.3851843256587 59 14 16 -2
Microsoft Azure 43 0.379726217475451 70 15 10 5
HostDime 60 0.357464427565077 60 16 15 1
Rackspace 40 0.356282970938665 69 17 11 6
Synthesis 47 0.317886056933924 74 18 7 11
Media Temple 35 0.305726756042135 58 19 17 2
GoDaddy 30 0.277380620794128 45 20 25 -5
InMotion Hosting 43 0.252456868216651 52 21 20 1
WestHost 43 0.214851925523797 51 22 21 1
Bluehost 22 0.193489653693868 46 23 24 -1
Just Host 22 0.130232286167726 39 24 30 -6
HostGator 13 0.110124578122674 49 25 22 3
iPage 16 0.106548464670946 44 26 26 0
Netfirms 21 0.1063667334132 44 27 27 0
Hostway 18 0.085839112937093 44 28 28 0
Lunarpages 11 0.056357713906061 49 29 23 6
HostMonster 25 0.045586062644636 42 30 29 1
1&1 6 0.042907593743725 36 31 32 -1
Mochahost 10 0.017875749515721 20 32 34 -2
Verio 5 0.008564782830854 35 33 33 0
IX Web Hosting 0 0 38 34 31 3

This actually made the ranked order between companies even closer between Review Signal and HostingReviews. Pagely and WebSynthesis are still the two major outliers which suggests we have a more fundamental problem between the two sites and how we've measured those companies. But overall, the ranks got closer together, the original being off by a total of 124 (sum of how far off each rank was from one another) and Wilson rank being 104 which is 16% closer together. A win for the Wilson Confidence Interval and sample sizing issues!

Bonus: HostingReviews with Wilson Confidence Interval vs Original Rating/Ranking

Company Rating Reviews Wilson Score Rating Difference Rank Wilson Rank Difference
Flywheel 97 76 0.903664487437776 7 2 1 1
Kinsta 100 13 0.771898156944708 23 1 2 -1
SiteGround 79 185 0.725707872732548 6 5 3 2
Pantheon 84 49 0.713390268477418 13 3 4 -1
Linode 72 313 0.667820465467272 5 9 5 4
Pagely 81 42 0.667522785017387 14 4 6 -2
A Small Orange 74 153 0.665222315806623 7 7 7 0
WP Engine 72 278 0.664495409977001 6 10 8 2
DigitalOcean 66 1193 0.632643582791056 3 13 9 4
WiredTree 75 57 0.62426879138105 13 6 10 -4
Google Cloud Platform 70 101 0.604645970406924 10 11 11 0
LiquidWeb 70 79 0.591607758450507 11 12 12 0
Vultr 73 33 0.560664188794383 17 8 13 -5
Amazon Web Services 55 537 0.50771491300961 4 18 14 4
DreamHost 49 389 0.440663157139283 5 21 15 6
GreenGeeks 59 39 0.434429655157957 16 16 16 0
Servint 59 29 0.410929374988646 18 17 17 0
SoftLayer 50 72 0.387468960047243 11 19 18 1
Site5 49 83 0.385299666256405 10 22 19 3
MidPhase 61 18 0.3851843256587 22 14 20 -6
Microsoft Azure 43 358 0.379726217475451 5 24 21 3
HostDime 60 15 0.357464427565077 24 15 22 -7
Rackspace 40 461 0.356282970938665 4 28 23 5
Synthesis 47 36 0.317886056933924 15 23 24 -1
Media Temple 35 416 0.305726756042135 4 30 25 5
Arvixe 41 59 0.293772727671168 12 27 26 1
GoDaddy 30 1505 0.277380620794128 2 32 27 5
InMotion Hosting 43 23 0.252456868216651 18 25 28 -3
Web Hosting Hub 38 34 0.237049871468482 14 29 29 0
WebHostingBuzz 50 8 0.215212526824442 28 20 30 -10
WestHost 43 14 0.214851925523797 22 26 31 -5
Bluehost 22 853 0.193489653693868 3 34 32 2
Just Host 22 54 0.130232286167726 9 35 33 2
HostGator 13 953 0.110124578122674 2 41 34 7
iPage 16 128 0.106548464670946 5 39 35 4
Netfirms 21 34 0.1063667334132 10 36 36 0
Pressable 19 43 0.100236618545274 9 37 37 0
Hostway 18 34 0.085839112937093 9 38 38 0
Fasthosts 12 179 0.080208937918757 4 42 39 3
HostPapa 14 72 0.078040723708843 6 40 40 0
Globat 33 3 0.060406929099298 27 31 41 -10
Lunarpages 11 71 0.056357713906061 5 43 42 1
HostMonster 25 4 0.045586062644636 20 33 43 -10
1&1 6 540 0.042907593743725 2 46 44 2
Mochahost 10 10 0.017875749515721 8 44 45 -1
JaguarPC 10 10 0.017875749515721 8 45 46 -1
Verio 5 19 0.008564782830854 4 47 47 0
IPOWER 5 19 0.008564782830854 4 48 48 0
IX Web Hosting 0 45 0 0 49 49 0
PowWeb 0 10 0 0 50 50 0
MyHosting 0 9 0 0 51 51 0
WebHostingPad 0 5 0 0 52 52 0
HostRocket 0 2 0 0 53 53 0
Superb Internet 0 1 0 0 54 54 0

You will notice the biggest differences are companies with more reviews generally moving up in rank, small sample sizes move down. Because the sample sizes are so small on some companies, you can see their % rating drops dramatically. But since most companies don't have a lot of data, it doesn't influence the rankings as much.

Conclusion

It's been nice having HostingReviews.io around when it was actively being updated (the manual process is certainly overwhelming for any individual I think!). I will miss having a real competitor to compare what I'm seeing in my data. I don't know the new owners, but I do consider Steven, the creator, a friend and wish him the best of luck going forward while he works on his primary business, ChurchThemes.com. It saddens me to see the new owners ruining his work with what looks like another mediocre affiliate review site pushing some of the highest paying companies in the space. But it's yet another unfortunate reminder of why I'm so disappointed by the web hosting review industry.

Sources: All data was pulled from Archive.org.

http://web.archive.org/web/20150113073121/http://hostingreviews.io/

http://web.archive.org/web/20150130063013/https://reviewsignal.com/webhosting/compare/

The following two tabs change content below.
avatar
Kevin Ohashi is the geek-in-charge at Review Signal. He is passionate about making data meaningful for consumers. Kevin is based in Washington, DC.





4 thoughts on “Goodbye HostingReviews.io, I Will Miss You

  1. Pingback: In Case You Missed It – Issue 18 – WordPress Tavern

    1. avatarKevin Ohashi Post author

      Rick,

      I think he has posted some interesting stuff. He keeps a good list of EIG brands. He’s certainly not a fan of them. The companies he is promoting are generally well rated here or too small for me to say. They look like darling companies (popular with a small group/niche/community). I’m wary of darling companies because the data doesn’t support their position. When you see one mentioned in a negative light in a community, the treatment is different than had you written the exact same thing about a disliked brand like EIG.

      I think his intentions are likely good, but I am not sure I like his methodologies. For example, his performance monitoring “The third place is taken by StableHost. Its uptime is 99.73%, average full page load time is 0.88 seconds and satisfactory Apdex is 99.60.” Why is he giving an award (bronze model) to a company at 99.73% uptime. He even writes this uptime level is not ok (his written threshold is 99.9, which I think is fine). But he monitored three companies and gave third place award to one who was a completely failure by his own metric. That seems misleading to me and he admits he’s picking what he believes are the best and his favorites. This is the fundamental problem with darling companies in my mind, they get special treatment where personal opinions are influencing the results. Also hand picking only the companies you like before you apply ‘objective’ methods isn’t an objective methodology. EIG has terrible ratings here for all their brands, but I’ve given them no different treatment from popular companies like SiteGround, LiquidWeb and WPEngine which sit up high on his list. To me, that’s objective, giving fair treatment to all brands regardless of personal feelings or even that of the communities around you.

      I also don’t see much value in the performance monitoring he does, seeing companies wildly flop about on rankings every month doesn’t give anyone useful information in my mind. At least on the level he’s giving it: awarding gold, silver, bronze medals every month. At best it should be detecting longer term problems with companies who have slow servers/slowing down/bad networks.

      Overall, I think he’s probably a good guy trying to do the best he can with good intentions. I don’t agree with his methods and think they could be improved, there is a little too much personal influence on what is said and tested for my liking. But definitely looks better than most of the competition out there. So anyone who is trying to make this space better I can appreciate.

      Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Current day month ye@r *

Loading...

Interested in seeing which web hosting companies people love (and hate!)? Click here and find out how your web host stacks up.