Category Archives: Review Signal News

Playing With Data for Fun and Profit (SlideShare Presentation)

Original post from my personal blog

I was asked to talk at Howard University to their digital business class, which focuses on the use of Social Media, Mobile Apps & Platforms, Data Analytics, and Cloud Computing as strategic assets to be utilized in business.

If anyone is curious to learn about the history of Review Signal and where the idea came:

I also built a demo to let the students learn about and play with sentiment analysis in real-time. It used the Movie Review corpus from Cornell. It's a very primitive keyword based system but I thought illustrated the concept well.

Sentiment Analysis Demo

The demo is the close to the first try I made at sentiment analysis. What is in use at Review Signal is infinitely more complex, but if you're curious to learn about sentiment analysis and prefer visual learning, I think it suits that purpose well.

Amazon Giveaway Marketing Results and Advice

Amazon Launched Amazon Giveaway platform one week ago. I immediately thought it would be interesting to try out as a marketing channel. I've never done a giveaway. Although multiple hosting companies have offered to give free plans if I advertise their services; which didn't seem ethical given what we do.

Here's what I learned.

What to Give Away?

The first question is what product would correlate well with the service I'm offering (web hosting reviews). I looked at web hosting books and that was about the only thing related on web hosting on Amazon. But they looked terrible and I wouldn't want one, so why would my audience? So I had to get creative. I decided the easiest thing for me would be flash drives, a very common promotional item. Maybe I could spin a message about backing up your data (I honestly don't think the product selection and message was that good). I think the more tailored the giveaway is to your company the better. If you can giveaway your own product that would be the best.

How Amazon Giveaway Works

Basically, you just browse around Amazon looking for a product that says

setup-amazon-giveaway

Then you choose how you want to run it. There are currently two options. One gives away X items to the first N people who click. The second option gives away X items to the Nth person to click. I think you would be crazy to use option 1 and I'm not sure why it's even an option. So I will pretend everyone is selecting to giveaway an item for every Nth person.

The second setting is whether it's free or you a visitor to follow you on Twitter. I think a Twitter follow should be the default setting. Otherwise you get nothing for your giveaway beyond people looking at the landing page. You have no idea who they are and have no means to contact them again otherwise.

Then you pay for the X items you are giving away (plus shipping).

The giveaway lasts for one week and you get a refund at the end of any unspent money.

The Marketing

How you market your giveaway probably has the biggest impact on how well it does. There's also probably a correlation with the quality of the giveaway and the targeting of it towards the audience you're after.

Our giveaway did the minimal marketing effort. It was posted on our Facebook page once. I also posted it on Twitter with the #AmazonGiveaway hashtag a few times. It was also posted to Reddit /r/AmazonGiveaways. I didn't use any targeted distribution for my audience, so it was the lowest common denominator of marketing.

However, I got really lucky and @Amazon retweeted me.

AmazonReviewSignal

Which caused this:

amazonRTeffect

And this:

amazonRTanalytics

So lots of followers, but definitely no conversions.

 

The Results

reviewsignal-giveaway

I configured my giveaway to be every 500 people wins and I was willing to give 20 of them away. I only gave away 3 and was refunded $244.76.

Amazon also emailed me some basic analytics (which I can't find on the dashboard)

giveaway-analytics

So the net result was I spent $40.24 to get 1854 followers. Of which, 444 unfollowed me within that week. I started with 357 followers, and now have 1767. So roughly 24% of the followers I bought were worthless and I only really got 1410 real new followers. So the total cost per new follower was 2.8 cents.

I was hoping to get in early on a new product and had to guess about a lot of things without anything to go on. Marketing basics still should apply if you're doing a giveaway. You need to market it towards your audience and giveaway something that they are most likely to care about. You also can't expect great results just by doing the bare minimum. I think I got lucky because Amazon retweeted me (which was one of the hopes of jumping in early, getting early press). But it's nothing to bank on.

If I were going to do it again, I would do it very differently. I wouldn't bother with #AmazonGiveaway hashtag and create a landing page specifically for the purpose. I would highlight what I am giving away and also highlight what Review Signal does. I would probably try to capture an email address before giving someone a link to the giveaway. It's more of a barrier, but also would hopefully filter people most interested in my giveaway (which hopefully would be very targeted towards my audience).

TL;DR Lessons:

  • Choose a product relevant to your company/business
  • Market the giveaway in places where your audience/customers are (not just tweeting #AmazonGiveaway)
  • Capturing Twitter followers is a mediocre reward, you should probably build some type of landing page to convert more users before they get the link to the giveaway

 

For the curious, some more analytics screenshots are below

The full analytics from the Tweet Amazon RT'd.

amazonRTfullanalytics

What the whole engagement/analytics looked like over the period from Twitter Analytics:reviewsignaltwitteranalytics

Amazon VS normal tweets with #AmazonGiveaway hashtagreviewsignaltwitteranalyticsamazontweet

What a normal non-giveaway tweet looked like:reviewsignaltwitteranalyticsnonamazon

The Best Web Hosting Companies in 2014

It's always interesting to look back at a year and analyze what happened. 2014 was the second full year of operation for Review Signal. Four new companies were published on Review Signal: Azure, FlyWheel, Pagely, WebSynthesis. We added roughly 45,000 new reviews (oddly enough about half as many as last year). We ran two massive performance testing reviews of managed WordPress hosting companies (1, 2).

So I finally got around to slicing and dicing the data exclusively looking at data collected in 2014 and here are the awards:

 

Best Shared Web Host: A Small Orange [Reviews]

2014-best-shared-asmallorange

Best Web Hosting Support: FlyWheel [Reviews]

2014-best-support-flywheel

Best Specialty Web Hosting: FlyWheel [Reviews]

2014-best-specialty-flywheel

Best Unmanaged VPS: Digital Ocean [Reviews]

2014-best-unmanaged-vps-digitalocean

Best Managed VPS: KnownHost [Reviews]

2014-best-managed-vps-knownhost

New comer FlyWheel [Reviews] has set the bar in terms of how high a company can fly (I'm sorry!). When I introduced FlyWheel they had the absolute highest numbers I've ever seen and continue to be in a tier of their own. They do WordPress hosting and that is it, so maybe there is some advantage to specialization. They took the best specialty hosting and support awards this year.

For the third consecutive year in a row, A Small Orange [Reviews] has the best shared web hosting.

Digital Ocean [Reviews] has become the fourth largest web hosting company in under two years according to netcraft. It's easy to understand why when they take home the best unmanaged VPS provider for a second year in a row.

Finally, a new-comer into our awards list, Known Host [Reviews] managed to take the Best Managed VPS award this year beating out last year's winner, A Small Orange.

 

WordPress Hosting Performance Benchmarks (November 2014)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

This is the second round of managed WordPress web hosting performance testing. You can see the original here. The latest (2015 Edition) can be found here.

Companies Tested

A Small Orange* [Reviews]
BlueHost [Reviews]
CloudWays* [Reviews]
DreamHost [Reviews]
FlyWheel* [Reviews]
GoDaddy* [Reviews]
Kinsta*
LightningBase*
MediaTemple* [Reviews]
Nexcess*
Pagely* [Reviews]
Pantheon* [Reviews]
PressLabs*
SiteGround*† [Reviews]
WebSynthesis* [Reviews]
WPEngine* [Reviews]

Note: Digital Ocean and Pressable were removed from testing.

*Company donated an account to test on. I checked to make sure I was on what appeared to be a normal server.

†Tests were performed with SiteGround's proprietary SuperCacher module turned on fully.

The Products (Click for Interactive Table)

 

wordpress hosting product chart screenshot

Methodology

The question I tried to answer is how well do these WordPress hosting services perform? I tested each company on two distinct measures of performance: peak performance and consistency.

All tests were performed on an identical WordPress dummy website with the same plugins except in cases where hosts added extra plugins. Each site was monitored for one month (July 2014) for consistency.

1. LoadStorm

LoadStorm was kind enough to give me unlimited resources to perform load testing on their platform and multiple staff members were involved in designing and testing these WordPress hosts. I created identical scripts for each host to load a site, login to the site and browse the site. Then I increased the user load until a web host started to fail. I stopped at 2000 concurrent users for the web hosts that were left unscathed by load testing. Logged in users were designed to break some of the caching and better simulate real user load which a lot of people (both readers and hosting companies) requested after the first round of testing.

2. Blitz.io

I used Blitz again to compare against previous results. First test was 60 seconds, scaling from 1-1000 users. The second test was 60 seconds, scaling from 1-2000.

3. Uptime (UptimeRobot and StatusCake)

Consistency matters. I wanted to see how well these companies performed over a longer period of time. I used two separate uptime monitoring services over the course of a month to test consistency.

4. WebPageTest.org

"WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster." WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. I tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Background Information

Before I go over the results I wanted to explain and discuss a few things. Every provider I tested had the latest version of WordPress installed. I had to ask a lot of companies to disable some security features to perform accurate load tests. Those companies were: GoDaddy, LightningBase, MediaTemple, SiteGround and WebSynthesis. I also asked DreamHost and WPEngine, but they refused my request.

Some companies were more cooperative than others. SiteGround spent hours with me customizing their security features to let the load testing tools bypass their security measures. PressLabs ran into an issue we were never able to resolve to get Load Storm to work properly on their servers. We spent hours trying to fix it, but couldn't find a solution. That's why they are missing some test data.

CloudWays is an interesting platform that let's you deploy your WordPress stack to either Digital Ocean or Amazon's EC2 servers. I was given a server on each platform of near comparable specs (EC2 Small 1.7GB vs Digital Ocean 2GB). So CloudWays is listed as CloudWays AWS and CloudWays DO to indicate which provider the stack was running on.

Pantheon was tested on their free development environment which I was told is identical to their production environment.

Results

Load Storm

I ran multiple Load Storm tests to get a sense of where to start testing. The first was 1-100 users, which not a single company struggled with. The second was 50-500 users, which again nobody struggled with. So the first meaningful test was 100-1000 users. For the companies that didn't struggle there, I did a 500-2000 user test. I ran these tests with an immense amount of help from Scott Price at LoadStorm. He spent hours with me, teaching me how to use LoadStorm, build tests and offering guidance/feedback on the tests themselves.

 Test 1. 100-1000 Concurrent Users over 30 minutes

 

Request Count Average RPS Peak Response Time (ms) Average Response Time (ms) Average Throughput (kB/s) Errors
A Small Orange 116127 64.52 2752 356 1318.55 41
BlueHost 107427 59.68 16727 1306 1159.55 13351
Cloudways DO 103359 55.57 16983 1807 1169.28 2255
Cloudways AWS 87447 47.01 16286 5436 821.75 18530
DreamHost 115634 62.17 15514 441 1244.31 4327
FlyWheel 116027 62.38 775 368 1287.86 0
GoDaddy 133133 71.58 1905 434 3883.42 0
Kinsta 116661 62.72 552 309 1294.77 0
LightningBase 117062 62.94 1319 256 1324.89 12
MediaTemple 116120 62.43 793 403 1304.27 0
Nexcess 116634 62.71 15085 294 1299.85 8
Pagely 119768 64.39 1548 461 1227.06 0
Pantheon 117333 63.08 528 264 1316.41 0
SiteGround 117961 63.42 939 165 180.09 0
WebSynthesis 116327 62.54 1101 332 1285.83 0
WPEngine 123901 68.83 10111 416 1302.44 2956

Discussion of Load Storm Test 1 Results

There was a pretty clear division of good and bad performance in this test. Most companies didn't struggle at all. A few collapsed: BlueHost, CloudWays AWS, CloudWays DO, and DreamHost. BlueHost started spewing 500 errors almost as soon as we started. CloudWays AWS started timing out immediately. CloudWays DO started having issues around 800 users and then started timing out. DreamHost started giving 503 Service Unavailable almost right away. It looks like our script triggered a security mechanism but they refused to work with me to test any further.

SiteGround ran into a security measure we weren't able to get around in time for publishing this article. The server seemed to just throttle the connection again.

PressLabs isn't listed because we couldn't get LoadStorm to work on their system. I am not sure what was different about their backend, but I tried to work with PressLabs and LoadStorm to get it working to no avail.

 

  • Load-Storm-A-Small-Orange-2000
  • Load-Storm-Fly-Wheel-2000
  • Load-Storm-GoDaddy-2000
  • Load-Storm-Kinsta-2000
  • Load-Storm-Lightning-Base-2000
  • Load-Storm-Nexcess-2000
  • Load-Storm-Pagely-2000
  • Load-Storm-Pantheon-2000
  • Load-Storm-SiteGround-2000
  • Load-Storm-WebSynthesis-2000

Test 2. 500 - 2000 Concurrent Users over 30 Minutes

I removed the hosts that failed and doubled the concurrent users for the second test.

Request Count Average RPS Peak Response Time (ms) Average Response Time (ms) Average Throughput (kB/s) Errors
A Small Orange 248249 133.47 5905 436 2639.68 0
FlyWheel 236474 127.14 3811 983 2499.11 16841
GoDaddy 285071 153.26 8896 371 8255.24 92
Kinsta 248765 133.74 942 316 2714.82 0
LightningBase 248679 133.7 3887 343 2763.92 23
MediaTemple 249125 133.94 1499 313 2748.32 9
Nexcess 243115 130.71 15097 388 2644.72 80
Pagely 256163 137.72 15078 446 2621.04 1
Pantheon 250063 134.44 1111 297 2754.67 0
WebSynthesis 240305 129.2 4389 743 2598.83 1173

Discussion of Load Storm Test 2 Results 

FlyWheel started to fail around 1500 users causing 502 errors and remained constant at that level of failure. I'm not sure what the bottleneck was, but it didn't overload the server, but I suspect the I/O of something bottle-necked causing a certain amount of requests to fail. WebSynthesis had a few errors as well, they were 5 separate spikes somewhat evenly spaced out. The server didn't show signs of failure, it looks like it might have been an issue with caches being refreshed and some requests failing in the meantime. WebSynthesis' error rate was still under 0.5%, so I don't have any real issue with those errors. The slower average response time can also be attributed to the spikes in performance.

Remarkably, some companies didn't even struggle. Kinsta kept sub one second response times for 30 minutes and nearly a quarter million requests. Most companies had a spike or two causing a higher peak response time, but Kinsta and Pantheon didn't (and Media Temple had a tiny one at 1.5 seconds). Simply amazing performance.

Another interesting note, GoDaddy pushed triple the amount of data through because their admin screen had a lot more resources being loaded. That's why the average throughput is so high. Despite that fact, it didn't seem to impact their performance at all, which is astounding.

Full Interactive Test Results

A Small Orange
FlyWheel
GoDaddy
Kinsta
LightningBase
MediaTemple
Nexcess
Pagely
Pantheon
SiteGround
WebSynthesis

Blitz.io

 Test 1. 1-1000 Concurrent Users over 60 seconds

Blitz Test 1. Quick Results Table

Success Errors Timeouts Avg Hits/Second Avg Response (ms)
A Small Orange 27595 14 0 460 67 ms
BlueHost 23794 1134 189 397 160 ms
CloudWays AWS 24070 162 148 401 138 ms
CloudWays DO 27132 118 127 452 49 ms
DreamHost 13073 45 7885 218 21 ms
FlyWheel 28669 20 10 478 27 ms
GoDaddy 26623 8 5 444 104 ms
Kinsta 27544 0 0 459 69 ms
LightningBase 27893 0 1 465 56 ms
MediaTemple 26691 8 9 445 102 ms
Nexcess 18890 2288 641 337 517 ms
Pagely 25358 9 0 423 156 ms
Pantheon 27676 21 0 461 64 ms
PressLabs 25903 143 0 432 89 ms
SiteGround 24939 0 0 416 152 ms
WebSynthesis 28913 0 0 482 19 ms
WPEngine 23074 121 4 385 247 ms

Discussion of Blitz Test 1 Results

I learned from the last round of testing that any hosting that isn't optimized at all for WordPress (default install) will get destroyed by these tests. So I didn't include any of them this time. There wasn't any as catastrophic failures this time.

Who performed without any major issues?

A Small Orange, FlyWheel, GoDaddy, Kinsta, LightningBase, MediaTemple, Pagely, Pantheon, SiteGround, WebSynthesis all performed near perfect. There's nothing more to say for these companies other than they did excellent. All of their error/time rates were below 0.5%.

Who had some minor issues?

CloudWays AWS, CloudWays DO, PressLabs and WPEngine. All four of these providers had over 100 errors/timeouts and had an error/timeout rates between 0.5%-2%. Not a huge deal, but definitely not perfect.

Who had some major issues?

BlueHost, DreamHost, and Nexcess. BlueHost started show stress around 40 seconds in and started to buckle around 47 seconds. DreamHost had a couple spikes in response time and errors. However, it looks like the load testing tool may have hit some type of security limit because requests started timing out but it gave very fast responses and maintained roughly 250 hits/second constantly. It doesn't look like the server was failing. I couldn't get them to disable the security to really test it, so it's hard to say much more. Nexcess started to show stress around 20 seconds and buckle around 30 seconds.

 Test 2. 1-2000 Concurrent Users over 60 seconds

  • Blitz-A-Small-Orange-2000
  • Blitz-Blue-Host-2000
  • Blitz-CloudWays-AWS-2000
  • Blitz-CloudWays-DO-2000
  • Blitz-Dream-Host-2000
  • Blitz-Fly-Wheel-2000
  • Blitz-GoDaddy-2000
  • Blitz-Kinsta-2000
  • Blitz-LightningBase-2000
  • Blitz-Media-Temple-2000
  • Blitz-Nexcess-2000
  • Blitz-Pagely-2000
  • Blitz-Pantheon-2000
  • Blitz-PressLabs-2000
  • Blitz-SiteGround-2000
  • Blitz-WebSynthesis-2000
  • Blitz-WPEngine-2000

Blitz Test 2. Quick Results Table

Success Errors Timeouts Avg Hits/Second Avg Response (ms)
A Small Orange 54152 26 1 903 77 ms
BlueHost 29394 14368 3408 490 234 ms
CloudWays AWS 25498 4780 8865 425 338 ms
CloudWays DO 53034 1477 49 884 58 ms
DreamHost 10237 5201 20396 171 201 ms
FlyWheel 56940 121 68 949 29 ms
GoDaddy 53262 29 64 888 105 ms
Kinsta 55011 32 0 917 69 ms
LightningBase 55648 0 0 927 58 ms
MediaTemple 53363 16 28 889 100 ms
Nexcess 25556 15509 4666 426 279 ms
Pagely 51235 41 2 854 147 ms
Pantheon 55187 91 0 920 65 ms
PressLabs 35547 4105 1569 592 326 ms
SiteGround 42645 490 220 711 276 ms
WebSynthesis 57776 1 0 963 20 ms
WPEngine 39890 304 333 665 364 ms

Discussion of Blitz Test 2 Results

Who performed without any major issues?

A Small Orange, FlyWheel, GoDaddy, Kinsta, LightningBase, MediaTemple, Pagely, Pantheon, WebSynthesis all performed near perfect. All of their error/time rates were around 0.5% or lower.

Who had some minor issues?

SiteGround and WPEngine. All four of these providers had over 100 errors/timeouts and had an error/timeout rates between 0.5%-2%. SiteGround started to show some stress around 30 seconds and didn't started to have real issues after 50 seconds (errors). WPEngine started to show stress around 20 seconds and performed slightly erratically until the end of the test.

Who had some major issues?

BlueHost, CloudWays AWS, CloudWays DO, DreamHost, Nexcess, and PressLabs. The four that had major issues from last around completely failed with error/timeout rates exceeding 50%. DreamHost who looked like it was fine behind the security measures buckled around 35 seconds into this test and started returning errors, increased response times and the hits/second dropped. CloudWays DO definitely started to stress and show signs of buckling around 50 seconds. But its error rate was still under 3%. I don't think it would have lasted much longer had the tests gone further, but it was the least worst failure. PressLabs was a surprise, it started to show stress around 25 seconds and started to buckle around 35 seconds into the test.

 Full Blitz Results (PDFs)

A Small OrangeBlueHost, CloudWays AWS, CloudWays DO, DreamHost, FlyWheel, GoDaddy, Kinsta, LightningBase, MediaTemple, Nexcess, Pagely, Pantheon, PressLabs, SiteGroundWebSynthesis, WPEngine.

Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. All the companies were monitored over an entire month (July 2014).

Uptime Robot

Uptime (%)
A Small Orange 100
BlueHost 99.71
CloudWays AWS 100
CloudWays DO 99.93
DreamHost 99.92
FlyWheel 99.97
GoDaddy 99.9
Kinsta 100
LightningBase 100
MediaTemple 99.81
Nexcess 100
Pagely 99.95
Pantheon 100
PressLabs 100
SiteGround 100
WebSynthesis 100
WPEngine 100

According to UptimeRobot not a single company was below 99.5% uptime. In fact, with the exception of Media Temple and BlueHost, they were all above 99.9% uptime. For reference 99.5% uptime is 3.5 hours of downtime per month. 99.9% is <45 minutes of downtime per month. Overall, nothing to really complain about according to Uptime Robot.

StatusCake

Availability (%) Response Time (ms)
A Small Orange 1 0.23s
BlueHost 0.9969 2.45s
CloudWays AWS 0.998 0.75s
CloudWays DO 1 2.41s
DreamHost 1 2.22s
FlyWheel 0.999 1.99s
GoDaddy 1 2.41s
Kinsta 1 2.13s
LightningBase 1 1.6s
MediaTemple 1 1.18s
Nexcess 1 2.33s
Pagely 1 2.49s
Pantheon 1 2.04s
PressLabs 1 1.49s
SiteGround 0.9993 1.64s
WebSynthesis 1 1.77s
WPEngine 1 2.76s

According to StatusCake, the results look even better. I used multiple services to monitor because there can be networking issues unrelated to a web host's performance. StatusCake only detected issues with four companies, which is fewer than UptimeRobot detected. It's hard to say which is better or right. But they both say that uptime didn't really seem to be an issue for any company.

StatusCake also provides an average response time metric. According to them, it's using a browser instance and fully rendering the page. They also have many different geographical locations that they are testing from. I don't have any further insight into these tools beyond what I can read on their documentation. If they are to be believed, A Small Orange has astonishingly fast performance. WPEngine is the slowest average load time at 2.76 seconds which isn't that bad.

 

WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only. This was tested against the default install from every company.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
A Small Orange 1.443 0.801 0.836 0.64 0.93
BlueHost 1.925 1.321 1.012 0.785 1.26075
CloudWays AWS 0.655 0.867 0.967 0.746 0.80875
CloudWays DO 0.493 0.851 1.036 0.811 0.79775
DreamHost 1.177 0.863 1.067 1.147 1.0635
FlyWheel 0.497 0.864 1.066 1.109 0.884
GoDaddy 1.607 1.355 0.934 0.855 1.18775
Kinsta 0.759 0.752 0.947 0.592 0.7625
LightningBase 0.584 0.787 0.936 0.675 0.7455
MediaTemple 1.516 0.983 0.955 0.555 1.00225
Nexcess 1.433 1.139 1.196 0.859 1.15675
Pagely 6.831 0.86 0.913 0.709 2.32825
Pantheon 0.654 0.828 0.923 0.954 0.83975
PressLabs 0.715 1.018 1.213 0.723 0.91725
SiteGround 1.392 1.239 1.01 1.212 1.21325
WebSynthesis 0.407 0.835 0.982 1.024 0.812
WPEngine 0.821 1.086 0.839 0.685 0.85775

There isn't much surprising here. The pack is really tight with less than a half second difference average between the top and bottom hosts. If we exclude Pagely. I'm not sure what happened with their Dulles, VA test, but it seems like there was something terribly wrong with the network when I tested it. The average response times from every other location were incredibly fast (<1 second). I'm going to chalk it up to a bad node somewhere causing that particular test to perform so poorly, almost certainly not a reflection of their hosting.

What is interesting, compared to last time is that these companies are getting faster. There was only one company with a sub 1 second average last time. Now there are 10 companies (11 if you count Pagely). Three of them were above one second last time, so they are showing signs of improvement (Pagely, WebSynthesis, WPEngine). It also means there is a lot of new competition that is not behind the entrenched players in terms of performance.

Conclusion

Every service seems to have their issues somewhere if you look hard enough. I try to avoid injecting my personal opinion and bias as much as possible. So I won't be ranking or outright saying any single company is the best. Some providers did exceptionally well and tended to clump together performance-wise, I will call those the top tier providers. This top tier designation is related to performance only and is claimed only from the results of these tests. What each of these companies is offering is different and may best suit different audiences depending on a variety of factors beyond performance, such as features, price, support, and scale (I tested mostly entry level plans). But I will provide a short summary and discussion of the results for each provider.

Top Tier WordPress Hosting Performance

A Small Orange, GoDaddy, Kinsta, LightningBase, MediaTemple, Pagely, Pantheon, WebSynthesis

Each of these companies were below the 0.5% error rate on all load testing all the way up to 2000 concurrent users on both LoadStorm and Blitz.

Honorable Mention

FlyWheel gets an honorable mention. They performed really well on many of the tests. FlyWheel fell apart on the final LoadStorm test to 2000 logged in users. I'll explain more in their individual section as to why this is deserving of an honorable mention.

Amazon Web Services (AWS) vs Digital Ocean

One of the most interesting comparisons to me was CloudWays. They provide you with the ability to choose which VPS provider and type you want. It then sets up their WordPress configuration (in an identical manner from my understanding) on the VPS. I was granted access to one Amazon and one Digital Ocean VPS from them. The Amazon was a small (1.7GB ram) and the Digital Ocean was a 2GB ram instance.

aws_vs_digital_ocean_loadstorm

The head-to-head results from LoadStorm (1000 user test) results above pretty clearly show Digital ocean performing better in every category (with the exception of Peak Response Time which is a timeout). Digital Ocean sent more data, had less errors and it did it faster.

aws_vs_digital_ocean_blitz

The Blitz.io results show pretty clearly that Digital Ocean is outperforming AWS by a wide margin as well. It delivered twice as many hits with less errors and time outs.

It's pretty easy to conclude based on the tests that on the low-end VPSs, that Digital Ocean's hardware outperforms Amazon's hardware.

Individual Host Analysis

A Small Orange

They've improved their LEMP stack since the last time I tested. They never buckled in any test and were definitely one of the best. Their staff was incredibly friendly (special thank you to Ryan MacDonald) and they've stepped up their performance game. The one thing that isn't quite there yet is the documentation/user experience, there are a lot of improvements they could make to make their LEMP stack more accessible to the less tech savvy. All in all, the experience was in-line with what I would expect from a company that has one of the highest support ratings on our site.

BlueHost

Their WordPress offering is brand new. It struggled in every load test. Their price is on the middle-high end but the performance was not. Ultimately, they fell short of where I would expect based on pricing and the competition.

CloudWays

CloudWays was certainly an interesting company to test given that they had two entries, one running on Amazon Web Services (EC2) and another on Digital Ocean. The Digital Ocean VPS outperformed AWS in every category which was interesting. The AWS instance's performance was near the bottom of the pack performance wise, but the Digital Ocean one was in the middle. It is a very interesting platform they have built which allows deployment and management across providers. However, their performance isn't quite there yet. Other companies are running on the same hardware and getting better results. CloudWays doesn't do just WordPress, so it's easy to understand why their performance might not quite be as good as some of their competitors who solely focus on WordPress.

DreamHost

DreamPress was another disappointment. The security features hid some of the performance weakness on the first Blitz test, but it completely failed on the second. The way DreamPress is designed it says it has automatic RAM scaling and each site is run by two VPS instances. It's very unclear what resources you are really getting for your money. They are charging $50/month for a 1GB ram VPS, so I get the feeling a lot of resources are shared and it may not be a true VPS.

FlyWheel

FlyWheel were excellent on every test except the final 2000 logged in user test from LoadStorm. They are built on top of Digital Ocean and I was using the smallest VPS. Yet their performance beat VPSs on Digital Ocean that had four times the resources (CloudWays DO). For cached content on the Blitz test, they had the second highest hits/second and response time. I suspect the testing hit a hardware maximum. FlyWheel had the best performance with the lowest dedicated resources (512MB ram). The companies that outperformed it had more resources dedicated to them or shared resources which presumably would allow access to far greater than 512MB ram. It was an impressive performance given what they are selling and combined with them having the best reviews of any company Review Signal has ever tracked. FlyWheel certainly merit serious consideration.

GoDaddy

GoDaddy continues to surprise me. They flew through all the tests, including a weird issue where they transferred 3X the data during the LoadStorm test and didn't show any signs of stress. The only comparison I have to last time is the Blitz testing, where they eked out another 3000+ hits and raised their hits/second from 829 to 888. GoDaddy also raised their max hit rate marginally from 1750 to 1763. What's more impressive is they reduced their errors+timeouts from 686 to 93. More hits with less errors. From a performance perspective, they did excellent in absolute terms and relative to their last benchmarks.

Kinsta

A new-comer that jumped straight to the top of the performance tiers. Kinsta's performance was amazing in the Load Storm 2000 logged in user test. They had the lowest peak response time and zero errors over a 30 minute test. They didn't struggle with any tests whatsoever and showed zero downtime. Kinsta's performance was top tier.

LightningBase

Another new-comer that jumped straight to the top. One of the cheapest too starting at under $10. LightningBase aced the Blitz testing and did excellent on Load Storm. There was no downtime monitored. LightningBase belongs in the top tier and is delivering amazing value.

Media Temple

Media Temple is interesting because I was told it was running the same technology as GoDaddy (GoDaddy bought Media Temple a year ago). They have a few more premium features like Git and a staging environment. Media Temple's performance was superb. It actually beat GoDaddy's performance in just about every measure by a marginal amount on both Load Storm and Blitz's load testing. If GoDaddy has top tier performance, Media Temple certainly does as well.

Nexcess

Nexcess's performance was excellent in the Load Storm testing. However, it completely collapsed during the Blitz load testing. I'm really not sure what to make of those results. Perhaps the underlying shared hardware is very good but the static caching setup isn't quite up to snuff? It's probably not worth speculating, suffice to say, Nexcess ended up looking like a middle of the pack web host instead of a top tier one because of the Blitz test.

Pagely

Pagely put on another spectacular performance. They handled the Load Storm test with 1 error. Blitz results stayed similar to the last run. They handled more hits, but had a few more errors+timeouts (1 last time, 43 this time). Really not much to add here other than they continue to be in the top tier.

Pantheon

Pantheon specialized in Drupal hosting, so I was wondering how well it would translate to WordPress. The short answer is, it converted over really well. They had a flawless run on the LoadStorm test - 0 errors and not even any spikes in response time over 30 minutes. They are one of the most expensive (only second to PressLabs) options on this list, but definitely make a case for it. Perfect uptime and near flawless load testing sent them easily into the top tier.

PressLabs

It's hard to write much about PressLabs because we couldn't get LoadStorm to work properly to test out their hosting. However, their Blitz results were lackluster. For the most expensive plan we tested, it was a bit of a disappointment to see it not do stunningly well.

SiteGround

SiteGround sadly didn't do as well as they did last time. Their Blitz load testing score went down slightly. We couldn't bypass their security measures to properly test Load Storm. They obviously have some good protection measures to prevent malicious users from trying to access too many things, but it also meant I couldn't get a deeper look this time around. That was a change from the last round of testing. Slightly disappointing to see the performance dip, but I hope it was due to the extra security measures they put in place that made testing them difficult.

WebSynthesis

WebSynthesis was teetering on the Load Storm test of having too many errors (0.5%), but they were under it and handled the test quite well. They also had no weird security issues this time around, and WebSynthesis led the pack on Blitz testing. They went from 871 hits/second to 963 hits/second; leading every provider on the Blitz tests with a whopping 1 error to boot. Sprinkle in some perfect up time numbers and it's clear WebSynthesis is still a top tier provider and is continuing to get better.

WPEngine

I feel like I could copy+paste my last conclusion about WPEngine. "WPEngine had some issues. Uptime was not one of them, they were perfect or upwards of 99.9% in that department. However, their performance shortcomings became apparent during the load tests." They didn't even make it to the final round of Load Storm testing. They were also middle of the pack on the Blitz testing. Compared to the last round of Blitz testing, the results were nearly identical, with slightly fewer errors+timeouts. I'm not sure if I should be disappointed to not see improvement or relieved to see them maintain the exact same performance and consistency. Their vaunted rankings on Review Signal's reviews have slipped relative to a few of the other providers on here (FlyWheel and WebSynthesis). While they were once leading the pack in technology, the rest of the pack is starting to catch up.

 

Thank Yous

A special thanks goes out to the sponsor of this post and an individual employee, Scott Price of Load Storm, who worked countless hours with me in order to perform these tests.

I want to thank all the companies that participated in these tests. I tested the support staff a fair bit at some of them and I thank them for their time and patience.

A special thanks goes to Chris Piepho from LightningBase also provided a lot of feedback based on the original article and helped improve the methodology for this round.

A huge thanks goes out to Mark Gavalda at Kinsta for his feedback and performance testing discussions. He's tested some further out stuff than I have like HHVM and php-ng performance. Also to their designer, Peter Sziraki, who designed the the header image for this article.

 

Introducing Pagely and FlyWheel

I am happy to announce two new hosts on Review Signal today.

One of them was a long time in coming, Pagely. The original managed WordPress hosting company. In my original managed WordPress hosting performance benchmarks, Pagely came out at the top - having no trouble with any of the tests I threw at their services.

Pagely_october_2014

 

It's a bit disappointing to see that their reviews don't quite match their performance. From what I can tell, it looks like they've struggled with some major outages in the past. However, there is an upward trend in opinions about them. Their performance is top notch, it would be great to see the rest of the service catch up.

Our second addition to Review Signal is FlyWheel which has an astounding 95% Overall Rating. FlyWheel is another managed WordPress hosting service built on top of Digital Ocean. I don't think I've ever seen a company have such positive reviews. It's a struggle to find anyone saying something negative about them. It's wonderful see such positive reviews for a new company.

Flywheel_october_2014

I hope they can keep it up, but my past experience says all the companies which start out so remarkably strong generally tend to come down to more 'normal' levels in the 70%ish range. Their competitors WPEngine (82% -> 73%) and WebSynthesis (83% -> 76%) both did. Also the company they were built on top of, Digital Ocean, went from 81% to 76%. Great service seems like the hardest problem to scale for a web hosting company. I hope FlyWheel can break the rules and continue it's streak of excellence.

Bias, Negativity, Sentiment and Review Signal

Photo Credit: _Abhi_

People are more likely to express negative sentiments or give negative reviews than they are positive ones.

I hear this in almost every discussion about Review Signal and how it works. There is certainly lots of studies to back this up. One major study concluded that bad is a stronger than good. One company found people were 26% more likely to share bad experiences. There is plenty of research in the area of Negativity Bias for the curious readers.

Doesn't that create problems for review sites?

The general response I have to this question is no. It doesn't matter if there is a negativity bias when comparing between companies because it's a relative comparison. No company, at least not at the start, has an unfair advantage in terms of what their customers will say about them.

Negativity bias may kick in later when customers have had bad experiences and want to continually share that information with everyone and anyone despite changes in the company. Negative inertia or the stickiness of negative opinion is a real thing. Overcoming that is something that Review Signal doesn't have any mechanism to deal with beyond simply counting every person's opinion once. This controls it on an individual level, but not on a systemic level if a company has really strong negative brand associations.

What if a company experiences a disaster, e.g. a major outage, does that make it hard to recover in the ratings?

This was a nuanced question that I hadn't heard before and credit goes to Reddit user PlaviVal for asking.

Luckily, major outages are a rare event. They are fascinating to observe from a data perspective. The most recent and largest outage was the EIG (BlueHost, HostGator, JustHost, HostMonster) outage in August 2013. If we look at the actual impact of the event, I have a chart available here.

When I looked at the EIG hosts' post-outage, there really hasn't been a marked improvement in their ratings. Review Signal's company profiles have Trends tabs on every company which graph on a per month basis to see how a company is done over the past 12 months.

BlueHost-May2014 HostGator-May2014

There is definitely some variance, but poor ratings post-outage seem quite common. It's hard to make an argument that these companies have recovered to their previous status and are simply being held back by major outcries that occurred during the outage.

The only other company with a major outage I can track in the data is GoDaddy. GoDaddy have had numerous negative events in their timeline since we started tracking them. There has been the elephant killing scandal, SOPA, DNS outages and multiple super bowl events.

godaddy_chart

August 2012 - July 2013

Godaddy-May2014

June 2013 - May 2014

There are clear dips for events such as the September 2012 DNS Outage, the Superbowl in February. Their overall rating is 46% right now and the trend is slightly up. But they seem to hang around 45-50% historically and maintain that despite the dips from bad events. There is arguably some room to for them be rated higher depending on the time frame you think is fair, but we're talking a couple percent at most.

What about outages affecting multiple companies? eg. Resellers, infrastructure providers, like Amazon, who others are hosting on top of. Are all the companies affected equally?

No. Just because there is an outage with a big provider that services multiple providers doesn't mean that all the providers will be treated identically. The customer reaction may be heavily influenced by the behavior of the provider they are actually using.

Let's say there is an outage in Data Center X(DC X). It hosts Host A and Host B. DC X has an outage lasting 4 hours. Host A tells customers 'sorry, it's all DC X's fault' and Host B tells customers 'We're sorry, our DC X provider is having issues, to make up for the downtime your entire month's bill is free because we didn't meet our 99.99% uptime guarantee.' Just because Host A and Host B had identical technical issues, I imagine the responses from customers would be different. I've definitely experienced great customer service which changed my opinion of a company dramatically on how they handled a shitty situation. I think the same applies here.

Customer opinions are definitely shaped by internal and external factors. The ranking system here at Review Signal definitely isn't perfect and has room for improvement. That said, right now, our rankings don't seem to be showing any huge signs of weakness in the algorithms despite the potential for issues like the ones talked about here to arise.

Going forward, the biggest challenge is going to be creating a decay function. How much is a review today worth versus a review in the past? At some point, a review of a certain age just isn't as good as a recent review. At some point, this is a problem I'm going to have to address and figure out. But now, it's on the radar but it doesn't seem like a major issue yet.

Introducing Windows Azure

I am happy to announce we've added Windows Azure hosting platform to Review Signal today. Azure is definitely a big player in the cloud server market. It was also one a very notable absence in our listings. Now that we've added Azure and Amazon AWS in the past few months, our cloud listings for IaaS providers looks a lot more complete.

Windows Azure comes in with a 70% overall rating which is quite respectable and puts it right next to RackSpace on the rankings. Although the support scores seem to be a lot lower at an underwhelming 56%.

Want to see the full cloud provider rankings? Visit our complete rankings and click on the Cloud tab.

Introducing WebSynthesis – Managed WordPress Hosting

WebSynthesis [Reviews] has been added to Review Signal recently. They were quietly published alongside our recent article Managed WordPress Hosting Showdown – Performance Benchmarks Comparison. If you aren't familiar with WebSynthesis, you may be familiar with one of their other products, the Gensis WordPress theme/framework. They are both owned by Copyblogger Media. WebSynthesis performed very well in our managed wordpress benchmarking being classified as one of the top tier managed WordPress hosting providers. They aren't slacking in the review department either, as I write this, WebSynthesis have an 83% overall rating. That is currently the highest overall rating of any company we track.

“Synthesis provides business class WordPress hosting and caching solutions for organizations ranging from SMB to the Enterprise.   Our unique VPS based offerings provide business customers the ultimate balance of managed services and configuration autonomy.” – Derick Schaefer, Creator of WebSynthesis

A huge congratulations goes out to Derick Schaefer and his team at WebSynthesis for their great work. I hope you guys keep it up!

Check Out WebSynthesis Reviews at Review Signal

The Best Web Hosts of 2013 – Reflections and Awards

2013 was the first full year of operation for Review Signal. It has been an exciting year and a successful year. We've added roughly 90,000 new reviews in 2013. We added two new companies this year. One was a company we struggled with to get listed because of their vast size and scope: Amazon. The other has been possibly the fastest growing new web hosting company of 2013: Digital Ocean. While proud that we added these two companies, it isn't enough. We're tracking over one hundred companies and there are a more that should be added soon. Adding a new company does take some extra work, but we need to find that time and make sure it gets done. Without further ado...

The Best Web Hosting Companies of 2013

Best Shared Web Host: A Small Orange

asmallorange best shared webhost

Best Web Hosting Support: SiteGround

siteground best hosting support

Best Unmanaged VPS Provider: Digital Ocean

digitalocean best unmanaged vps

Best Managed VPS Provider: A Small Orange

asmallorange best managed vps

A Small Orange continues to be the top web host that offers a broad range of hosting options. It was only beat out by newcomer, Digital Ocean, which provides very specific unmanaged SSD backed VPSes. And SiteGround is a new comer to our best of the year category. They eked out a less than 1% edge over A Small Orange in support rating. We would like to congratulate all of these companies on an excellent 2013 and hope that they continue their stellar performance in 2014.

I would also like to thank Cat Robinson for designing this year's awards. My favorite part of working with her was 'I don't know what a cute symbol is for VPS provider.' Me either. Thank you Cat.

*The way these scores were calculated were the same as last year. We only look at data collected in the 2013 calendar year, so there may be some differences in what you see on the site live, which keeps all data from all years versus this years rankings.

Digital Ocean Logo

Digital Ocean Downtime and Disclosure

Review Signal went down last night. My first alert was at 8:52 PM EST and it didn't come back online until 12:53 AM EST. That is four hours of downtime. The culprit was apparently a failed UPS at Digital Ocean's NY1 facility.

This outage took down Review Signal and it's tracking systems. There was no data collected during that period in time. Which coincidentally would be the time likely to have the most complaints about Digital Ocean.

I am aware they are rated as one of the best providers on this site and missing a critical downtime plays in favor of their rankings. This blog post is serving as a disclaimer and disclosure of that fact. Digital Ocean's rating is likely to be inflated some marginal amount. I was trying to manually track the volume of tweets I missed and I believe it to be in the neighborhood of 100-200. A large portion of those were coming from their official Twitter account. Not everything was a complaint. But for argument's sake, let's say 100 negative reviews were posted last night by affected customers. Their rating would be a drop from 80% to around 77% which would still be within 1% of the next highest rated provider. [source]

I am disappointed that I missed collection on such an event, but the impact for consumers seem minimal. It should also get smoothed out over time as more people share their opinions of the service.

Going forward, I hope to re-build some of the infrastructure and data collection systems to be more robust and spread across multiple providers.

Loading...

Interested in seeing which web hosting companies people love (and hate!)? Click here and find out how your web host stacks up.