Tag Archives: godaddy

Ethics in Web Hosting – HostCamp Presentation

In June, I was the opening speaker for the inaugural HostCamp in Berlin, which was a side event for the larger WordCamp Europe.

My topic presentation and topic was Ethics in WordPress Hosting. It was a topic the event organizer, Jonathan Wold, and I talked about at length. The goal was to start a discussion about ethical issues facing the industry, what sort of behavior and policies people have and how to address them.

The event was by invitation and I cannot discuss what others shared because that was in private. My goal was to convince web hosting company executives that ethics matter, not just for the sake of being ethical. I wanted to show how even perceived unethical behavior could financially harm companies today with social media. So please act properly, it's in your best financial interest. One of the case studies is Digital Ocean which I wrote about and inspired the talk.

 

Under $25/Month WordPress Hosting Performance Benchmarks (2016)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

The full company list, product list, methodology, and notes can be found here

This post focuses only on the results of the testing in the <$25/month price bracket for WordPress Hosting.

 

<$25/Month WordPress Hosting Products

review_signal_table_25_updated

 

<$25/Month WordPress Hosting Performance Benchmarks Results

1. Load Storm

Test 500-2000 Concurrent Users over 30 Minutes, 10 Minutes at Peak

Company Total Requests Total Errors Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred (GB) Peak Throughput (MB/s) Average Throughput (MB/s)
A2 310069 203981 249.08 172.26 15138 549 4.639 8.853 2.577
BlueHost 181995 153234 147.47 101.11 16000 7634 1.066 3.677 0.592
DreamHost 295685 43 224.1 164.27 15063 339 16.06 13.5 8.922
FlyWheel 265618 81491 205.22 147.57 15101 1154 11.5 9.361 6.391
GoDaddy 311172 1363 238.68 172.87 10100 340 16.07 13.31 8.927
Hosting Agency (DE) 182424 117939 132.65 101.35 15991 6743 3.823 10.53 2.124
IWW 272657 84 217.92 151.48 10096 266 14.93 13.77 8.293
LightningBase 314439 5 238.68 174.69 8989 255 16.24 13.24 9.023
Media Temple 327662 1466 258.45 182.03 10628 381 12.55 10.54 6.972
Pressed 289318 61 214.05 160.73 15029 266 16.25 13.01 9.03
SiteGround 301722 1 230.45 167.62 9374 447 15.9 13.76 8.833
TrafficPlanetHosting 289335 476 217.63 160.74 15216 570 16.15 14.08 8.974
WP Land 293166 11596 228.4 162.87 15608 644 15.47 13.3 8.594

Discussion of Load Storm Test Results

The companies that clearly didn't struggle at all with LoadStorm were DreamHost [Reviews], Incendia Web Works (IWW), LightningBase, Pressed, SiteGround [Reviews]. GoDaddy [Reviews], MediaTemple [Reviews] and Traffic Planet Hosting had minor spikes at the start, but they seem nearly inconsequential in the grand scheme of the test.

WP.land seemed to have some security measures which struggled with wp-login being hit so frequently.

A2 Hosting [Reviews], BlueHost [Reviews], FlyWheel [Reviews] and Hosting Agency did not do well on this test. FlyWheel explicitly stated this was too much load for that size plan and recommended upgrading if this was the expected load.

2. Blitz.io

Test 1-1000 Concurrent Users over 60 seconds

Blitz Test Quick Results Table

Company Hits Errors Timeouts Average Hits/Second Average Response Time Fastest Response Slowest Response
A2 590 27255 390 10 92 55 167
BlueHost 23340 71 274 389 214 155 604
DreamHost 29337 0 1 489 4 3 7
FlyWheel 28530 0 0 476 28 21 146
GoDaddy 15222 11093 28 254 196 190 229
Hosting Agency (DE) 662 20862 3649 11 630 400 1556
IWW 28786 9 0 480 23 21 24
LightningBase 27488 0 0 458 71 71 72
Media Temple 15255 11260 5 254 200 188 318
Pressed 26228 0 0 437 80 5 389
SiteGround 26055 1 21 434 100 72 346
TrafficPlanetHosting 1018 8344 9718 17 266 102 843
WP Land 28344 0 0 472 39 38 39

Discussion of Blitz Test 1 Results

This test is just testing whether the company is caching the front page and how well whatever caching system they have setup is performing (generally this hits something like Varnish or Nginx).

Who performed without any major issues?

DreamHost, IWW, LightningBase, SiteGround, WP Land all handled the test without any issues.

Who had some minor issues?

BlueHost had a couple spikes during the test which caused some errors and timeouts, but they weren't substantial.

FlyWheel had a spike at the very end of the test which caused a large increase in response times.

Pressed started to have a ramp up in response times but it never errored or timed out during the test.

Who had some major issues?

GoDaddy, MediaTemple and TrafficPlanetHosting seemed to pretty clearly hit security measures which couldn't be worked around. The response times were relatively stable, but errors shot up which is symptomatic of a security measure kicking in rather than the server being taxed. It's hard to know how they would have performed sans security measures.

A2 and Hosting Agency did not take kindly to the Blitz test and crashed almost immediately under load.

3. Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. UptimeRobot was paid for and monitoring on a 1 minute interval. All the companies were monitored over approximately two months (May-June 2016).

Uptime Robot & StatusCake

Company StatusCake UptimeRobot
A2 99.92 99.91
BlueHost 30.22 18.06
DreamHost 99.97 99.97
FlyWheel 99.96 99.98
GoDaddy 99.96 99.98
Hosting Agency (DE) - 100
IWW 99.73 99.88
LightningBase 99.99 100
Media Temple 99.96 99.95
Pressed 100 99.87
SiteGround 99.97 99.98
TrafficPlanetHosting 99.98 99.98
WP Land 99.92 100

BlueHost screwed up and cancelled this account mid-testing causing the uptime to look horrific. Their other two plans which were not cancelled had measurements of 99.98, 99.98, 100 and 99.99 uptime. I'm upset that it happened and there was a struggle to restore the account and have to take credit away for this type of screw up. But, they were able to keep the other servers up with near perfect uptime which I think should be stated here as well.

Hosting Agency for some reason couldn't be monitored by StatusCake (http/2 issue they still haven't fixed for nearly 9 months, which UptimeRobot fixed within 24 hours when I notified them). But they had 100% on UptimeRobot, so it looks good.

IWW had a bunch of short outages and one longer one (2hr 33m) which brought it's uptime down.

Pressed had a 1hr 51m downtime (502 error) recorded by UptimeRobot but StatusCake never picked it up. I'm not sure what to make of that, it might be something wrong with UptimeRobot's servers connecting properly since StatusCake never picked it up over an interval that long.

Everyone else had above 99.9% uptime.

4. WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only.

Company WPT Dulles WPT Denver WPT LA WPT London WPT Frankfurt WPT South Africa
A2 0.819 0.638 1.109 1.181 1.687 5.054
BlueHost 0.902 0.521 0.878 1.532 1.874 3.483
DreamHost 0.769 0.777 1.444 1.107 1.64 4.33
FlyWheel 0.74 0.722 1.077 1.082 1.649 5.241
GoDaddy 0.939 0.728 0.834 1.376 1.992 6.909
Hosting Agency (DE) 1.299 1.258 2.17 0.985 1.55 4.905
IWW 0.544 0.658 0.864 0.929 1.416 4.105
LightningBase 0.62 0.598 1.078 0.95 1.471 5.764
Media Temple 0.86 0.667 0.811 1.313 1.945 4.645
Pressed 0.773 0.902 1.276 1.176 1.691 4.845
SiteGround 0.741 0.64 1.048 1.06 1.721 4.94
TrafficPlanetHosting 0.793 0.562 1.26 1.212 1.723 3.522
WP Land 0.719 0.689 1.154 1.099 1.709 4.8

 

Company WPT Singapore WPT Shanghai WPT Japan WPT Sydney WPT Brazil
A2 2.244 22.287 1.974 2.003 1.895
BlueHost 2.255 22.728 1.809 1.467 2.274
DreamHost 1.93 22.186 2.028 1.954 1.747
FlyWheel 1.765 12.549 1.845 1.816 1.758
GoDaddy 2.173 22.373 1.826 1.959 2.103
Hosting Agency (DE) 2.311 22.406 2.651 2.772 2.596
IWW 1.98 22.547 1.615 1.96 1.535
LightningBase 1.999 19.731 1.708 1.913 1.661
Media Temple 2.113 22.141 1.802 1.959 2.135
Pressed 2.233 23.691 1.997 2.037 1.894
SiteGround 2.131 22.718 1.843 2.079 1.788
TrafficPlanetHosting 2.081 22.74 1.872 1.595 1.816
WP Land 2.25 22.305 1.852 1.959 1.752

What I learned was getting traffic into China is terrible. Nobody really did well on the Shanghai location. South Africa is also really slow. Most servers were US based but were delivering content to most corners of the world in about 2 seconds or less which is impressive. Hosting Agency based in Germany was a bit disappointing. Very slow relatively speaking to the US. But it wasn't even the fastest to London or Frankfurt. LightningBase and IWW were able to beat the German company in the US by a large margin and to Europe which reinforces that geographic location isn't everything in terms of speed.

I wish I could compare averages against last year except they removed one of the testing locations (Miami) and I did a global test instead because that was something people wanted to see.

5. WPPerformanceTester

Company PHP Bench [Seconds] (lower=faster) WP Bench [Queries Per Second](higher=faster)
A2 12.626 570.78
BlueHost 13.089 1083.42
DreamHost 17.104 446.23
FlyWheel 11.761 387.3
GoDaddy 13.804 278.47
Hosting Agency (DE) 6.501 45.28
IWW 7.637 1869.16
LightningBase 10 1315.79
Media Temple 12.241 339.79
Pressed 11.036 217.2
SiteGround 11.497 733.14
TrafficPlanetHosting 8.666 918.27
WP Land 14.485 684.93

What was enormously interesting about WPPerformanceTester results this year was the much larger spread and faster results. Last year, almost everyone was around 10-14 seconds for PHP Bench with the outlier of PressLabs doing 8.9 and DreamHost at 27. DreamHost again has the dubious honor of the slowest PHP Bench but it improved by a whopping 10 seconds down to 17. The fastest was Hosting Agency with 6.5, more than a full 2 seconds faster than last year's fastest speed. IWW, TrafficPlanetHosting also managed sub 10 second speeds.

Last year's fastest WP Bench was 889 queries per second. That was blown away by this years testing with IWW leading the group at more than double the speed (1869). BlueHost, LightningBase and TrafficPlanetHosting all managed to be faster than last year's fastest benchmark as well. Unfortunately, Hosting Agency's incredibly fast PHP bench is somewhat cancelled out by their slowest WP Bench score, which is slower than last year's slowest. It should be noted that transaction speed isn't always a great measured on distributed/clustered/cloud systems that may be running databases on different machines, but at the entry level that's less of an issue. Generally the incredibly fast scores you see are local databases with no network latency overhead.

Conclusion

It is nice to get back to a real entry level analysis with a much more level playing field. Having 13 different companies available to choose from in the <$25/month range is fantastic. Despite the change in this years format, the lower end plans still outperformed the fastest competitors from last year's tests which had plans up to ~$300/month.

Despite the hard price cap in this bracket of testing, there were still some companies that handled all the tests without any serious issue. Many more did very well but ran into minor issues.

The amount of companies jumping into the space is a fantastic win for consumers. In this tier we saw A2, Pressed, WP Land, Hosting Agency, IWW and Traffic Planet Hosting all enter for the first time. They target a variety of different niches within the space and overall it's a win for us, the consumer to have more good choices and options. From a performance standpoint, you can still get amazing performance value for the money even at the lowest tier.

Without further ado, I will tell you who had the best performance, who deserved an honorable mention and then analyze each host individually. I still don't believe in ranking in any particular order, only grouping companies by how well they performed.

Top Tier WordPress Hosting Performance

review_signal_2016_trophy_25

DreamHost [Reviews], LightningBase, and  SiteGround [Reviews],

All three of these companies went through the full testing without any meaningful issues.

Honorable Mentions

Pressed had an odd uptime issue but also showed some signs of server stress during the blitz test. For a brand new company they performed admirably, but I'm not quite comfortable awarding them the top tier status quite yet when you compare their results against the three top tier companies, but they put on a very good showing.

WP.land did well in every test except LoadStorm where it had a roughly 4% error rate. It looked like a security issue with wp-login which isn't uncommon. But there were also some spikes/delays as well. It could just be security acting up, but again, a minor issue that kept it out of the top tier, but it was worthy of an honorable mention from yet another new comer to this year's testing.

GoDaddy [Reviews]/MediaTemple [Reviews], I combine this one because it's running on the same tech and the results look very similar and experienced the same security issues. You can pretty clearly see when the security measures kick in on Blitz and I wasn't able to work with their tech team to come up with a way to responsibly bypass their security measures. LoadStorm had a spike at the start with wp-login issues but resolved itself out quickly and had a flat response time graph. It's possible their tech is just as good as the top tier hosts, but I wasn't able to accurately measure it because of security measures but it looks very good and at least deserves the honorable mention.

Traffic Planet Hosting is another new entrant and had similar issues to GoDaddy/MediaTemple. Security issues caused some problems on the Blitz test, but it did start to show some load too. Not perfect, but it did well on LoadStorm as well.  (no honorable mention?)

Individual Host Analysis

A2 Hosting [Reviews]

A2 Hosting was a new entrant to this test and as much as I love the competition in the space, A2 fell short. Other than their uptime monitoring which was good, they struggled in all the load testing experiments.

BlueHost [Reviews]

BlueHost specifically messed up with my account in this test and the uptime was terrible because of it. That alone ruined the uptime test, although as I stated in the section, the other servers all maintained excellent uptime which were on different accounts. They did ok in the blitz test, but not in the LoadStorm test. They also surprisingly managed the fastest individual WebPageTest score of any host in this price range. Compared to last year I don't see any huge signs of improvement with regards to performance.

DreamHost [Reviews]

Last year DreamHost's DreamPress product almost made the top tier except for some major downtime issues. This year, they had no such downtime issues and the performance remained top notch. DreamHost earned the top tier status for the <$25/month price bracket. It appears to be an excellent product priced very competitively.

FlyWheel [Reviews]

FlyWheel only entered one product this year and it was less powerful than last year's. It struggled a bit more on the LoadStorm test but the Blitz was perfect (although for this price tier, it was a weaker test than last year's test). They explicitly stated for LoadStorm that the plan was inappropriate for that level of traffic. They can probably handle bigger sites, but if we're comparing dollars to performance, they fell short in this price bracket on that metric. But they are still rated as the most well liked company that we track at Review Signal, so they are clearly doing something right in terms of product and customer service.

GoDaddy [Reviews]

GoDaddy had a stalwart performance marred by what appeared to be security measures. They very well could have a top notch product but we couldn't work out a responsible way to bypass the security measures for the Blitz load test. LoadStorm looked pretty good, one small spike to start and steady up to 2000 users. GoDaddy earned an honorable mention status because the product didn't seem to encounter any non-artificial problems.

Incendia Web Works

IWW did a great job in both load tests. The only concern was uptime, where IWW had 99.73% and 99.88% as recorded by each service. The performance component is definitely there, but a little more consistency and we have another serious competitor in the space. The only reason they didn't earn honorable mention while Pressed did is that there were conflicting uptime reports for Pressed where one showed 100% and the other recorded sub 99.9% uptime. Two independent services showed IWW below 99.9%, so there isn't much doubt about it in my mind. Like DreamHost last year, they put on a great performance showing and I hope next year the servers are a bit more stable and I can award top tier status.

LightningBase

LightningBase continues to impress. The last two years they've put on consistently near perfect tests. Their Blitz result was perfect and their LoadStorm had only 5 errors out of 314439 requests. Combined with 100/99.99% uptime monitors, LightningBase is unquestionably in the top tier for the <$25/month WordPress hosting bracket.

MediaTemple [Reviews]

MediaTemple's results basically mirrored GoDaddy's results. It would be even hard to tell the graphs apart if you removed the names. The MediaTemple/GoDaddy platform appears to be very solid but we couldn't responsibly get by some security measures, so I couldn't award it top tier status, but MT earned an honorable mention.

Pressed

Pressed earned itself an honorable mention. It had a weird uptime issue but more importantly it started to show some signs of load during the Blitz test where I would expect a flat response time from a static cache test like Blitz. It's a very new product and I'm sure we'll continue to see tremendous improvements as time goes on, a very good performance from possibly the newest company in this year's testing.

Hosting Agency

Hosting Agency performed as expected, it appears to have no special WordPress optimizations. If you were to install a basic lamp stack, this is the performance I expect out of the box. They had perfect uptime and oddly found themselves on both ends of the spectrum on my WPPerformanceTester. They weren't faster to England or Germany on WebPageTest, which I suspect is because there was no special caching technologies to accelerate delivery of pages despite being geographically closer. And it just collapsed during the load tests, especially Blitz which is essentially a static cache test (where they have none). Another important note is that their entire system is in German only.

SiteGround [Reviews]

SiteGround got even better this year. They jumped up from honorable mention to top tier status. Their Blitz and LoadStorm tests both improved while everything else remained at a high level. An all around fantastic performance which deserved top tier status.

Traffic Planet Hosting

Another new comer to this years testing. TPH put on a good show, there seemed to be some security measures which ruined the Blitz testing, but the LoadStorm test looked very solid. They earned an honorable mention because the only issue seemed artificial. I'm less confident about the quality of the product than GoDaddy/MediaTemple, but it still seemed to warrant recognition.

WP.land

WPLand was the final new entrant and they put on a fantastic showing. Everything went near perfect except the LoadStorm test which seemed to have an issue with wp-login triggering some security measures. But the response rate was pretty stable and quick despite the ramp up to 2000 users. They also had a perfect blitz test with no errors and a 1ms spread in fastest to slowest response times. WP Land earned honorable mention status because overall it was a very good performance with a small issue that might be security related.

 

WordPress Hosting Performance Benchmarks (2016)

LoadStormLogo

Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

2018 WordPress Hosting Performance Benchmarks is now live.

This is the fourth round of managed WordPress web hosting performance testing. You can see the original, 2014 version , and 2015 version.

Companies Tested

A2 Hosting [Reviews]
BlueHost [Reviews]
CloudWays [Reviews]
Conetix
DreamHost [Reviews]
FlyWheel [Reviews]
GoDaddy [Reviews]
Incendia Web Works
Kinsta
LightningBase
LiquidWeb [Reviews]
MediaTemple [Reviews]
Pagely [Reviews]
Pantheon [Reviews]
Pressable (Formerly ZippyKid)
Pressed.net
Pressidium
Pressjitsu
PressLabs
Hosting Agency (German)
SiteGround [Reviews]
Traffic Planet Hosting
WordPress.com VIP
WPEngine [Reviews]
WP.land
WPOven.com

Companies that didn't participate this round but did on previous rounds: WebHostingBuzzWPProntoNexcessA Small Orange [Reviews] and  WebSynthesis [Reviews].

Every plan was donated by the company for testing purposes with the strict stipulation that it would be the same as if any normal user signed up. There is a notes section at the bottom that details the minutiae of changes made to plans at the end of this post. Nearly every single company had security issues that I had to get around, so they worked to make sure my testing went through properly. Load testing often looks like an attack and it's the only way I can do these tests.

The Products

This year is a bit different than years past where every company and plan competed against one another. When I started the price gap was from $5/month to $29/month. Last year the gap was $5.95 to $299. I was only testing entry level plans but the market has dramatically changed since I first got started. Today, there is demand at many different price points and lots of companies have gone upscale with WordPress.com VIP at the top of the price bracket starting at $5,000/month. The only logical way to break things up was by price brackets. So below you will see the brackets and which companies participated. Specific details will be included on each bracket's write up.

 

<$25/m $25-50/m $51-100/m $101-200/m $201-500/m $500+/m
A2 Hosting A2 Hosting LiquidWeb A2 Hosting Kinsta Kinsta
Bluehost Conetix Bluehost Bluehost Media Temple Pagely
DreamHost LLC Lightning Base Cloudways (AWS ) Conetix Pagley Pantheon
Flywheel Pantheon Cloudways (Google) Kinsta Pantheon Pressable
GoDaddy Pressable Kinsta Liquid Web Pressable Pressidium
Incendia Web Works Pressjitsu Lightning Base Pressable Pressidium WordPress.com VIP
Lightning Base SiteGround Media Temple Pressidium Presslabs WP Engine
Media Temple WP Engine Pagely Pressjitsu SiteGround
Pressed WP.land Pantheon
Hosting Agency.de Cloudways (DigitalOcean) Pressable
SiteGround Cloudways (Vultr) Pressidium
Traffic Planet Hosting WPOven SiteGround
WP.land

 

Methodology

The question I tried to answer is how well do these WordPress hosting services perform? I tested each company on two distinct measures of performance: peak performance and consistency. I've also included a compute and database benchmark based on a WordPress plugin.

All tests were performed on an identical WordPress dummy website with the same plugins except in cases where hosts added extra plugins. Each site was monitored for approximately two months for consistency.

1. LoadStorm

LoadStorm was kind enough to give me resources to perform load testing on their platform and multiple staff members were involved in designing and testing these WordPress hosts. I created identical scripts for each host to load a site, login to the site and browse the site. Logged in users were designed to break some of the caching and better simulate real user load. The amount of users varies by cost.

2. Blitz.io

I used Blitz again to compare against previous results. This tests the static caching of the homepage. I increased the number of users based on monthly cost this time.

3. Uptime (UptimeRobot and StatusCake)

Consistency matters. I wanted to see how well these companies performed over a longer period of time. I used two separate uptime monitoring services over the course of a month to test consistency.

4. WebPageTest.org

WebPageTest with 9 runs, first view only, native connection. I tested from Dulles, Denver, Los Angeles, London, Frankfurt, South Africa, Singapore, Shanghai, Japan, Sydney, Brazil.

5. WPPerformanceTester (free plugin on WordPress.org)

I created a WordPress plugin to benchmark CPU, MySql and WordPress DB performance. The CPU/MySql benchmarks are testing the compute power. The WordPress component tests actually calling $wpdb and executing insert, select, update and delete queries.

 

Notes - Changes made to Hosting Plans

A2 - VPS Servers can't install WordPress out of the box without extra payment for Softaculous. Disabled recaptcha.

Conetix - disabled WordFence and Stream plugins.

SiteGround - fully enable SuperCacher plugin

GoDaddy - 24 database connection limit increased if you notify them of heavy load

CloudWays - disabled WordFence

LiquidWeb and HostDime no longer providing Shared Hosting

I've updated the site today to reflect that LiquidWeb and HostDime no longer provide shared hosting.

It leaves quite a large gap between SiteGround (72%) and pretty much everyone else still in the shared hosting space (<60%).

I do wonder if this is a bellwether for shared hosting becoming a thing of the past. There are still millions of people on it and in all likelihood will continue to be. But we've seen the rise of all sorts of specialty hosting which is likely eating up a lot of the market. The rise of developer oriented providers like Amazon, Azure, Digital Ocean have opened up the floodgates for building services on top of them. We've seen numerous companies built on top of these companies and targeting niches, especially WordPress like FlyWheel, Pagely. We've even seen configurable providers like CloudWays which lets you select the cloud provider of your choice and install and manage your websites on them.

These new hosting providers are charging more and giving different experiences to users. Developers have flocked to them and are building the next generation of web hosting services. High quality companies seem to be moving up market, charging more and providing more where I'm guessing the margins are substantially better than in the shared hosting space unless you're trying to upsell everything.

It will be interesting to to follow, will we continue to see more consolidation ala EIG and GoDaddy? Is there room for another great shared hosting provider that grows very large? Or will shared hosting slowly fade away as superior technologies (VPS) and specialized companies eat away at it providing the specific services people really want. We've also seen non-webhosts like SquareSpace, Wix and Weebly gain large market shares. On the BuiltWith estimates ranging from 880,000-1.6m websites for each of them.

The one trend I am not a fan of is that there are fewer and fewer really good choices in the shared hosting space that are of significant scale.

WordPress.org Hosting Recommendations Listing Criteria

UPDATE (5/13/2016 7:05 PM): Official comment from Matt Mullenweg was posted. Quoted below, click or scroll to the comment section to see the original.

“I would like to see some transparency in the process”

As stated on the page, the listing is completely arbitrary. The process was: There was a survey, four applicants were chosen, and the page was updated. That might repeat later in the year, or the process might change.

“how criteria are weighted”

There is no criteria or weighting. It ultimately is one person’s opinion. Information that is important is reflected in the questions asked in the survey, but that is not everything that is taken into account. (I have looked at this site in the past, for example.)

“who is making the decisions”

I am. James helped in sorting through the many submissions that came in, doing additional research, and digging on finalists, but ultimately the decision was mine. You can and should blame me fully for any issues you have with it. I appreciate James’ help in this go-round, but he will not be involved at all with any future updates. (So, please leave him alone.)

“how much money is involved”

There was no money involved. Obviously being listed on the page is hugely valuable and impacts the listed (or unlisted) businesses a great deal. This is why I take full responsibility for the listing, now and in the future — I have been fortunate to be extraordinarily successful and no financial or business consideration any of the applicants could offer matters to me. A host could offer $100,000,000 to be listed on the page for 1 day, and I would say no.

-Matt Mullenweg


Yesterday, I posted WordPress.org Updates Hosting Recommendations, Nobody Knows Selection Criteria. Which naturally meant I was going to find out as much as I could about the process, because it's a big deal and my mission here at Review Signal is honest and transparent web hosting reviews.

I confirmed with multiple sources that the newly listed companies didn't pay any money to get listed. Everyone seems to have filled out the form and then heard nothing back until the updated page was published yesterday. Both the winners (BlueHost [Reviews], DreamHost [Reviews], FlyWheel [Reviews], SiteGround [Reviews]) and losers (everyone else) seemed to agree on this process based on everyone I talked to.

Great. The application process seems fair.

But the selection process is still a black box, with help from people who follow WordPress more closely than myself, I found James Huff (macmanx) a 12 year volunteer and 5 year employee at Automattic who was directly involved with the new WP.org hosting recommendations.

James_huff1

I didn't hide who I was or my interest. The most concerning part of this exchange was that 'Absolutely no money changed hands, unless you consider sponsorship of WordCamps as monetary with regards to the "contributions to WordPress.org."'

No money changed hands except a lot of sponsorship dollars to the organization. Guess who the top global gold community sponsors are? BlueHost (and JetPack/WooCommerce, both owned by Automattic). Somehow BlueHost are also a Silver sponsor too, along with GoDaddy. BlueHost is pouring a lot of money into WordCamps/WordPress.org Foundation.

I'm sorry, but I do consider that money changing hands. They are giving a large sum of money - it's material enough to mention in their SEC filings.

James_huff2

We're still going to have to agree to disagree about what money changing hands means. But he says it was fair. But fair is pretty meaningless when we don't really have any insight into what standard of fairness is the goal. How is each criteria being weighed and evaluated. But this is the list of hosts that they can confidently tell everyone are good.

I'm not sold.

James_huff3

Historical perception seems to be the proxy for what marketers might call Net Promoter Score (NPS). How much do consumers like/recommend something. That's essentially what I measure here at Review Signal and my data has been incredibly close to what company's internal data shows (LiquidWeb NPS Scores vs LiquidWeb Review Signal Rating).

It is arguably the most important factor of recommendations and for service businesses, it's about the best metric for all encompassing quality available.

But it's only part of the criteria and that's fair. But should there be some minimum threshold? Can a company score a zero in quality and high in everything else be worthy of a listing? BlueHost's rating is 41%. That means roughly 6/10 people don't recommend it or have anything good to say about them.

There are WordCamp sponsors that didn't make the cut. Of the global community sponsors 2/3 hosting companies did though, BlueHost and DreamHost, while one didn't, GoDaddy. But the largest sponsor made it and is at the top and it's still BlueHost.

But moving on, James mentioned Automattic has no play in the process, but he does wear multiple hats. Which means he is aware of the potential perception of conflict of interest.

James_huff4

Finally, a mention of Matt. Important again when thinking about the context for potential conflicts of interest. I outline what would happen in a dream world and what's realistic. I think honest disclosure and basic transparency is perfectly realistic. It's ok to make money, just be clear about where it's coming from. A standard I try to uphold here at Review Signal, see how we make money and read the entire process for how our rankings are calculated. See? It's not hard and I still make money giving the best information available.

James_huff5

AWP comments

That is the comment thread I referenced. Not a single person said anything positive about BlueHost and the assumption is they just paid for it. BlueHost being listed ruins the credibility of the recommendations when there is no transparency about what criteria was being used.

James_huff6

Moving on, the survey itself has issues which I brought up before. It's asking for sensitive company information and being handled by employees of a company that owns two competitors in the space (WP.com VIP, Pressable), took $15 million in investment from another (BlueHost), and is an investor in a fourth competitor (WP Engine).

That seems like a huge potential conflict of interest and I know it dissuaded at least one company from even applying.

James_huff7

james huff 3 tweets

It didn't end on the nicest note, I don't think James took my criticisms well. From his original messages, I think he knows and understands the perception of conflicts of interest but admitting them in this context puts him in a very awkward position that I don't envy. He wears multiple hats and surely wants to wear them all fairly. I would say admitting that those multiple hats has the potential for conflicts of interest isn't a weakness of character, it's an admission of humanity. I'm sure James is a great guy and has done a lot of good things for the community. But I think people who can be perceived with a strong potential for conflict of interest, which anyone connected to Automattic in this situation would have, shouldn't be managing this particular process.

I truly don't have any ill will towards James personally or Automattic. Even BlueHost/EIG, I've been more than willing to give them the benefit of the doubt and continue to hope that they will be better (ASO did break my heart a bit, I thought they were turning EIG around). My data continues to show them being mediocre and a seeming touch of death in terms of quality (their strategy does seem to be cost cutting and economies of scale). But I don't fault them for their behavior, I expect it, it's well published in their SEC filings.

Conclusion

I still think WordPress.org can do better with its hosting recommendations and I'm not going to stop advocating for them until they are better. I would like to see some transparency in the process, how criteria are weighted, who is making the decisions and how much money is involved. I think the companies that applied would appreciate feedback about why they weren't selected, what makes them different and fall short of the companies that do make the cut. Or just call them Ads / Sponsors. Don't say they are the best and brightest and endorse them. Say, we took money and this guy paid us the most. At least we meet the minimum threshold of honesty and transparency.

 

References

For posterity, the logs in their entirety are available below. It's long, so I tried to cut down some stuff to get to the most important bits. But I don't want to hide anything.

Direct Message Archive macmanx Making WordPress Slack Direct Message Archive macmanx Making WordPress Slack2

Endurance International Group – Profitable?

Endurance International Group is one of the largest web hosting companies who own many of the brands you see in the consumer space. EIG owns A Small Orange, BlueHost, HostGator, HostMonster and JustHost to name a few of their most well known brands.

What caught my eye was an article on Nasdaq, where EIGI (EIG's Stock Ticker) is up and at an all time high. A lot of analysts are rating it as a buy and the price surge seems to indicate people are listening. But I'm not a financial adviser, nor am I interested in making stock recommendations.

What does interest me is web hosting and considering that is the core of EIG's business, the underlying numbers are quite fascinating.

EIG had its first year with a positive operating income with $629.85 million in revenue and $617.37 million in total operating expense leaving $12.48 million in operating income. However, they weren't profitable because they have a lot of debt they are paying off. EIG's net income was a loss of $42.82 million.

"Total subscribers increased by 91,000 in the fourth quarter. Average monthly revenue per subscriber rose 12% year over year to $14.78. For all of 2014, the number of subscribers rose 17% to 4.087 million and the average monthly revenue per subscriber increased 11% to $14.48." - according to the article on Nasdaq

$14.48 per month, per subscriber. $173.76 per year per subscriber. It's easy to understand how they are paying such high commissions with those numbers. That number also seems to be trending up which is a good sign for the financial direction the company is going.

How does that compare to other companies?

I dug up an old GoDaddy S1 from 2014 [Godaddy Reviews] which states their average revenue per user for the trailing 12 months is $105 (it's fluctuated between $93-$105 over the past few years).

I also found Web.com's latest 10K filing which stated monthly ARPU of $14.62. which is $175.44 annually.

EIG and Web.com look very similar just reaching positive operating income this year and very similar revenue per subscribers. It states pretty clearly in Web.com's filing "The growth in average revenue per subscriber continues to be driven principally by our up-sell and cross-sell campaigns focused on selling higher revenue products to our existing customers as well as the introduction of new product offerings and sales channels oriented toward acquiring higher value customers."

It seems like common knowledge to anyone in the web hosting industry that these companies are getting users in cheap. Those ~$5/month hosting plans are obviously not the only thing being sold. It would seem they are able to on average roughly triple that monthly figure by selling other services.

So the question in my mind becomes what do those new products look like? We're seeing a jump into the managed WordPress hosting space. Is there actual innovation that's going to happen or are these big companies simply going to carve out some of the high margin services provided by niche providers? Is that going to be a win for consumers?

I don't have the answers, but I'm certainly interested to see how it plays out.

GoDaddy WordPress Hosting Review

GoDaddyLogo

This post is based off WordPress Hosting Performance Benchmarks, where you can read the full details of how GoDaddy performed against the competition.

Overview

GoDaddy [Reviews] is the company that sparked this series of WordPress performance testing. They said their WordPress hosting was as good as anyone else on the markets WordPress hosting. I wanted to see if it was true in our first round of WordPress testing and it certainly was. The second round of testing did not disappoint either. GoDaddy maintained their position in the top tier of managed WordPress providers. This article summarizes GoDaddy's performance under multiple testing scenarios.

The Plan

All testing was done on GoDaddy's shared WordPress Hosting service which cost $6.99/month.The plan allows for 1 site, 100GB of space and 25,000 visitors per month. It also had the usual features of automated backups, 24/7 support via phone or ticket and free migrations.

Performance

LoadStorm

The first performance test was done with LoadStorm. GoDaddy made it to the final round of testing where 2000 concurrent users were logging into WordPress and browsing the test site. The test was designed to test non-cached performance by logging users into WordPress. It caused many hosting setups to crumble. You can see GoDaddy's result in this graph (click on it to play with the interactive results):

Load-Storm-GoDaddy-2000

 

GoDaddy actually pushed through triple the amount of data compared to every other web hosting company. Amazingly, it didn't struggle with this at all. The error rate was an infinitesimal, only 92 errors in over 285,000 requests. Other than a little spike, it handled the entire test without missing a beat.

Blitz

The second load test that was run on GoDaddy was Blitz.io. Blitz was used to test cached performance. It simply requested the home page of our test site from 1-2000 times per second.

Blitz-GoDaddy-2000

 

GoDaddy's Blitz results look exactly like what you hope for. A small spike at the very beginning and then perfectly stable performance afterwards. There are no complaints against its cached performance. Full Blitz Results (PDF)

Uptime

Two third-party uptime monitoring services (UptimeRobot and StatusCake) tracked the test site for a month. The results for GoDaddy was 99.9% and 100% uptime respectively. That is the uptime level you would expect of any good service.

WebPageTest

“WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster.” WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. GoDaddy was tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
GoDaddy 1.607 1.355 0.934 0.855 1.18775

There was absolutely no issues with their WebPageTest results, it loaded very quickly at an average speed of under 1.2 seconds to completely load.

Conclusion

GoDaddy [Reviews] is one of the top tier WordPress hosting providers when looking at performance. GoDaddy continues to surprise me. They flew through all the tests, including a weird issue where they transferred 3X the data during the LoadStorm test and didn’t show any signs of stress. The only comparison I have to last time is the Blitz testing, where they eked out another 3000+ hits and raised their hits/second from 829 to 888. GoDaddy also raised their max hit rate marginally from 1750 to 1763. What’s more impressive is they reduced their errors+timeouts from 686 to 93. More hits with less errors. From a performance perspective, they did excellent in absolute terms and relative to their last benchmarks.

Get 25% off with Coupon Code: cjcwp1

GoDaddyLogo

 

Bias, Negativity, Sentiment and Review Signal

Photo Credit: _Abhi_

People are more likely to express negative sentiments or give negative reviews than they are positive ones.

I hear this in almost every discussion about Review Signal and how it works. There is certainly lots of studies to back this up. One major study concluded that bad is a stronger than good. One company found people were 26% more likely to share bad experiences. There is plenty of research in the area of Negativity Bias for the curious readers.

Doesn't that create problems for review sites?

The general response I have to this question is no. It doesn't matter if there is a negativity bias when comparing between companies because it's a relative comparison. No company, at least not at the start, has an unfair advantage in terms of what their customers will say about them.

Negativity bias may kick in later when customers have had bad experiences and want to continually share that information with everyone and anyone despite changes in the company. Negative inertia or the stickiness of negative opinion is a real thing. Overcoming that is something that Review Signal doesn't have any mechanism to deal with beyond simply counting every person's opinion once. This controls it on an individual level, but not on a systemic level if a company has really strong negative brand associations.

What if a company experiences a disaster, e.g. a major outage, does that make it hard to recover in the ratings?

This was a nuanced question that I hadn't heard before and credit goes to Reddit user PlaviVal for asking.

Luckily, major outages are a rare event. They are fascinating to observe from a data perspective. The most recent and largest outage was the EIG (BlueHost, HostGator, JustHost, HostMonster) outage in August 2013. If we look at the actual impact of the event, I have a chart available here.

When I looked at the EIG hosts' post-outage, there really hasn't been a marked improvement in their ratings. Review Signal's company profiles have Trends tabs on every company which graph on a per month basis to see how a company is done over the past 12 months.

BlueHost-May2014 HostGator-May2014

There is definitely some variance, but poor ratings post-outage seem quite common. It's hard to make an argument that these companies have recovered to their previous status and are simply being held back by major outcries that occurred during the outage.

The only other company with a major outage I can track in the data is GoDaddy. GoDaddy have had numerous negative events in their timeline since we started tracking them. There has been the elephant killing scandal, SOPA, DNS outages and multiple super bowl events.

godaddy_chart

August 2012 - July 2013

Godaddy-May2014

June 2013 - May 2014

There are clear dips for events such as the September 2012 DNS Outage, the Superbowl in February. Their overall rating is 46% right now and the trend is slightly up. But they seem to hang around 45-50% historically and maintain that despite the dips from bad events. There is arguably some room to for them be rated higher depending on the time frame you think is fair, but we're talking a couple percent at most.

What about outages affecting multiple companies? eg. Resellers, infrastructure providers, like Amazon, who others are hosting on top of. Are all the companies affected equally?

No. Just because there is an outage with a big provider that services multiple providers doesn't mean that all the providers will be treated identically. The customer reaction may be heavily influenced by the behavior of the provider they are actually using.

Let's say there is an outage in Data Center X(DC X). It hosts Host A and Host B. DC X has an outage lasting 4 hours. Host A tells customers 'sorry, it's all DC X's fault' and Host B tells customers 'We're sorry, our DC X provider is having issues, to make up for the downtime your entire month's bill is free because we didn't meet our 99.99% uptime guarantee.' Just because Host A and Host B had identical technical issues, I imagine the responses from customers would be different. I've definitely experienced great customer service which changed my opinion of a company dramatically on how they handled a shitty situation. I think the same applies here.

Customer opinions are definitely shaped by internal and external factors. The ranking system here at Review Signal definitely isn't perfect and has room for improvement. That said, right now, our rankings don't seem to be showing any huge signs of weakness in the algorithms despite the potential for issues like the ones talked about here to arise.

Going forward, the biggest challenge is going to be creating a decay function. How much is a review today worth versus a review in the past? At some point, a review of a certain age just isn't as good as a recent review. At some point, this is a problem I'm going to have to address and figure out. But now, it's on the radar but it doesn't seem like a major issue yet.

Managed WordPress Hosting Showdown – Performance Benchmarks Comparison

UPDATE: Round 2 of Testing (November 2014) is now available.

WordPress as a platform has become the most popular CMS around, claiming to power almost 19% of the web. As a result, Managed WordPress hosting has become a very popular niche. Many companies in the managed WordPress space are charging a very high premium over the traditional shared web hosting providers. So beyond the marketing speak, what are you really getting? Most promise to make your life easier with features like automatic updates, backups, and security. They also claim to have great performance. It's hard to test objectively the ease-of-use features. But we can measure performance. There weren't many performance benchmarks that I could find, and the ones I could were not very thorough. So I began by designing my own set of testing.

Companies Tested

A Small Orange* [Reviews]
Digital Ocean [Reviews]
GoDaddy* [Reviews]
Pagely
Pressable*
SiteGround*† [Reviews]
WebSynthesis* [Reviews]
WPEngine [Reviews]

*Company donated an account to test on. I checked to make sure I was on what appeared to be a normal server. GoDaddy had over 3000 domains on the same IP. SiteGround had 887 domains. A Small Orange was a VPS, so it should be isolated. Pressable and WebSynthesis didn't have any accounts on the same IP. I am not sure how isolated they are in their environments.

†Tests were performed with SiteGround's proprietary SuperCacher module turned on fully unless otherwise specified.

The Products

I created a comparison chart of all the companies and the product used in this test. It was mostly the basic/cheapest offer with the exception of SiteGround, because their cheapest hosting plan didn't have full WordPress caching built in, but it was still very much within the price range of other offers.

(Click to see full table)

comparison_chart_web

Methodology

The question I tried to answer is how well do these WordPress hosting services perform? I tested each company on two distinct measures of performance: peak performance and consistency.

1. Blitz.io

Load testing from the most well known load testing service. First test was 60 seconds, from 8 locations each scaling from 1-125 concurrent users (total 1000 users). For this test each one was tested with identical theme (twenty fourteen) and the out of the box configuration. The second test was 60 seconds, from 2 locations (Virginia/California) scaling from 1-1000 (total 2000 users). The configuration of each site was identical with Customizr theme and plugins.

2. Uptime (UptimeRobot and Uptime - a node.js/mongo project)

Consistency matters. I wanted to see how well these companies performed over a longer period of time. I used two separate uptime monitoring services: one existing third party service and one open source project.

3. WebPageTest.org

"WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster." WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. I tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

4. Unnamed Load Testing Service*

This service asked to remain nameless in this article. They do natural load testing and are in beta. I tested each WordPress host with the same theme (twenty fourteen) and the out of the box configuration for this test. I ran into some issues with this service which I will discuss later.

Background Information

Before I go over the results I wanted to explain and discuss a few things. Every provider I tested had the latest version of WordPress installed. Every plugin that came with it was also up to date with the exception of GoDaddy which had an older version of JetPack included (17 days out of date when I first setup).

I had some trouble getting set up on A Small Orange, the signup email was stuck in gmail's spam filter. I also found a potentially minor security issue in their customer system which they promptly responded to and fixed. I also had to specifically ask for the customized WordPress LEMP stack to be installed on my VPS.

GoDaddy stores SFTP and other critical details on a separate area away from your main GoDaddy account and WordPress admin (gateway.godaddy.com for anyone stuck looking).

I ran into issues with Pressable's CNAME redirect. It seemed to cache a coming soon page and didn't resolve itself by clearing any cache I could find. It resolved itself over a day or so, but being stuck with a coming soon page wasn't a pleasant first experience.

SiteGround includes CloudFlare but I never got it working, it failed to configure on www. So I couldn't conduct the test with it enabled.

Pagely charges you extra for SFTP access (which I didn't pay for and made my own life a living hell while trying to do this test).

WebSynthesis came pre-installed with two themes that were out of date.

Results

Blitz.io

 Test 1. 1-125 Concurrent Users from 8 Locations over 60 seconds (Gallery)

 Discussion of Blitz Test 1 Results

The first thing I must note here is that two companies got absolutely destroyed by this test: Digital Ocean and A Small Orange.

My Digital Ocean VPS just died repeatedly. MySql died and needed to be manually restarted. I thought it was a bad instance, so I spun up another and got the same result. I even tried installing a caching plugin to see if I could get any performance out of their WordPress stack. I had absolutely no luck. Given this result, I eliminated Digital Ocean from the rest of my testing. You can run high performance WordPress sites on Digital Ocean (Review Signal's blog is running on one currently), but it requires knowing what you're doing and isn't recommended for people looking for managed WordPress hosting. Digital Ocean is a self-managed VPS provider; it's not for beginners or those who need managed support of their WordPress site. I included Digital Ocean to see how their offer would fare against specialized companies. The short answer is, it doesn't compare, at all.

Another out-of-the-box install with A Small Orange got crushed by this test too. After dconsulting with A Small Orange support, it became apparent I wasn't on their customized WordPress setup. I asked for it to be installed and all further tests were on this much more performant setup. You will see two sets of results for ASO, the normal and the LEMP stack, which is their high performance setup. One thing to note is that ASO offers less management on their customized WordPress setup because it no longer uses cPanel.

The lesson here is that WordPress, out-of-the-box with a LAMP stack, performs pretty badly. For a personal blog with low traffic, it probably won't matter, but for a site with any substantial amount of traffic, it will most likely crumble.

Who performed without any major issues?

A Small Orange (from now on, anytime I talk about ASO, it's about the specialized WordPress setup), Pagely, and SiteGround. Each of these companies had stable response times and few to no errors.

Who had some issues?

GoDaddy had an issue with errors in the middle of the test around 400 users but seemed to gracefully scale upwards without any difficulty and maintained steady load times and stopped erroring. Pressable's response times were a bit varied. Pressable didn't seem to have much trouble with the traffic because it had zero errors and minimal timeouts. WPEngine seemed to have a weird connection timeout issue around 600 users that resolved itself fairly quickly. WebSynthesis seemed to cap out at around 400 users/second with a few bursts. The response time remained steady and it was erroring (connection reset) instead of timing out. WebSynthesis support told me "We analyzed the logs on the server and some of your requests are not being cached as your tests are throwing over 14K symbols in a single URL. This is not realistic for normal use cases of WordPress." Nevertheless, they made a tweak to the nginx (webserver) config, and I tested it again in test 2.

Test 1. Quick Results Table

Success Errors Timeouts Avg Hits/second Avg Response (ms)
ASO 23788 18 2 396 241
GoDaddy 23962 165 0 399 227
Pagely 20132 1 0 336 459
Pressable 21033 0 19 351 412
SiteGround 19672 0 0 328 495
WebSynthesis 19995 4224 5 333 246
WPEngine 20512 192 196 342 395

GoDaddy, despite their small hiccups, managed to have the best average response time to 8 servers distributed across 5 continents (Virginia, Oregon, California, Singapore, Japan, Brazil, Australia, Ireland). Furthermore, they also managed to serve the most hits.

SiteGround had the slowest average response and lowest hits/second but also didn't have a single error or timeout and the response was consistent throughout the test.

A Small Orange's performance was stunningly consistent. The fastest response was 238ms and the slowest was 244ms, a difference of 6ms over nearly 24,000 requests. They were just barely behind GoDaddy in hits and average response.

Overall, other than WebSynthesis, no host seemed to have serious difficulty with this test.

 

 Test 2. 1-1000 Concurrent Users from 2 Locations over 60 seconds (Gallery)

Discussion of Blitz Test 2 Results

This test was designed to see just how much traffic these web hosts can handle. Blitz increased their pricing for multiple server locations while I was running this test. I had to reduce server locations from 8 down to 2 locations with higher user counts instead. The response times may be less meaningful, but I picked Virginia and California so that the test locations were on opposite sides of the US. I believe every server tested was in the US, so hopefully that was somewhat balanced, but the average response time may mean less than the stability of the response time.

Who performed without any major issues?

Pagely.

Who had some issues?

A Small Orange's setup definitely couldn't scale all the way up. Response times started increasing with increased users as did errors/timeouts. GoDaddy had some bizarre spikes that look similar to the one I saw in test 1, except three of them this time. Despite this, they pushed the most successful hits again and had the best ping of hosts that didn't completely error out. Pressable had some spikey performance similar to GoDaddy. Pressable pushed a lot of successful requests and did recover from the spikes. SiteGround hit a major spike but then seemed to kick into high gear and performed even better and finished out the test exceptionally strong and stable. WebSynthesis seemed to cap out at around 400 users/second with a few bursts again. The response time remained fairly steady and it was erroring (connection reset) instead of timing out again. WPEngine's response times got worse as the load increased and timeouts started to increase as well.

I included a screenshot from my uptime monitoring system. It's checking each host every 5 seconds, and I highlighted the hour in which all the tests took place. You can see some large spikes for companies that seemed to have latency struggles.

 

Test 2. Quick Results Table

Success Errors Timeouts Hits/second Avg Response (ms) Max Hit Rate (per second)
ASO 27057 777 518 451 739 597
GoDaddy 49711 685 1 829 148 1750
Pagely 48228 0 1 804 216 1580
Pressable 43815 503 9 730 271 1466
SiteGround 48735 12 19 812 263 1708
WebSynthesis 20855 35773 0 348 120 763
WPEngine 39784 25 1008 663 304 1149

GoDaddy seemed to have the best peak performance again. SiteGround and Pagely seemed to handle the load fantastically and didn't show any signs of performance issues (again). With the exception of A Small Orange, every host saw an improvement in average response time. As I wrote earlier, this may be because they were tested only from US locations. That caveat aside, the response times are a lot closer together and look pretty good for US based visitors. Still, this test also started to raise questions about many web hosts' ability to handle a heavy traffic load.

WebSynthesis Response to ECONNRESET Errors

WebSynthesis ran into the same issue in both tests, a strange ECONNRESET error. Suspecting something may be blocking the test requests' as a security measure, I asked them to investigate. They made a change to their nginx config after the initial set of testing and wrote back "we made adjustments to handle the types of URLs you were hitting us with.  We did review our logs and do not see these in production thus will not put these kinds of changes in production as we feel they are unrealistic." Here are the results:

WebSynthesis2-blitz WebSynthesis2 (Download Full Report WebSynthesis2.pdf)

The new WebSynthesis results were pretty impressive. Average ping of 123ms (3ms slower than initial test), 871 hits/second average, 1704 hits/second and with only 94 errors (ECONNRESET again). The original tests did not suggest that either the hardware or software was starting to buckle. But the configuration change does indicate that they were probably blocking some of the requests. Load testing tools can't fully emulate users (they generally come from only a couple of machines) and it's conceivable that some security measures are triggered by their unusual behavior. Since I am testing these companies out of the box, I am leaving this result separate where support got involved and changed configuration settings.

Uptime

What is often more important than peak performance is how well a service does on average. To test this, I used two services: UptimeRobot and a NodeJS project called Uptime.

UptimeRobot Results

Monitored HTTP and Ping every 5 minutes. This was over a 10 day span.

HTTP Ping
ASO 1 1
GoDaddy 0.9979 -
Pagely 0.9862 -
Pressable 0.9995 1
SiteGround 0.9993 1
WebSynthesis 1 1
WPEngine 1 1

A Small Orange, WebSynthesis and WPEngine showed no downtime. Every server responded to pings 100% of the time with the exception of GoDaddy and Pagely which seemed to be blocking pings to the server (at least from UptimeRobot).

Pagely's downtime was mostly my own doing (3 hours), when I was editing a template  to use some of these testing services. Only 5 minutes of the downtime was unrelated to that incident.

GoDaddy had 28 minutes of downtime. SiteGround had 9 minutes. Pressable had 5 minutes.

When you account for my screwup, only GoDaddy shows up under the 99.9% uptime threshold.

Uptime (nodejs) Results

Uptime was configured to perform an HTTP check every 5 seconds on each host with a 1500ms slow threshold. This was executed from a Digital Ocean VPS in NYC.

Responsiveness is defined as the percentage of pings above slow threshold over the period. Availability is the uptime percentage.

Availability (%) Downtime (m) Response Time (ms) Responsiveness (%)
ASO 99.998 1 204 99.97
GoDaddy 99.963 17 309 99.679
Pagely 99.998 1 237 99.974
Pressable 99.914 39 727 90.87
SiteGround 99.997 1 206 99.616
WebSynthesis 99.994 3 97 99.727
WPEngine 99.965 16 209 99.819

Nobody had a perfect record although four companies (A Small Orange, Pagely, SiteGround and WebSynthesis) were above the 99.99% uptime marker. The rest were still all above 99.9%. The most worrisome result was Pressable because they had the most downtime and a very high average response time. This might be caused by the monitoring server being far away from their server. Below is a detailed graph of the response times:

pressable_response_time

The lowest ping I saw was around 172ms and the relatively consistent bottom line of pings at around 300ms is reasonable. However, inconsistent performance with high spikes results in a very high average. Every other company had a fairly smooth graph in comparison. They show an occasional spike and/or some small variance (<100ms) between response at the base line, but nobody came close to a graph like Pressable's. The next most interesting is A Small Orange's graph:

aso_response_time

Though within reasonable response times, it has a spike and a weird pattern bouncing between around 170ms and 270ms.

Giving Pressable the benefit of the doubt, I signed up for Pingdom and monitored what their service saw. This was done with 1 minute resolution.pressable_pingdom_uptime

pressable_pingdom

 

The pings varied pretty wildly, the highest being 2680ms and lowest 2150, a 530ms difference. And that was based on hourly averages; the variance within each hour may have been much greater. It would seem to corroborate the results from the Uptime script I was running, i.e. performance fluctuates a lot.

 

WebPageTest.org

Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only. This was tested against the default install from every company. I also tested SiteGround's multiple levels of their SuperCache technology from one location to see how much it improved performance. SuperCache was left on for all the other tests performed. You will also notice the original A Small Orange and the WordPress optimized LEMP stack. Digital Ocean hadn't completely failed out at this point yet either.

Company Dulles,VA (s) Miami, FL (s) Denver, CO (s) Los Angeles, CA (s) Average Load (s)
A Small Orange 1.894 2.035 2.381 1.648 1.9895
ASO Lemp 0.85 0.961 1.056 0.665 0.883
Digital Ocean 1.245 0.95 1.419 0.924 1.1345
GoDaddy 0.94 1.208 1.229 0.671 1.012
Pressable 0.642 1.174 1.721 0.981 1.1295
SiteGround 1.073 1.327 1.682 1.353 1.35875
SiteGround (Varnish Dynamic Cache) 0.732
SiteGround (Varnish Dynamic Cache, Memcached) 0.725
SiteGround (PageSpeed, Memcached) 1.216
WPEngine 0.812 1.235 1.06 1.08 1.04675
Pagely 0.924 1.083 1.46 0.748 1.05375
WebSynthesis 0.616 1.021 1.516 1.116 1.06725

You can see a huge performance difference in A Small Orange's default cPanel install and their optimized LEMP stack. Load times were reduced by more than half from every location. That should convince you that optimizing WordPress can dramatically improve performance. To a lesser degree, you can see it happen when SiteGround's various SuperCache options are turned on.

A Small Orange's LEMP stack leads the pack here. However, it's amazing how close the performance of most of these companies was on this test.

 

Conclusion

Every service seems to have their issues somewhere. I try to avoid injecting my personal opinion and bias as much as possible. So I won't be ranking or outright saying any single company is the best. Some providers did exceptionally well and tended to clump together performance-wise, I will call those the top tier providers. This top tier designation is related to performance only and from the results of these tests. What each of these companies is offering is different and may best suit different audiences depending on a variety of factors beyond performance, such as features, price, support, and scale (I only tested entry level plans). But I will provide a short summary and discussion of the results for each provider.

A Small Orange

Once I moved away from the stock WordPress install on a normal VPS to their specialized LEMP WordPress VPS, it was a much better experience. Their uptime was near perfect on both services (1 minute of downtime total measured between them). The first load test it performed incredibly well and was 2nd by only a few requests per second. However, ASO did buckle under the heavier load test but it didn't fail out and managed to respond to most requests (including uptime monitoring) during the whole event. While their performance didn't scale as well as most of the competitors, I did receive a lot of support from them and it was quite responsive, in-line with what I would expect from a company that has one of the highest support ratings.

Digital Ocean

They are not in the same business as the rest of these companies. I added them because I wanted to see how well a stock install of WordPress would compete with pretty good hardware that's low cost (SSD backed VPS). The results here aren't a knock on their service at all. As I said earlier, this blog is running on a Digital Ocean VPS. The difference is I have spent many hours configuring it myself to be somewhat high performance. Digital Ocean is designed for people who can administrate their own servers. If you need managed WordPress hosting, stick to companies that are managing WordPress for you. If you're comfortable and want to do it yourself, these guys have one of the highest rated companies that we track.

GoDaddy

This whole test started from a statement made by Jeff King, a senior vice president at GoDaddy and GM of their hosting division. He wrote to me, "The new products are top of the market (really, you can’t get faster WordPress anywhere now) and we’re just beginning."  Challenge accepted.

GoDaddy surprised me, and in a good way. They have a pretty bad reputation in the web community and it shows on this site where their overall score is below 50%. Yet, their WordPress hosting kept up or led the pack in some of the performance tests. In both Blitz.io load tests, out-of-the-box, GoDaddy had the highest number of successful requests, the highest number of concurrent users, and either 1st or 2nd in average response time.  (WebSynthesis's performance did beat them when their support investigated connection resets) There were some weird performance bumps during the load tests, but nothing major. The biggest blot in terms of performance was on their uptime. They had the most downtime (28 minutes) of any of the companies tracked in UptimeRobot's monitoring (which ran longer than my second Uptime monitoring setup). But it was still 99.8% uptime, not a huge knock.

Overall, I would say GoDaddy delivered on their claim, performance wise. They appear to be in the top tier of specialized WordPress hosting companies. Given their price, I think they have the potential to push down pricing on most of their competitors who charge 3-4 times what GoDaddy charges. If we take a more holistic view, beyond performance, they still don't have all the tools to cater to the different niches that the specialized companies are competing for (although there were some hints dropped that things like Git, Staging Environments and more were coming soon). And then there is a branding problem they are trying to overcome. But GoDaddy is definitely doing some things very right and should make the managed WordPress hosting space very interesting.

Pagely

Pagely's performance didn't ever seem to get affected by any tests. They had a mere 5 minutes of downtime. The load testing services never seemed to cause any stress on their system. It was an impressively consistent performance. They didn't have the highest peak performance on the load tests, but they had a flat response time and only a single error or timeout in each blitz load test. One thing that irritated me about their offer was charging extra for SFTP access. Every other company included this for free and it's generally a given with a web hosting service. Still, a very impressive performance by Pagely, they are definitely in the top tier.

Pressable

Pressable had some issues during this test. I am not sure why but there was a very strange issue where performance seemed to repeatedly spike throughout my entire testing session. When it was good, it was performing at a level consistent with the top tier providers. The problem was, it wasn't always good. On the large Blitz load test there was consistent performance except for two spikes, which put it behind the front of the pack. It caused low responsiveness scores and potentially some downtime calculations as well. The foundation of a top tier provider is there, and generously open sourced on GitHub. They just need to sort out this weird performance spikiness issue.

SiteGround

Another very pleasant surprise in SiteGround. Not only are you getting cPanel hosting, you're getting top tier WordPress performance once you fully enable their SuperCacher plugin. They are one of the most well liked companies we track and have some of the best rated support. I honestly didn't know they were offering such high performance WordPress hosting. They didn't have the absolute fastest responses or push the highest concurrent users but they kept pace. They had one of the stranger graphs on the heavy load test, for some reason the performance got even better after a big spike. They had excellent uptime at above 99.9% measured by both services. Like GoDaddy, SiteGround looks like they could make this space interesting with a $7.95 plan performing on par with plans 3-4x its cost. While I didn't get to try some of the more developer-centric features like a staging environment and Git, they are available on a plan that's as little as 50% of the cost of the competitors at $14.95. Definitely in the top tier of managed WordPress providers.

WebSynthesis

These guys are harder to evaluate. Their uptime was excellent: either perfect or upwards of 99.9% as measured by the two services. The load testing ran into a weird ECONNRESET error. Their support was very helpful and made some configuration changes that seemed to allow the load testing service through. Once they did that, they outperformed every provider on almost every metric, highest average hits/second, fastest response and most successful hits with relatively flat response times. As I wrote in my discussion about them, load testing tools aren't a perfect emulation of real users. But it looked like it was running into a security rule rather than actual strain on the service. If that assumption is correct, these guys are truly a top tier provider.

WPEngine

WPEngine had some issues. Uptime was not one of them, they were perfect or upwards of 99.9% in that department. However, their performance shortcomings became apparent during the load tests. They had the most errors and timeouts, besides WebSynthesis, in the first test and seemed to buckle under the load in the second test with rising errors and timeouts and slower response times. When WPEngine was first listed here on Review Signal, they had the highest rating of any company. They've fallen a bit since then but WPEngine still remains near the front of the pack. They have a strong brand and seem to be doing some things right. They have some features that few other providers have, but this test was mostly about performance. In that department, they didn't quite match the level of performance that some of their competitors reached.

 

 

 Product Comparison Chart with Coupon Codes

 

 

Notes:

*Unnamed Load Testing Service

AVG Response Failures AVG Response Heavy
ASO 2031 No
GoDaddy 2120 No 5904
Pagely 2398 No
Pressable 1360 No 15570
SiteGround 22659 Yes 25712
WebSynthesis 1929 No 3740
WPEngine 1835 No

I didn't get to conduct a full test with this service because I may have caused the entire service to crash during testing. This table is showing 2 tests, on average response and whether failures occurred (any type of failures). The second test is what caused the service to crash and is incomplete. The first test was 500 users/second from 1 machine and the second was 8000 users/second from 40 machines. The response times were pretty slow all around, and SiteGround seemed to have some major issues with this test. I am unsure as to why, I re-ran the first test again later and it seemed to handle it without any failures (errors) on the second run. The testing system is in beta and it's really hard to know what happened. SiteGround seemed to handle Blitz's heavier test without issue and the second test here went fine. Hard to know if there was really an issue on SiteGround's end or the testing service. The heavy test was interesting, WebSynthesis ended up being the fastest which is a similar result to the Blitz.io test once they fixed the nginx config. Perhaps this load test wasn't triggering any of their security measures? I could not complete the testing because the system went down prematurely.

I am not sure if there are useful inferences to be drawn from these tests. I was asked not to name the service because of the issues encountered but I wanted to include the partial results here in case someone did find some value in looking at the numbers.

I actually tried a third load testing service that was also in beta and it never was able to fully run the tests either. I am starting to feel like load testing kryptonite.

Thank You

First off, I want to thank the companies that agreed to participate voluntarily. I had nothing but pleasant experiences dealing with the people at each company. A few even took it a step beyond and offered a lot of help and insight about how this test might be conducted. There was a surprising amount of consistency of views about what and how to measure performance offered. A few of the individuals who stood out the most:

David Koopman at GoDaddy for his insights on performance and testing.

Vid Luther at Pressable was incredibly helpful and knowledgeable about performance. He's even written a great article here about performance. He also helped get at least one other company on board for testing and for that, I am thankful as well.

Tina Kesova at Siteground has always been helpful and this test was no exception. She had SiteGround on board almost instantly when I just mentioned the seed of the idea back in November 2013.

A few friends of mine also helped in figuring out how to perform these tests and dealing with some of the technical challenges in benchmarking. Dave Lo, Eric Silverberg and Samuel Reed all offered their advice and helped me make the design of the tests as fair as possible.

A special thanks goes to people who read drafts of this article and provided feedback including Andrey Tarantsov, JR Harrel and my dad.

Anyone else I missed, I am sorry, and thank you too.

 

GoDaddy Media Temple

Changing the Story, MediaTemple’s New Spin on GoDaddy’s SOPA Story

I run Review Signal to be an unbiased source of information about web hosting companies. I try to avoid injecting my personal opinions into the discussion about a web host's quality and service. I try to explain what I see in the data collected from hundreds of thousands of people.

However, there are rare moments when you just need to call a company out because the issue is so important.

Today, during a live Google Hangout Q&A, Russ Reeder, the President and COO of Media Temple said,

"The employees at GoDaddy never supported it. It was one person, who had a voice, and they are no longer at GoDaddy. It was an employee, it wasn't the core management."

I understand GoDaddy just purchased your company. I've watched your twitter account explode trying to prevent customers from leaving because of their acquisition. However, this just doesn't square with the story told during the actual SOPA incident.

Here's then GoDaddy CEO Warren Adelman's statement,

Go Daddy opposes SOPA because the legislation has not fulfilled its basic requirement to build a consensus among stake-holders in the technology and Internet communities. Our company regrets the loss of any of our customers, who remain our highest priority, and we hope to repair those relationships and win back their business over time.

And the full press release on the turn around, [emphasis added]

SCOTTSDALE, Ariz. (Dec. 23, 2011) - Go Daddy is no longer supporting SOPA, the "Stop Online Piracy Act" currently working its way through U.S. Congress.

"Fighting online piracy is of the utmost importance, which is why Go Daddy has been working to help craft revisions to this legislation - but we can clearly do better," Warren Adelman, Go Daddy's newly appointed CEO, said. "It's very important that all Internet stakeholders work together on this. Getting it right is worth the wait. Go Daddy will support it when and if the Internet community supports it."

Go Daddy and its General Counsel, Christine Jones, have worked with federal lawmakers for months to help craft revisions to legislation first introduced some three years ago. Jones has fought to express the concerns of the entire Internet community and to improve the bill by proposing changes to key defined terms, limitations on DNS filtering to ensure the integrity of the Internet, more significant consequences for frivolous claims, and specific provisions to protect free speech.

"As a company that is all about innovation, with our own technology and in support of our customers, Go Daddy is rooted in the idea of First Amendment Rights and believes 100 percent that the Internet is a key engine for our new economy," said Adelman.

In changing its position, Go Daddy remains steadfast in its promise to support security and stability of the Internet. In an effort to eliminate any confusion about its reversal on SOPA though, Jones has removed blog postings that had outlined areas of the bill Go Daddy did support.

"Go Daddy has always fought to preserve the intellectual property rights of third parties, and will continue to do so in the future," Jones said.

The statement from the CEO is slightly ambiguous, but the press office statement makes it pretty clear that they were working with SOPA legislation. GoDaddy and their General Counsel were working on this, there is no mention of one employee doing this without approval. They worked for months on SOPA legislation. The original statements don't make SOPA sound like some fringe issue, they are framing it as they were actively engaged in its crafting and have changed their judgement about the content/outcome of the legislation.

I tried to get an answer from Russ Reeder and Damian Sellfors, but they ended the Q&A session right before my question.
media temple live stream question

The rogue agent story that has been passed off today just doesn't match up well. For an issue like SOPA, which is near and dear to so many internet users, this type of spinning needs explanation and clarification.

If you want to sell out your company and earn millions of dollars, that's your prerogative (and congratulations on having enough money to do whatever you want for the rest of your life). All I ask is you be honest with your consumers.