Tag Archives: wordpress

WordPress.org Removes BlueHost + SiteGround and Relists Within Days

This story gets stranger.

You've probably read my previous coverage WordPress.org Updates Hosting Recommendations, Nobody Knows Selection Criteria and WordPress.org Hosting Recommendations Listing Criteria.

BlueHost and SiteGround were removed May 27 or 28.


There was a rumor that is was about modifying config files, but that was denied by Hristo Pandjarov (SiteGround Employee) who wrote, "we don't do nothing of the sort, the wp-config file as well as the entire installation is default including the themes. We have a plugin if you want to use the SuperCacher. We're still checking out why we've been removed from that page but that's definitelly[sic] not the case."

BlueHost and SiteGround reappeared today (June 1, 2016).

I reached out to Matt Mullenweg for a comment and was told:

"Temporary issues may cause recommended hosts to change at any time, but given the long track record of both I expect they'll be back. We might also try out different presentations and layouts there in the future."

So essentially, there's no new information, and even the web hosts themselves seem to be in the dark about what's going on.

People celebrated the removal of BlueHost and were shocked by the removal of SiteGround. But these erratic listing changes are exactly why transparency needs to be applied to the page. It's worth millions of dollars in business and is influential in one of the largest internet communities. But nobody knows or understands what's really going on.

WordPress.org Hosting Recommendations Listing Criteria

UPDATE (5/13/2016 7:05 PM): Official comment from Matt Mullenweg was posted. Quoted below, click or scroll to the comment section to see the original.

“I would like to see some transparency in the process”

As stated on the page, the listing is completely arbitrary. The process was: There was a survey, four applicants were chosen, and the page was updated. That might repeat later in the year, or the process might change.

“how criteria are weighted”

There is no criteria or weighting. It ultimately is one person’s opinion. Information that is important is reflected in the questions asked in the survey, but that is not everything that is taken into account. (I have looked at this site in the past, for example.)

“who is making the decisions”

I am. James helped in sorting through the many submissions that came in, doing additional research, and digging on finalists, but ultimately the decision was mine. You can and should blame me fully for any issues you have with it. I appreciate James’ help in this go-round, but he will not be involved at all with any future updates. (So, please leave him alone.)

“how much money is involved”

There was no money involved. Obviously being listed on the page is hugely valuable and impacts the listed (or unlisted) businesses a great deal. This is why I take full responsibility for the listing, now and in the future — I have been fortunate to be extraordinarily successful and no financial or business consideration any of the applicants could offer matters to me. A host could offer $100,000,000 to be listed on the page for 1 day, and I would say no.

-Matt Mullenweg

Yesterday, I posted WordPress.org Updates Hosting Recommendations, Nobody Knows Selection Criteria. Which naturally meant I was going to find out as much as I could about the process, because it's a big deal and my mission here at Review Signal is honest and transparent web hosting reviews.

I confirmed with multiple sources that the newly listed companies didn't pay any money to get listed. Everyone seems to have filled out the form and then heard nothing back until the updated page was published yesterday. Both the winners (BlueHost [Reviews], DreamHost [Reviews], FlyWheel [Reviews], SiteGround [Reviews]) and losers (everyone else) seemed to agree on this process based on everyone I talked to.

Great. The application process seems fair.

But the selection process is still a black box, with help from people who follow WordPress more closely than myself, I found James Huff (macmanx) a 12 year volunteer and 5 year employee at Automattic who was directly involved with the new WP.org hosting recommendations.


I didn't hide who I was or my interest. The most concerning part of this exchange was that 'Absolutely no money changed hands, unless you consider sponsorship of WordCamps as monetary with regards to the "contributions to WordPress.org."'

No money changed hands except a lot of sponsorship dollars to the organization. Guess who the top global gold community sponsors are? BlueHost (and JetPack/WooCommerce, both owned by Automattic). Somehow BlueHost are also a Silver sponsor too, along with GoDaddy. BlueHost is pouring a lot of money into WordCamps/WordPress.org Foundation.

I'm sorry, but I do consider that money changing hands. They are giving a large sum of money - it's material enough to mention in their SEC filings.


We're still going to have to agree to disagree about what money changing hands means. But he says it was fair. But fair is pretty meaningless when we don't really have any insight into what standard of fairness is the goal. How is each criteria being weighed and evaluated. But this is the list of hosts that they can confidently tell everyone are good.

I'm not sold.


Historical perception seems to be the proxy for what marketers might call Net Promoter Score (NPS). How much do consumers like/recommend something. That's essentially what I measure here at Review Signal and my data has been incredibly close to what company's internal data shows (LiquidWeb NPS Scores vs LiquidWeb Review Signal Rating).

It is arguably the most important factor of recommendations and for service businesses, it's about the best metric for all encompassing quality available.

But it's only part of the criteria and that's fair. But should there be some minimum threshold? Can a company score a zero in quality and high in everything else be worthy of a listing? BlueHost's rating is 41%. That means roughly 6/10 people don't recommend it or have anything good to say about them.

There are WordCamp sponsors that didn't make the cut. Of the global community sponsors 2/3 hosting companies did though, BlueHost and DreamHost, while one didn't, GoDaddy. But the largest sponsor made it and is at the top and it's still BlueHost.

But moving on, James mentioned Automattic has no play in the process, but he does wear multiple hats. Which means he is aware of the potential perception of conflict of interest.


Finally, a mention of Matt. Important again when thinking about the context for potential conflicts of interest. I outline what would happen in a dream world and what's realistic. I think honest disclosure and basic transparency is perfectly realistic. It's ok to make money, just be clear about where it's coming from. A standard I try to uphold here at Review Signal, see how we make money and read the entire process for how our rankings are calculated. See? It's not hard and I still make money giving the best information available.


AWP comments

That is the comment thread I referenced. Not a single person said anything positive about BlueHost and the assumption is they just paid for it. BlueHost being listed ruins the credibility of the recommendations when there is no transparency about what criteria was being used.


Moving on, the survey itself has issues which I brought up before. It's asking for sensitive company information and being handled by employees of a company that owns two competitors in the space (WP.com VIP, Pressable), took $15 million in investment from another (BlueHost), and is an investor in a fourth competitor (WP Engine).

That seems like a huge potential conflict of interest and I know it dissuaded at least one company from even applying.


james huff 3 tweets

It didn't end on the nicest note, I don't think James took my criticisms well. From his original messages, I think he knows and understands the perception of conflicts of interest but admitting them in this context puts him in a very awkward position that I don't envy. He wears multiple hats and surely wants to wear them all fairly. I would say admitting that those multiple hats has the potential for conflicts of interest isn't a weakness of character, it's an admission of humanity. I'm sure James is a great guy and has done a lot of good things for the community. But I think people who can be perceived with a strong potential for conflict of interest, which anyone connected to Automattic in this situation would have, shouldn't be managing this particular process.

I truly don't have any ill will towards James personally or Automattic. Even BlueHost/EIG, I've been more than willing to give them the benefit of the doubt and continue to hope that they will be better (ASO did break my heart a bit, I thought they were turning EIG around). My data continues to show them being mediocre and a seeming touch of death in terms of quality (their strategy does seem to be cost cutting and economies of scale). But I don't fault them for their behavior, I expect it, it's well published in their SEC filings.


I still think WordPress.org can do better with its hosting recommendations and I'm not going to stop advocating for them until they are better. I would like to see some transparency in the process, how criteria are weighted, who is making the decisions and how much money is involved. I think the companies that applied would appreciate feedback about why they weren't selected, what makes them different and fall short of the companies that do make the cut. Or just call them Ads / Sponsors. Don't say they are the best and brightest and endorse them. Say, we took money and this guy paid us the most. At least we meet the minimum threshold of honesty and transparency.



For posterity, the logs in their entirety are available below. It's long, so I tried to cut down some stuff to get to the most important bits. But I don't want to hide anything.

Direct Message Archive macmanx Making WordPress Slack Direct Message Archive macmanx Making WordPress Slack2

WordPress.org Updates Hosting Recommendations, Nobody Knows Selection Criteria

I've railed about Drupal and WordPress Have Sold Us Out in terms of hosting recommendations before. We've been waiting a long time (around a year now?) for WordPress.org to do it's revamp of its hosting recommendation page.

The Winners

BlueHost, DreamHost, FlyWheel, SiteGround

I'm not shocked at all to see BlueHost somehow still manages to be at the very top (albeit the list is alphabetical). They've continuously survived being listed, I guess that's what a million dollars will do.

Where is the transparency?

They requested hosts submit a ridiculous amount of personal information. You can see the full survey below:

2016 WordPress Hosting Survey - WordPressorg

It asks some deeply private questions like number of employees, how many 30 day active paying customers you have, and how many net paying customers are you gaining or losing each month?

Mind you, as far as anyone can tell, Matt has complete control over who shows up, and Automattic bought the majority stake in a company competing in the WordPress hosting space, Pressable. They also run WordPress.com VIP. They are also an investor in WPEngine. So some of the most secretive numbers a company competing in this space might have are being disclosed potentially to multiple of their biggest competitors through a process with no transparency or even a person named to be responsible for it.

That alone is worrisome for the process, it should definitely be run independent of Matt.

Everything else needs to be explained too. Who is responsible for this revamp? What were the selection criteria? How often will it be updated? Will existing companies be continuously re-evaluated?


It's not clear who 'we' is. They say listing is arbitrary but then add criteria. I'm not sure they understanding what arbitrary means. Or maybe they simply ignore the criteria they mention. Maybe it's just a terrible joke? Just like the process (or lack thereof) that seems to be in place.

A lot of it is pretty subjective. design, tone, ease of WP auto-install, historical perception? BlueHost is still listed, which is has consistently been pretty poorly reviewed (along with just about all EIG brands) and continues a downward trend.


Furthermore, it's the same criteria that's been written since at least 2010.

So maybe saying it's arbitrary gives them as escape to list whomever they want, especially considering the financial considerations involved.

Newly Listed Companies

I tried to find some explanation for how the three new companies were selected, but there really isn't much to go on. DreamHost is a Silver Community Sponsor for WordCamp, but so is GoDaddy who did not make the cut.

FlyWheel only does WordPress, but DreamHost and SiteGround do a lot more.

DreamHost has a ton of forum threads on WordPress.org, SiteGround has only a few over 10 years. FlyWheel has one total.

I talked to someone at one of the newly listed hosted companies and they confirmed that the form was filled out and that was it. Also, there was no financial consideration involved with the listing.

Which is very nice to hear, but doesn't really inspire confidence in the recommendations.

I've aired my concern with BlueHost multiple times.

But what about the new companies and their ratings?

DreamHost has a 59% rating on Review Signal, which is ok, given the upper end of the shared hosting spectrum is SiteGround at 71%. FlyWheel, the specialized hosting company has the highest rating of any company at a whopping 85%.

So the new companies are all far better than BlueHost (41%). But there are other very highly rated companies that didn't make the cut. For example, WP Engine (72%) is probably the biggest  name not listed based on size, brand in the WP community and rating at Review Signal.


I'm glad there are some much better companies than Blue Host listed and at least one of them got there without paying for the privilege. There is still language about some donating a portion of the fee back, which makes you think it's still at least BlueHost.

I'm still unhappy with the lack of transparency of the entire process. The most influential place for people entering the WordPress community is recommending one very mediocre hosting company who has historically paid large sums to be listed and has a deep financial relationship with the person ultimately responsible for the recommendations. The revamp didn't change that.

I am disappointed and I don't expect to hear anything from WordPress.org/Matt clarifying the hosting page, again.




There was a little discussion in the WordPress slack. macmanx is James Huff, an Automattic employee. Seems they wanted only 1 managed WordPress host. Other details include around 100 applications. And even in the WordPress slack, the first comment doubts that these are really the best (well, one which almost everyone assumes to be BlueHost).

james_huff_3_outta_4 James_Huff_hosting_recommendations

WPPerformanceTester – A WordPress Plugin to Benchmark Server Performance

Everyone who read our most popular blog post, WordPress Hosting Performance Benchmarks may have noticed a new test this year (2015) called WPPerformanceTester. It was something I built during the tests to add a new benchmark to see what the underlying performance of the server the test websites were hosted on. It wasn't hugely meaningful because I had no real basis to compare from except the benchmarks I had just generated. So it really played no role in the actual rankings and outcomes of the testing.

But the vision for it and value has slowly become more apparent. In my testing, Pagely had an unusually slow WordPress benchmark (testing WordPress database functions). It was acknowledged by their team and they have since announced a migration to a newer Amazon technology called Aurora which gave Pagely a 3-4x performance increase.

So without further ado, I'd like to announce WPPerformanceTester is now live on GitHub and licensed under the GPLv3. All problems, errors and issues should be submitted on GitHub.

What Tests Does WPPerformanceTester Run?

  • Math - 100,000 math function tests
  • String Manipulation - 100,000 string manipulation tests
  • Loops - 1,000,000 loop iterations
  • Conditionals - 1,000,000 conditional logic checks
  • MySql (connect, select, version, encode) - basic mysql functions and 1,000,000 ENCODE() iterations
  • $wpdb - 250 insert, select, update and delete operations through $wpdb

Industry Benchmarks

WPPerformanceTester also allows you to see how your server's performance stacks up against our industry benchmark. Our industry benchmark is the average of all submitted test results. After you run WPPerformanceTester, you will have the option to submit the benchmark with or without writing a review of your web host. Please consider submitting without a review so that our benchmark improves. If you feel inclined to write a review, please feel free. They will be published in an upcoming project that ties together many of the projects I've been working on here at Review Signal.

Please Note

WPPerformanceTester is a single node testing tool (if you're running a distributed/clustered system it will not give a complete picture, but only of the servers that execution touches.

Furthermore, WPPerformanceTester is not the be-all and end-all of performance testing or web host quality. Our WordPress Hosting Performance Benchmarks performs a variety of tests and that only gives insight into performance. It doesn't look at customer service quality, pricing, and other important dimensions of a good web hosting service.

WPPerformanceTester should be used as one tool in a performance toolbox. I hope it's valuable and helpful, but please keep in mind the larger picture as well. If you care about service quality, we also maintain the largest web hosting review database. My goal is to cover every aspect, and WPPerformanceTester marks a small step in that direction of being able to give consumers a complete picture of web hosting quality in the WordPress space.

WordPress Hosting Performance Benchmarks (2015)


Sponsored by LoadStorm. The easy and cost effective load testing tool for web and mobile applications.

This is the third round of managed WordPress web hosting performance testing. You can see the original here, and the November 2014 version here.

New (9/14/2016) The 2016 WordPress Hosting Performance Benchmarks are live.

New (8/20/2015) This post is also available as an Infographic.

Companies Tested

A Small Orange [Reviews]
BlueHost [Reviews]
CloudWays [Reviews]
DreamHost [Reviews]
FlyWheel [Reviews]
GoDaddy [Reviews]
MediaTemple [Reviews]
Pagely [Reviews]
Pantheon [Reviews]
SiteGround† [Reviews]
WPEngine* [Reviews]

Note:  Pressable and WebSynthesis [Reviews] were not interested in being tested this round and were excluded. WordPress.com dropped out due to technical difficulties in testing their platform (a large multi-site install).

Every company donated an account to test on. All were the WordPress specific plans (e.g. GoDaddy's WordPress option). I checked to make sure I was on what appeared to be a normal server. The exception is WPEngine*. They wrote that I was "moved over to isolated hardware (so your tests don’t cause any issues for other customers) that is in-line with what other $29/month folks use." From my understanding, all testing was done on a shared plan environment with no actual users on the server to share. So this is almost certainly the best case scenario performance wise, so I suspect the results look better than what most users would actually get.

†Tests were performed with SiteGround's proprietary SuperCacher module turned on fully with memcached.

The Products (Click for Full-Size Image)



The question I tried to answer is how well do these WordPress hosting services perform? I tested each company on two distinct measures of performance: peak performance and consistency. I've also included a new and experimental compute and database benchmark. Since it is brand new, it has no bearing on the results but is included for posterity and in the hope that it will lead to another meaningful benchmark in the future.

All tests were performed on an identical WordPress dummy website with the same plugins except in cases where hosts added extra plugins. Each site was monitored for over a month for consistency.

1. LoadStorm

LoadStorm was kind enough to give me unlimited resources to perform load testing on their platform and multiple staff members were involved in designing and testing these WordPress hosts. I created identical scripts for each host to load a site, login to the site and browse the site.  I tested every company up to 2000 concurrent users. Logged in users were designed to break some of the caching and better simulate real user load.

2. Blitz.io

I used Blitz again to compare against previous results. Since the 1000 user test wasn't meaningful anymore, I did a single test for 60 seconds, scaling from 1-2000 users.

3. Uptime (UptimeRobot and StatusCake)

Consistency matters. I wanted to see how well these companies performed over a longer period of time. I used two separate uptime monitoring services over the course of a month to test consistency.

4. WebPageTest.org

"WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster." WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. I tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

5. WPPerformanceTester

I created a WordPress plugin to benchmark CPU, MySql and WordPress DB performance. It is based on a PHP benchmark script I forked (available on GitHub) and adapted to WordPress. The CPU/MySql benchmarks are testing the compute power. The WordPress component tests actually calling $wpdb and executing insert, select, update and delete queries. This plugin will be open sourced once I clean it up and make it usable for someone beyond myself.

Background Information

Before I go over the results I wanted to explain and discuss a few things. Every provider I tested had the latest version of WordPress installed. I had to ask a lot of companies to disable some security features to perform accurate load tests. Those companies were: DreamHost, Kinsta, LightningBase, Nexcess, Pagely, Pressidium, PressLabs, SiteGround, and WPEngine.

Every company that uses a VPS based platform were standardized around 2GB of memory for their plan (or equivalent) in an effort to make those results more comparable. The exception is DreamHost which uses a VPS platform but uses multiple scaling VPSs.

CloudWays's platform that lets you deploy your WordPress stack to multiple providers: Digital Ocean, Amazon (AWS)'s EC2 servers or Google Compute Engine. I was given a server on each platform of near comparable specs (EC2 Small 1.7GB vs Digital Ocean 2GB vs GCE 1.7GB g1 Small). So CloudWays is listed as CloudWays AWS, CloudWays DO, CloudWays GCE to indicate which provider the stack was running on.

SiteGround contributed a shared and VPS account designated by the Shared or VPS after it.


Load Storm

Since last round didn't have any real issues until 1000 users I skipped all the little tests and began with 100-1000 users. I also did the 500-2000 user test on every company instead of simply disqualifying companies. I ran these tests with an immense amount of help from Phillip Odom at LoadStorm. He spent hours with me, teaching me how to use LoadStorm more effectively, build tests and offering guidance/feedback on the tests themselves.

 Test 1. 100-1000 Concurrent Users over 30 minutes

Name of Test Total Requests Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred(GB) Peak Throughput(kB/s) Average Throughput(kB/s) Total Errors
A Small Orange 114997 90.27 61.83 1785 259 2.41 1878.14 1295.82 0
BlueHost 117569 93.62 63.21 15271 2522 5.41 4680.6 2909.16 23350
CloudWays AWS 138176 109.1 74.29 15086 397 7.15 6016.88 3844.49 44
CloudWays DO 139355 109.88 74.92 2666 321 7.21 5863.82 3876.3 0
CloudWays GCE 95114 76.22 52.84 15220 7138 3.63 3247.38 2014.92 23629
DreamHost 143259 113.57 77.02 15098 314 7.1 6136.75 3815.73 60
FlyWheel 128672 101.98 69.18 9782 571 7 6197.92 3764.6 333
GoDaddy 134827 104.6 72.49 15084 352 7.49 6368.32 4028.45 511
Kinsta 132011 102.98 70.97 3359 229 7.35 6078.95 3951.75 0
LightningBase 123522 100.73 68.62 4959 308 6.53 5883.15 3626.2 4
MediaTemple 134278 105.72 74.6 15096 363 7.45 6397.68 4140.7 640
Nexcess 131422 104.47 70.66 7430 307 7.17 6256.08 3854.27 0
Pagely 87669 70.8 47.13 7386 334 5.75 5090.11 3091.06 3
Pantheon 135560 106.42 72.88 7811 297 7.24 5908.27 3890.83 0
Pressidium 131234 103.03 70.56 7533 352 7.23 6092.36 3889.64 0
PressLabs 132931 107.43 71.47 10326 306 3.66 3264.02 1968.98 0
SiteGround Shared 137659 111.35 74.01 7480 843 6.85 5565.02 3683.04 111
SiteGround VPS 130993 103.45 70.43 15074 310 7.17 6061.82 3855.86 19
WPEngine 148744 117.15 79.97 15085 206 7.32 6224.06 3935.35 4
WPOven.com 112285 96.58 60.37 15199 2153 5.78 5680.23 3108.94 5594
WPPronto 120148 99.08 64.6 15098 681 5.61 4698.51 3018.33 19295

Discussion of Load Storm Test 1 Results

Most companies were ok with this test, but a few didn't do well: BlueHost, CloudWays GCE, WPOven and WPPronto. FlyWheel, GoDaddy and Media Temple had a couple spikes but nothing too concerning. I was actually able to work with someone at DreamHost this time and bypass their security features and their results look better than last time. I am also excited that we got PressLabs working this time around after the difficulties last round.

In general, the 1000 user test isn't terribly exciting, 7/21 companies got perfect scores with no errors. Another 6 didn't have more than 100 errors. Again, this test pointed out some weak candidates but really didn't do much for the upper end of the field.

Test 2. 500 - 2000 Concurrent Users over 30 Minutes

Note: Click the company name to see full test results.

Total Requests Peak RPS Average RPS Peak Response Time(ms) Average Response Time(ms) Total Data Transferred(GB) Peak Throughput(kB/s) Average Throughput(kB/s) Total Errors
A Small Orange 242965 181.62 130.63 15078 411 5.09 3844.54 2737 1
BlueHost 201556 166.83 111.98 15438 8186 5.32 5229.07 2953.17 93781
CloudWays AWS 261050 195.23 145.03 15245 2076 13.13 9685.95 7296.4 11346
CloudWays DO 290470 218.17 161.37 15105 532 14.87 12003.3 8262.77 1189
CloudWays GCE 193024 147.22 107.24 15168 8291 4.72 4583.86 2622.85 93821
DreamHost 303536 232.27 163.19 15100 442 14.95 12619.67 8039.54 210
FlyWheel 253801 202.15 136.45 15218 1530 11.26 9939.17 6052.49 56387
GoDaddy 283904 221.12 152.64 15025 356 15.74 13731.97 8460.12 1432
Kinsta 276547 214.93 148.68 15025 573 15.16 13444.75 8151.37 1811
LightningBase 263967 211.12 141.92 7250 330 13.82 13061.01 7429.91 18
MediaTemple 286087 223.93 153.81 15093 355 15.83 14532.42 8512.11 1641
Nexcess 277111 207.73 148.98 15087 548 15 12313.29 8066.37 359
Pagely 181740 148.18 97.71 11824 791 11.82 10592.21 6355.09 1
Pantheon 287909 223.02 154.79 15039 276 15.28 13831.45 8217.49 3
Pressidium 278226 208.55 149.58 15044 439 15.28 12453.66 8213.63 12
PressLabs 280495 214.07 150.8 8042 328 7.66 6267.46 4118.34 0
SiteGround Shared 301291 231.93 161.98 15052 557 14.76 12799.09 7934.03 1837
SiteGround VPS 279109 209.67 150.06 12777 374 15.21 12506.79 8178.5 20
WPEngine 316924 241.67 170.39 7235 285 15.52 12989.23 8341.47 3
WPOven.com 213809 169.97 118.78 15268 4442 8.81 7153.5 4894.98 35292
WPPronto 258092 206.53 143.38 15246 539 10.85 9483.74 6026.26 76276

Discussion of Load Storm Test 2 Results 

The previous companies that struggled ( BlueHost, CloudWays GCE, WPOven and WPPronto) didn't improve, which is to be expected. FlyWheel which had a few spikes ran into more serious difficulties and wasn't able to withstand the load. CloudWays AWS ended up failing, but their Digital Ocean machine spiked but was able to handle the load.

The signs of load were much more apparent this round with a lot more spikes from many more companies. GoDaddy and Media Temple who also had spikes in the first test, had spikes again but seemed to be able to withstand the load.  Kinsta spiked early but was stable for the duration of the test. SiteGround Shared had a steady set of small spikes but didn't fail.

Nobody had the same level of perfection as last time with no spike in response times. Only one company managed an error-less run this time (PressLabs) but many achieved similar results, like A Small Orange went from 0 errors to 1, Pantheon went from 0 to 3 and Pagely had only 1 error, again.

The biggest change that occurred was WPEngine. It went from failing on the 1000 user test to having one of the better runs in the 2000 user test. I have to emphasize it was a shared plan on isolated hardware though with no competition for resources.


 Test 1. 1-2000 Concurrent Users over 60 seconds

Blitz Test 1. Quick Results Table

Note: Click the company name to see full test results.

Company Hits Errors Timeouts Average Hits/Second Average Response Time Fastest Response Slowest Response
A Small Orange 51023 56 280 850 115 72 285
BlueHost 37373 475 2102 623 338 124 979
CloudWays AWS 56946 737 74 949 13 3 73
CloudWays DO 52124 1565 1499 869 35 23 87
CloudWays GCE 50463 1797 782 841 96 92 138
DreamHost 58584 1 0 978 4 4 4
FlyWheel 49960 3596 2022 833 30 24 140
GoDaddy 29611 26024 18 494 165 103 622
Kinsta 57723 1 0 962 20 20 21
LightningBase 54448 1 4 907 81 81 81
MediaTemple 29649 25356 126 494 162 104 1103
Nexcess 38616 4924 2200 644 221 70 414
Pagely 58722 1 0 979 3 2 5
Pantheon 55814 112 9 930 52 52 54
Pressidium 47567 1 9 793 233 233 234
PressLabs 58626 0 0 977 5 4 6
SiteGround Shared 49127 1123 1 819 172 171 178
SiteGround VPS 35721 75 4371 595 238 82 491
WPEngine 56277 827 1 938 27 21 70
WPOven.com 55027 10 2 917 69 68 71
WPPronto 54921 99 29 915 69 68 72


Discussion of Blitz Test 1 Results

This test is just testing whether the company is caching the front page and how well whatever caching system they have setup is performing (generally this hits something like Varnish or Nginx).

Who performed without any major issues?

DreamHost, Kinsta, LightningBase, Pagely, Pantheon, Pressidium, PressLabs, WPOven, WPPronto all performed near perfect. There's nothing more to say for these companies other than they did excellent.

Who had some minor issues?

A Small Orange started showing signs of load towards the end. CloudWays AWS had a spike and started to show signs of load towards the end. SiteGround Shared had a spike at the end that ruined a very beautiful looking run otherwise. WPEngine started to show signs of load towards the end of the test.

Who had some major issues?

BlueHost, CloudWays DO, CloudWays GCE, FlyWheel, GoDaddy, MediaTemple, Nexcess, and SiteGround VPS had some major issues. The CloudWays platform pushed a ton of requests (the only companies over 50,000) but also had a lot of errors and timeouts. The rest were below 50,000 (although FlyWheel was only a hair behind) and also had a lot of errors and timeouts. SiteGround VPS might be an example of how shared resources can get better performance versus dedicated resources. GoDaddy and Media Temple have near identical performance (again, it's the same technology I believe). Both look perfect until near the end where they crash and start erroring out. Nexcess just shows load taking its toll.

Uptime Monitoring

Both uptime monitoring solutions were third party providers that offer free services. All the companies were monitored over an entire month+ (May-June 2015).

Uptime Robot

Uptime (30 Day)
A Small Orange 100
BlueHost 100
CloudWays AWS 100
CloudWays DO 100
CloudWays GCE 100
DreamHost 94.06
FlyWheel 100
GoDaddy 100
Kinsta 100
LightningBase 100
MediaTemple 100
Nexcess 100
Pagely 100
Pantheon 99.94
Pressidium 100
PressLabs 100
SiteGround Shared 100
SiteGround VPS 100
WebHostingBuzz 42.9
WPEngine 100
WPOven.com 100
WPPronto 100

At this point, I will finally address the odd elephant in the blog post. WebHostingBuzz has empty lines for all the previous tests. Why? Because their service went down and never came back online. I was told that I put an incorrect IP address for the DNS. However, that IP worked when I started and was the IP address I was originally given (hence the 42% uptime, it was online when I started testing). It took weeks to even get a response and once I corrected the IP, all it ever got was a configuration error page from the server. I've not received a response yet about this issue and have written them off as untestable.

The only other company that had any major issue was DreamHost. I'm not sure what happened, but they experienced some severe downtime while I was testing the system and returned an internal server error for 42 hours.

Every other company had 99.9% uptime or better.


StatusCake had a slightly longer window available from their reporting interface, so the percentages are a little bit different and noticeable on companies like DreamHost.

StatusCake Availability (%) Response Time (ms)
A Small Orange 99.96 0.21
BlueHost 99.99 0.93
CloudWays AWS 100 0.76
CloudWays DO 100 0.47
CloudWays GCE 100 0.69
DreamHost 97.14 1.11
FlyWheel 100 1.25
GoDaddy 100 0.65
Kinsta 100 0.71
LightningBase 99.99 0.61
MediaTemple 100 1.38
Nexcess 100 0.61
Pagely 99.99 0.47
Pantheon 99.98 0.56
Pressidium 99.99 0.94
PressLabs 100 0.65
SiteGround Shared 100 0.54
SiteGround VPS 100 0.9
WebHostingBuzz 58.1 0.67
WPEngine 100 0.71
WPOven.com 100 0.73
WPPronto 100 1.19

The results mirror UptimeRobot pretty closely. WebHostingBuzz and DreamHost had issues. Everyone else is 99.9% or better.

StatusCake uses a real browser to track response time as well. Compared to last year, everything looks faster. Only two companies were sub one second average response time last year. This year, almost every company maintained sub one second response time, even the company that had servers in Europe (Pressidium).


Every test was run with the settings: Chrome Browser, 9 Runs, native connection (no traffic shaping), first view only.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
A Small Orange 0.624 0.709 0.391 0.8 0.631
BlueHost 0.909 1.092 0.527 0.748 0.819
CloudWays AWS 0.627 0.748 0.694 1.031 0.775
CloudWays DO 0.605 0.751 0.635 1.075 0.7665
CloudWays GCE 0.787 0.858 0.588 1.019 0.813
DreamHost 0.415 0.648 0.522 0.919 0.626
FlyWheel 0.509 0.547 0.594 0.856 0.6265
GoDaddy 0.816 1.247 0.917 0.672 0.913
Kinsta 0.574 0.559 0.587 0.903 0.65575
LightningBase 0.544 0.656 0.5 0.616 0.579
MediaTemple 0.822 0.975 0.983 0.584 0.841
Nexcess 0.712 0.871 0.593 0.795 0.74275
Pagely 0.547 0.553 0.665 0.601 0.5915
Pantheon 0.627 0.567 0.474 0.67 0.5845
Pressidium 0.777 0.945 0.898 1.05 0.9175
PressLabs 0.542 1.257 0.723 0.732 0.8135
SiteGround Shared 0.721 0.85 0.478 0.808 0.71425
SiteGround VPS 0.667 0.651 0.515 0.657 0.6225
WebHostingBuzz 0
WPEngine 0.648 0.554 0.588 0.816 0.6515
WPOven.com 0.624 0.574 0.556 0.595 0.58725
WPPronto 0.698 0.809 0.443 0.721 0.66775

In line with the StatusCake results, the WebPageTest results were shockingly fast. The first time I did this testing, only one company had a sub one second average response time. Last year about half the companies were over one second average response time. The fastest last year was LightningBase at 0.7455 seconds. This year that would be in the slower half of the results. The fastest this year was LightningBase again at 0.579 seconds. The good news for consumers appears to be that everyone is getting faster and your content will get to consumers faster than ever no matter who you choose.


Company PHP Ver MySql Ver PHP Bench WP Bench MySql
A Small Orange 5.5.24 5.5.42-MariaDB 13.441 406.67 LOCALHOST
BlueHost 5.4.28 5.5.42 12.217 738.01 LOCALHOST
CloudWays AWS 5.5.26 5.5.43 10.808 220.12 LOCALHOST
CloudWays DO 5.5.26 5.5.43 11.888 146.76 LOCALHOST
CloudWays GCE 5.5.26 5.5.43 10.617 192.2 LOCALHOST
DreamHost 5.5.26 5.1.39 27.144 298.6 REMOTE
FlyWheel 5.5.26 5.5.43 12.082 105.76 LOCALHOST
GoDaddy 5.4.16 5.5.40 11.846 365.76 REMOTE
Kinsta 5.6.7 10.0.17-MariaDB 11.198 619.58 LOCALHOST
LightningBase 5.5.24 5.5.42 12.369 768.64 LOCALHOST
MediaTemple 5.4.16 5.5.37 12.578 333.33 REMOTE
Nexcess 5.3.24 5.6.23 12.276 421.76 LOCALHOST
Pagely 5.5.22 5.6.19 10.791 79.79 REMOTE
Pantheon 5.5.24 5.5.337-MariaDB 12.669 194.86 REMOTE
Pressidium 5.5.23 5.6.22 11.551 327.76 LOCALHOST
PressLabs 5.6.1 5.5.43 8.918 527.7 REMOTE
SiteGround Shared 5.5.25 5.5.40 14.171 788.02 LOCALHOST
SiteGround VPS 5.6.99 5.5.31 11.156 350.51 LOCALHOST
WPEngine 5.5.9 5.6.24 10.97 597.37 LOCALHOST
WPOven.com 5.3.1 5.5.43 11.6 570.13 LOCALHOST
WPPronto 5.5.25 5.5.42 11.485 889.68 LOCALHOST

This test is of my own creation. I created a plugin designed to test a few aspects of performance and get information about the system it was running on. The results here have no bearing on how I am evaluating these companies because I don't have enough details to make these meaningful. My goal is to publish the plugin and get people to submit their own benchmarks though. This would allow me to get a better picture of the real performance people are experiencing from companies and track changes over time. The server details it extracted may be of some interest to many people. Most companies were running PHP 5.5 or later but a few aren't. Most companies seem to be running normal MySql, but ASO, Kinsta and Pantheon all are running MariaDB which many people think has better performance. Considering where all three of those companies ended up performing on these tests, it's not hard to believe. There seems to be an even split between running MySql localhost (BlueHost, LightningBase, Nexcess, SiteGround, WPEngine, WPPronto) or having a remote MySql server (DreamHost, GoDaddy, MediaTemple, Pagely, Pantheon, PressLabs).

The PHP Bench was fascinating because most companies were pretty close with the exception of DreamHost which took nearly twice as long to execute.

The WP Bench was all over the place. Pagely had by far the slowest result but on every load test and speed test they went through, they performed with near perfect scores. The test simulates 1000 $wpdb calls doing the primary mysql functions (insert, select, update, delete). Other companies had outrageously fast scores like WPPronto or BlueHost but didn't perform anywhere near as well as Pagely on more established tests.

For those reasons, I don't think this benchmark is usable yet. But I would love feedback and thoughts on it from the community and the hosting companies themselves.

Attempting VPS Parity in Testing

One substantial change to the testing methodology this round was that all VPS providers were tested with the same amount of memory (2 GB Ram). Since the most interesting tests were the load tests I have only graphed them below:


The LoadStorm test had a huge spread in terms of performance. The Google Compute Engine server from CloudWays was by far the worst (an issue we touched on before that it's not a true VPS with dedicated resources). FlyWheel and WPOven also struggled to keep up with the LoadStorm test. Others like ASO, CloudWays DO, Kinsta, and SiteGround handled the test with minimal issues. On the other hand, it's very interesting to see how fairly consistent most of the VPSs perform in the Blitz test between 50,000 and roughly 55,000 hits. The error rates are a bit interesting though because this hardware should be about as close to the same as possible.

The easier result to explain is the Blitz performance. It is testing the ability of these companies to spit back a single page from cache (most likely Varnish or Nginx). So that level of caching seems to be pretty close to parity.

The LoadStorm test shows a wide difference in performance. The LoadStorm test is far more comprehensive and designed to bust through some caching and hit other parts of the stack. It really elucidates the difference in company's ability to tune and optimize their servers from both software and hardware perspectives.


Every service seems to have their issues somewhere if you look hard enough. I try to avoid injecting my personal opinion and bias as much as possible. As I've added more companies to the testing, drawing a line between which companies performed in the top tier and which did not has become blurrier. The closest test was the LoadStorm 2000 test where multiple companies (CloudWays DO, GoDaddy, Kinsta, Media Temple, SiteGround Shared) were on the absolute edge of being top tier providers. Last time I picked an arbitrary 0.5% error rate and these companies were all around the 0.5-0.7% mark. Last year the difference was quite large after that point. I openly admit to having personal connections with people at nearly all these companies and my ability to draw the line in this instance could be considered questionable. So this year I deferred the judgment to an independent party, Phillip Odom at LoadStorm, to determine what he thought of the performances. Phillip is the Director of Performance Engineering at LoadStorm and he has more experience with load testing and the LoadStorm product than almost anyone I know. His job was to determine if the performance could be considered top tier or not. He said a couple spikes early but a stable performance otherwise seemed top tier. The difference in 1/100 of a percent didn't seem like a big deal, especially over a 30 minute test where the issues were at the start as it ramped up to 2000 concurrent users. So the companies on the edge that exhibited that behavior were considered top tier for the LoadStorm test.

I won't be ranking or outright saying any single company is the best. Some providers did exceptionally well and tended to clump together performance-wise, I will call those the top tier providers. This top tier designation is related to performance only and is claimed only from the results of these tests. What each of these companies is offering is different and may best suit different audiences depending on a variety of factors beyond performance, such as features, price, support, and scale (I tested mostly entry level plans OR 2GB RAM plans for VPS providers). I will also provide a short summary and discussion of the results for each provider.

Top Tier WordPress Hosting Performance


A Small Orange,  Kinsta, LightningBasePagely, Pantheon, PressidiumPressLabs

Each of these companies performed with little to no failures in all tests and exhibited best in class performance for WordPress hosting.

Honorable Mentions

CloudWays gets an honorable mention because it's Digital Ocean (DO) instance performed quite well overall. It had some issue with the Blitz test at the end but still managed to push through over 52,000 successful hits. It's Amazon stack performed better on the Blitz test but not as well on LoadStorm. I'm not sure why the performance of identical stacks is differing across tests so much between AWS/DO but they improved dramatically since the last test and are on the cusp of becoming a top tier provider.

SiteGround's Shared hosting also gets an honorable mention. It was on that edge for both LoadStorm and Blitz. It had one spike at the end of the Blitz test which caused it's error rate to spike but the response times didn't move.

WPEngine gets an honorable mention because they performed well on most tests. They struggled and were showing signs of load on the Blitz test though that kept them out of the top tier of providers.

Individual Host Analysis

A Small Orange [Reviews]

Another top tier performance from ASO. They didn't really struggle much with any of the tests. Although their performances were slightly below their results last time, it's hard to beat things like having zero errors during LoadStorm's test. It's become easier to launch the LEMP VPS stack which is also nice. All in all, the experience was in-line with what I would expect from a company that has one of the highest support ratings on our site.

BlueHost [Reviews]

Improved against their last results but well below par in the performance department. The pricing and performance just don't match yet.

CloudWays [Reviews]

CloudWays is always a fun company to test. They added another provider since their last test: Google Compute Engine (GCE). Their Digital Ocean and Amazon performances both went up substantially which tells me they've made major improvements on their WordPress stack. We did run into some huge flaws in GCE though which aren't CloudWays's fault. We used the g1.small server on GCE and ran into huge performance walls that were repeatable and inexplicable from a software standpoint. Google was contacted and we learned that the "g1 family has "fractional" CPU, meaning that not a full virtual CPU is assigned to a server. This also means that the CPU is shared with other VMs and "capped" if usage exceeds a certain amount. This is exactly what happened during the load test. The VM runs out of CPU cycles and has to wait for new ones being assigned on the shared CPU to continue to server requests." Essentially, it's not a real VPS with dedicated resources and I was told a comparable would be N1.standard1 which is 2-3x the price of the AWS/DO comparables servers. It doesn't make GCE a very attractive platform to host on if you're looking for performance and cost efficiency. CloudWays did show major improvements this round and earned themselves that honorable mention. They were by far the most improved provider between tests.

DreamHost [Reviews]

DreamPress improved their performance a lot over last round. In fact they did fantastically well on every load test once I got the opportunity to actually work with their engineers to bypass the security measures. However, they failed pretty badly on the uptime metrics. I have no idea what happened but I experienced a huge amount of downtime and ran into some very strange errors. If it wasn't for the severe downtime issues, DreamPress could have been in the top tier.

FlyWheel [Reviews]

FlyWheel were excellent on every test except the final seconds of the Blitz test. Although they were just shy of the top tier, they are showing a lot of consistency in very good performance getting an honorable mention the last two times. Just some minor performance kinks to work out. Not bad at all for a company with the best reviews of any company Review Signal has ever tracked. FlyWheel is definitely worth a look.

GoDaddy [Reviews]

GoDaddy's performance declined this round. It struggled with the Blitz test this time around. I'm not sure what changed, but it handled Blitz far worse than before and LoadStorm slightly worse. The performance between GoDaddy and Media Temple again looked near identical with the same failure points on Blitz. At the retail $6.99 price though, it's still a lot of bang for your buck compared to most providers who are in the $20-30/month range.


Kinsta had another top tier performance. There was a slight decline in performance but that could be explained by the fact we tested different products. Kinsta's test last year was a Shared plan they no longer offer. This year it was a 2GB VPS that we tested. Dedicated resources are great but sometimes shared gives you a little bit extra with good neighbors which could explain the difference. Either way, Kinsta handled all of the tests exceptionally well and earned itself top tier status.


LightningBase is another consistent performer on our list. Another test, another top tier rank earned. It had ridiculous consistency with the Blitz test where the fastest and slowest response were both 81ms. A textbook performance at incredible value of $9.95/month.

Media Temple [Reviews]

Media Temple and GoDaddy are still running the same platform by all indications. Media Temple offers a more premium set of features like Git, WP-CLI, Staging but the performance was identical. It declined from last time and had the same bottlenecks as GoDaddy.


I feel like copy and paste is the right move for Nexcess. Nexcess's performance was excellent in the Load Storm testing. However, it collapsed during the Blitz load testing. This was the same behavior as last year. It handled the Blitz test better this year, but still not well enough. Nexcess ends up looking like a middle of the pack web host instead of a top tier one because of the Blitz test, again.

Pagely [Reviews]

Is the extra money worth it? Only if you value perfection. Pagely came through again with an amazing set of results. It handled more hits than anyone in the Blitz test at a staggering 58,722 hits in 60 seconds (979 hits/second). We're approaching the theoretical maximum at this point of 1000 hits/second. And Pagely did it with 1 error and a 3ms performance difference from the fastest to slowest responses. The original managed WordPress company continues to put on dominant performance results.

Pantheon [Reviews]

Another test, another top tier performance. Just another day being one of the most respected web hosts in the space. Everyone I talk to wants to compare their company to these guys. It's obvious why, they've built a very developer/agency friendly platform that looks nothing like anything else on the market. It also performs fantastically well. They didn't perform the absolute best on any particular test but they were right in the top echelon with minimal errors on everything.


Pressidium was a new entrant and it did exceptionally well. They are UK based and suffered slightly on some performance tests because of latency between the UK and the US testing locations used. For example, the Blitz testing showed fewer  responses, but their total of 10 errors shows pretty clearly that it was a throughput across the Atlantic ocean issue more than their service struggling because it had a 1 second spread from the fastest to slowest response. Incredibly consistent performance. Despite their geographic disadvantage in this testing they still managed to keep a sub-one second response from four US testing locations in the WebPageTest testing. Overall, a top tier performance from a competitor from across the pond.


We finally got PressLabs working with the LoadStorm testing software. And it was worth the wait. They were the only company to handle the 2000 logged in user test with zero errors. Combined with the second fastest Blitz test (again without a single error) puts PressLabs firmly in the top tier as you would expect from the most expensive offering tested this round.

SiteGround [Reviews]

It was nice that we finally worked out the security issues in testing with LoadStorm on SiteGround. SiteGround's Shared hosting platform bounced back after last years testing. Their Blitz performance went up substantially and put it back into the honorable mention category. The VPS performance was slightly worse on the Blitz test, but noticeably better on the much longer LoadStorm test. This could be a good example of when Shared hosting can outperform dedicated resources because Shared hosting generally has access to a lot more resources than smaller VPS plans. Depending on how they are setup and managed, you can often get more burst performance from Shared over a small VPS. But in the longer term, dedicated resources are generally more stable (and guaranteed). SiteGround's Shared hosting definitely helps keep the lower priced options with excellent performance a reality for many.


WebHostingBuzz asked to be included in this testing and then completely disintegrated to the point I couldn't even test them. I still never heard anything from them for months. I would like to know what happened, but until I actually get a response, this one will remain a bizarre mystery.

WPEngine [Reviews]

This is a difficult one to write about. There are definitely performance improvements that occurred. They jumped up to an honorable mention. Their engineers actually worked to resolve some security issues that hindered previous testing. My biggest concern is the isolated shared environment I was on. A shared environment has a lot more resources than many dedicated environments and I was isolated away to prevent the testing from affecting any customers (which is a reasonable explanation). But that means I was likely to be getting the absolute dream scenario in terms of resource allocation, so a normal user would see this in the very best case scenario. So WPEngine is certainly capable of delivering better performance than they did in the past, but I do have concerns about the reasonable expectation of a new user getting the same results.


WPOven was another new entrant to this testing and they performed well in a couple tests. They flew through the Blitz test without any issues. Their WebPageTest results were one of the absolute fastest in an already fast pack. Their uptime was perfect. They did struggle with the LoadStorm tests though both at the 1000 and 2000 user levels. It's nice to see more competitors enter the space, WPOven put on a good first show, but there is still some serious improvements to make to catch up to the front of the field.


Another new entrant who ran into a severe testing issue which caused me to re-do all the tests. The server was given more resources than the plan specified while debugging some security issues. The results on the extra resources were on par with some of the top in the field, but not representative of what the actual plan would be able to achieve. I didn't believe it was malicious (they were quite transparent about what happened), so I gave the benefit of the doubt and re-did all testing in a closely monitored condition. With the default resource allocation, WPPronto couldn't withstand LoadStorm's test. The results were pretty easy to see in the 508 errors it started to throw on the properly resourced plan. It ran out of processes to handle new connections as expected. As with all new entrants that don't leap to the forefront, I hope they continue to improve their service and do better next round.


Thank You

Thank you to all the companies for participating and helping make this testing a reality. Thanks to LoadStorm and specifically Phillip Odom for all his time and the tools to perform this testing. Thanks to Peter at Kinsta for offering his design support.



8/13/2015 : The wrong PDF was linked for DreamHost and its Blitz numbers were adjusted to reflect their actual performance. This change has no effect on how they were ranked since the issue was with downtime.


WordPress.org Revamping Hosting Recommendations


Almost exactly three months ago I wrote how WordPress sold us out with regards to their web hosting 'recommendations' which were (and still are) just advertisements (now a single ad for BlueHost). So what changed?


It appears that WordPress.org is going to change their web hosting recommendations and acknowledges that the industry has changed (dramatically I might add). They left the nonsense text at the top about how BlueHost is the best and brightest while the new text at the bottom admits that isn't the case at all. I will assume it was a minor oversight in the updating of the page.

So what appears to be changing? They are taking requests from companies to be included with a very long survey which is fairly comprehensive covering demographics, staff, usability, technology, financials and more. It looks like it's either designed for a very thorough vetting process or to prevent companies from signing up because of the amount of disclosure required.

Questions like How many net paying customers are you gaining/losing each month? and Approximately how many 30-day-active paying customers do you have? are pretty revealing questions for private companies. Especially considering that there are potentially two competitors that have a strong financial relationship with the murky organization structure of WordPress Foundation - Automattic (WordPress.com) and Endurance International Group (Investor in Automattic, parent company of BlueHost, the primary affiliate of WordPress.org).

There doesn't seem to be any details about how this process will work, who will be managing it, or when the community will get to have any input.

Going Forward

It looks like a step in the right direction, because any direction is better than the current one. But I'm not sold yet.

What I would like to see is actual community input about how recommendations should be made. The current survey is for hosting companies only and doesn't cover the quality of service actually experienced. It asks very detailed questions about what the company says they are doing, but we all know that is the best case scenario that's communicated publicly. Naturally, as a web hosting review website, I think reviews are an important feedback and decision mechanism for making informed choices. There are other ways to evaluate companies like performance benchmarking as well. The community may or may not agree with any of these methods, but they should at least be given a chance to make their voice heard in this discussion. It's the people who know the least that are going to be reading and making decisions from WordPress.org's hosting page, it's our responsibility to help them as much as possible.

I would like transparency on how the companies are going to be evaluated. The black box magic that has given us BlueHost as the best WordPress host for a decade needs to end.

Proper disclosure is a must. The wishy-washy these companies donate a portion of your fee back to us crap needs to end. If you're going to run affiliate links, be up front about them and don't pretend it's something else. This ties back into transparency. If you're going to use affiliate programs and get kickbacks it needs to be done honestly. People have a right to full information about the relationship between WordPress.org and the companies it recommends.

In conclusion, I'm glad to see that changes might be forthcoming, but I urge WordPress.org to do it properly and provide something that is honest and transparent.

Kinsta WordPress Hosting Review


This post is based off WordPress Hosting Performance Benchmarks (2014).



Kinsta is yet another new comer in our testing with something to prove. Kinsta easily shot to the top of our performance charts. Kinsta's plans have changed quite a bit since we tested them. When our testing was done they offered a $27/month plan. However, they've gone up-market and their cheapest plan is now $157/month. It seems they're targeting people who want serious performance.

The Plan

All testing was done on a shared account, which is no longer available.  This plan tested had 1 WordPress site, 1GB SSD disk space, 50GB bandwidth and costs $27/month.



The first performance test was done with LoadStorm. Kinsta made it to the final round of testing where 2000 concurrent users were logging into WordPress and browsing the test site. The test was designed to test non-cached performance by logging users into WordPress. It caused many hosting setups to crumble. You can see Kinsta's result in this graph (click on it to play with the interactive results):



Kinsta aced the LoadStorm test. It had zero errors and one of the fastest average response times at 316ms. Kinsta also had the absolute lowest peak response time at 942ms. That's an astonishing feat, that over 30 minutes Kinsta served nearly 250,000 requests and not a single one took over a second to be delivered. Amazing.


The second load test that was run on Kinsta was Blitz. Blitz was used to test cached performance. It simply requested the home page from 1-2000 times per second.


I can't draw lines this straight. The response time was flat. As you would expect from a company that aced the cache busting test, they didn't struggle in the slightest. Full Blitz Results (PDF)


Two third-party uptime monitoring services (StatusCake and UptimeRobot) tracked the test site for a month. The results for Kinsta were perfect. 100% uptime according to both sources.


“WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster.” WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. Kinsta was tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
Kinsta 0.759 0.752 0.947 0.592 0.7625

Kinsta had the second fastest average response time of all the companies we tested. No issues with this test in the slightest.


Kinsta, a new-comer to our testing, jumped straight to the top of the performance tiers. Kinsta’s performance was amazing in the Load Storm 2000 logged in user test. They had the lowest peak response time and zero errors over a 30 minute test. They didn’t struggle with any tests whatsoever and showed zero downtime. Kinsta’s performance was undoubtedly top tier.

Visit Kinsta


WebSynthesis WordPress Hosting Review


This post is based off WordPress Hosting Performance Benchmarks (2014).



WebSynthesis [Reviews] had an extremely strong showing in our first round of testing once I got by a security issue. They managed to defend their status as a top tier WordPress web host.

The Plan

All testing was done on a VPS account. The plan tested had 2 GB ram, 40 GB disk space, 650 GB bandwidth, 20,000 visitors/day and costs $97/month.



The first performance test was done with LoadStorm. WebSynthesis made it to the final round of testing where 2000 concurrent users were logging into WordPress and browsing the test site. The test was designed to test non-cached performance by logging users into WordPress. It caused many hosting setups to crumble. You can see WebSynthesis's result in this graph (click on it to play with the interactive results):



WebSynthesis stayed under the threshold of 0.5% error rate, but it was close. This grueling 2000 user test really put a strain on the server as you can see from the spikes but it held for 30 minutes without failing.


The second load test that was run on WebSynthesis was Blitz. Blitz was used to test cached performance. It simply requested the home page from 1-2000 times per second.


WebSynthesis was better than flat. There is a slight downward trend in response time. WebSynthesis led the pack, again, delivering 57,776 hits in one minute with a single error. The best results of anyone on this test. Full Blitz Results (PDF)


Two third-party uptime monitoring services (StatusCake and UptimeRobot) tracked the test site for a month. The results for WebSynthesis were 100% uptime according to both sources, again.


“WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster.” WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. WebSynthesis was tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
WebSynthesis 0.407 0.835 0.982 1.024 0.7812

WebSynthesis handled this test fine. In fact, they had the single fastest average page load from a single location of any company at 0.407 seconds from Dulles, VA.


WebSynthesis [Reviews] was teetering on the Load Storm test of having too many errors (0.5%), but they were under it and handled the test quite well. They also had no weird security issues this time around, and WebSynthesis led the pack on Blitz testing. They went from 871 hits/second last time to 963 hits/second this time; leading every provider on the Blitz tests with a whopping 1 error to boot. Sprinkle in some perfect up time numbers and it’s clear WebSynthesis is still a top tier provider and is continuing to get better.

Visit WebSynthesis


Pagely WordPress Hosting Review


This post is based off WordPress Hosting Performance Benchmarks (2014).



Pagely came in with a title to defend. Pagely was one of the top tier web hosts in our first round of testing and didn't show any signs of struggling. My biggest complaint was SFTP was an addon, which they now include with every account. Performance-wise Pagely was back at it again with another top tier performance.

The Plan

All testing was done on a shared account, the Personal / Business plan.  This plan allows for 1 WordPress site, 5GB disk space, 10GB bandwidth and costs $24/month.



The first performance test was done with LoadStorm. Pagely made it to the final round of testing where 2000 concurrent users were logging into WordPress and browsing the test site. The test was designed to test non-cached performance by logging users into WordPress. It caused many hosting setups to crumble. You can see Pagely's result in this graph (click on it to play with the interactive results):



Pagely did well on this test. There was one error total which caused a response time spike (blue line in the graph). Other than a single error, the performance was impeccable.


The second load test that was run on Pagely was Blitz. Blitz was used to test cached performance. It simply requested the home page from 1-2000 times per second.


Pagely's Blitz result was exemplary. There were 43 timeouts and errors combined. There was a near flat response time which means it had no issues at all. Pagely didn't blink at this test, as expected based on their performance last time on this test. Full Blitz Results (PDF)


Two third-party uptime monitoring services (StatusCake and UptimeRobot) tracked the test site for a month. The results for Pagely were 99.95% and 100% uptime. It's hard to complain about those numbers or find any issue with Pagely's uptime.


“WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster.” WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. Pagely was tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
Pagely 6.831 0.86 0.913 0.709 2.32825

Pagely was the only company that had any issue with the WebPageTest component of our testing. The test from Dulles had bizarrely high load times for no explicable reason. The other locations were all sub one second, so I dismissed it as a real issue because it's likely some fluke networking issue. But there was a weird networking issue.


Pagely easily defended its title as one of the top tier WordPress hosts. They handled the Load Storm test with 1 error. Blitz results stayed similar to the last run. They handled more hits, but had a few more errors+timeouts (1 last time, 43 this time). If performance is the name of the game, Pagely continues to be at the forefront.

Visit Pagely


LightningBase WordPress Hosting Review

lightningbaselogo1600x290bThis post is based off WordPress Hosting Performance Benchmarks (2014).



LightningBase was a new comer to our WordPress Hosting Performance Benchmarks. Lightning Base's founder, Chris Piepho, was incredibly helpful giving feedback on how the testing in Round 1 was done. His feedback played a bit part in the differences you see in Round 2, namely, cache busting. So it's without a large surprise that someone that cares so deeply about performance that their own service did remarkably well in our testing.

The Plan

All testing was done on a shared account, the Personal plan.  The personal plan allows for 1 WordPress site, 10,000 visits/month, 1GB SSD disk space, 10GB bandwidth, 20GB CDN and costs $9.95/month.



The first performance test was done with LoadStorm. LightningBase made it to the final round of testing where 2000 concurrent users were logging into WordPress and browsing the test site. The test was designed to test non-cached performance by logging users into WordPress. It caused many hosting setups to crumble. You can see LightningBase's result in this graph (click on it to play with the interactive results):



LightningBase handled the test with minimal errors (23) and showed minimal signs of struggling with the load. There appears to be a bit of delay every so often that looks like a cache update. Other than that minor detail it looks excellent.


The second load test that was run on LightningBase was Blitz. Blitz was used to test cached performance. It simply requested the home page from 1-2000 times per second.


LightningBase's Blitz result looks textbook. There were no errors and no timeouts. There was a near flat response time which means it had no issues at all. LightningBase aced our Blitz testing. Full Blitz Results (PDF)


Two third-party uptime monitoring services (StatusCake and UptimeRobot) tracked the test site for a month. The results for LightningBase in both cases was perfect uptime. In the uptime department, LightningBase had a flawless performance.


“WebPagetest is an open source project that is primarily being developed and supported by Google as part of our efforts to make the web faster.” WebPageTest grades performance and allows you to run tests from multiple locations simulating real users. LightningBase was tested from Dulles, VA, Miami, FL, Denver, CO, and Los Angeles, CA.

Company Dulles,VA Miami, FL Denver, CO Los Angeles, CA Average
LightningBase 0.584 0.787 0.936 0.675 0.7455

There was absolutely no issues with their WebPageTest results. LightningBase had the fastest average load time of every host compared in our testing.


LightningBase is another new-comer that jumped straight to the top. One of the cheapest too starting at under $10 per month. LightningBase aced the Blitz testing and did excellent on Load Storm tests. There was zero downtime monitored. LightningBase belongs in the top tier of WordPress hosting companies and is delivering amazing value on top of their stellar performance benchmarks.

Visit Lightning Base