The full company list, product list, methodology, and notes can be found here.
This post focuses only on the results of the testing in the <$25/month price bracket for WordPress Hosting.
- 1 Other Price Tier Results
- 2 Hosting Plan Details
- 3 Load Storm Testing Results
- 4 Load Impact Testing Results
- 5 Uptime Monitoring Results
- 6 WebPageTest.org Results
- 7 WPPerformanceTester Results
- 8 Conclusion
Other Price Tier Results
Hosting Plan Details
Click the table below to see the full product specifications sheet.
Load Storm Testing Results
This test scaled from 500 to 2000 users over 30 minutes with 10 minutes at the peak.
|Total Requests||Total Errors||Peak RPS||Average RPS||Peak Response Time (ms)||Average Response Time (ms)||Total Data Transferred (GB)||Peak Throughput (MB/s)||Average Throughput (MB/s)|
|Incendia Web Works||294518||0||222.42||163.62||4070||348||21.29||17.14||11.83|
GreenGeeks, Incendia Web Works, LightningBase, and SiteGround [Reviews] all handled this test without issue. A2 Hosting [Reviews], Nexcess and Pressable had some minor issues. A2 Hosting and Nexcess had a couple minor spikes. Pressable had a weird issue with the Tokyo data center for Load Storm which occurred every time on different plans tested. Nestify is interesting because they never failed but definitely started slowing down under load but their server stayed up and errored the second fewest times of any plan.
1&1 [Reviews] Internet seemed to be running into a security related issue - they weren't able to turn off their DDoS proxy for a single site. They kept a relatively constant response time and error rate. Overall it looked pretty decent while having the security issue.
BlueHost [Reviews] Cloud and Shared both ran into security issues we couldn't work to bypass. They were perhaps the most bizarre graphs I have seen during all my years of testing. Hard to say much based on them.
Load Impact Testing Results
This test scaled from 1 to 1000 users over 15 minutes. Error count capped at 5,000.
|Requests||Errors||Data Transferred (GB)||Peak Average Load Time (Seconds)||Peak Average Bandwidth (Mbps)||Peak Average Requests/Sec|
|Incendia Web Works||339517||0||17.06||0.414||253.9||617|
GreenGeeks had a bit of a spike and started to increase in response time towards the end of the test.
IOZoom started to lag slightly towards the end of the test doubling in response time but didn't error once.
Nexcess started to show a little load towards the end but managed to not error at all and stayed below 800ms response time.
Uptime Monitoring Results
|Incendia Web Works||99.99||99.97|
Most companies maintained above the 99.9% threshold on both uptime monitors.
A2 Hosting had some issues with UptimeRobot showing 99.95% and StatusCake 98.96%.
IO Zoom had a major downtime issue with UptimeRobot showing 95.18% and StatusCake 95.39%. What happened was a server upgrade and my account wasn't upgraded to a new PHP handler. Beyond that event was fine, but that caused the low numbers being reported.
Nestify had some issues with UptimeRobot showing 99.83% and StatusCake 99.92%.
Nexcess had a strange phenomena where UptimeRobot recorded 100% and StatusCake showed 99.72%. StatusCake showed multiple relatively short downtimes across two days in August which were the cause of this low number. UptimeRobot which was checking far more frequently didn't experience any of these issues, so I suspect it might have been issue more with StatusCake being able to connect than Nexcess being actually down.
|WPT Dulles||WPT Denver||WPT LA||WPT London||WPT Frankfurt||WPT Mauritius|
|Incendia Web Works||0.366||1.21||1.094||0.791||0.894||1.988|
|WPT Singapore||WPT Mumbai||WPT Japan||WPT Sydney||WPT Brazil|
|Incendia Web Works||2.243||2.432||1.713||2.081||1.207|
There isn't a whole lot to say since this is a non-impacting metric in terms of the results. Nobody had huge issues.
One interesting thing I compared was the average response time from 2016's benchmarks in the 9 locations that were the same. The best average improvements were Dulles going from an average of .809 to .503 and Frankfurt going from 1.698 to 1.270. The worst change was LA by a wide margin going from 0.72 to 1.404. In aggregate 2018's results were 0.471 seconds slower on the same 9 locations as 2016. Could just be the testing location that day. I don't use these as an impacting measure for a reason.
|PHP Bench||WP Bench|
|Incendia Web Works||6.63||749.6251874|
Incendia Web Works led the way with the fastest PHP Bench score. Kickassd had the fastest WP Bench.
Compared in aggregate to 2016's results we see on average we are seeing higher speeds. 2016 had an average PHP Bench of 11.573 while this year had an average of 11.281. WP Bench had an average of 684 in 2016 and this year had of 877.
One thing to note is that WP bench scores can vary tremendously based on the database architecture. You'll often see faster scores on architectures running the database locally compared to ones that run separately or have redundancy built in.
A2 Hosting [Reviews] had a borderline uptime performance that kept it out of the top tier. Green Geeks had some response time issues on the Load Impact test where it started to slow down. Nexcess had a strange uptime monitor and was borderline on the load tests but never errored out.
Individual Host Analysis
1&1 [Reviews] was a new participant this year and they were interesting. While they didn't win any awards, their tests were having issues because of a DDoS proxy they couldn't turn off on an account specific level. I wish we could have gotten a clean test through to see how they really stacked up. 1&1 kept a solid uptime above 99.97%
34SP.com was another new participant; based in the UK. They were by far the fastest to connect to London and Frankfurt on WebPageTest which is nice to see. On the load testing front 34SP just couldn't quite handle the stress of so many visitors. They did manage to keep a good uptime during tests though with above 99.96%.
A2 Hosting [Reviews] earned an honorable mention this year. Their performance was quite good but there were a few issues. Their uptime was borderline with one monitor showing below 99.9% but the average of the two monitors above 99.9%. The LoadStorm test had a minor spike issue but nothing serious. LoadImpact was good with no errors. It was a huge improvement over last year where they failed the load tests. Nice to see such a big improvement.
BlueHost Cloud + BlueHost Shared
BlueHost [Reviews] were grouped together because I was hoping to see a meaningful difference in performance between the different offerings but it really wasn't very clear that either offering was the obvious better choice. The Shared plan outperformed in LoadImpact while Cloud outperformed in Load Storm. They had similar uptimes, both averaging 99.97%, which was perhaps the highlight of their results. We were unable to work around some security measures in their network and get clean tests through. It led to some of the strangest looking LoadStorm tests I've ever seen.
FlyWheel [Reviews] while still maintaining the best score on our review site at 81% at the time of writing, continues to underperform on the performance tests. It's just a reminder that peak performance isn't everything for everyone. They didn't do well in either load test but managed a perfect 100% uptime on both monitors.
GreenGeeks is another new entrant who put up a stellar first performance earning themselves an honorable mention. The only test they had a little bit of issue with is the Load Impact test where they had a spike and started to show a bit of increased response times towards the end. Very happy to have another strong offering in the mix for people to choose from.
Incendia Web Works
Incendia Web Works had some uptime issues last year which marred their results. This year no such thing happened. IWW continued to have great load testing results and earned itself Top Tier status. They had flawless load test results being the only company with 0 errors on both LoadStorm and LoadImpact. It's nice to see companies improve their consistency and earn a higher ranking.
IOZoom was another new entrant this year. The uptime issue was really unfortunate and in the future I hope their upgrade process is now better equipped to make sure things like that don't happen. They had a good Load Impact test with just some minor increase in response times. Unfortunately IO Zoom did poorly on the LoadStorm test. These tests are difficult and companies sometimes take years to manage them, I hope IO Zoom comes back next year swinging with better results.
Kickassd was yet another new entrant who impressed in some tests. Kickassd had perfect uptime at 100%. They also had a perfect LoadImpact test with a flat response time and zero errors. Unfortunately, the Load Storm test was a bit much for the server and it didn't do as well. Overall, really impressed with a new entrant who can manage do well on a lot of the tests. I also know a lot of improvements were deployed customer wide because of the testing. Next year I hope to see those improvements earn Kickassd an award.
LightningBase. Another year, another near perfect run. 100% uptime. Great load tests. This is the 4th year in a row LightningBase has earned top tier recognition. It seems almost expected at this point.
Nestify was a new entrant this year. Their LoadStorm test was a decent start. It definitely started to slow down, but they had the second fewest errors of any company. Their Load Impact test went similarly, increased response times with some errors but not an overwhelming number. Nestify's uptime left something to be desired with an average below 99.9%. This was caused by a block storage maintenance/failure at Vultr according to them. During testing we realized one of the potentially limiting factors was the 100Mbps connection to the server being saturated during the load tests. There are the makings of a very solid competitor if they can figure out a few issues.
Nexcess returned to the tests this year after not participating last year. They brought in the person responsible for A Small Orange's WordPress stack that did so well in early versions of these benchmarks. This resulted in Nexcess earning an honorable mention status. Their results were pretty good in all the meaningful tests. Their uptime was a bit strange with one showing 100% and the other 99.72%. The LoadStorm test had a couple spikes but nothing major. LoadImpact had some increased response time but stayed under 800ms which is quite good. Glad to see their return and they're making at mark at the entry level.
99.9% was Pressable's uptime and pretty much sums up their results as a whole. Another Top Tier award for Pressable. The only strange issue which consistently happened on all their plans is a weird issue with LoadStorm's Tokyo data center having some errors. It really didn't have much impact with an error rate at 0.15%. Another great year for Pressable.
Another Top Tier award for SiteGround [Reviews]. They had the second fewest cumulative errors at 18 for Load Storm and Load Impact tests (14+4) respectively. Combined with a 99.99% uptime, they continue to look great.
Other Price Tier Results
Latest posts by Kevin Ohashi (see all)
- Ethics and One Tweet’s Impact on Digital Ocean - June 6, 2019
- WordPress Hosting Performance Benchmarks (2019) - May 8, 2019
- WordPress Hosting – Does Price Give Better Performance? - December 6, 2018