Google Pagespeed Insights, GTmetrix and SpeedFactor are all great tools to help identify speed related issues and fix them. All have their advantages and disadvantages and, in my opinion, one is ultimately a better tool than the others. Keep in mind that there are so many different variables that go into the results and every tool is different.
Up until last year, GTmetrix was my go to tool for pagespeed testing. This tool offers better information to see exactly why your website is slow and shows better “real world numbers”. There was a problem, though, one that also happened with Google Pagespeed Insights. As a freelancer, I couldn’t package my findings and show them to my clients. As an agency, I couldn’t show them a before and after unless I took screenshots before and after.
A long time ago I used to this by saving the results and the speed metrics in a spreadsheet, a process that took alot of time for all my clients. I liked to have daily data and not care about it. Set it and forget it. This is where SpeedFactor comes in: using the Google Lighthouse API and Simulated User Monitoring, it stores a ton of metrics and displays it in pretty charts. It also allows for chart annotations, so I can show my clients when a specific fix has been deployed or a specific feature has been implemented.
A real world number is something that you can use to understand how your page loads. Getting a generic numerical grade of 1-100 doesn’t offer much insight. Google Pagespeed Insights offers the 1-100 grade for both mobile and desktop and offers solutions on possible fixes.
With Google, you really don’t know. GTmetrix offers the same 1-100 score, one from Google and one from YSlow, so you get two different grades from two different sources. Those still aren’t what I would consider real world numbers though. GTmetrix provides three metrics that I consider “real world numbers”.
That’s exactly what I care about. I need to see the number of requests going down (or up) after deploying a fix or optimizing a certain area of the site.
I also want to see the the total size of the page after I removed that huge, full HD image from the homepage.
SpeedFactor goes into even more detail and lists all internal and external scripts, stylesheets and images and creates a comparison chart. Moreover, it saves everything and displays a daily evolution. I found this to be pretty useful with a client adding lots of tracking scripts and not realizing the load impact they had on the site. Their explanation was “we are using Google Tag Manager so everything is asynchronous”. It is and it isn’t. Those scripts still need to load and everything asychronous will impact the Time to First Bite (a metric which SpeedFactor stores daily).
This is the time it takes to fully load your website, all the images, all the resources. This number is significant because at the end of the day this provides the answer to your question for coming to these websites. Is your website slow? Well, this provides the “speed”.
This metric should be focused on more than an arbitrary 1-100 number. I’ve seen websites with poor 1-100 grades have fast load times and I’ve seen websites with great 1-100 numbers have slow load times. Which is why focusing on the time in seconds is infinitely more helpful than a grade.
This is the total size of your page after it’s fully loaded. If you have a 12MB image on your website, this will be calculated into the total size of your website. It factors in everything from images, stylesheets, scripts and even external resources such as Google Analytics or Hotjar. It will show the value in megabytes, which can be beneficial in understanding how much data mobile users have to use to access your website.
This is the total number of requests your website requires to load. One request can be one image or one script. If this number is at 40, that means to fully load your website your browser was required to download 40 items, and the size of those 40 items will total up to your total page size.
These numbers are still up to your interpretation. According to GTmetrix, a 6 second load time and a 2MB website is above average. If through analytics you know your user base is mostly mobile users, this could be a big problem and could increase your bounce rate. This is where the information provided shouldn’t be taken as is and should be applied to your website’s specific needs. If you know through analytics that your user base is entirely desktop users in a city with multiple high-speed internet options, these numbers aren’t nearly as detrimental. That doesn’t mean they shouldn’t be improved.
Follow the recommendations from Google, GTmetrix and SpeedFactor and you can see the numbers drop, in a good way. There are far too many different approaches to the suggestions, but as an example, to enable caching, which is a big factor in page speed, you could do it through your hosting server or use a WordPress plugin. Even then, the way you do it through your server could depend upon what type of server you are using.
I completely stopped using Google Pagespeed Insights. I rarely use GTmetrix. I am using SpeedFactor daily and I am improving it on a weekly basis. SpeedFactor has several useful tools, such as security testing and reporting, a basic SEOFactor section and lots of DNS tools, all of these feeding data into your reports. SpeedFactor also has an uptime monitor and a weekly report.
Unlimited Automated Page Speed Monitoring & Tracking.
Use SpeedFactor to track your website. It’s simple and reliable.
See how real people experience the speed of your website. Then find (and fix) your web performance problems.