Posted & filed under Web Development.



“You can't manage what you don't measure”, as the old adage goes. For simplicity’s sake, I choose page load time as our main performance metric:


Page load time = the time in seconds to load and render a page


For “site performance” we could use the average value of single pages load times, across all pages or the most visited pages only. The home page and the most visited pages will affect the overall user’s perception of the site speed, so they should receive special attention.




  • Should we care about web page speed?

  • Does this metric have a significant business value?

  • Do we care about performance?


If we care about performance:


  • We talk about it

  • We sell it to the client as a package option, a competitive advantage, or a project requirement

  • We have a wide range of performance metrics in place, and automated tests for QA

  • We measure it before, during and after development

  • We have performance goals, and shared best practice to meet these goals

  • We design and develop with performance in mind


When we design and code, do we think:


  • How big the page will be?

  • Are images optimized?

  • How fast will it load on a desktop and a mobile device?

  • Are we going to have high traffic campaigns for this site?

  • What is the range of acceptable performance agreed with the client (SLA)?

  • Is this new feature really worth the performance cost?


Website performance: people do care


There is, of course, a huge interest about performance in the IT and WebDev community:


In our world of blazing fast high-speed internet connections, many developers are throwing site load time by the wayside as they focus on creating graphically rich website experiences. Does site size and load time still have any importance? Do we still need to resize our images, and worry about every kilobyte here and there?

Website owners are obsessed with their rankings, quality of traffics, conversions rates, reputation management and the overall well-being of their online presence. It is quite frustrating how website availability is overlooked.


Back in 1999 the acceptable load time for a site was 8 seconds. It decreased to 4 seconds in 2004, and 2 seconds in 2009. These are based on the study of the behavior of the online shoppers. Our expectations already exceed the 2-second rule, and we want it faster.


“There’s something called responsive web design and while it helps address the user experience aspect – the usability of the page by making it look like a mobile website – it actually hardly ever reduces the byte count”


Gomez’ own studies [Gomez is one the biggest player in website testing] reveal the lack of visitor loyalty. By analyzing page abandonment data across more than 150 websites and 150 million page views, Gomez found that an increase in page response time from 2 to10 seconds increased page abandonment rates by 38%.


There is a strong correlation between page load time and human behavior. If users cannot engage quickly with sites and apps, their user experience suffers and they will take their business elsewhere. While the focus on the back end (server) is on stability and scalability, 80 to 90 percent of page load time is spent on the front end (browser).


Some stats as food for thought:


  • 47% of web users expect a page load of 2 seconds or less

  • 14% of users will start shopping at a different site if a page loads too slow

  • 40% of users will abandon a website that takes more than 3 seconds to load

  • 64% of shoppers who are dissatisfied with their site visit will go somewhere else to shop next time

  • 52% of online shoppers claim that quick page loads are important for their loyalty to a site


10 reasons why we should care


1) A fast page is an act of love (and skills)

A slow page that is not following any of the standard best practice, will give the user (especially a technically proficient user) a bad impression of the company beyond that page. Our websites are out there for everyone to see and test. It is a keyhole view on our inner processes: design, backend and frontend development, hosting and monitoring. We should assume that a potential client will check our past work and notice any obvious mistake or bad implementation. If we make the page as fast as technically possible (within reasons), we show that we care for the user time and for the client money.


2) Fast content is better content

Content will look better if it loads fast.


A 2004 Study by Skadberg & Kimmel showed that speed affects people’s evaluation of the attractiveness and the content of a Web site. In other words if your website is slower, people will actually like your actual content less, even though the quality of your site’s content should in theory be the same whether your site is fast or slow.


3) Tell me how fast you are and I will tell who you are

As shallow as it can sound, a quick, performant website will affect brand perception. A slow website will negatively impact a marketing campaign.


4) Higher search rankings

Google is using page load time as one of the main factors in calculating website rank.


The advanced SEOs will know that improved user engagement helps improve website rankings (Google has a lot of tools at its disposal to track this and it does). So even if Google did not measure site speed directly in its ranking algorithm, improving site speed would still help achieve better rankings.

The fact Google does look at site speed as a factor in rankings means that there’s a double impact on rankings when you improve site speed. You make Google happy by having a faster site and improved user engagement.


5) Loss of revenues

The Gomez Peak Time Internet Usage Study conducted by Equation Research on 1500 consumers confirms the negative impact of poor performance:

  • At peak traffic times, more than 75% of online consumers left for a competitor’s site rather than suffer delays

  • 88% of online consumers are less likely to return to a site after a bad experience

  • Almost half expressed a less positive perception of the company overall after a single bad experience.

  • More than a third told others about their disappointing experience


6) Mobile is not just responsive design

Slow and big pages will cost even more money and time to mobile users, that pay for data bandwidth usage. Saying that a site is responsive and so “mobile optimized” is not fully true.


HTTP Archive suggests the average web page when browsing on a mobile device – accounting for the fact that some websites have mobile-optimized pages and others don’t – is about 720 kilobytes.

“Anything that is over the 700 kilobytes or 800 kilobytes range I would mark as too heavy for mobile.”


7) A penny saved is a penny earned

Someone has to pay for the traffic. A smaller page size for an high traffic site can make a significant difference.

8) Make the server shine

Bandwidth is often a backend bottleneck. Also, a page optimized for speed usually means less requests to server, less CPU usage, and so a smaller, cheaper, servers can be used.


9) Fear not Stephen Fry

The so-called Fry Effect is the sudden and massive increase in traffic following a tweet or facebook post by a “celebrity” with a large number of followers. Too often the charity sites endorsed by Stephen Fry with one tweet crashed under the weight of thousands of pageviews. An optimized home page can significantly reduced the risks of downtime in the worst possible moment.


10) Don’t follow the crowd, lead them

The “everybody-talks-about-it-i-want-it” effect. Think about the “responsive website” selling point. What if we could sell a site as “optimized for speed”, using standard performance score like Google PageSpeed or Yslow as a competitive advantage? We could say the site is in the top 10% fastest sites in the web, or that our home page scores 95+ in the Pingdom test.


Love means never having to say “Sorry, I wasn’t monitoring you”


If we care, we need to monitor performance all along the development cycle, and use the data to change our coding process. If performance is not integrated in design, development and maintenance, we don’t have a feedback loop in place: we only fix issues when they occur and never really improve the process.

Having an automated page speed test in place, would help with troubleshooting and QA as well.


How do we measure page load time?

First option is using “synthetic” tests: QA testing the page in their browsers, online test tools, software running locally, etc. They are all called synthetic because they run in a controlled, “sanitized” environment. They are useful to get a reliable “baseline” of our website performance: for debugging, regression testing, etc. They are not, though, very representative of how the real users actually experiences our site, out in the wild. Too many variables affects the page load time, factors we can’t really fully control or test for, like network speed, internet congestion, device used, browser, OS, etc.


This is why we also need RUM tools, Real User Monitoring (or Measurement), to fill the gap and have a clear, complete picture of how fast our site really is.




Real User Monitoring (RUM) details the time it takes for your users' browsers to load your webpages, where they come from, and what browsers they use. It provides valuable insight into the experience that actual users experience on your site.


The issue with only showing synthetic data is that it typically makes a website appear much faster than it actually is. My rule-of-thumb is that your real users are experiencing page load times that are twice as long as their corresponding synthetic measurements.


RUM data, by definition, is from real users. It is the ground truth for what users are experiencing. Synthetic data, even when generated using real browsers over a real network, can never match the diversity of performance variables that exist in the real world: browsers, mobile devices, geo locations, network conditions, user accounts, page view flow, etc.


Google Analytics RUM

Page speed data are by default included in Google Analytics. Analytics restricts Site Speed collection hits to about 1% of visitors, this means that data are not really reliable for low traffic sites.


Pingdom RUM

Pingdom RUM tool implements several statistics on the full set of user data, with no sampling. It also offers real time stats.



Apdex defines a standard method for reporting and comparing the performance of software applications in computing. Its purpose is to convert measurements into insights about user satisfaction, by specifying a uniform way to analyze and report on the degree to which measured performance meets user expectations.


The Apdex method converts many measurements into one number on a uniform scale of 0-to-1 (0 = no users satisfied, 1 = all users satisfied). The resulting Apdex score is a numerical measure of user satisfaction with the performance of enterprise applications.



Deconstructing page load time


The first rule of performance optimization is:


80% of page load time is spent in the browser


This includes the time to download the resources from the server, render the page, process the scripts.

Generating the html page in the backend is usually very fast. So, from a pure load time point of view, it makes sense to focus on front end, browser caching, page size and design improvements.


Page load time is mostly affected by:


  • Page size

  • Number of requests

  • Number of Javascript and CSS files

  • Complexity of the Javascript and CSS code


Performance score


Developers distilled a standard set of best practice to help creating faster websites.  

Online tools rate a webpage based on the more or less correct and complete implementation of some of these rules. Scores will differ, since they give different weights to different evaluation rules.

Most used:


  • Pingdom

  • Google Page Speed

  • Yslow


For example, Google suggests a set of best practices grouped into six categories that cover different aspects of page load optimization:


  • Optimizing caching: keeping your application's data and logic off the network altogether

  • Minimizing round-trip times: reducing the number of serial request-response cycles

  • Minimizing request overhead: reducing upload size

  • Minimizing payload size: reducing the size of responses, downloads, and cached pages

  • Optimizing browser rendering: improving the browser's layout of a page

  • Optimizing for mobile: tuning a site for the characteristics of mobile networks and mobile devices


Home page is probably the most important page: most visited, hit by traffic peaks (campaigns, Tweets, etc.). It should be the fastest and lightest page.



Load time





























































Design for speed

“Optimized for speed” is a selling point and a development framework.

Implement Google best practice

Evaluate sites with Google PageSpeed and implement Google rules.


Use Pingdom RUM and Google Analytics.


Evaluation of page performance and score becomes part of the QA process.

Automate tests

Setup automated page load tests in development and production.

Optimize homepage for speed

The homepage is one of the most visited pages and will significantly impact overall server performance, bandwidth usage and availability.

Target: Home page load time < 2 secs

Page size

Taget Mobile optimized sites < 1MB

Target Desktop optimized < 2MB




The Case for Speed: How Site Load Time has an Effect on your Readership


Poor Website Loading Times Irritate People, Matter to Search Engines, Plus Can Cost you Money


Bloated web pages costly for smartphone users


Effect of Website Speed on Users


Comparing RUM & Synthetic Page Load Times


Why Website Speed And Page Load Times Are So Important To Your Online Readers

Leave a Reply

  • (will not be published)