Google published "is the web going faster ?" with an impressive information :
While access from desktop is only a bit faster, it is still impressive given that the size of the web pages have increased by over 56% during this period. It’s great to see access from mobile is around 30% faster compared to last year.
So now you now why webperformance is important - how can you render your site quickly when it increases its size from 56% in one year ?
Looking at the +56%, you have :
- +50% for images
- +30% for HTML
- +25% for JS
- +20% for CSS
- Stable for Flash
- Other : +930% : mp4, json, flv, fonts, etc
When you also review a presentation from Steve Souders (Mr Web Performance) called "How fast are we going now ?" you can learn slide 19 that for a 2s increase of page load, you will have a loss of revenue of 4.3% per user. On slide 18, you can also see that people are no longer patient on the web. You have other meaningful examples in the following slides. You also have the example of Shopzilla (slide 21) where web performance best practices adoption leads both to an increase of conversation rate AND a decrease in required IT resources. So web performance is not only about spending more money.
Where does it lead us within our own network and our own apps :
- As web apps/sites tends to be richer and richer on one side and network does not increase as fast as the apps, we need to adopt web performance best practices to provide a good user experience to our customers ; especially when apps are used worldwide (think long distance country)
- Focus on the user feeling of page rendering ; sometimes by just changing the way the page is rendered, your user may feel its faster (even if global loading time remains the same just because content is displayed another way)
- Based on my experiences, you can have simple quick wins
with almost no effort ; so you don't need to have a contract with a CDN
like Akamai on day 1, maybe later and only if your app/site is on the
public internet (does not work for internal apps on internal network).
- When the intranet was initially released in 2008, I could improve its score by 2 or 3 grades at YSlow just by setting some configuration on server side (etags configuration, gzip compression, cache). Idea was to force cache on client side to avoid content being downloaded twice when not required for the 2nd and latter visits. So yes, your experience with it could be worse :-P
- Putting call to JS libraries at the bottom of your page will improve its loading time without any other impacts
- Some sites used to have a huge load due to bots getting informations. Making the process a synchronous and out of eZ Publish session mechanism reduced the load of our infrastructure : Divided by 4 on each of the 6 frontals and by 10 on database side. At the end, even by reducing our infrastructure from 6 to 4 frontals we are still in overcapacity (we keep them for disaster recovery plan if necessary) but could recycle 2 frontals for another projects.
- There are other best practices which are easy to make like put your static (content like images, css, js, etc) on a separate sub domaine so that the browser can download in parallel (did you know a browser will download 4 files in parallel per domain by the way ?)
Definitely and especially for web apps on our internal network, web performance best practices adoption are not an option and you can do it well with almost noting and better, you may save money.