GIT – The Problem

External javascript URLs placed in <head> are blocking: they will load and execute in the order that they are specified. This is typically a goodthing. For example you might be referencing the jquery library then a jquery plugin, and you need to load them in sequence. The external code is also loaded before any inline javascript in <body>, so that doesn’t fall over when it assumes jquery is available.

This ease of development comes at a price: longer page load times for visitors to your site. Consider the example where you load jquery and a jquery plugin in <head>. The browser will sequentially execute:

  1. loads HTML source
  2. requests /js/jquery.js
  3. executes the jQuery source code
  4. requests /js/jquery.plugin.js
  5. executes plugin source code
  6. render the HTML page

Clearly, the 4 extra steps between loading the HTML and rendering it will increase page loading time. The significance of this time depends on the latency between the client and the server, and the hardware of the client (i.e. how fast their machine will execute the retrieved js code).

Caching, Aggregation, Compression, CDNs: a Solution?

Caching, aggregation and CDNs ease this burden. Referencing JS files using your app deployment version like /js/jquery.js?v=12345means you can serve them with headers that encourage the client to heavily. In this example serving jQuery from the Google CDN would provide a low latency request and higher probability that the file was already in client cache. Aggregating the two files into one, and serving just/js/jquery+plugin.js removes 1 whole HTTP request. Compressing the javascript using tools like YUI Compressor or Google Closure can reduce its size for transmission, and potentially simplify the code itself.

Caching can remove the HTTP request latency on secondary page views: the first time a visitor hits your site the JS will have to be retrieved, but it will be picked up from cache on subsequent visits. Aggregation can reduce the number and size of HTTP requests required on first time visits. However, neither can remove the time taken to execute the javascript, which can be significant with complex js libraries on modest client hardware.

All these techniques will improve your page loading times. If you’re not doing them already, you should be: they are not difficult to implement and do not affect the way you currently develop your application.

Asynchronous Javascript: a Better Solution

If we think back to the sequential execution list of a browser, the absolute fastest way we can get a page to display is:

  1. loads HTML source
  2. render the HTML page

This is what loading javascript asynchronously does. Display the page as fast as possible at all costs! That “cost” is to make the javascript load in a parallel (non-blocking) way. When the page has loaded and displayed to the client, the javascript may, or may not, be ready.

This undoubtably makes the page load very fast. It now doesn’t wait for the javascript, and blocks only on any external CSS files (which of course you’ve cached, compressed, and aggregated). It also means that any javascript effects/etc catch up, rather than always precede the page load. This changes the way you code: javascript now truly does progressively enhance your page since the enhancements get applied independently of page load: someone with a slow connection may start using the page before your javascript effect is applied.

An Example of Asynchronous Javascript

Consider focusing a search box on page load: obviously it is OTT to load jQuery for this simple task (and yes, HTML5 autofocus), but it succinctly demonstrates the concept.

<html>
<head>
  <script src=/js/jquery.js></script>
</head>
<body>
  <form>
    <input id=search type=search name=query>
  </form>
  <script>
    $(document).ready(function(){
      $('#search').focus();
    });
  </script>
<body>
<html>

The jQuery inclusion in head blocks the page load: jQuery is retrieved, parsed and executed before the page will be rendered. In the body jQuery is immediately available and we use the jQuery ready function to focus the search box once the DOM has loaded. Now consider a typical async javascript version:

<html>
<head>
  <script src=/js/asyncJs.js></script>
</head>
<body>
  <form>
    <input id=search type=search name=query>
  </form>
  <script>
    asyncJs.load('/js/jquery',function(){
      $(document).ready(function(){
        $('#search').focus();
      });
    });
  </script>
<body>
<html>

In a chicken-and-egg situation we need a bootstrap script to provide helpers to actually load the heavier scripts, but this is usually tiny. The page blocks on this bootstrap, but then the page load is independent of jQuery load. Note that we load the jQuery script with a callback which will fire once jQuery is available. This may be before or after the DOM is ready, so we still need the jQuery ready wrapper. The search box will retrieve focus at some point after the DOM is ready, but this may be after page load.

In this example we derive very little benefit, but page load time will scale well with added complexity: if you add 5 jQuery plugins the speedup will be significant. Typical async js libraries will be able to

  • load script(s) from URL asynchronously
  • fire callback functions once a script has loaded
  • specify a dependency map so some scripts are loaded before others (e.g. a jQuery plugin depends on jQuery itself)
  • an event for DOM ready, which is harder for libraries to detect when operating asynchronously (it may have fired before the library loads)

Asynchronous Javascript: the Tools

I started writing an async JS bootstrap myself: I stopped when I discovered how tricky it was to work around all the browser quirks. There are some decent existing libraries:

  • LAB.js is popular, with a intuative interface
  • Require.js works well with CommonJS module formats
  • head.js is a good all-rounder
  • script.js, an impressive 643b of goodness from Dustin Diaz

Async Javascript: the Pain!

A word of warning: asynchronous javascript is painful. You should think carefully whether it is really worth it for your application.

Your code will contain more bugs. The extra bugs will be ones based on JS loading sequences and timing so may be sporadic, or only exhibit for a client with a slow connection or poor hardware. These are the worst sort of bugs and at some point you will curse the day you learnt the meaning of “asynchronous”. Don’t blame me, I did warn you.

Print Friendly

Comments

comments

Bài viết liên quan