Web Performance - Payload Size

Web Performance - Payload Size


Minimizing the payload size of both dynamic and static resources can reduce network latency significantly; in addition, for scripts that are cached, cutting down their byte size speeds up the time the browser takes to parse and execute code needed to render the page. The maximum packet size called technically MTU (Maximum Transmission Unit) is 1500 bytes on Ethernet network (varying on other network’s types). This action is performed decreasing the number of packets your server transmits and keeping them under 1500 bytes wherever possible.

Extracted from the Make the Web Faster Google’s project.


There are multiple ways to reduce the payload size of a website, but basically we need to compress, optimize, minify and defer the load of the resources used in each page.

  • Enable compression
  • Remove unused CSS
  • Minify JavaScript
  • Minify CSS
  • Minify HTML
  • Defer loading of JavaScript
  • Optimize images
  • Serve scaled images
  • Serve resources from a consistent URL

Enable compression

Compressing resources like HTML, CSS, and Javascript files with gzip (supported by most modern browsers) or deflate (which use the same compression algorithms but it is not widely used) can reduce the number of bytes sent over the network; it is recommendable to compress resources with a minimum range between 150 and 1000 bytes because sizes below 150 bytes can increase after compressed. Remember not to compress binary files (images, videos, PDF, etc) because there are already compressed, but we can optimize the images with software like Trimage (Linux) and ImageOptim (Mac).

Remove unused CSS

Removing or deferring style rules that are not used by a document avoid downloads unnecessary bytes and allow the browser to start rendering sooner. Even if a stylesheet is in an external file that is cached, rendering is blocked until the browser loads the stylesheet from disk. In addition, once the stylesheet is loaded, the browser’s CSS engine has to evaluate every rule contained in the file to see if the rule applies to the current page. Often, many web sites reuse the same external CSS file for all of their pages, even if many of the rules defined in it don’t apply to the current page.

Minify JavaScript

Compacting JavaScript code eliminating unnecessary bytes, such as extra spaces, line breaks, and indentation can save many bytes of data and speed up downloading, parsing, and execution time. It is recommendable to minify any JS file that are 4096 bytes or larger in size; we should see a benefit for any file that can be reduced by 25 bytes or more (less than this will not result in any appreciable performance gain). There are several online and offline tools used to minify not only Javascript code but CSS and HTML too: Closure Compiler, JSMin or the YUI Compressor.

Minify CSS

Minifying CSS has the same benefits as those for minifying [removed] reducing network latency, enhancing compression, and faster browser loading and execution. Several tools are freely available to minify JavaScript, including the YUI Compressor and CssMin.js

Web Performance Payload Size

Minify HTML

Minifying HTML has the same benefits as those for minifying CSS and [removed] reducing network latency, enhancing compression, and faster browser loading and execution. Moreover, HTML frequently contains in-line Javascript and CSS codes, so it is useful to minify these as well.

Note: This rule is experimental and is currently focused on size reduction rather than strict HTML well-formedness. Future versions of the rule will also take into account correctness.

For details on the current behavior, see the Page Speed wiki

Defer loading of Javascript

Deferring loading of Javascript functions that are not called at startup reduces the initial download size, allowing other resources to be downloaded in parallel, and speeding up execution and rendering time. For AJAX-type applications that use many bytes of Javascript code, this can add considerable latency. To use this technique, you should first identify all of the Javascript functions that are not actually used by the document before the onload event; for any file containing more than 25 uncalled functions, move all of those functions to a separate, external Javascript file (for files containing fewer than 25 uncalled functions, it’s not worth the effort of refactoring).

**// Javascript event listener that forces the external file to be loaded after the ONLOAD event.**
function download_js_at_onload(){
    var element = document.createElement('script');
    element.src = 'deferred-functions.js';

// Check for browser support of event handling capability.
    window.addEventListener('load', download_js_at_onload, false);
}else if (window.attachEvent){
    window.attachEvent('onload', download_js_at_onload);
    window.onload = download_js_at_onload;

Optimize images

Images saved from programs like Adobe Fireworks can contain kilobytes of extra comments, and use too many colors, even though a reduction in the color palette may not perceptibly reduce image quality; improperly optimized images can take up more space than they need to; we should perform both basic and advanced optimization on all images:

  • Reducing color depth to the lowest acceptable level,
  • Removing image comments,
  • Saving the image to an appropriate format,
  • Compression of JPEG and PNG files.

Additionally the type of an image can have a drastic impact on the file size. Use these guidelines:

  • PNGs are almost always superior to GIFs and are usually the best choice and it’s supported by all modern web browsers including alpha transparency; try to convert suitable PNGs with GIMP (the best image editing and processing program open-source) by using Indexed rather than RGB mode.
  • Use GIFs for graphics less than 10x10 pixels or color palette of less than 3 colors; and try to use GIFs instead of PNGs for animations.
  • Use JPGs for all photographic-style images.
  • Do not use BMP or TIFF file format in any case.

Several tools are available that perform further, lossless compression on JPEG and PNG files, with no effect on image quality; I recommend Trimage (for Linux) and ImageOptim (for Macintosh), two user and command-line interfaces to optimize image files via OptiPNG, AdvPNG, PNGCrush, PNGOut, GIFSicle, OptiPNG, JPEGRescan, JPEGTran and JPEGOptim, depending on the filetype (currently, PNG and JPG files are supported) all image files are losslessly compressed on the highest available compression levels.

Web Performance Payload Size

Serve scaled images

Sometimes you may want to display the same image in various sizes, so you will serve a single image resource and use HTML or CSS in the containing page to scale it; this makes sense if the actual image size matches at least one the largest of the instances in the page. However, if we serve an image that is larger than the dimensions used in all of the markup instances, we are sending unnecessary bytes over the wire. We should use an image editor to scale images to match the largest size needed in your page, and make sure that we specify those dimensions in the page as well using HTML not CSS.

Serve resources from a consistent URL

It’s important to serve a resource from a unique URL, to eliminate duplicate download bytes and additional RTTs. This typically happen when we reference images and resources shared across multiple pages in a site. A relative URL and an absolute URL are consistent if the hostname of the absolute URL matches that of the containing document.

// One resource referenced in two ways with the domain www.example.com
1. /images/example.gif
2. www.example.com/images/example.gif

// One image referenced in two ways with different domains.
1. /images/example.gif
2. subdomain.example.com/images/example.gif
Do you have a project idea? Let's make it together!