One of the most fundamental rules for creating fast websites is to optimize your resources, and when it comes to textual content such as HTML, CSS, and JS, we’re talking about compression.
De facto text compression on the Internet is Gzip, with about 80% of short answers supporting this algorithm, and the remaining 20% using the much new Brotli.
Of course, this amount only 100% measures the compressible responses that were actually compressed – there are still many millions of resources that could or should have been compressed but were not. For a more detailed breakdown of the numbers, see section “Compression” of the web almanac.
Gzip is extremely effective. All of Shakespeare’s works weigh 5.3 MB in text format; after Gzip (compression level 6) this number decreases to 1.9 MB. This reduces the file size by 2.8 times without losing data. Pleasantly!
Even better for us is that Gzip prefers repetition – the more repeating lines in a text file, the more efficient Gzip can be. This is great news for the Internet, where HTML, CSS and JS have a very consistent and repetitive syntax.
But while Gzip is very effective, it’s old; it was released in 1992 (which certainly helps explain its prevalence). 21 years later, in 2013, Google launched Brotli, a new algorithm that claims to be even better than Gzip! The same 5.2 MB Shakespeare collection is reduced to 1.7 MB when compressed with Brotli (compression level 6), which reduces the file size by 3.1 times. Cool!