Google has come up with a new type of compression for compressing web pages. This new open source compression algorithm is called Zopfli, and this new algorithm compresses web content about three to eight per cent more densely (PDF) than the zlib library, which is a standard. Zopfil uses a standard decompression algorithm which is already a part of almost all modern web browsers, so the decompression of these compressed web content will not be a problem.
This new Google technology can be used on servers for lower web page latencies, and faster web content compression speeds, which is just want is needed to make the web and browsing the web a little bit faster. This new algorithm, the Zopfil, was the work of Zurich-based Google engineer Lode Vandevenne, as his 20% project. Google has this thing where it allows its employees to work on something of their own interest in 20% of their work time, and they usually come up with something really cool, like this one.
The algorithm is an implementation of the Deflate algorithm, which is the same as the one used by the ZIP and gzip data compression software, which is also the same as the one used in PNG image compression. In fact, Zofil’s output is actually compatible with zlib, but the compression algorithm used is much more effective and advanced.
As Vandevenne writes in the announcement today, “the exhaustive method is based on iterating entropy modeling and a shortest path search algorithm to find a low bit cost path through the graph of all possible deflate representations.”
But there is one drawback. It is that the compression algorithm takes a bit longer to compress data than any other compression algorithm. Also, the decompression time is almost the same. Indeed, as Vandevenne notes, “due to the amount of CPU time required — 2 to 3 orders of magnitude more than zlib at maximum quality — Zopfli is best suited for applications where data is compressed once and sent over a network many times, for example, static content for the web.”
Source: Tech Crunch