While Google is promoting WebP, Mozilla thinks JPEG is good enough.

Though streaming video is a bigger user of bandwidth overall, it's images, not video, that are the big bandwidth user during regular browsing. A big proportion of this bandwidth is taken by lossy image formats, specifically JPEG, used to shrink photographic pictures to a more download-friendly size. The desire to make these images smaller—and hence faster to download—has inspired a lot of investigation to determine if some other format might do the job better.

Google has been promoting the use of WebP, the still image derivative of its WebM video codec. Mozilla has also been looking at the issue, but the open source browser organization has come up with a different conclusion: we don't need a new image format, we just need to make better JPEGs.

To that end, the group has released its own JPEG compression library, mozjpeg 2.0, which reduces file sizes by around five percent compared to the widely used libjpeg-turbo. Facebook has announced that it will be testing mozjpeg 2.0 to reduce its bandwidth costs, similar to its WebP trial.

Mozilla's decision to stick with JPEG was driven by two factors. First, its study comparing JPEG to WebP, JPEG XR, and HEVC-MSP (a still image counterpart to the new H.265 video codec), showed that neither WebP nor JPEG XR offer any consistent advantage over JPEG (though this is complicated). Second, it felt that improving JPEG, which virtually every browser and device can already use, is going to be more useful to more people.

Mozilla has scope to improve JPEG because of the way the image format is defined. The JPEG specification describes only the behavior of the image decoder, the part that takes the already compressed data and reconstructs an image. Encoding is left unspecified; a piece of software has to produce something that can be decoded correctly, but how it goes about that is left open.

This might seem a little peculiar, but it is typical of lossy compression formats, with for example H.264 and MP3 both being specified in the same way. It's done because it allows future advances in, say, modelling the capabilities of the human eye (or, for MP3s, the ear) to be used to improve compression without having to create a whole new specification.

Mozilla's work on mozjpeg 2.0 is focused primarily on some of the lossless parts of the compression process (while JPEG is a lossy format, compressors do perform some lossless operations, too). The work results in slightly higher CPU time to compress an image in the first place, but the result are files that are 5-15 percent smaller but which can still be decoded by any software that can handle JPEGs already (bugs notwithstanding). The company says that it intends to make further improvements in the future.

mozjpeg is free and permissively licensed, so Mozilla is hoping that it will see widespread adoption by those creating JPEG images. Facebook, which is also contributing money to fund future mozjpeg development, should be just the first company to test Mozilla's work.

As for why Mozilla isn't interested in WebP even though Google promotes it as superior, it all depends on how you measure image quality. All lossy compression formats contain errors relative to the original image, due to the data that is lost. The thing that makes comparing lossy formats hard is that not all errors are equal; the human visual system is more attuned to some kinds of error and less so to others. For example, it's more sensitive to errors in brightness than errors in color.

As such, there are both objective metrics that just look at the raw error between the lossy and original image and various subjective metrics that strive to mimic the evaluations that real humans would make.

According to Mozilla, Google's tests that demonstrate WebP's superiority depend on objective measures of image quality. Mozilla's own tests showing that JPEG can do just as well depend instead on these subjective measures. HEVC-MSP looks like it is superior according to both kinds of test, but it's complicated and subject to the same patent and licensing quagmire as H.264, making it uncompelling for the time being.