
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:media="http://search.yahoo.com/mrss/">
    <channel>
        <title><![CDATA[ The Cloudflare Blog ]]></title>
        <description><![CDATA[ Get the latest news on how products at Cloudflare are built, technologies used, and join the teams helping to build a better Internet. ]]></description>
        <link>https://blog.cloudflare.com</link>
        <atom:link href="https://blog.cloudflare.com/" rel="self" type="application/rss+xml"/>
        <language>en-us</language>
        
        <lastBuildDate>Mon, 13 Apr 2026 14:47:55 GMT</lastBuildDate>
        <item>
            <title><![CDATA[Uncovering the Hidden WebP vulnerability: a tale of a CVE with much bigger implications than it originally seemed]]></title>
            <link>https://blog.cloudflare.com/uncovering-the-hidden-webp-vulnerability-cve-2023-4863/</link>
            <pubDate>Thu, 05 Oct 2023 15:00:43 GMT</pubDate>
            <description><![CDATA[ Google announced a security issue in Chrome titled "Heap buffer overflow in WebP in Google Chrome." At first it seemed like just another bug, but has implications that extended well beyond Chrome. ]]></description>
            <content:encoded><![CDATA[ <p></p><p>At Cloudflare, we're constantly vigilant when it comes to identifying vulnerabilities that could potentially affect the Internet ecosystem. Recently, on September 12, 2023, Google announced a security issue in Google Chrome, titled "Heap buffer overflow in WebP in Google Chrome," which caught our attention. Initially, it seemed like just another bug in the popular web browser. However, what we discovered was far more significant and had implications that extended well beyond Chrome.</p>
    <div>
      <h3>Impact much wider than suggested</h3>
      <a href="#impact-much-wider-than-suggested">
        
      </a>
    </div>
    <p>The vulnerability, tracked under <a href="https://nvd.nist.gov/vuln/detail/CVE-2023-4863">CVE-2023-4863</a>, was described as a heap buffer overflow in WebP within Google Chrome. While this description might lead one to believe that it's a problem confined solely to Chrome, the reality was quite different. It turned out to be a bug deeply rooted in the libwebp library, which is not only used by Chrome but by virtually every application that handles WebP images.</p><p>Digging deeper, this vulnerability was in fact first reported in an earlier CVE from Apple, CVE-2023-41064, although the connection was not immediately obvious. In early September, Citizen Lab, a research lab based out of the University of Toronto, reported on an apparent exploit that was being used to attempt to install spyware on the iPhone of "an individual employed by a Washington DC-based civil society organization." The advisory from Apple was also incomplete, stating that it was a “buffer overflow issue in ImageIO,” and that they were aware the issue may have been actively exploited. Only after Google released CVE-2023-4863 did it become clear that these two issues were linked, and there was a wider vulnerability in WebP.</p><p>The vulnerability allows an attacker to create a malformed WebP image file that makes libwebp write data beyond the buffer memory allocated to the image decoder. By writing past the legal bounds of the buffer, it is possible to modify sensitive data in memory, eventually leading to execution of the attacker's code.</p><p>WebP, introduced over a decade ago, has gained widespread adoption in various applications, ranging from web browsers to email clients, chat apps, graphics programs, and even operating systems. This ubiquity meant that this vulnerability had far-reaching consequences, affecting a vast array of software and virtually all users of the WebP format.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1VR5ms2oG6429Y6D8F3Hic/80023d781031236142015576080767c1/image2-3.png" />
            
            </figure><p><i>How the WebP vulnerability is exploited</i></p>
    <div>
      <h3>Understanding the technical details</h3>
      <a href="#understanding-the-technical-details">
        
      </a>
    </div>
    <p>So what exactly was the issue, how could it be exploited, and how was it shut down? We can get our best clues by looking at <a href="https://chromium.googlesource.com/webm/libwebp/+/902bc9190331343b2017211debcec8d2ab87e17a%5E%21/#F0">the patch that was made to libwebp</a>. This patch fixes a potential out-of-buffer (OOB) error in part of the image decoder – the Huffman tables – with two changes: additional validation of the input data, and a modified dynamic memory allocation model. A deeper dive into libwebp and the WebP image format built on top of it reveals what this means.</p><p>WebP is a combination of two different image formats: a lossy format similar to JPEG using VP8 codec, and a lossless format using WebP's custom lossless codec. The bug was in the lossless codec's handling of Huffman coding.</p><p>The fundamental idea behind Huffman coding is that using a constant number of bits for every basic unit of information in a dataset – like a pixel color – is not the most efficient representation. We can use a variable number of bits, and assign shortest sequences to the most frequently occurring values, and longer ones to the least common values. The sequences of ones and zeros can be represented as a binary tree, with the shorter, more common codes near the root, and longer, less common codes deeper in the tree. Looking up values in the tree bit by bit is relatively slow. Practical implementations build lookup tables that allow matching many bits at a time.</p><p>Image files contain compact information about the shape of the Huffman tree, which the decoder uses to reconstruct the tree, and build lookup tables for the codes. The bug in libwebp was in the code building the lookup tables. A specially crafted WebP file can contain a very unbalanced Huffman tree that contains codes much longer than any normal WebP file would have, and this made the function generating lookup tables write data beyond the buffer allocated for the lookup tables. Libwebp had checks for validity of the Huffman tree, but it would write the invalid lookup tables before the consistency check.</p><p>The buffer for lookup tables is allocated on the heap. Heap is an area of memory where most of the data of the application is stored. Code that writes data past its buffer allows attackers to modify and corrupt data that happens to be adjacent in memory to the buffer. This can be exploited to make the application misbehave, and eventually start executing code supplied by the attacker.</p><p>The fixed version of libwebp ensures that the input data will always create a valid internal structure, and if so, allocates more memory if necessary to ensure the buffer is always big enough.</p><p>Libwebp is a mature library, maintained by seasoned professionals. But it's written in the C language, which has very few safeguards against programming errors, especially memory use. Despite the care taken in the library's development, a single erroneous assumption led to a critical vulnerability.</p>
    <div>
      <h3>Swift action</h3>
      <a href="#swift-action">
        
      </a>
    </div>
    <p>On the same day that Google's announcement caught our attention, we filed an internal security ticket, to document and address the vulnerability.</p><p>Google was initially perplexed about the true source of the problem. They did not release a patched version of libwebp before announcing the vulnerability. We discovered the yet-unreleased patch for libwebp in its repository, and used it to update libwebp in our services. libwebp officially released the patch a day later.</p><p>Our image processing services are written in Rust. We've submitted patches to Rust packages that contained a copy of libwebp and filed RustSec advisories for them (<a href="https://rustsec.org/advisories/RUSTSEC-2023-0061.html">RUSTSEC-2023-0061</a> and <a href="https://rustsec.org/advisories/RUSTSEC-2023-0062.html">RUSTSEC-2023-0062</a>). This ensured that the broader Rust ecosystem was informed and could take appropriate action.</p><p>In an interesting turn of events, GitHub's vulnerability scanner was quick to recognize our RustSec reports as the first case of CVE-2023-4863, even before the issue gained widespread attention. This highlights the importance of having robust security reporting mechanisms in place and the vital role that platforms like GitHub play in <a href="https://www.cloudflare.com/the-net/oss-attack-detection/">keeping the open-source community secure</a>.</p><p>These quick actions demonstrate how seriously Cloudflare takes this kind of threat. We have a belt-and-suspenders approach to security that limits the binaries we run at our edge to those signed by us, and ensures that all vulnerabilities are identified and remedied as soon as possible. In this case, we have scrutinized our logs, and found no evidence that any attackers attempted to leverage this vulnerability against Cloudflare. We believe this exploit targeted individuals rather than the infrastructure of a company like Cloudflare, but we never take chances with our customers’ data, and so fixed this vulnerability as quickly as possible, before it became well known.</p>
    <div>
      <h3>Conclusion</h3>
      <a href="#conclusion">
        
      </a>
    </div>
    <p>Google has now widened its description of this issue, correctly calling out that all uses of WebP are potentially affected. This widened description was originally filed as yet another new CVE – <a href="https://nvd.nist.gov/vuln/detail/CVE-2023-5129">CVE-2023-5129</a> – but then that was flagged as a duplicate of the original CVE-2023-4863, and the description of the earlier filing updated. This incident serves as a reminder of the complex and interconnected nature of the Internet ecosystem. What initially seemed like a Chrome-specific problem revealed a much deeper issue that touched nearly every corner of the digital world. The incident also showcased the importance of swift collaboration and the critical role that responsible disclosure plays in mitigating security risks.</p><p>For each and every user, it demonstrates the need to keep all browsers, apps and operating systems up to date, and to install recommended security patches. All applications supporting WebP images need to be updated. We've updated our services.</p><p>At Cloudflare, we remain committed to enhancing the security of the Internet, and incidents like these drive us to continually refine our processes and strengthen our partnerships within the global developer community. By working together, we can make the Internet a safer place for everyone.</p> ]]></content:encoded>
            <category><![CDATA[Vulnerabilities]]></category>
            <category><![CDATA[Chrome]]></category>
            <category><![CDATA[WebP]]></category>
            <category><![CDATA[Security]]></category>
            <category><![CDATA[Swift]]></category>
            <guid isPermaLink="false">5nwTZ5NB8OAO51Ut53JB1B</guid>
            <dc:creator>Willi Geiger</dc:creator>
            <dc:creator>Kornel Lesiński</dc:creator>
        </item>
        <item>
            <title><![CDATA[SVG support in Cloudflare Images]]></title>
            <link>https://blog.cloudflare.com/svg-support-in-cloudflare-images/</link>
            <pubDate>Wed, 21 Sep 2022 14:00:00 GMT</pubDate>
            <description><![CDATA[ Cloudflare Images now supports storing and delivering SVG files ]]></description>
            <content:encoded><![CDATA[ <p></p><p>Cloudflare Images was announced one year ago <a href="/announcing-cloudflare-images/">on this very blog</a> to help you solve the problem of delivering images in the right size, right quality and fast. Very fast.</p><p>It doesn’t really matter if you only run a personal blog, or a portal with thousands of vendors and millions of end-users. Doesn’t matter if you need one hundred images to be served one thousand times each at most, or if you deal with tens of millions of new, unoptimized, images that you deliver billions of times per month.</p><p>We want to remove the complexity of dealing with the need to store, to process, resize, re-encode and serve the images using multiple platforms and vendors.</p><p>At the time we wrote:</p><blockquote><p><i>Images is a single product that stores, resizes, optimizes and serves images. We built Cloudflare Images, so customers of all sizes can build a scalable and affordable image pipeline in minutes.</i></p></blockquote><p>We supported the most common formats, such as JPG, WebP, PNG and GIF.</p><p>We did not feel the need to support SVG files. SVG files are inherently scalable, so there is nothing to resize on the server side before serving them to your audience. One can even argue that SVG files are documents that can generate images through mathematical formulas of vectors and nodes, but are not images <i>per se.</i></p><p>There was also the clear notion that SVG files were a potential risk due to known and <a href="https://www.fortinet.com/blog/threat-research/scalable-vector-graphics-attack-surface-anatomy">well documented</a> vulnerabilities. We knew we could do something from the security angle, but still, why go through that workload if it <i>didn’t make sense</i> in the first place to consider an SVG as a supported format.</p><p>Not supporting SVG files, though, did bring a set of challenges to an increasing number of our customers. <a href="https://w3techs.com/technologies/details/im-svg">Some stats already show that around 50% of websites serve SVG files</a>, which matches the pulse we got from talking with many of you, customers and community.</p><p>If you relied on SVGs, you had to select a second storage location or a second image platform elsewhere. That commonly resulted in an egress fee when serving an uncached file from that source, and it goes against what we want for our product: one image pipeline to cover all your needs.</p><p>We heard loud and clear, and starting from today, you can store and serve SVG files, safely, with Cloudflare Images.</p>
    <div>
      <h3>SVG, what is so special about them?</h3>
      <a href="#svg-what-is-so-special-about-them">
        
      </a>
    </div>
    <p>The Scalable Vector Graphics file type is great for serving all kinds of illustrations, charts, logos, and icons.</p><p>SVG files don't represent images as pixels, but as geometric shapes (lines, arcs, polygons) that can be drawn with perfect sharpness at any resolution.</p><p>Let’s use now a complex image as an example, filled with more than four hundred paths and ten thousand nodes:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4ruJDon2gjvBXwHi9DGsA7/997727a99a00188695871c37f08adf46/uHWAmWDUYVNmDByskHnBsSf_-poXNMAz7sxTw-bjNYHldqbU5ecTj_upSCKIHoXRnolnrlpPqvyDbBray-TaRkDJcGOO9CKQUdY3CpvwmaNn0rRkqqnPLAJJaE0D.png" />
            
            </figure><p>Contrary to the bitmaps where pixels arrange together to create the visual perception of an image to the human eye, that vector image can be resized with no quality loss. That happens because resizing that SVG to 300% of its original size is redefining the size of the vectors to 300%, not expanding pixels to 300%.</p><p>This becomes evident when we’re dealing with small resolution images.</p><p>Here is the 100px width SVG from the Toroid shown above:</p><p><img src="http://staging.blog.mrk.cfdata.org/content/images/2022/09/Toroid.svg" /></p><p>and the correspondent 100 pixels width PNG:</p><p><img src="http://staging.blog.mrk.cfdata.org/content/images/2022/09/image3-18.png" /></p><p>Now here is the same SVG with the HTML width attribute set at 300px:</p><p><img src="http://staging.blog.mrk.cfdata.org/content/images/2022/09/Toroid.svg" /></p><p>and the same PNG you saw before, but, upscaled by 3x, so the width is also 300px:</p><p><img src="http://staging.blog.mrk.cfdata.org/content/images/2022/09/unnamed.png" /></p><p>The visual quality loss on the PNG is obvious when it gets scaled up.</p><p>Keep in mind: The Toroid shown above is stored in an SVG file of 142Kb. And that is a very complex and heavy SVG file already.</p><p>Now, if you do want to display a PNG with an original width of 1024px to present a high quality image of the same Toroid above, the size will become an issue:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1haST58mkysZs17Fn2dNv4/d297d11a7aef91c24c8fcda6a89ddf4f/unnamed--1-.png" />
            
            </figure><p>The new 1024px PNG, however, weighs 344 KB. That’s about 2.4 times the weight of the unique SVG that you could use in any size.</p><p>Think about the storage and bandwidth savings when all you need to do with an SVG, to get the exact same displayed image is use a <code>width=”1024”</code> in your HTML. It requires less than half of the kilobytes used on the PNG.</p><p>Couple all of this with the flexibility of using attributes like <code>viewbox</code> in your HTML code, and you can pan, zoom, crop, scale, all without ever needing anything other than the one and original SVG file.</p><p>Here’s an example of an SVG being resized on the client side, with no visual quality loss:</p><div></div>
<p></p><p>Let’s do a quick summary of what we covered so far: SVG files are wonderful for vector images like illustrations, charts, logos, and are infinitely scalable with no need to resize on the server side;</p><p>the same generated image, but on a bitmap is either heavier than the SVG when used in high resolutions, or with very noticeable loss of visual quality when scaled up from a lower resolution.</p>
    <div>
      <h3>So, what are the downsides of using SVG files?</h3>
      <a href="#so-what-are-the-downsides-of-using-svg-files">
        
      </a>
    </div>
    <p>SVG files aren't just images. They are XML-based documents that are as powerful as HTML pages. They can contain arbitrary JavaScript, fetch external content from other URLs or embed HTML elements. This gives SVG files much more power than expected from a simple image.</p><p>Throughout the years, numerous exploits have been known, identified and corrected.</p><p>Some old attacks were very rudimentary, yet effective. The famous <a href="https://en.wikipedia.org/wiki/Billion_laughs_attack">Billion Laughs</a> exploited how <a href="https://www.w3resource.com/xml/entities.php">XML uses Entities and declares them in the Document Type Definition</a>, and how it handles recursion.</p><p>Entities can be something as simple as a declaration of a text string, or a nested reference to other previous entities.</p><p>If you defined a first entity with a simple string, and then created a second entity calling 10 times the first one, and then a third entity calling 10 times the second one up until a 10th one of the same kind, you were requiring a parser to generate an output of a billion strings as defined on the very simple first entity. This would most commonly exhaust resources on the server parsing the XML, and form a <a href="https://www.cloudflare.com/en-gb/learning/ddos/what-is-a-ddos-attack/">DoS</a>. While that particular limitation from the XML parsing got widely addressed through XML parser memory caps and lazy loading of entities, more complex attacks became a regular thing in recent years.</p><p>The common themes in these more recent attacks have been <a href="https://www.cloudflare.com/learning/security/how-to-prevent-xss-attacks/">XSS (cross-site-scripting)</a> and foreign objects referenced in the XML content. In both cases, using SVG inside  tags in your HTML is an invitation for any ill-intended file to reach your end-users. So, what exactly can we do about it and make you trust any SVG file you serve?</p>
    <div>
      <h3>The SVG filter</h3>
      <a href="#the-svg-filter">
        
      </a>
    </div>
    <p>We've developed a filter that simplifies SVG files to only features used for images, so that serving SVG images from any source is just as safe as serving a JPEG or PNG, while preserving SVG's vector graphics capabilities.</p><ul><li><p>We remove scripting. This prevents SVG files from being used for cross-site scripting attacks. Although browsers don't allow scripts in , they would run scripts when SVG files are opened directly as a top-level document.</p></li><li><p>We remove hyperlinks to other documents. This makes SVG files less attractive for SEO spam and phishing.</p></li><li><p>We remove references to cross-origin resources. This stops 3rd parties from tracking who is viewing the image.</p></li></ul><p>What's left is just an image.</p><p>SVG files can also contain embedded images in other formats, like JPEG and PNG, in the form of <a href="https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/Data_URLs">Data URLs</a>. We treat these embedded images just like other images that we process, and optimize them too. We don't support SVG files embedded in SVG recursively, though. It does open the door to recursive parsing leading to resource exhaustion on the parser. While the most common browsers are already limiting SVG recursion to one level, the potential to exploit that door led us to not include, at least for now, this capability on our filter.</p><p>We do set Content-Security-Policy (CSP) headers in all our HTTP response headers to disable unwanted features, and that alone acts as first defense, but filtering acts in more depth in case these headers are lost (e.g. if the image was saved as a file and served elsewhere).</p><p>Our tool is <a href="https://github.com/cloudflare/svg-hush">open-source</a>. It's written in Rust and can filter SVG files in a streaming fashion without buffering, so it's fast enough for filtering on the fly.</p><p>The SVG format is pretty complex, with lots of features. If there is safe SVG functionality that we don't support yet, you can report issues and contribute to development of the filter.</p><p>You can see how the tool actually works by looking at the tests folder in the open-source repository,  where a sample unfiltered XML and the already filtered version are present.</p><p>Here’s how a diff of those files looks like:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/61edBaHRCc923sZoBnknKB/dd590659eaab4dafb9c637434e3411a4/image5-8.png" />
            
            </figure><p>Removed are the external references, foreignObjects and any other potential threats.</p>
    <div>
      <h3>How you can use SVG files in Cloudflare Images</h3>
      <a href="#how-you-can-use-svg-files-in-cloudflare-images">
        
      </a>
    </div>
    <p>Starting now you can upload SVG files to Cloudflare Images and serve them at will. Uploading the images can be done like for any other supported format, <a href="https://developers.cloudflare.com/images/cloudflare-images/upload-images/dashboard-upload/">via UI</a> or <a href="https://developers.cloudflare.com/images/cloudflare-images/upload-images/upload-via-url/">API</a>.</p><div></div><p>Variants, <a href="https://developers.cloudflare.com/images/cloudflare-images/transform/resize-images/">named</a> or <a href="https://developers.cloudflare.com/images/cloudflare-images/transform/flexible-variants/">flexible</a>, are intended to transform bitmap (raster) images into whatever size you want to serve them.</p><p>SVG files, as vector images, do not require resizing inside the Images pipeline.</p><p>This results in a banner with the following message when you’re previewing an SVG in the UI:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7DBn12LoNf1pQdIV0yxlN3/737bebe7ab5bb17f2883448aaea23524/image1-30.png" />
            
            </figure><p>And as a result, all variants listed will show the exact same image in the exact same dimensions.</p><p>Because an image is worth a thousand words, especially when trying to describe behaviors, here is what will it look like if you scroll through the variants preview:</p><div></div><p>With Cloudflare Images you do get a default Public Variant listed when you start using the product, and so you can immediately start serving your SVG files using it, just like this:</p><p><a href="https://imagedelivery.net/">https://imagedelivery.net/</a>&lt;your_account_hash&gt;/&lt;your_SVG_ID&gt;/public</p><p>And, as shown from above, you can use any of your variant names to serve the image, as it won’t affect the output at all.</p><p>If you’re an Image Resizing customer, you can also benefit from serving your files with our tool. Make sure you head to the <a href="https://developers.cloudflare.com/images/image-resizing/">Developer Documentation</a> pages to see how.</p>
    <div>
      <h3>What’s next?</h3>
      <a href="#whats-next">
        
      </a>
    </div>
    <p>You can subscribe to Cloudflare Images <a href="https://dash.cloudflare.com/?to=/:account/images">directly in the dashboard</a>, and starting from today you can use the product to store and serve SVG files.</p><p>If you want to contribute to further developments of the filtering too and help expand its abilities, check out our <a href="https://github.com/cloudflare/svg-hush">SVG-Hush Tool repo</a>.</p><p>You can also connect directly with the team in our <a href="https://discord.com/invite/cloudflaredev">Cloudflare Developers Discord Server</a>.</p> ]]></content:encoded>
            <category><![CDATA[GA Week]]></category>
            <category><![CDATA[General Availability]]></category>
            <category><![CDATA[Cloudflare Images]]></category>
            <guid isPermaLink="false">5Z8tlaSgZifZHEK46BkW2r</guid>
            <dc:creator>Paulo Costa</dc:creator>
            <dc:creator>Yevgen Safronov</dc:creator>
            <dc:creator>Kornel Lesiński</dc:creator>
        </item>
        <item>
            <title><![CDATA[Introducing support for the AVIF image format]]></title>
            <link>https://blog.cloudflare.com/generate-avif-images-with-image-resizing/</link>
            <pubDate>Sat, 03 Oct 2020 13:00:00 GMT</pubDate>
            <description><![CDATA[ We're adding support for the new AVIF image format. It compresses images significantly better than older-generation formats such as WebP and JPEG. ]]></description>
            <content:encoded><![CDATA[ 
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7pur7e2GCh3MalSZMrQS7q/10a180152a825d811ae832347dad8b66/BDES-1080_Cloudflare_Resize_Illustration_AV1-AVIF_Blog_Header.png" />
            
            </figure><p>We've added support for the new AVIF image format in <a href="https://developers.cloudflare.com/images/">Image Resizing</a>. It compresses images significantly better than older-generation formats such as WebP and JPEG. It's supported in Chrome desktop today, and support is coming to other Chromium-based browsers, as well as Firefox.</p>
    <div>
      <h3>What’s the benefit?</h3>
      <a href="#whats-the-benefit">
        
      </a>
    </div>
    <p>More than a half of an average website's bandwidth is spent on images. Improved <a href="https://www.cloudflare.com/learning/performance/glossary/what-is-image-compression/">image compression</a> can save bandwidth and improve overall performance of the web. The compression in AVIF is so good that images can reduce to half the size of JPEG and WebP</p>
    <div>
      <h3>What is AVIF?</h3>
      <a href="#what-is-avif">
        
      </a>
    </div>
    <p>AVIF is a combination of the HEIF ISO standard, and a royalty-free AV1 codec by <a href="https://aomedia.org/">Mozilla, Xiph, Google, Cisco, and many others</a>.</p><p>Currently, JPEG is the most popular image format on the Web. It's doing remarkably well for its age, and it will likely remain popular for years to come thanks to its excellent compatibility. There have been many previous attempts at replacing JPEG, such as JPEG 2000, JPEG XR and WebP. However, these formats offered only modest compression improvements, and didn't always beat JPEG on image quality. Compression and image quality in <a href="https://netflixtechblog.com/avif-for-next-generation-image-coding-b1d75675fe4">AVIF is better than in all of them, and by a wide margin</a>.</p><table>
<tr>
<td>
    <img src="http://staging.blog.mrk.cfdata.org/content/images/2021/11/avif-test-timothy-meinberg.jpeg" />
</td>
<td>
    
    
    <img src="https://images.ctfassets.net/slt3lc6tev37/4JGMoM6h1eV1YAv8N3QnRC/7e61ac8dfe6836ce46d347311993c50b/avif-test-timothy-meinberg.webp" />
    
</td>
<td>
    
    
    <img src="http://staging.blog.mrk.cfdata.org/content/images/2021/11/avif-test-timothy-meinberg.avif.png" />
    
</td>
</tr>
<tr>
    <td>JPEG (17KB)</td>
    <td>WebP (17KB)</td>
    <td>AVIF (17KB)</td>
</tr>
</table>
    <div>
      <h3>Why a new image format?</h3>
      <a href="#why-a-new-image-format">
        
      </a>
    </div>
    <p>One of the big things AVIF does is it fixes WebP's biggest flaws. WebP is over 10 years old, and AVIF is a major upgrade over the WebP format. These two formats are technically related: they're both based on a video codec from the VPx family. WebP uses the old VP8 version, while AVIF is based on AV1, which is the next generation after <a href="https://en.wikipedia.org/wiki/VP10">VP10</a>.</p><p>WebP is limited to 8-bit color depth, and in its best-compressing mode of operation, it can only store color at half of the image's resolution (known as chroma subsampling). This causes edges of saturated colors to be smudged or pixelated in WebP. AVIF supports 10- and 12-bit color at full resolution, and high dynamic range (HDR).</p><table>
<tr><td><img src="http://staging.blog.mrk.cfdata.org/content/images/2020/10/avif-test-jpeg1.png" /></td><td>JPEG</td></tr>
<tr><td><img src="http://staging.blog.mrk.cfdata.org/content/images/2020/10/avif-test-webp1.png" /></td><td>WebP</td></tr>
<tr><td><img src="http://staging.blog.mrk.cfdata.org/content/images/2020/10/avif-test-webp2.png" /></td><td>WebP (sharp YUV option)</td></tr>
<tr><td><img src="http://staging.blog.mrk.cfdata.org/content/images/2020/10/avif-test-avif1.png" /></td><td>AVIF</td></tr>
</table><p>AV1 also uses a new compression technique: chroma-from-luma. Most image formats store brightness separately from color hue. AVIF uses the brightness channel to guess what the color channel may look like. They are usually correlated, so a good guess gives smaller file sizes and sharper edges.</p>
    <div>
      <h3>Adoption of AV1 and AVIF</h3>
      <a href="#adoption-of-av1-and-avif">
        
      </a>
    </div>
    <p>VP8 and WebP suffered from reluctant industry adoption. Safari only added WebP support very recently, 10 years after Chrome.</p><p>Chrome 85 supports AVIF already. It’s a work in progress in other Chromium-based browsers, and Firefox. Apple hasn't announced whether Safari will support AVIF. However, this time Apple is one of the <a href="https://aomedia.org/membership/members/">companies in the Alliance for Open Media</a>, creators of AVIF. The AV1 codec is already seeing faster adoption than the previous royalty-free codecs. Latest GPUs from Nvidia, AMD, and Intel already have hardware decoding support for AV1.</p><p>AVIF uses the same HEIF container as the HEIC format used in iOS’s camera. AVIF and HEIC offer similar compression performance. The difference is that HEIC is based on a commercial, patent-encumbered H.265 format. In countries that allow software patents, H.265 is illegal to use without obtaining patent licenses. It's covered by thousands of patents, owned by hundreds of companies, which have fragmented into two competing licensing organizations. Costs and complexity of patent licensing used to be acceptable when videos were published by big studios, and the cost could be passed on to the customer in the price of physical discs and hardware players. Nowadays, when video is mostly consumed via free browsers and apps, <a href="https://blog.chiariglione.org/a-future-without-mpeg/">the old licensing model has become unsustainable</a>. This has created a huge incentive for Web giants like Google, Netflix, and Amazon to get behind the royalty-free alternative. AV1 is free to use by anyone, and the alliance of tech giants behind it will <a href="http://aomedia.org/press%20releases/the-alliance-for-open-media-statement/">defend it</a> from patent troll's lawsuits.</p><p>Non-standard image formats usually can only be created using their vendor's own implementation. AVIF and AV1 are already ahead with multiple independent implementations: libaom, Intel SVT-AV1, libgav1, dav1d, and rav1e. This is a sign of a healthy adoption and a prerequisite to be a Web standard. Our Image Resizing is implemented in <a href="https://www.rust-lang.org/">Rust</a>, so we've chosen the <a href="https://github.com/xiph/rav1e">rav1e</a> encoder to create a pure-Rust implementation of AVIF.</p>
    <div>
      <h3>Caveats</h3>
      <a href="#caveats">
        
      </a>
    </div>
    <p>AVIF is a feature-rich format. Most of its features are for smartphone cameras, such as "live" photos, depth maps, bursts, HDR, and lossless editing. Browsers will support only a fraction of these features. In our implementation in Image Resizing we’re supporting only still standard-range images.</p><p>Two biggest problems in AVIF are encoding speed and lack of progressive rendering.</p><p>AVIF images don't show anything on screen until they're fully downloaded. In contrast, a progressive JPEG can display a lower-quality approximation of the image very quickly, while it's still loading. <a href="/parallel-streaming-of-progressive-images/">When progressive JPEGs are delivered well</a>, they make websites appear to load much faster. Progressive JPEG can look loaded at half of its file size. AVIF can fully load at half of JPEG's size, so it somewhat overcomes the lack of progressive rendering with the sheer compression advantage. In this case only WebP is left behind, which has neither progressive rendering nor strong compression.</p><p>Decoding AVIF images for display takes relatively more CPU power than other codecs, but it should be fast enough in practice. Even low-end Android devices can play AV1 videos in full HD without help of hardware acceleration, and AVIF images are just a single frame of an AV1 video. However, encoding of AVIF images is much slower. It may take even a few seconds to create a single image. AVIF supports tiling, which accelerates encoding on multi-core CPUs. We have <a href="/cloudflares-gen-x-servers-for-an-accelerated-future/">lots of CPU cores</a>, so we take advantage of this to make encoding faster. Image Resizing doesn’t use the maximum compression level possible in AVIF to further increase compression speed. Resized images are cached, so the encoding speed is noticeable only on a cache miss.</p><p>We will be adding AVIF support to <a href="https://support.cloudflare.com/hc/en-us/articles/360000607372-Using-Cloudflare-Polish-to-compress-images">Polish</a> as well. Polish converts images asynchronously in the background, which completely hides the encoding latency, and it will be able to compress AVIF images better than Image Resizing.</p>
    <div>
      <h3>Note about benchmarking</h3>
      <a href="#note-about-benchmarking">
        
      </a>
    </div>
    <p>It's surprisingly difficult to fairly and accurately judge which lossy codec is better. Lossy codecs are specifically designed to mainly <a href="https://www.cloudflare.com/learning/performance/glossary/what-is-image-compression/">compress image details</a> that are too subtle for the naked eye to see, so "looks almost the same, but the file size is smaller!" is a common feature of all lossy codecs, and not specific enough to draw conclusions about their performance.</p><p>Lossy codecs can't be compared by comparing just file sizes. Lossy codecs can easily make files of any size. Their effectiveness is in the ratio between file size and visual quality they can achieve.</p><p>The best way to compare codecs is to make each compress an image to the exact same file size, and then to compare the actual visual quality they've achieved. If the file sizes differ, any judgement may be unfair, because the codec that generated the larger file may have done so only because it was asked to preserve more details, not because it can't compress better.</p>
    <div>
      <h3>How and when to enable AVIF today?</h3>
      <a href="#how-and-when-to-enable-avif-today">
        
      </a>
    </div>
    <p>We recommend AVIF for websites that need to deliver high-quality images with as little bandwidth as possible. This is important for users of slow networks and in <a href="https://whatdoesmysitecost.com/">countries where the bandwidth is expensive</a>.</p><p>Browsers that support the AVIF format announce it by adding <code>image/avif</code> to their <code>Accept</code> request header. To request the AVIF format from <a href="https://developers.cloudflare.com/images/">Image Resizing</a>, set the <code>format</code> option to <code>avif</code>.</p><p>The best method to detect and enable support for AVIF is to use <a href="https://developers.cloudflare.com/images/worker">image resizing in Workers</a>:</p>
            <pre><code>addEventListener('fetch', event =&gt; {
  const imageURL = "https://jpeg.speedcf.com/cat/4.jpg";

  const resizingOptions = {
    // You can set any options here, see:
    // https://developers.cloudflare.com/images/worker
    width: 800,
    sharpen: 1.0,
  };

  const accept = event.request.headers.get("accept");
  const isAVIFSupported = /image\/avif/.test(accept);
  if (isAVIFSupported) {
    resizingOptions.format = "avif";
  }
  event.respondWith(fetch(imageURL), {cf:{image: resizingOptions}})
})</code></pre>
            <p>The above script will auto-detect the supported format, and serve AVIF automatically. Alternatively, you can use URL-based resizing together with the <code>&lt;picture&gt;</code> element:</p>
            <pre><code>&lt;picture&gt;
    &lt;source type="image/avif" 
            srcset="/cdn-cgi/image/format=avif/image.jpg"&gt;
    &lt;img src="/image.jpg"&gt;
&lt;/picture&gt;</code></pre>
            <p>This uses our <a href="https://developers.cloudflare.com/images/about"><code>/cdn-cgi/image/…</code></a> endpoint to perform the conversion, and the alternative source will be picked only by browsers that support the AVIF format.</p><p>We have the <code>format=auto</code> option, but it won't choose AVIF yet. We're cautious, and we'd like to test the new format more before enabling it by default.</p> ]]></content:encoded>
            <category><![CDATA[Optimization]]></category>
            <category><![CDATA[WebP]]></category>
            <category><![CDATA[Speed & Reliability]]></category>
            <category><![CDATA[Product News]]></category>
            <guid isPermaLink="false">49mNQJofaty5LTpZMvcjPb</guid>
            <dc:creator>Kornel Lesiński</dc:creator>
        </item>
        <item>
            <title><![CDATA[Parallel streaming of progressive images]]></title>
            <link>https://blog.cloudflare.com/parallel-streaming-of-progressive-images/</link>
            <pubDate>Tue, 14 May 2019 16:00:00 GMT</pubDate>
            <description><![CDATA[ Image-optimized HTTP/2 multiplexing makes all progressive images across the page appear visually complete in half of the time. ]]></description>
            <content:encoded><![CDATA[ 
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/277BtPJSXREVafvPqyX3ZH/60d4273f08458c42a0190203a4bf9b7b/880BAE29-39B3-4733-96DA-735FE76443D5.png" />
            
            </figure><p>Progressive image rendering and HTTP/2 multiplexing technologies have existed for a while, but now we've combined them in a new way that makes them much more powerful. With Cloudflare progressive streaming <b>images appear to load in half of the time, and browsers can start rendering pages sooner</b>.</p>
<p>In HTTP/1.1 connections, servers didn't have any choice about the order in which resources were sent to the client; they had to send responses, as a whole, in the exact order they were requested by the web browser. HTTP/2 improved this by adding multiplexing and prioritization, which allows servers to decide exactly what data is sent and when. We’ve taken advantage of these new HTTP/2 capabilities to improve perceived speed of loading of progressive images by sending the most important fragments of image data sooner.</p>
    <div>
      <h3>What is progressive image rendering?</h3>
      <a href="#what-is-progressive-image-rendering">
        
      </a>
    </div>
    <p>Basic images load strictly from top to bottom. If a browser has received only half of an image file, it can show only the top half of the image. Progressive images have their content arranged not from top to bottom, but from a low level of detail to a high level of detail. Receiving a fraction of image data allows browsers to show the entire image, only with a lower fidelity. As more data arrives, the image becomes clearer and sharper.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/Mlb3C96uo4KLWgNe835l2/31ccbe118c20ce15f4c35c5ceb0cd377/image6.jpg" />
            
            </figure><p>This works great in the JPEG format, where only about 10-15% of the data is needed to display a preview of the image, and at 50% of the data the image looks almost as good as when the whole file is delivered. Progressive JPEG images contain exactly the same data as baseline images, merely reshuffled in a more useful order, so progressive rendering doesn’t add any cost to the file size. This is possible, because JPEG doesn't store the image as pixels. Instead, it represents the image as frequency coefficients, which are like a set of predefined patterns that can be blended together, in any order, to reconstruct the original image. The inner workings of JPEG are really fascinating, and you can learn more about them from my recent <a href="https://www.youtube.com/watch?v=jTXhYj2aCDU">performance.now() conference talk</a>.</p><p>The end result is that the images can look almost fully loaded in half of the time, for free! The page appears to be visually complete and can be used much sooner. The rest of the image data arrives shortly after, upgrading images to their full quality, before visitors have time to notice anything is missing.</p>
    <div>
      <h3>HTTP/2 progressive streaming</h3>
      <a href="#http-2-progressive-streaming">
        
      </a>
    </div>
    <p>But there's a catch. Websites have more than one image (sometimes even hundreds of images). When the server sends image files naïvely, one after another, the progressive rendering doesn’t help that much, because overall the images still load sequentially:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7KLYwVqXQSyp0QjnTTbCX4/b70c3a915dd7afaf7b0beec2aedf9f1d/image5.gif" />
            
            </figure><p>Having complete data for half of the images (and no data for the other half) doesn't look as good as having half of the data for all images.</p><p>And there's another problem: when the browser doesn't know image sizes yet, it lays the page out with placeholders instead, and relays out the page when each image loads. This can make pages jump during loading, which is inelegant, distracting and annoying for the user.</p><p>Our new progressive streaming feature greatly improves the situation: we can send all of the images at once, in parallel. This way the browser gets size information for all of the images as soon as possible, can paint a preview of all images without having to wait for a lot of data, and large images don’t delay loading of styles, scripts and other more important resources.</p><p>This idea of streaming of progressive images in parallel is as old as HTTP/2 itself, but it needs special handling in low-level parts of web servers, and so far this hasn't been implemented at a large scale.</p><p>When we were improving <a href="/better-http-2-prioritization-for-a-faster-web">our HTTP/2 prioritization</a>, we realized it can be also used to implement this feature. Image files as a whole are neither high nor low priority. The priority changes within each file, and dynamic re-prioritization gives us the behavior we want:</p><ul><li><p>The image header that contains the image size is very high priority, because the browser needs to know the size as soon as possible to do page layout. The image header is small, so it doesn't hurt to send it ahead of other data.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7qRxoj4KuoVnSD7nEpWbMe/6bd553099bbb167660c6a65fb782ea7c/image7.jpg" />
            
            </figure></li><li><p>The minimum amount of data in the image required to show a preview of the image has a medium priority (we'd like to plug "holes" left for unloaded images as soon as possible, but also leave some bandwidth available for scripts, fonts and other resources)</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/6Cm6w0yIFtMJctt42maKMF/f585afb934ef9e1f98ecd4d3750e7b09/secondhalf.png" />
            
            </figure></li><li><p>The remainder of the image data is low priority. Browsers can stream it last to refine image quality once there's no rush, since the page is already fully usable.</p></li></ul><p>Knowing the exact amount of data to send in each phase requires understanding the structure of image files, but it seemed weird to us to make our web server parse image responses and have a format-specific behavior hardcoded at a protocol level. By framing the problem as a dynamic change of priorities, were able to elegantly separate low-level networking code from knowledge of image formats. We can use Workers or offline image processing tools to analyze the images, and instruct our server to change HTTP/2 priorities accordingly.</p><p>The great thing about parallel streaming of images is that it doesn’t add any overhead. We’re still sending the same data, the same amount of data, we’re just sending it in a smarter order. This technique takes advantage of existing web standards, so it’s compatible with all browsers.</p>
    <div>
      <h3>The waterfall</h3>
      <a href="#the-waterfall">
        
      </a>
    </div>
    <p>Here are waterfall charts from <a href="https://webpagetest.org">WebPageTest</a> showing comparison of regular HTTP/2 responses and progressive streaming. In both cases the files were exactly the same, the amount of data transferred was the same, and the overall page loading time was the same (within measurement noise). In the charts, blue segments show when data was transferred, and green shows when each request was idle.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4gWR5w8eEc3VASAU81KwBS/986a63df2f51f9f88237bc1bc74d5273/image8.png" />
            
            </figure><p>The first chart shows a typical server behavior that makes images load mostly sequentially. The chart itself looks neat, but the actual experience of loading that page was not great — the last image didn't start loading until almost the end.</p><p>The second chart shows images loaded in parallel. The blue vertical streaks throughout the chart are image headers sent early followed by a couple of stages of progressive rendering. You can see that useful data arrived sooner for all of the images. You may notice that one of the images has been sent in one chunk, rather than split like all the others. That’s because at the very beginning of a TCP/IP connection we don't know the true speed of the connection yet, and we have to sacrifice some opportunity to do prioritization in order to maximize the connection speed.</p>
    <div>
      <h3>The metrics compared to other solutions</h3>
      <a href="#the-metrics-compared-to-other-solutions">
        
      </a>
    </div>
    <p>There are other techniques intended to provide image previews quickly, such as low-quality image placeholder (LQIP), but they have several drawbacks. They add unnecessary data for the placeholders, and usually interfere with browsers' preload scanner, and delay loading of full-quality images due to dependence on JavaScript needed to upgrade the previews to full images.</p><ul><li><p>Our solution doesn't cause any additional requests, and doesn't add any extra data. Overall page load time is not delayed.</p></li><li><p>Our solution doesn't require any JavaScript. It takes advantage of functionality supported natively in the browsers.</p></li><li><p>Our solution doesn't require any changes to page's markup, so it's very safe and easy to deploy site-wide.</p></li></ul><p>The improvement in user experience is reflected in performance metrics such as <b>SpeedIndex</b> metric and and time to visually complete. Notice that with regular image loading the visual progress is linear, but with the progressive streaming it quickly jumps to mostly complete:</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/1lKfDWRojqzIpdPg7MwFGj/70c9f7df2546b5ea50d69e4bbafde7cc/image1-5.png" />
            
            </figure>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/n9HPq3RQ8Ew4CTxAf8NP4/7ed0c32f1592de89fd2f3a95b18d74e4/image4.png" />
            
            </figure>
    <div>
      <h3>Getting the most out of progressive rendering</h3>
      <a href="#getting-the-most-out-of-progressive-rendering">
        
      </a>
    </div>
    <p>Avoid ruining the effect with JavaScript. Scripts that hide images and wait until the <code>onload</code> event to reveal them (with a fade in, etc.) will defeat progressive rendering. Progressive rendering works best with the good old <code>&lt;img&gt;</code> element.</p>
    <div>
      <h3>Is it JPEG-only?</h3>
      <a href="#is-it-jpeg-only">
        
      </a>
    </div>
    <p>Our implementation is format-independent, but progressive streaming is useful only for certain file types. For example, it wouldn't make sense to apply it to scripts or stylesheets: these resources are rendered as all-or-nothing.</p><p>Prioritizing of image headers (containing image size) works for all file formats.</p><p>The benefits of progressive rendering are unique to JPEG (supported in all browsers) and JPEG 2000 (supported in Safari). GIF and PNG have interlaced modes, but these modes come at a cost of worse compression. WebP doesn't even support progressive rendering at all. This creates a dilemma: WebP is usually 20%-30% smaller than a JPEG of equivalent quality, but progressive JPEG <i>appears</i> to load 50% faster. There are next-generation image formats that support progressive rendering better than JPEG, and compress better than WebP, but they're not supported in web browsers yet. In the meantime you can choose between the bandwidth savings of WebP or the better perceived performance of progressive JPEG by changing Polish settings in your Cloudflare dashboard.</p>
    <div>
      <h3>Custom header for experimentation</h3>
      <a href="#custom-header-for-experimentation">
        
      </a>
    </div>
    <p>We also support a custom HTTP header that allows you to experiment with, and optimize streaming of other resources on your site. For example, you could make our servers send the first frame of animated GIFs with high priority and deprioritize the rest. Or you could prioritize loading of resources mentioned in <code>&lt;head&gt;</code> of HTML documents before <code>&lt;body&gt;</code> is loaded.</p><p>The custom header can be set only from a Worker. The syntax is a comma-separated list of file positions with priority and concurrency. The priority and concurrency is the same as in the whole-file cf-priority header described in the previous blog post.</p>
            <pre><code>cf-priority-change: &lt;offset in bytes&gt;:&lt;priority&gt;/&lt;concurrency&gt;, ...</code></pre>
            <p>For example, for a progressive JPEG we use something like (this is a fragment of JS to use in a Worker):</p>
            <pre><code>let headers = new Headers(response.headers);
headers.set("cf-priority", "30/0");
headers.set("cf-priority-change", "512:20/1, 15000:10/n");
return new Response(response.body, {headers});</code></pre>
            <p>Which instructs the server to use priority 30 initially, while it sends the first 512 bytes. Then switch to priority 20 with some concurrency (<code>/1</code>), and finally after sending 15000 bytes of the file, switch to low priority and high concurrency (<code>/n</code>) to deliver the rest of the file.</p><p>We’ll try to split HTTP/2 frames to match the offsets specified in the header to change the sending priority as soon as possible. However, priorities don’t guarantee that data of different streams will be multiplexed exactly as instructed, since the server can prioritize only when it has data of multiple streams waiting to be sent at the same time. If some of the responses arrive much sooner from the upstream server or the cache, the server may send them right away, without waiting for other responses.</p>
    <div>
      <h3>Try it!</h3>
      <a href="#try-it">
        
      </a>
    </div>
    <p>Enable our <a href="/better-http-2-prioritization-for-a-faster-web"><i>Enhanced HTTP/2 Prioritization</i></a> feature, and JPEG images optimized by <a href="/introducing-polish-automatic-image-optimizati/"><i>Polish</i></a> or <a href="/announcing-cloudflare-image-resizing-simplifying-optimal-image-delivery/"><i>Image Resizing</i></a> will be streamed automatically.</p> ]]></content:encoded>
            <category><![CDATA[Speed & Reliability]]></category>
            <category><![CDATA[Speed Week]]></category>
            <category><![CDATA[Optimization]]></category>
            <guid isPermaLink="false">75xuDiShO45CbDQuf5hkLY</guid>
            <dc:creator>Andrew Galloni</dc:creator>
            <dc:creator>Kornel Lesiński</dc:creator>
        </item>
    </channel>
</rss>