This appears to be very well written and easy to understand even if you only know the basics of digital image encoding.
I found the parts about patching and frames with different blend modes very fascinating. I wonder if it would be possible to build a GUI DCC app that uses JpegXL as its project format.
It seems that it could support layers, splines, symbols (transformed instances of layers), blend modes and animations without "baking" any of it to pixels
Yes, my hope is that jxl can become an interoperable format for layered images. It does not have all the functionality of image editor formats (PSD, XCF, etc), but it does have a very useful subset of that (named layers with basic blend modes). For interchange, it would be very suitable since it has state of the art compression (both lossy and lossless), does not force you to merge the layers before exporting, while the images can be viewed directly by any application that supports jxl — since the blending happens inside the decoder, the viewing application can be blissfully unaware that it even is a layered image, it will just get the merged image from the decoder if it doesn't ask for the individual layers.
As it currently stands there should be over a billion devices that natively support JPEG-XL, as it was introduced in all Apple OSs since September 2023[1].
On the web alone it should be close to a billion users with support for JXL due to Safari’s market share.
It's also supported in Windows, GNOME, KDE, pretty much all image editors/viewers, and pretty much every other relevant program except for chromium based browsers.
Not just Chromium-based browsers, Firefox as well. Might not make much of a difference for user counts but it does mean that so far it's available on the web is limited to a single vendor.
It's worth noting that it is "supported" in Firefox however it's not enabled at compile time for release builds (but is enabled for nightly and testing/validation builds).
Full release/production support will come when the (more or less drop in replacement) rust rewrite of libjxl is production ready.
Firefox has it implemented (behind a preference on nightly). They just don't want to ship it if Chromium isn't going to because it would cause fragmentation in the web and something they have to maintain forever for a minority of sites (as most won't bother if Chromium based browsers don't support it).
Tbh it's less about having to maintain it forever and more about not wanting to deal with maintaining a C++ library codebase that would widen the potential attack surface of the browser (due to memory bugs, etc). They are fine adopting it as long as it's in rust (which is being worked on, see sibling comments)
The Mozilla "organizations" are a two-headed grift piggy-backed on a non-profit shell so the IRS keeps smiling.
Firefox hasn't made a technical decision without first forwarding the minutes to Mountain View and Redmond since roughly 2017.
Every nine-figure Google wire lands promptly converts into $450 k-per-head salary vapor and off-site "all-hands," while the same week another 250 actual engineers get an email that begins: "You're talented and valued BUT-."
Servo? Jettisoned.
MDN? Gutted.
Security teams? Re-org'd into a Slack channel no one reads.
And the Foundation helpfully reminds donors:
"Your gifts don't pay for Firefox engineering."
No kidding. They pay for glossy pamphlets proclaiming the open-web gospel, first-class flights to "advocacy summits," and Mitchell Baker's $2.5 million thank-you note. Firefox isn't a browser; it's a loss-leader Google keeps in the closet for the next antitrust subpoena.
So does Chrome if you check out the right commits and enable it.
But if you go getfirefox.com, click "Download Firefox" then there will be no JXL support not even behind any configuration flags. So no, it doesn't support it. There are also no plans to enable support with the current implementation.
It wasn't killed off. Support was removed from Chrome, for what appears to be rather spurious reasons, but practically everyone else are busy implementing it.
While I'm not the biggest fan of WebP, using generation loss as a metric wouldn't be an indicator of a real world scenario. I can't think of any actual instance where an image needs to be re-encoded, say, 10 times, let alone 100+ times.
JXL is still alive and well, it's just taking time to reach the prime time.
- Mac OS, iOS, and Safari support JPEG-XL
- Windows has first party JPEG-XL support as of this year (admittedly it's opt in rather than default)
- Essentially every major image processing app, editor, or drawing app supports JPEG-XL
- Firefox has preliminary support for JPEG-XL gated behind a feature flag and the nightly release.
- The JPEG-XL team is writing a direct port of the reference libjxl library into rust[1]. There already exists a third party rust port by some of the mainline contributors and it has ironed out a lot of the issues with the porting process prior to this mainline port. This first party rust port is intended to be gradually brought up to a hardened, production ready state.
- Mozilla has stated they have no objections to fully adopting JPEG-XL in Firefox once the rust port is production ready [2].
The last major barriers other than getting the rust code production ready will be chrome and android's first party support/adoption.
------
TLDR: JPEG-XL is very much not dead and instead people are nose down working hard to continue pushing its adoption forward.
> To address this concern, the team at Google has agreed to apply their subject matter expertise to build a safe, performant, compact, and compatible JPEG-XL decoder in Rust, and integrate this decoder into Firefox.
I was not aware of this. Also judging by this and the sibling comments, it looks like the momentum didn't die despite Google's apathy. Hopefully the fact that their own team is now developing the rust port, as well as the growing support in other platforms, is enough to make Google reconsider its choices.
> it looks like the momentum didn't die despite Google's apathy.
Google is a founding organization of jpeg-xl and are a core part of the team. Chromium punted it, but Google as an organization hasn't exactly since they haven't pulled out of jpegxl itself nor removed their engineers from it.
Big companies are big, they do conflicting things from time to time. Or often.
IMHO it's because the WUFFS code for just vanilla JPEGs is in the most polite terms "jaw droppingly horrific" and JPEG-XL is an order of magnitude more complex.
It wasn't "killed", it was always disabled by default in Chrome, and removed for really quite reasonable reasons: literally every other image decoder has had serious vulnerabilities. Enabling it by default would expose a gigantic attack surface that almost certainly will be exploited sooner or later.
This is also why Firefox doesn't support it by default (IIRC it doesn't even link against libjpegxl by default in release builds – only nightly ones).
There is nothing preventing the Chrome or Firefox people from revisiting all of this in the future.
It seems to me the Rust implementation of JPEG XL is by far the best path forward for broad JPEG XL support in Firefox, Chrome, and other browsers. While Rust is of course not a complete guarantee there will never be any security issues, it does eliminate virtually all of the major exploits that have targeted image decoders in the past. Both Firefox and Chrome have expressed interest in this.
WebP got added about 15 years ago or so. Chrome (and Firefox) learned the lessons from the problems that caused.
And "push WebP" for that purpose? Google as a whole benefits hugely from reduced image sizes.
Firefox also doesn't implement JXL as I mentioned. Are they trying to "push WebP" too now? This is such conspiratorial nonsense. No evidence for it at all. Doesn't even make any logical sense. Google literally worked (and continues to work) on JXL.
First, there's Lossy WEBP, based on VP8 video compression. It is better than JPEG, but mediocre by today's standards. Lossy AVIF and Lossy JXL greatly outclass lossy WEBP.
Second, there's Lossless WEBP, which is not in any way based on VP8. Lossless WEBP is a stellar image format that not only compresses very well, but also decompresses very quickly. Its biggest competition is Lossless JXL, which usually compresses to a smaller file, but decoding that image is slow enough to be annoying. Sometimes lossless WEBP produces a smaller file than lossless JXL.
Yes, you are right that the lossless format is much more notable but also much less common than the lossy one. It is quite an improvement over PNG which is the only real competitor on the web.
I use JPEG XL / JXL all of the time, the fact it was "killed off" is news to me. I also use Firefox and not Chrome, so maybe that has something to do with it. If Google decides they want to divide the web by being stupid and failing to follow standards, we have very little path to change that, but it certainly does not create any form of consensus or resolute outcome. Google removing JPEG XL from Chrome because they want to force everyone to use a much worse standard they control (webp) doesn't mean anything about the future of JPEG XL.
This appears to be very well written and easy to understand even if you only know the basics of digital image encoding.
I found the parts about patching and frames with different blend modes very fascinating. I wonder if it would be possible to build a GUI DCC app that uses JpegXL as its project format. It seems that it could support layers, splines, symbols (transformed instances of layers), blend modes and animations without "baking" any of it to pixels
Thanks!
Yes, my hope is that jxl can become an interoperable format for layered images. It does not have all the functionality of image editor formats (PSD, XCF, etc), but it does have a very useful subset of that (named layers with basic blend modes). For interchange, it would be very suitable since it has state of the art compression (both lossy and lossless), does not force you to merge the layers before exporting, while the images can be viewed directly by any application that supports jxl — since the blending happens inside the decoder, the viewing application can be blissfully unaware that it even is a layered image, it will just get the merged image from the decoder if it doesn't ask for the individual layers.
JPEG XL had so much going for it. Kinda sad it was killed off just like that.
As it currently stands there should be over a billion devices that natively support JPEG-XL, as it was introduced in all Apple OSs since September 2023[1].
On the web alone it should be close to a billion users with support for JXL due to Safari’s market share.
[1]: https://cloudinary.com/blog/jpeg-xl-how-it-started-how-its-g...
It's also supported in Windows, GNOME, KDE, pretty much all image editors/viewers, and pretty much every other relevant program except for chromium based browsers.
Not just Chromium-based browsers, Firefox as well. Might not make much of a difference for user counts but it does mean that so far it's available on the web is limited to a single vendor.
It's worth noting that it is "supported" in Firefox however it's not enabled at compile time for release builds (but is enabled for nightly and testing/validation builds).
Full release/production support will come when the (more or less drop in replacement) rust rewrite of libjxl is production ready.
> rust rewrite of libjxl
See:
* https://github.com/libjxl/jxl-rs
Firefox has it implemented (behind a preference on nightly). They just don't want to ship it if Chromium isn't going to because it would cause fragmentation in the web and something they have to maintain forever for a minority of sites (as most won't bother if Chromium based browsers don't support it).
Tbh it's less about having to maintain it forever and more about not wanting to deal with maintaining a C++ library codebase that would widen the potential attack surface of the browser (due to memory bugs, etc). They are fine adopting it as long as it's in rust (which is being worked on, see sibling comments)
Considering firefox is a small fraction of safari's size I don't think it would fragment the web that much.
The Mozilla "organizations" are a two-headed grift piggy-backed on a non-profit shell so the IRS keeps smiling.
Firefox hasn't made a technical decision without first forwarding the minutes to Mountain View and Redmond since roughly 2017.
Every nine-figure Google wire lands promptly converts into $450 k-per-head salary vapor and off-site "all-hands," while the same week another 250 actual engineers get an email that begins: "You're talented and valued BUT-."
Servo? Jettisoned.
MDN? Gutted.
Security teams? Re-org'd into a Slack channel no one reads.
And the Foundation helpfully reminds donors:
"Your gifts don't pay for Firefox engineering."
No kidding. They pay for glossy pamphlets proclaiming the open-web gospel, first-class flights to "advocacy summits," and Mitchell Baker's $2.5 million thank-you note. Firefox isn't a browser; it's a loss-leader Google keeps in the closet for the next antitrust subpoena.
they did say "relevant". Though arguably Chromium will probably overthink their decision if both Safari and Firefox support it.
Firefox does support jxl (in the sense that the code is there and works), but it's disabled by default.
So does Chrome if you check out the right commits and enable it.
But if you go getfirefox.com, click "Download Firefox" then there will be no JXL support not even behind any configuration flags. So no, it doesn't support it. There are also no plans to enable support with the current implementation.
It wasn't killed off. Support was removed from Chrome, for what appears to be rather spurious reasons, but practically everyone else are busy implementing it.
Sadly removing support from Chrome is effectively the same as killing it off. And the reason is Google wants people to use webp instead.
Not really? Chrome dropped support but Google is actually supporting the JPEG-XL rust port that Firefox is waiting on.
Friendly reminder that WebP is trash https://www.youtube.com/watch?v=w7UDJUCMTng
While I'm not the biggest fan of WebP, using generation loss as a metric wouldn't be an indicator of a real world scenario. I can't think of any actual instance where an image needs to be re-encoded, say, 10 times, let alone 100+ times.
What do you think happens to images shared and re-shared between people online?
JXL is still alive and well, it's just taking time to reach the prime time.
- Mac OS, iOS, and Safari support JPEG-XL
- Windows has first party JPEG-XL support as of this year (admittedly it's opt in rather than default)
- Essentially every major image processing app, editor, or drawing app supports JPEG-XL
- Firefox has preliminary support for JPEG-XL gated behind a feature flag and the nightly release.
- The JPEG-XL team is writing a direct port of the reference libjxl library into rust[1]. There already exists a third party rust port by some of the mainline contributors and it has ironed out a lot of the issues with the porting process prior to this mainline port. This first party rust port is intended to be gradually brought up to a hardened, production ready state.
- Mozilla has stated they have no objections to fully adopting JPEG-XL in Firefox once the rust port is production ready [2].
The last major barriers other than getting the rust code production ready will be chrome and android's first party support/adoption.
------
TLDR: JPEG-XL is very much not dead and instead people are nose down working hard to continue pushing its adoption forward.
------
1. https://github.com/libjxl/jxl-rs
2. https://github.com/mozilla/standards-positions/pull/1064
> To address this concern, the team at Google has agreed to apply their subject matter expertise to build a safe, performant, compact, and compatible JPEG-XL decoder in Rust, and integrate this decoder into Firefox.
I was not aware of this. Also judging by this and the sibling comments, it looks like the momentum didn't die despite Google's apathy. Hopefully the fact that their own team is now developing the rust port, as well as the growing support in other platforms, is enough to make Google reconsider its choices.
> it looks like the momentum didn't die despite Google's apathy.
Google is a founding organization of jpeg-xl and are a core part of the team. Chromium punted it, but Google as an organization hasn't exactly since they haven't pulled out of jpegxl itself nor removed their engineers from it.
Big companies are big, they do conflicting things from time to time. Or often.
I am still surprised that WUFFS isn't being used to address safety concerns with the JPEG-XL reference library.
IMHO it's because the WUFFS code for just vanilla JPEGs is in the most polite terms "jaw droppingly horrific" and JPEG-XL is an order of magnitude more complex.
It wasn't "killed", it was always disabled by default in Chrome, and removed for really quite reasonable reasons: literally every other image decoder has had serious vulnerabilities. Enabling it by default would expose a gigantic attack surface that almost certainly will be exploited sooner or later.
This is also why Firefox doesn't support it by default (IIRC it doesn't even link against libjpegxl by default in release builds – only nightly ones).
There is nothing preventing the Chrome or Firefox people from revisiting all of this in the future.
It seems to me the Rust implementation of JPEG XL is by far the best path forward for broad JPEG XL support in Firefox, Chrome, and other browsers. While Rust is of course not a complete guarantee there will never be any security issues, it does eliminate virtually all of the major exploits that have targeted image decoders in the past. Both Firefox and Chrome have expressed interest in this.
And because they wanted to push WebP
WebP got added about 15 years ago or so. Chrome (and Firefox) learned the lessons from the problems that caused.
And "push WebP" for that purpose? Google as a whole benefits hugely from reduced image sizes.
Firefox also doesn't implement JXL as I mentioned. Are they trying to "push WebP" too now? This is such conspiratorial nonsense. No evidence for it at all. Doesn't even make any logical sense. Google literally worked (and continues to work) on JXL.
if anything is being pushed these days, it'd be avif
Then why did they develop libjxl, and why are they working on jxl-rs? https://github.com/libjxl/libjxl/blob/main/AUTHORS https://github.com/libjxl/jxl-rs/blob/main/AUTHORS
Maybe their stated reason for not enabling support in Chrome is the actual reason.
...which overall is a pretty mediocre image format.
WEBP is two image formats bolted together.
First, there's Lossy WEBP, based on VP8 video compression. It is better than JPEG, but mediocre by today's standards. Lossy AVIF and Lossy JXL greatly outclass lossy WEBP.
Second, there's Lossless WEBP, which is not in any way based on VP8. Lossless WEBP is a stellar image format that not only compresses very well, but also decompresses very quickly. Its biggest competition is Lossless JXL, which usually compresses to a smaller file, but decoding that image is slow enough to be annoying. Sometimes lossless WEBP produces a smaller file than lossless JXL.
Yes, you are right that the lossless format is much more notable but also much less common than the lossy one. It is quite an improvement over PNG which is the only real competitor on the web.
VHS was a pretty mediocre medium for video. It didn’t stop JVC.
https://youtu.be/hWl9Wux7iVY
There's nothing stopping you from using it in your own applications. Just not directly in the browser for now.
It works in Safari and is coming to Firefox.
JPEG XL is alive and well (as is ublock origin as it happens).
I use JPEG XL / JXL all of the time, the fact it was "killed off" is news to me. I also use Firefox and not Chrome, so maybe that has something to do with it. If Google decides they want to divide the web by being stupid and failing to follow standards, we have very little path to change that, but it certainly does not create any form of consensus or resolute outcome. Google removing JPEG XL from Chrome because they want to force everyone to use a much worse standard they control (webp) doesn't mean anything about the future of JPEG XL.
[dead]
[dead]