I'm sure all distros would consider shipping a patch if one existed.
I wouldn't be so sure of that. Some distro's like openSUSE/SLED and RH/Fed err on the side of caution when it comes to items that may cause potential legal issues.
Originally Posted by agd5f
NASA likes to release big images from Mars in JPEG2000.
Originally Posted by dopehouse
The trouble is, very little actually supports the alpha transparency feature. In fact, ImageMagick actually aborts with an assertion failure if you try loading a transparent JPEG2000 image, and I don't think it's the only program with this issue.
Originally Posted by chithanh
Well, JPEG 2000 isn't really any better quality-wise than JPEG, it's much slower (especially the open source implementations), and it's patent-encumbered. Why would it see widespread adoption?
Originally Posted by b15hop
because you are wrong. Jpeg2000 is A LOT better than jpeg.
http://en.wikipedia.org/wiki/JPEG_2000#Features *shrug* Seems to be better than JPEG mostly on limited-bandwidth scenarios. If you're free to use as much space as you want for the images, difference is negligible.
>Compared to the previous JPEG standard, JPEG 2000 delivers a typical compression gain in the range of 20%, depending on the image characteristics. Higher-resolution images tend to benefit more, where JPEG-2000's spatial-redundancy prediction can contribute more to the compression process. In very low-bitrate applications, studies have shown JPEG2000 to be outperformed by the intra-frame coding mode of H.264. Good applications for JPEG 2000 are large images, images with low-contrast edges — e.g., medical images.
It also takes ten times longer to decompress than normal JPEG, at least with the open-source implementations. I'm not exaggerating - I actually tested this:
Originally Posted by energyman
[aidan@yarrow 4 tmp] 0$ identify test-in.jpg
test-in.jpg JPEG 1920x1200 1920x1200+0+0 8-bit DirectClass 863kb
[aidan@yarrow 4 tmp] 0$ time convert test-in.jpg test-out.pnm
[aidan@yarrow 4 tmp] 0$ time convert test-in.jp2 test-out.pnm
(This is Jasper, since that's what ImageMagick uses. Unfortunately, it looks like the other library OpenJpeg is just as slow. OpenJpeg 2.0 may be faster, but it's still in the alpha stage.)
Unless you're downloading a very big image to a very fast PC on a very slow connection, I'm not convinced it's worth it.
Why would image conversion rate have any significance as to whether it should be adopted or not? Surely the only important part is how fast it's rendered.
Unless graphics cards have gained hardware JPEG 2000 support without me noticing, rendering the image requires converting it to an uncompressed format - which is what those commands are testing.
Originally Posted by nanonyme