Announcement

Collapse
No announcement yet.

LZHAM 1.0 Lossless Data Compression Codec Released

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • LZHAM 1.0 Lossless Data Compression Codec Released

    Phoronix: LZHAM 1.0 Lossless Data Compression Codec Released

    Version 1.0 of LZHAM has been released, the lossless data compression codec spearheaded by Rich Geldreich, the former Valve developer involved in their Linux and OpenGL activities...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Great! Now, hopefully we can get AMD to release Mantle as an open spec, and bring it to Linux.

    Comment


    • #3
      Originally posted by xeekei View Post
      Great! Now, hopefully we can get AMD to release Mantle as an open spec, and bring it to Linux.
      Are you sure this is the right thread?
      What does a compression algorithm have to do with Mantle?

      Comment


      • #4
        Originally posted by CrystalGamma View Post
        Are you sure this is the right thread?
        What does a compression algorithm have to do with Mantle?
        The person coding LZHAM is the same guy complaining about OpenGL, which is even mentioned in the article. LZHAM could be seen as a step towards improving Linux gaming, and I suggested another step.

        Comment


        • #5
          Originally posted by xeekei View Post
          The person coding LZHAM is the same guy complaining about OpenGL, which is even mentioned in the article. LZHAM could be seen as a step towards improving Linux gaming, and I suggested another step.
          Better use of time is to make OpenGL Next not suck, since its an API rewrite anyway.

          Comment


          • #6
            Originally posted by siavashserver
            I just gave the latest git version a try, here is my test results on a binary game content (geometry vertices, normals, etc) :

            Code:
            CPU : Intel Core2 Quad Q9550 2.8GHz
            RAM : Corsair Dominator Twin Dual Channel DDR2 4GB 1066MHz 
            HDD : Seagate Barracuda 320GB 7200RPM (ST3320613AS)
            
            OS  : Arch Linux 64bit
            
            Original File Size : 10.7MB
            
            Codec | Size  | Ratio | Time
            XZ    | 4.6MB | 42%   | 0.452s
            LZHAM | 4.9MB | 45%   | 0.143s
            LZ4   | 6.8MB | 63%   | 0.033s
            
            *** Decompression time includes disk I/O + CPU
            Input file was compressed using the highest available compression settings in all tests.


            My humble opinion on LZHAM:

            I see LZHAM as a great compression tool for delivering contents such as game assets and packages distributed by Linux distributions and it's both network traffic and old/weak hardware friendly. However it isn't (still) suitable for contents (game assets stored on disk) which are going to be loaded from disk and decompressed over and over again. Since disk space is cheap these days, I suggest compressing game contents which are going to be stored on disk using LZ4, and recompressing them using LZHAM just for content delivery purpose.
            Was that LZ4 or LZ4HC (aka LZ4 -9 if you use the CLI)

            Comment


            • #7
              Originally posted by siavashserver
              Here is the parameters I passed to them:
              Code:
              XZ    : xz -z -k -9 -e
              LZHAM : lzhamtest -m4 -x -e c
              LZ4   : lz4c -9 -BD
              oh, I feel like the benchmarks of lzham were way off on its decompression performance compared to what I've seen lately
              going to stick with LZ4 I guess.

              Comment


              • #8
                Originally posted by siavashserver
                I see LZHAM as a great compression tool for delivering contents such as game assets and packages distributed by Linux distributions and it's both network traffic and old/weak hardware friendly. However it isn't (still) suitable for contents (game assets stored on disk) which are going to be loaded from disk and decompressed over and over again. Since disk space is cheap these days, I suggest compressing game contents which are going to be stored on disk using LZ4, and recompressing them using LZHAM just for content delivery purpose.
                On the other hand, the higher compression saves IO time. If that's worth it depends on the balance of disk vs CPU speed, of course - but it's not a given that you'll save much time by moving to weaker, faster, compression.

                Comment


                • #9
                  Originally posted by zanny View Post
                  Better use of time is to make OpenGL Next not suck, since its an API rewrite anyway.
                  Khronos said that OpenGL Next is a long way out, and that it probably won't be quite as low-level as Mantle. As long as Mantle gets released as open, and the HLSL shader dependence gets removed, it can easily be adopted as the new Linux API.

                  Comment


                  • #10
                    The lz4 dev is also working on a new compression algo somewhere between lzma and lz4. Although it is in "Early Access" state right now.

                    Zstandard - Fast real-time compression algorithm. Contribute to Cyan4973/zstd development by creating an account on GitHub.

                    Comment

                    Working...
                    X