Page 3 of 4 FirstFirst 1234 LastLast
Results 21 to 30 of 34

Thread: Khronos ASTC: Royalty-Free Next-Gen Texture Compression

  1. #21
    Join Date
    Oct 2008
    Posts
    2,911

    Default

    Quote Originally Posted by MaxToTheMax View Post
    Based on what I know of the two algorithms, ASTC is totally different, except for the basic similarity of both being block-based. They have basically zero chance of being binary compatible. Transcoding to S3TC on the fly is doable, but you lose the image quality benefits of ASTC and introduce a dependency on the S3TC patents as well.
    Yeah, my understanding is that the S3TC patents are so generic that any transcoding done on them in that manner would still leave the patent violated - which is why a simple solution like that hasn't already been added into Mesa.

    However, ASTC is supposed to be useful for the same types of uses that S3TC is good for, so hopefully it will fully replace it soon enough. The S3TC patents should expire around 2018-19 or so, which is probably around when this will start being used commonplace anyway.

  2. #22
    Join Date
    Jul 2009
    Posts
    221

    Default

    so now that S3TC has an official replacement, maybe developers would be allowed to at least use S2TC by default.

    and wtf is "below 1bit"??? always on or always off???

  3. #23
    Join Date
    Jan 2007
    Posts
    10

    Default

    Quote Originally Posted by jakubo View Post
    and wtf is "below 1bit"??? always on or always off???
    These compression algorithms work on large blocks of pixels, so it can (for example) encode a 12x12 block in 128 bits, so that if 144 pixels in 128 bits, so ~0.89 bits/pixel, but I doubt the quality will be acceptable for the majority of content at that bitrate.

    ASTC is interesting in that it can change the size of the pixel block it uses 128 bits to describe to get different effective bitrates, allowing a fair bit of fine-tuning based on the content being compressed.

  4. #24
    Join Date
    Aug 2012
    Posts
    2

    Wink

    ARM has released an eval ASTC coder/decoder! Joy!

    http://www.malideveloper.com/develop...compressor.php

  5. #25
    Join Date
    Sep 2008
    Posts
    112

    Default

    I have a question about how games approach this;
    Do they compress the texture beforehand, or compress it live (when you load the game)?
    The latter might sound silly, but the main point is to save video memory right? If so, then you would be able to choose compression method.
    (Although for a mobile platform, hard disk space might be a bit of an issue as well)


    Also, I'm tempted to write an export/import plugin for ASTC in GIMP.
    Last edited by Micket; 08-07-2012 at 10:04 AM.

  6. #26
    Join Date
    Sep 2009
    Posts
    116

    Default

    ASTC for GIMP would be awesome!

    Both approaches are often used. I believe on desktop systems it's most common to have the graphics driver encode it for you on the fly, since it makes the data files of the game easier to work with.

  7. #27
    Join Date
    Sep 2007
    Location
    Connecticut,USA
    Posts
    941

    Default

    Quote Originally Posted by MaxToTheMax View Post
    Based on what I know of the two algorithms, ASTC is totally different, except for the basic similarity of both being block-based. They have basically zero chance of being binary compatible. Transcoding to S3TC on the fly is doable, but you lose the image quality benefits of ASTC and introduce a dependency on the S3TC patents as well.
    Ahh, so most likely then games (and drivers) will have to add support for ASTC natively such as detecting the compression libraries installed and prompting user to select one as part of initial game configuration

  8. #28
    Join Date
    Sep 2009
    Posts
    116

    Default

    Yeah, something like that.

  9. #29
    Join Date
    Sep 2008
    Posts
    112

    Default

    So. The license for the encoder/decoder released by ARM was nonfree, so I suppose one has to start by doing a free implementation.
    How tedious.

  10. #30
    Join Date
    Jan 2007
    Posts
    10

    Default

    Quote Originally Posted by Micket View Post
    I have a question about how games approach this;
    Do they compress the texture beforehand, or compress it live (when you load the game)?
    The /vast/ majority of (mobile, at least) games are compressed beforehand, so the artists can decide on what compression methods/modes are used and they all look acceptable for what they are used for.

    There is some interesting work on doing extra compression on an already compressed texture (As the compressed textures used in a graphics card need to be able to decompress a block independently, so they can't compress using repetition or patterns between blocks) such as Crunch (http://code.google.com/p/crunch/), and I believe that civ5 tries to use cuda to decompress from something like jpeg to dxt on the graphics card as the bottleneck is getting it from cpu memory to the graphics card memory. On mobile this is a bit different, as the cpu and gpu tend to share memory and bandwidth.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •