I think he was pretty clear the issue only appears WITH texture compression, not without it. He also understands that Red Hat has no interest in fixing texture compression, which is why he's looking into the code in the first place.
Originally Posted by glisse
i have to agree with glisse here, it seems simple enough, whatever the problem is that you are referring to , then provide a way to show this problem and tell them
Originally Posted by LiquidAcid
it seems you are a dev and so can provide a simple GL test program as requested, and if your not a dev and looking for support then at least describe the exact steps you took to see this problem you want fixed.
without that, you're just going around in circles and wasting time.
Hmm, I'm not so sure about that. There's already some code related to S3TC in r600g, and afaik that was written by Red Hat employee Dave Airlie. Reading the bug report linked to by LiquidAcid (which btw was filed by a VMware employee), VMware also requires the drivers to have S3TC support, so it's not just only useful for games.
Originally Posted by smitty3268
I don't think he's attempting to write an algorithm to actually do the compressing, something that would be patent-encumbered. Instead, he's just trying to get the driver to pass pre-compressed textures to the hardware, which can natively handle them without any special algorithms on the software side. You just need to tell the hardware which format the texture is in, and it should work automatically.
Originally Posted by popper
Well, it would be good if that's true. I'm just judging by the fact that it hasn't been done yet while they are working on a lot of other features. Plus the responses in this very thread by Glisse didn't sound like they were very interested. If they are planning to add S3TC support, it clearly isn't very high on their priority list, but perhaps it will happen eventually. I'd still rather not count on it, though, so if a community member can add the support then so much the better.
Originally Posted by monraaf
Actually I would be very much interested in some pointers on where to start. I'm currently looking at the code of the r300 CS checker and trying to understand how the compressed formats are handled there. It seems natural to me to first get the DRM to accept these formats (instead of just disabling all texture checks) before moving into gallium territory.
And yes, I'm just looking at the hardware support for pre-compressed textures. I don't think libtxc_dxtn needs to be reimplemented, since it already does its work well.
@smitty3268: Like monraaf pointed out, there is already code in r600g to handle texture compression. It's just disabled by default (like tiling) and even if forcibly enabled clashes with the CS checker. This one can however be disabled for these specific tests. Since I wanted to know what already worked I disabled the CS texture check. Turns out that base levels are rendered correctly, but mip levels are not. And that's all what I explained in the bug and also here. Nothing more, nothing less.
Tags for this Thread