Page 4 of 4 FirstFirst ... 234
Results 31 to 34 of 34

Thread: Gallium3D Now In Mainline Mesa Code-Base!

  1. #31
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,545

    Default

    Believe it or not, I do understand your point.

    I have one too -- that none of the really demanding Windows games are usefully running on Linux today, and that by the time they do you probably will like fglrx a lot better.

  2. #32
    Join Date
    Jan 2009
    Location
    Columbus, OH, USA
    Posts
    323

    Default

    bridgman, you mention a good GLSL compiler being one major component missing. What are your thoughts on LLVM and its potential to shore up this weakness. Could it be done? At speed?

  3. #33
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,545

    Default

    I think LLVM will be a great compiler in terms of being versatile and predictable. My only concern is not a criticism of LLVM per se, just an observation that most of the optimizing happens before instructions are emitted, which is fine for a normal CPU but not so fine for a superscalar machine.

    In the case of 6xx/7xx we use a 5-way superscalar instruction word (up to 5 independent operations per VLIW instruction) and I don't think LLVM's built in optimizations will be able to opimize for that. This is not a showstopper; it just means that either the code generator for 6xx/7xx will need to have some hard-coded optimization, or that someone will need to figure out how to make VLIW issues visible to the standard LLVM optimizers.

    In the meantime, we'll probably average something like 60% utilization of the superscalar hardware. The proprietary driver uses a very clever shader compiler which is able to achieve much higher utilization numbers.

    Big picture, my guess is that for a while all the drivers will use LLVM for software rendering (eg running vertex shaders on our pre-7xx IGP parts which don't have vertex shader hardware) and hard-code shader opcode emission for vertex and pixel shaders on GPU rather than using LLVM. Not sure about this, but it's my impression.
    Last edited by bridgman; 02-12-2009 at 11:19 AM.

  4. #34
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,149

    Default

    Quote Originally Posted by RealNC View Post
    I don't think he's testing Oblivion and Crysis and stuff
    It makes sense, too. The bottleneck with these is not your GPU, but the Wine D3D emulation layer. Quoting from WineHQ:Crysis:
    What does not
    The sound. Had to desable it to get passed the EA logo.
    The FPS. I get something like ~1FPS when I run this game perfectly on windows..
    Edit: Late reply, by about a whole page of posts...
    Last edited by BlackStar; 02-12-2009 at 08:17 AM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •