Page 3 of 3 FirstFirst 123
Results 21 to 28 of 28

Thread: radeon with DRI2 slower?

  1. #21
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,458

    Default

    Actually, I just noticed that the site has been updated very recently and includes numbers for 5xx and 6xx hardware as well. There is a FAQ that states pretty clearly that they understand glxgears is not a good benchmark, so that's a start.

    Oh well... that's the curse of the internet. Anyone who makes the effort to put up a big collection of useful information ends up getting abuse a couple of years later when the world has changed but their information has become the canonical reference for anyone searching for answers. Retesting everything would be a big task, even with glxgears, but maybe one or two lines at the start of the page might be a good compromise.

    EDIT - I guess in the meantime we could tweak glxgears to add an option to make it at least vaguely useful as a benchmark, by drawing the gears 50 times between calls to glXSwapBuffers or something. It would still suck (if only because every draw would have the same Z values) but would definitely suck less.
    Last edited by bridgman; 09-20-2009 at 11:47 AM.

  2. #22
    Join Date
    Aug 2008
    Location
    Finland
    Posts
    1,606

    Default

    Quote Originally Posted by bridgman View Post
    EDIT - I guess in the meantime we could tweak glxgears to add an option to make it at least vaguely useful as a benchmark, by drawing the gears 50 times between calls to glXSwapBuffers or something. It would still suck (if only because every draw would have the same Z values) but would definitely suck less.
    Or we could redisable the fps counter and have it instead output useful stuff like OpenGL renderer. (like most of the other demos do)

  3. #23
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,458

    Default

    Quote Originally Posted by nanonyme View Post
    Or we could redisable the fps counter and have it instead output useful stuff like OpenGL renderer. (like most of the other demos do)
    I think there's an option for that already.

  4. #24
    Join Date
    Aug 2008
    Location
    Finland
    Posts
    1,606

    Default

    Code:
    #define BENCHMARK
    
    #ifdef BENCHMARK
    
    /* XXX this probably isn't very portable */
    ...
    You mean this?
    ps. I don't honestly know why code that code is enabled instead of removed since the authors well know that it's platform-dependent and useless.
    Last edited by nanonyme; 09-20-2009 at 11:54 AM.

  5. #25
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,458

    Default

    I was thinking of the "-info" option :

    Code:
       if (printInfo) {
          printf("GL_RENDERER   = %s\n", (char *) glGetString(GL_RENDERER));
          printf("GL_VERSION    = %s\n", (char *) glGetString(GL_VERSION));
          printf("GL_VENDOR     = %s\n", (char *) glGetString(GL_VENDOR));
          printf("GL_EXTENSIONS = %s\n", (char *) glGetString(GL_EXTENSIONS));
       }

  6. #26
    Join Date
    Aug 2008
    Location
    Finland
    Posts
    1,606

    Default

    Ah, right. Well, that looks almost fine to me except I'm not sure it'd make sense to output extensions by default, that takes quite a lot of space.
    Just out of interest decided to try what would happen if you removed the benchmark trigger. Apparently whole glxgears breaks down...
    Apparently the thing that should be used in any case for this kind of stuff is gears, not glxgears. glxgears contains unportable code (and this might not be possible to fix) for calculating fps whereas gears uses glut for gathering necessary information.

  7. #27
    Join Date
    Jan 2009
    Posts
    117

    Default

    Quote Originally Posted by nanonyme View Post
    Ah, right. Well, that looks almost fine to me except I'm not sure it'd make sense to output extensions by default, that takes quite a lot of space.
    Just out of interest decided to try what would happen if you removed the benchmark trigger. Apparently whole glxgears breaks down...
    Apparently the thing that should be used in any case for this kind of stuff is gears, not glxgears. glxgears contains unportable code (and this might not be possible to fix) for calculating fps whereas gears uses glut for gathering necessary information.
    Lets jsut start to request distributions to ship gears and instead use tunnel or engine They at least take a bit more rendering power from older hardware. They are useless for benchmarking still but a bit better.

  8. #28
    Join Date
    Jan 2009
    Location
    Vienna, Austria; Germany; hello world :)
    Posts
    638

    Default

    Quote Originally Posted by amphigory View Post
    kernelOfTruth, you are one lucky guy to be in Vienna. I'd give my eye teeth to live there... even though I cannot abide Sachertorte.

    Off-Topic:

    it's probably only half as great when you're living here - not being here as a tourist - but I'm still loving it

    there's really a lot to discover in this city

    I think we'd find something other you'd like than the Sachertorte



    from what I've seen so far Seattle is also a pretty nice city



    On-Topic:

    last time I tried KMS it was significantly slower than non-KMS and still pretty unstable

    reading latest topics it seems to have stabilized significantly,

    now I'll have to wait until 2.6.32 gets ready (rc6+) and fglrx support for 2.6.32 so that I can switch between those two in-case it doesn't work too stable/fast yet

    great work guys !

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •