Page 4 of 6 FirstFirst ... 23456 LastLast
Results 31 to 40 of 51

Thread: The May 2012 Open-Source Radeon Graphics Showdown

  1. #31
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by droste View Post
    Problem:
    you want to enable (2D) color tiling -> you need a xorg.conf (or xorg.conf.d)

    Fact:
    you don't use something -> it's obsolete

    Solution:
    you don't use (2D) color tiling -> it's obsolete
    (2D) color tiling is obsolete -> you don't need a xorg.conf (or xorg.conf.d)

    problem solved
    LOL very funny... someone should write a nice GUI to turn on stuff like PCIe2.0 or color tiling.

    or even better a withe list to turn on the stuff automatically ..

    yes i know no one do have time for usability stuff-----

  2. #32
    Join Date
    Apr 2010
    Posts
    1,946

    Default

    Quote Originally Posted by Qaridarium View Post
    LOL very funny... someone should write a nice GUI to turn on stuff like PCIe2.0 or color tiling.

    or even better a withe list to turn on the stuff automatically ..

    yes i know no one do have time for usability stuff-----
    I thought we have "driconf" for years already.

  3. #33
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by crazycheese View Post
    I thought we have "driconf" for years already.
    nice try i checked it and there is no option for color tilling or PCIe2.0

    now we get a GUI without any helpful (if you want to improve the performance) options

  4. #34
    Join Date
    Jul 2010
    Posts
    499

    Default

    Quote Originally Posted by Pontostroy View Post
    Only etqw-demo has problem, other games and test runs fine.

    I build livecd with lastes mesa-git, xf86-drivers-git, kernel-git every 14-20 days for over a year (lastes build includes mesa-git 20120518), I tweaked mesa,kernel and drivers for maximum performance and do some tests, and have never seen so lightsmark showed less than 100 FPS, how Michael get 63 FPS, I do not know.
    The-May-2012-Open-Source-Radeon-Graphics-Showdown radeon-tweaked does not show the the real data, certainly not for the HD 6770.
    Wasn't able to download your live cd, server wasn't very responsive. I did a test run with Ubuntu 12.4, xorg-edgers and I am getting the same results as Michael.
    http://openbenchmarking.org/result/1...AR-EDGERSLM817

    Would be interesting to figure out how you've got over 100fps

  5. #35
    Join Date
    Jul 2010
    Posts
    499

    Default

    Just captured a GL trace (vdrift) under ubuntu12.4/xorg-edgers and win7/catalyst12.4 and retraced with glretrace -p.

    It looks like the gl calls might have a "somewhat" lower overhead on linux. Buffer swapping and flushing(where all the exciting stuff happens I guess) is killing the mesa driver.
    Code:
    windows
    3960396 [0 usec] glEnableClientState(array = GL_VERTEX_ARRAY)
    3960397 [0 usec] glEnableClientState(array = GL_TEXTURE_COORD_ARRAY)
    3960398 [0.392849 usec] glTexCoordPointer(size = 2, type = GL_FLOAT, stride = 0, pointer = blob(224))
    3960399 [0 usec] glVertexPointer(size = 3, type = GL_FLOAT, stride = 0, pointer = blob(336))
    3960400 [0.785698 usec] glDrawElements(mode = GL_TRIANGLES, count = 42, type = GL_UNSIGNED_INT, indices = blob(168))
    3960401 [0.392849 usec] glDisableClientState(array = GL_TEXTURE_COORD_ARRAY)
    3960402 [0.392849 usec] glDisableClientState(array = GL_NORMAL_ARRAY)
    3960403 [0.392849 usec] glDisableClientState(array = GL_VERTEX_ARRAY)
    3960404 [15.714 usec] glActiveTexture(texture = GL_TEXTURE0)
    3960405 [1.5714 usec] glBindTexture(target = GL_TEXTURE_2D, texture = 12)
    3960406 [0 usec] glEnableClientState(array = GL_VERTEX_ARRAY)
    3960407 [0.392849 usec] glEnableClientState(array = GL_TEXTURE_COORD_ARRAY)
    3960408 [0.392849 usec] glTexCoordPointer(size = 2, type = GL_FLOAT, stride = 0, pointer = blob(32))
    3960409 [0 usec] glVertexPointer(size = 3, type = GL_FLOAT, stride = 0, pointer = blob(48))
    3960410 [1.96425 usec] glDrawElements(mode = GL_TRIANGLES, count = 6, type = GL_UNSIGNED_INT, indices = blob(24))
    
    3960634 [23.5709 usec] wglSwapBuffers(hdc = 0x7c010395) = true
    
    linux
    1505427 [0.112 usec] glEnableClientState(array = GL_VERTEX_ARRAY)
    1505428 [0.109 usec] glEnableClientState(array = GL_TEXTURE_COORD_ARRAY)
    1505429 [0.305 usec] glTexCoordPointer(size = 2, type = GL_FLOAT, stride = 0, pointer = blob(224))
    1505430 [0.206 usec] glVertexPointer(size = 3, type = GL_FLOAT, stride = 0, pointer = blob(336))
    1505431 [3.972 usec] glDrawElements(mode = GL_TRIANGLES, count = 42, type = GL_UNSIGNED_INT, indices = blob(168))
    1505432 [0.184 usec] glDisableClientState(array = GL_TEXTURE_COORD_ARRAY)
    1505433 [0.103 usec] glDisableClientState(array = GL_NORMAL_ARRAY)
    1505434 [0.122 usec] glDisableClientState(array = GL_VERTEX_ARRAY)
    1505435 [0.138 usec] glActiveTexture(texture = GL_TEXTURE0)
    1505436 [0.445 usec] glBindTexture(target = GL_TEXTURE_2D, texture = 12)
    1505437 [0.144 usec] glEnableClientState(array = GL_VERTEX_ARRAY)
    1505438 [0.133 usec] glEnableClientState(array = GL_TEXTURE_COORD_ARRAY)
    1505439 [0.287 usec] glTexCoordPointer(size = 2, type = GL_FLOAT, stride = 0, pointer = blob(32))
    1505440 [0.209 usec] glVertexPointer(size = 3, type = GL_FLOAT, stride = 0, pointer = blob(48))
    1505441 [6.085 usec] glDrawElements(mode = GL_TRIANGLES, count = 6, type = GL_UNSIGNED_INT, indices = blob(24))
    
    1505677 [432.663 usec] glXSwapBuffers(dpy = 0x22e08b0, drawable = 67108879)
    24 vs 433 usec that's almost a factor of 20 in this snippet...

    Another thing I've learned is that trying to profile from user side is completely pointless.

  6. #36
    Join Date
    Dec 2008
    Location
    Vermont
    Posts
    103

    Default

    Quote Originally Posted by Qaridarium View Post
    its obsolete because in my system there isn't a /etc/x11/xorg.conf at all and there is also no xorg.conf.d

    because its all "auto detected"

    what you really wana say is something like this: you have to open up a file if you want this option.
    Actually, it has been done the right way. Everything "Just Works (TM)" and there are no messy, ugly, incomprehensible configuration files that require tampering with. On the other hand, it "Just Works (TM)" fails, or if you want to get something more than the default, that good old-fashioned text file will still be respected if you create it. You can still fix it, if it's failing to start correctly. You can still enhance it if you want more out of your system than the default. Plus if you try, and mess it up, you can still fix your system in text mode, which is still available after the boot-to-GUI fails.

    The PCIe-2.0 model is enabled when you load the radeon module. For Gentoo that is set in /etc/conf.d/modules and for RedHat 6.2 in a file /etc/modprobe.d/. I don't have the exact parameter name handy, because that system is at home powered down.

    I tried to turn on the 2D color tiling by creating an /etc/X11/xorg.conf file with a simple DEVICE stanza. X11 told me that it was an invalid option, so I either got the name wrong, or my particular drive revision (xf86-video-ati-6.14.3) doesn't have it.

    If anyone knows the correct option name to use for 2D color tiling, I'd appreciate finding out what it is.

  7. #37
    Join Date
    Jul 2010
    Posts
    499

    Default

    Quote Originally Posted by phred14 View Post
    If anyone knows the correct option name to use for 2D color tiling, I'd appreciate finding out what it is.
    I've got it from the test logs: http://openbenchmarking.org/system/1...aked/xorg.conf
    Code:
    Section "Device"
    	Identifier  "Card0"
    	Driver      "radeon"
    	Option "SwapbuffersWait" "0"
    	Option "ColorTiling" "1"
    	Option "ColorTiling2D" "1"
    EndSection
    It has been only recognized with mesa 8.1.

  8. #38
    Join Date
    Dec 2008
    Location
    Vermont
    Posts
    103

    Default

    I'll try and give this a shot, tonight. Well, first I'll check what Mesa I've got installed, and maybe I'll go ahead and install 8.1 - I believe I'm at 8.0.1. A quick look on an available Gentoo system, and I see that 8.1 isn't available yet in portage. I don't feel like moving a critical system to an overlay, so I think I'll file this one away and wait a little bit.

    Thanks for the info, sorry I can't act on it, yet.

  9. #39
    Join Date
    Jul 2010
    Posts
    499

    Wink

    Quote Originally Posted by phred14 View Post
    I'll try and give this a shot, tonight. Well, first I'll check what Mesa I've got installed, and maybe I'll go ahead and install 8.1 - I believe I'm at 8.0.1. A quick look on an available Gentoo system, and I see that 8.1 isn't available yet in portage. I don't feel like moving a critical system to an overlay, so I think I'll file this one away and wait a little bit.

    Thanks for the info, sorry I can't act on it, yet.
    I fully share your concerns, am running it from an extra "victim" partition. Btw I haven't noticed much difference yet with the few benchmarks I've tried.

  10. #40
    Join Date
    Feb 2011
    Location
    Ukraine
    Posts
    139

    Default

    Quote Originally Posted by log0 View Post
    Wasn't able to download your live cd, server wasn't very responsive. I did a test run with Ubuntu 12.4, xorg-edgers and I am getting the same results as Michael.
    http://openbenchmarking.org/result/1...AR-EDGERSLM817

    Would be interesting to figure out how you've got over 100fps
    Something is wrong with ubuntu or Unity.

    I tested with worst settings and got 63 fps.
    pcie_gen2=0,vblank_mode=1,swapbuffers=on,colortill ing=off,kwin effects=on
    http://openbenchmarking.org/result/1...BY-GOG09979773

    Enabling all optimization i got ~130fps without any magic and patches.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •