Page 2 of 2 FirstFirst 12
Results 11 to 20 of 20

Thread: AMD RadeonSI Gallium3D Performance Has A Long Way To Go

  1. #11
    Join Date
    Nov 2007
    Posts
    77

    Default

    Do one covering radeon.dpm=1

  2. #12
    Join Date
    Nov 2008
    Location
    Madison, WI, USA
    Posts
    865

    Default

    Quote Originally Posted by storm_st View Post
    dont forget that 7XXX cards bios tend to start at lowest power profile, you need to change it to high, for example as root before benchmarking

    Code:
    echo "high" > /sys/class/drm/card0/device/power_profile
    I am sure that speed difference ~8x because of that. My 7850 work almost perfect here, after that trick. 2D scrolling, gnome-shell transitions, all become very smooth.
    You're right about the power profiles... I guess that VBIOS info was misleading.

    From /sys/kernel/debug/dri/0/radeon_pm_info (root-only) for my 7850:

    default engine clock: 860000 kHz
    current engine clock: 149990 kHz
    default memory clock: 1200000 kHz
    current memory clock: 149990 kHz
    voltage: 1075 mV
    PCIE lanes: 16

  3. #13
    Join Date
    Jun 2010
    Location
    ฿ 16LDJ6Hrd1oN3nCoFL7BypHSEYL84ca1JR
    Posts
    1,045

    Default

    Hm..... http://openbenchmarking.org/result/1...UT-1306287SO77
    drm-next-3.11-wip-5, llvm svn 185061, latest libdrm git, mesa git, xf86-video-ati-git, glamor git

    Maybe it was because of kwin's xrender compositing or maybe it was because of prime.
    (kwin opengl compositing has the problem that often it only shows a black screen for 3d using prime until you disable and reenable compositing with alt+shift+f12)

    xonotic on ultra was bugged, most of the level did not render. But every other test was rendering fine.
    Code:
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22
    warning: failed to translate tgsi opcode DDX to LLVM
    Failed to translate shader from TGSI to LLVM
    EE si_state.c:1951 si_shader_select - Failed to build shader variant (type=1) -22



    And this dpm really needs the auto poweroff of hybrid gpus that are not used. dpm definitely seems to work as the fan goes off now and then, but then after a little while it goes on again. And while the test was running it did not go off.
    Last edited by ChrisXY; 06-28-2013 at 04:28 PM.

  4. #14
    Join Date
    Feb 2008
    Location
    Linuxland
    Posts
    5,061

    Default

    "6450 beats the 7850"

    Go 6450!

  5. #15
    Join Date
    Oct 2008
    Posts
    3,099

    Default

    Quote Originally Posted by ChrisXY View Post
    xonotic on ultra was bugged, most of the level did not render. But every other test was rendering fine.
    Code:
    warning: failed to translate tgsi opcode DDX to LLVM
    If those DDX opcode warnings are the problem, then it might be solved pretty soon. I think i've seen patches on the mailing list implementing DDX and DDY.

    I also think that's one of the last things needed before GL3 support is done, although there might be a few others.

  6. #16
    Join Date
    Aug 2012
    Posts
    390

    Default

    The same can be said for Catalyst, which performs horribly for GCN.

  7. #17
    Join Date
    Sep 2008
    Location
    Vilnius, Lithuania
    Posts
    2,538

    Default

    Quote Originally Posted by ChrisXY View Post
    Schrödinger's cat. Come on, you have unicode support, don't you?
    Indeed, and got to love that Compose button ☺

    As a side note, the Fedora codenames are pretty fun. I used both Spherical Cow and Schrödinger's Cat quite extensively lately, since one of the courses I had was Physics.

  8. #18
    Join Date
    Oct 2009
    Location
    Brisbane, Queensland, Australia
    Posts
    154

    Default

    As others have said, RadeonSI performance can be greatly improved either by selecting profile based power management and then setting /sys/class/drm/card0/device/power_profile to high, or by using a 3.11 kernel. Interestingly, when using a 3.11 kernel it seems like it will not be necessary to enable the new "DPM" power mangement (by adding radeon.dpm=1 to the GRUB kernel boot options) to get good performance (although of course enabling DPM will reduce power consumption and lower heat output) - I have found the following patch in the 3.11 kernel:

    author Alex Deucher 2013-07-05 17:14:30 (GMT)
    committer Alex Deucher 2013-07-05 22:08:54 (GMT)
    commit c6cf7777a32da874fabec4fd1c2a579f0ba4e4dd
    tree 22a8b1f3b98714760a24b69f7d45d56c716dcfe0
    parent 338a95a95508537e23c82d59a2d87be6fde4b6ff

    drm/radeon: set default clocks for SI when DPM is disabled

    Fix patching of vddc values for SI and enable manually forcing clocks to default levels as per NI.

    This improves the out of the box performance with SI asics.

    Signed-off-by: Alex Deucher

  9. #19
    Join Date
    Aug 2012
    Posts
    390

    Default

    Quote Originally Posted by madbiologist View Post
    As others have said, RadeonSI performance can be greatly improved either by selecting profile based power management and then setting /sys/class/drm/card0/device/power_profile to high, or by using a 3.11 kernel.
    HAHAHAHAHAHAHA no. Actually get a Radeon HD 7xxx and try for yourself. Performance sucks and this will not help much at all. The only thing playable is 2D indie games or old games like HL1, and even then framerate is low. If you use 3.11 kernel and mesa git, many games are currently broken.
    Last edited by mmstick; 07-29-2013 at 05:51 PM.

  10. #20
    Join Date
    Oct 2009
    Location
    Brisbane, Queensland, Australia
    Posts
    154

    Default

    Quote Originally Posted by mmstick View Post
    HAHAHAHAHAHAHA no. Actually get a Radeon HD 7xxx and try for yourself. Performance sucks and this will not help much at all. The only thing playable is 2D indie games or old games like HL1, and even then framerate is low. If you use 3.11 kernel and mesa git, many games are currently broken.
    Setting the power_profile to high does seem to help a fair bit - see http://phoronix.com/forums/showthrea...160#post341160

    And yes, we know that rendering in Xonotic was broken a month ago. Remember that the 3.11 kernel is still a work in progress.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •