Results 1 to 10 of 23

Thread: Running radeonsi on Ubuntu 13.04

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    Join Date
    Mar 2013

    Default Running radeonsi on Ubuntu 13.04

    There does not seem to be that much information about how to run radeonsi.
    So I tried with my HD 7750 on (K)Ubuntu Raring, and documented what I did.

    Since glamor is required, and it does not support the X version in Ubuntu, xorg-server has to be downgraded.
    While running this process, I had to use a bit of force (dpkg --force-depends -P) to remove conflicting packages.
    But in the end, apt is in a OK state.

    1. Fetch, build and install debian xorg-server 1.12.4 (git:// branch debian-unstable).

    2. Blacklist xorg 1.13 in apt preferences, to prevent reinstallation.
    Package: xserver-common
    Pin: version 2:1.13*
    Pin-Priority: -1

    Package: xserver-xorg-core
    Pin: version 2:1.13*
    Pin-Priority: -1

    Package: xserver-xorg-dev
    Pin: version 2:1.13*
    Pin-Priority: -1

    3. Fetch mesa 9.1 (git:// branch ubuntu). Add --enable-gbm to debian/rules. Build and install.

    4. Fetch glamor (git:// Configure and pay attention to missing dependencies manually, since there is no packaging (./configure --enable-glx-tls --prefix=/usr).
    make install.

    5. Fetch xf86-video-ati (git:// branch ubuntu).
    In debian/control remove the version dependency for xserver-xorg-dev. Also remove the dependencies on xserver-xorg-video-r128 and xserver-xorg-video-mach64 completely.
    Build and install.

    6. To get mouse/keyboard, I also had to rebuild xserver-xorg-input-evdev. If you use other drivers, those would also have to be recompiled.
    Fetch xf86-input-evdev (git:// branch ubuntu). In debian/control remove the version dependency for xserver-xorg-dev.
    Build and install.

    Those where the minimal changes I was able to get it running with.
    LLVM 3.2 in raring already has the r600g driver backported, and kernel 3.8 works.

    I also tried latest LLVM git, Mesa git, and kernel 3.9-rc1+, but no significant differences.

    KDE Desktop effects works.
    Nexuiz runs at 22 fps at 1920x1200.
    I thought I have Rochard running, but trying again just gives a black screen.
    Torchlight immediately locks up the GPU, but the machine still responds over ssh.
    Lightsmark 2008 completely freezes the machine after a few tests.

    I could not test Trine, since it crashes the X server both with nouveau and radeonsi.

  2. #2
    Join Date
    Mar 2013


    I forgot two steps in xf86-video-ati.

    In debian/rules add
    dh_auto_configure -- --enable-glamor

    In debian/patches/series remove

  3. #3
    Join Date
    Mar 2013


    And another thing I forgot, /etc/X11/xorg.conf needs to be configured:
    Section "Module"

    Load "dri2"
    Load "glamoregl"

    Section "Device"

    Identifier "ati"
    Driver "ati"

    Option "AccelMethod" "glamor"


  4. #4
    Join Date
    Mar 2013


    Well, that was a lot of unnecessary work:
    It did not even occur to me to try the stock xorg-server, as the Glamor page says it would be unreliable with 1.13+,

    I guess step 1, 2 and 6 are unnecessary.

  5. #5
    Join Date
    Jun 2010
    ฿ 16LDJ6Hrd1oN3nCoFL7BypHSEYL84ca1JR


    Did you actually have luck in trying it? I tried prime with it and it looks good so far, X.Org X Server 1.14.0, xf86-video-intel with uxa, xf86-video-ati compiled with glamor

     ~ % xrandr --listproviders
    Providers: number : 2
    Provider 0: id: 0x70 cap: 0xb, Source Output, Sink Output, Sink Offload crtcs: 3 outputs: 8 associated providers: 0 name:Intel
    Provider 1: id: 0x45 cap: 0xd, Source Output, Source Offload, Sink Offload crtcs: 6 outputs: 0 associated providers: 0 name:radeon
     ~ % xrandr --setprovideroffloadsink 1 0
     ~ % DRI_PRIME=0 glxinfo | grep OpenGL
    OpenGL vendor string: Intel Open Source Technology Center
    OpenGL renderer string: Mesa DRI Intel(R) Ivybridge Mobile
    OpenGL core profile version string: 3.1 (Core Profile) Mesa 9.2-devel (git-11b8df0)
      OpenGL core profile shading language version string: 1.40
      OpenGL core profile context flags: (none)
      OpenGL core profile extensions:
      OpenGL version string: 3.0 Mesa 9.2-devel (git-11b8df0)
      OpenGL shading language version string: 1.30
      OpenGL context flags: (none)
      OpenGL extensions:
    But as soon as I do "DRI_PRIME=1 glxinfo" X segfaults
    [  5449.109] (EE) Backtrace:
    [  5449.112] (EE) 0: /usr/bin/X (xorg_backtrace+0x36) [0x589a36]
    [  5449.112] (EE) 1: /usr/bin/X (0x400000+0x18d849) [0x58d849]
    [  5449.112] (EE) 2: /usr/lib/ (0x7f7193af7000+0xf1e0) [0x7f7193b061e0]
    [  5449.112] (EE) 3: /usr/bin/X (0x400000+0x15a46b) [0x55a46b]
    [  5449.112] (EE) 4: /usr/bin/X (DRI2Connect+0x77) [0x55c317]
    [  5449.112] (EE) 5: /usr/bin/X (0x400000+0x15d4d4) [0x55d4d4]
    [  5449.112] (EE) 6: /usr/bin/X (0x400000+0x37d46) [0x437d46]
    [  5449.113] (EE) 7: /usr/bin/X (0x400000+0x2680a) [0x42680a]
    [  5449.113] (EE) 8: /usr/lib/ (__libc_start_main+0xf5) [0x7f7192983a15]
    [  5449.113] (EE) 9: /usr/bin/X (0x400000+0x26b4d) [0x426b4d]
    [  5449.113] (EE)
    [  5449.113] (EE) Segmentation fault at address 0x28
    As far as I know this is because prime needs acceleration, i.e. glamor and there is no error reporting for a case where it doesn't work yet.

    I even tried "LD_PRELOAD=/usr/lib/xorg/modules/ X :1" but I only got "sh: symbol lookup error: /usr/lib/xorg/modules/ undefined symbol: serverClient"
    Last edited by ChrisXY; 03-11-2013 at 06:32 PM.

  6. #6
    Join Date
    Mar 2013


    Yes, I am using xserver-xorg-core 2:1.13.2-0ubuntu3 now. But I have also upgraded other packages (LLVM git, Mesa git, Linux git), since some crashing bugs are gone with them.

    I have not tried using prime.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts