Page 2 of 2 FirstFirst 12
Results 11 to 15 of 15

Thread: Radeon Gets Multi-Screen Reverse Optimus Support

  1. #11
    Join Date
    Aug 2012

    Thumbs up

    That sound fantastic. So with that patch I can use my radeon and my intel card together ( sony vaio VPC-Z21 with a fgx card in the dockingstation)?
    Because now it is not possible to use the external displays that are connected to the dockingstation and the internal display together.

    Does anyone has a reference how to use that feature? maybe a xorg.conf example ?

  2. #12
    Join Date
    Apr 2008


    Quote Originally Posted by halfmanhalfamazing View Post
    Or is this part of some "foundational groundwork" which will help lead to crossfire/SLI support in the OSS driver? (Ditto for nouveau)
    The purpose of crossfire and SLI is having both GPU working together on the same output is if it was a huge GFX card with more GPUs on it.
    The whole array of cards renders 1 ouput (this is done by either having 2 cards drawing alterning frames, or spliting the output frame in two and having each card render 1 half, or much more complicated schemes).

    The purpose of DMA-BUF (the technology behind OSS implementation of Optimus and Reverse-Optimus) is to send the output from one card to another. They are not collaborating on 1 output.
    The first card does output the frame alone on its own. In addition to its normal output port (HDMI connector 1, DisplayPort connector 2 and 3, etc.), thanks to DMA-BUF it also see a virtual output port. Trying to display things on this port will actually transfert the output frame to another Graphic card which didn't had anything to do with it up to now.

    Optimus example:
    - You have a laptop
    - The laptop has an onboard Intel, this intel is connected to the DVI output port and to the built-in screen.
    - The laptop has a GeForce card, this card isn't connected to absolutely anything and can't output a damn thing on its own.
    - You setup the Geforce as your main desktop.
    - thanks to DMA-BUF you create a "virtual connector" on the GeForce as if it had a DVI output.
    - this virtual connector actually sends (or maps) the output to the Intel embed.
    - the Intel embed actually displays the output using it's own output connectors.

    Reverse example:
    - You have a workstation, on which you also develop OpenCL code.
    - You have a small punny Radeon, embed in the APU of your main processor.
    - You have the latest possible overkill monster as a descreet card.
    - You want to test OpenCL code and you want it to run on the big card, and you don't want anything else disturbing it (you want 100% of your ressources to produce your bitcoins :-P )
    - The embed Radeon is more than enough to display the desktop.
    - BUT you want to still by able to use your 3 monitors setup (2 or 3 of them are connected to the big card).
    - You setup the embed as your main desktop.
    - Thanks to DMA-BUF, in addition to the crappy built-in VGA-out of the motherboard, you give access to a few virtual connectors to your embed card.
    - These connectors send (or share) the output frame with the big GFX card.
    - The big GFX card displays the output to your 2-3 monitors.
    - All the drawing is done by the embed.
    - The big card doesn't do much beside displaying the frame to the monitors. Most of the RAM is free, and the GPU doesn't do a shit.
    - So you have all the ressource free for your precious OpenCL bitcoins.

    USB example:
    - you bought a nice all-in-one Picture Frame / portable TV / external display combo.
    - when in external display, the widget works over USB
    (That's simply a framebuffer that can be displayed. No 3D-acceleration-over-USB or whatever).
    - (There is no actual connector like HDMI, DVI, DP+, VGA, ...) beside an useless composite in.
    - But you want full 3D.
    - So you use your main machine (laptop, desktop, whatever) as normal (with your normal big monitor connected to your output).
    - Thank to DMA-BUF, you creat an additional virtual connector for the USB display.
    - This additionnal connector writes the output frames into the framebuffer of the USB display.
    - Bam! You get 3D acceleration on a accelerator-less USB monitor.

    The same strategies can also work for an in-kernel always working VNC that directly works on top of KMS/DMA-buff (you still have the display even if the X or Wayland crashes and kicks you to the console. VNC works as if it where a normal monitor connected on an extra DVI port).
    It could also work for gameplay streaming (Play PC games actually running on your workstation while playing them on some under-powered portable console or tablet)

  3. #13
    Join Date
    Jun 2006


    If anyone cares, yesterday I was able to get a multiscreen opengl-composited desktop across two radeon X1250 on the same machine, using the latest kubunty nightly (it even set them up automatically for me).

    It worked fine for 2 minutes and then was hit by massive slowdowns, probably some kind of software fallback, but I tried multiple times and it always happened, even if the desktop just sat there doing nothing.

    So... works but not perfectly, for now. I'll retry again when 13.10 is nearing release, and maybe file some bug reports, although mysterious X server slowdown will be interesting to decide against what I should file it.

    (I also tried with a radeon and an nvidia 6200le but couldn't even get both of them to light up at the same time).

  4. #14
    Join Date
    Jul 2012


    I've been running this for a month with a Radeon HD 6750 as a primary and a Radeon X300 as an output slave. Biggest annoyance was cursor flickering on the outputs connected to the primary graphics card, which I solved by disabling "framebuffer object" and "always use buffer swapping" options in the OpenGL plugin of Compiz in CCSM. I can even run Unigine Heaven across 3 monitors in this configuration.

  5. #15
    Join Date
    May 2013

    Default There are AMD athlon X2 laptops with nvidia graphics

    Quote Originally Posted by halfmanhalfamazing View Post
    Have there been any AMD-based laptops which have an Nvidia discrete GPU built into them?
    Yes, my sister has one, so I have to make sure all kernel updates are done at my place in case
    the NVIDIA blob doesn't work right after the update. It stays on Ubuntu 10.04 to preserve the
    GNOME 2 environment, works great. If ever a newer OS were needed it would have to be Mint/MATE.

    In 10.04, Nouveau was still very new, the "experimental" 3D driver will work but that machine has heat
    issues unless constantly blown clear of dust, and I don't experiment on machines used by people
    who are not hackers and need months of reliability between visits.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts