Page 5 of 6 FirstFirst ... 3456 LastLast
Results 41 to 50 of 56

Thread: NVIDIA Has Major New Linux Driver: Optimus, RandR 1.4

  1. #41
    Join Date
    Sep 2012
    Posts
    7

    Default

    Quote Originally Posted by LLStarks View Post
    I'm pretty close to getting "Optimus" working. Unfortunately, my Nvidia card isn't wired to HDMI or any output and the LVDS doesn't want to play nice.

    I'm getting a blank, backlit screen. X will start without complaining, run programs invisibly and glxinfo will report Nvidia, but there's nothing on the LVDS or HDMI.


    Providers for Nvidia and modesetting (or Intel, I've tested both) report properly.
    Same here. If you make any progress, it would be great if you could send me a PM to point me in the right direction. Btw, I wonder if this error in Xorg.0.log
    could be related:
    [ 2791.538] (EE) Screen 1 deleted because of no matching config section.
    [ 2791.538] (II) UnloadModule: "intel"

  2. #42
    Join Date
    Apr 2007
    Location
    Arctic circle, Finland
    Posts
    297

    Default

    Quote Originally Posted by zzippy View Post
    Same here. If you make any progress, it would be great if you could send me a PM to point me in the right direction. Btw, I wonder if this error in Xorg.0.log
    could be related:
    [ 2791.538] (EE) Screen 1 deleted because of no matching config section.
    [ 2791.538] (II) UnloadModule: "intel"
    Are you configured your X as suggested in readme:
    http://us.download.nvidia.com/XFree8...E/randr14.html

    You will need a xorg modesetting driver for kms(in ubuntu it's named xserver-xorg-video-modesetting).

  3. #43
    Join Date
    Apr 2013
    Posts
    2

    Default It works - but with some tearing

    Hi,
    I managed to get the NVidia GPU to render throught the Intel GPU to a Laptop Screen as well as over HDMI to and external monitor.

    Process detailed at http://www.barunisystems.com/index.p...page?view=blog
    Regards,
    Chaitanya

  4. #44
    Join Date
    Sep 2012
    Posts
    7

    Default

    Quote Originally Posted by tuke81 View Post
    Are you configured your X as suggested in readme:
    http://us.download.nvidia.com/XFree8...E/randr14.html
    Yep. If you mean the xorg.conf file. If you mean the kernel CONFIG_DRM parameters -admit beyond my knowledge- : I don't know. Using Ubuntu Mainline 3.9 rc6. Could this be the point?
    Quote Originally Posted by tuke81 View Post
    You will need a xorg modesetting driver for kms(in ubuntu it's named xserver-xorg-video-modesetting).
    Thought Intel >2.21.5 should work also? I started with modesetting driver, same.

  5. #45
    Join Date
    Aug 2012
    Posts
    493

    Default

    Is the tearing issue fixable by Nvidia or Intel via driver changes or is this something that the technology just has to live with?

  6. #46
    Join Date
    Apr 2007
    Location
    Arctic circle, Finland
    Posts
    297

    Default

    Quote Originally Posted by dh04000 View Post
    Is the tearing issue fixable by Nvidia or Intel via driver changes or is this something that the technology just has to live with?
    GLX_EXT_buffer_age should fix those problems. It's compositing window managers which have to take advantage of it.

  7. #47
    Join Date
    Nov 2007
    Posts
    1,024

    Default

    Quote Originally Posted by Ericg View Post
    Proper Optimus: Seamless transition back and forth as needed
    Nope. That isn't even possible with OpenGl or D3D. The cards have different capabilities, you must recreate your device/context and requery extensions/caps to switch. There's no signal in GL to tell an app to do this, and while you could do it in D3D they don't.

    This is an issue for web games since either all pages render on Intel or all render on NVIDIA. There's no API to ask for the low-power or high-speed device, so even a multi-process arch like Chrome is stuck.

    Optimus is a stop-gap until GL/D3D offers a real solution in its API.

    Optimus is per-app. There's a system list of apps which use the NVIDIA, requiring updates so new games work. Also requires extra steps so your own projects use it, as by default any binary you build will only use Intel's GPU.

  8. #48
    Join Date
    Aug 2012
    Posts
    493

    Default

    Quote Originally Posted by tuke81 View Post
    GLX_EXT_buffer_age should fix those problems. It's compositing window managers which have to take advantage of it.
    http://www.phoronix.com/scan.php?pag...tem&px=MTI1MTM

    Does this mean 13.04 will have this feature? Does intel support it yet?

  9. #49
    Join Date
    Apr 2007
    Location
    Arctic circle, Finland
    Posts
    297

    Default

    Quote Originally Posted by dh04000 View Post
    http://www.phoronix.com/scan.php?pag...tem&px=MTI1MTM

    Does this mean 13.04 will have this feature? Does intel support it yet?
    Hmm ubuntu 13.04 seems to have compiz 0.9.9 and this is just merging in compiz 0.9.10...

    Dunno about intel, mesa has EGL_EXT_buffer_age so I presume intel has it, but not GLX_EXT_buffer_age.

  10. #50
    Join Date
    Apr 2013
    Posts
    2

    Default Newer Kernel / xrandr 1.4 / intel driver

    To get this working - you need the following :-

    a) Kernel 3.9
    b) randr 1.4
    c) intel driver set in the xorg file

    I've got this working - and using a kernel < 3.9 is a no go as is X server < 1.14.

    You don't need to set the intel GPU as modesetting - doesn't work.

    I have a writeup at http://www.barunisystems.com/index.p...page?view=blog for more details.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •