Page 4 of 4 FirstFirst ... 234
Results 31 to 36 of 36

Thread: A Batch Of Graphics Cards On Gallium3D

  1. #31
    Join Date
    Jan 2009
    Posts
    615

    Default

    Quote Originally Posted by whitecat View Post
    OK, I understand. So in my case, on my system I currently have :
    - 64-bit kernel
    - both 64/32-bit libdrm packages
    - both 64/32-bit xorg-x11-drv-ati packages
    - both 64/32-bit mesa-dri-drivers, mesa-libGL and mesa-libGLU packages
    - both 64/32-bit libtxc_dxtn packages
    - a 32-bits game

    I shouldn't expect 3D problems ?
    Yes, it should be ok with that config. Better compile a 32-bit glxinfo and glxgears and see if you get direct rendering. I think you don't need 32-bit xorg-x11-drv-ati. Also libdrm is no longer required for Gallium.

  2. #32
    Join Date
    Feb 2011
    Location
    France
    Posts
    185

    Default

    Quote Originally Posted by marek View Post
    Better compile a 32-bit glxinfo and glxgears and see if you get direct rendering.
    Good idea, I will check.
    Thank you for your answers.

  3. #33
    Join Date
    Feb 2011
    Posts
    1

    Cool intel rocks @ oss

    Looking at this test and some other phoronix graphics tests it seems to me that using a discrete graphics card (other than X1xxx) with an opensource driver provides only minimal advantage over running an 2500K igp (exluding lightsmark). just stunning. or did i get the wrong impression?

  4. #34

    Default

    Quote Originally Posted by whitecat View Post
    Really ? how disable S3TC on ta-spring ? On my machine with r600g the textures are corrupted !
    Spring can be run with S3TC without libtxc_dxtn:
    http://spring.bochs.info/phpbb/viewt...24720&p=461420

    However this still doesn't work with gallium drivers:
    https://bugs.freedesktop.org/show_bug.cgi?id=29012

  5. #35
    Join Date
    Aug 2007
    Posts
    6,607

    Default

    @bernstein

    I could not test a Sandy Bridge cpu yet for onboard vga, but for basic games Intel gfx was usually enough. It is definitely no solution for hardcore gamers as you do not even have got OpenGL 3.x functionality (the chip has dx10.1/opengl 3.x hardware features) with Linux. Current binary only drivers (ati+nv) support OpenGL 4.x - the Unigine engine can use it already for tesselation effects (but really needs a very powerful card). But for a simple game from time to time Intel gfx should be enough.

    Much more interesting would be how the integrated media ENCODER can be used, that would speed up h264 encoding very much...

  6. #36
    Join Date
    Oct 2008
    Posts
    3,038

    Default

    Quote Originally Posted by Michael View Post
    Because it's not "out of the box" configuration... While I'm sure there's a fair number of active Phoronix members that may install it, as far as overall Linux usage goes, how many people do you think will actually go forward and do it or even know about it? Not many at all.
    While i understand and support this decision for most of the tests Phoronix does, I believe S3TC is different. Without it, apps simply will not work. Once that happens, users will ask around on forums and figure out how to get it working - or else they won't. I think it's useful to know the performance of these applications for those who do, and those who don't won't care anyway. It's not some obscure setting to speed things up, it's just whether or not you can actually get a game to run or not.

    Otherwise, you're still going to be testing the Quake 3 engine 10 years from now, and that's already basically a useless test on today's hardware. It's like glxgears, a good test to run to ensure basic functionality, but not much good for anything more than that.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •