Page 5 of 7 FirstFirst ... 34567 LastLast
Results 41 to 50 of 65

Thread: Gallium3D / LLVMpipe With LLVM 2.8

  1. #41
    Join Date
    Feb 2008
    Location
    Linuxland
    Posts
    5,049

    Default

    Yeah, I've heard no complaints on the Intel drives.

    Thread derailment? That's par for the course on Phoronix

  2. #42
    Join Date
    Jan 2010
    Location
    Portugal
    Posts
    945

    Default

    AFAIK when SSDs have errors it's in the writing process and HDDs tend to fail during reads, so you'll only be aware of errors when you try to read the data you thought was safe. So at least in this regard SSDs seem safer. Don't know about real life expectancy though.

  3. #43
    Join Date
    Aug 2009
    Posts
    2,264

    Default

    There are still too much problems with SSD's. For example when abad HDD write happens it fails when transmitting to the HDD and the file save dialog tells you it failed.

    Due to the horrible nature of SSDs, the OS succesfuly transmits the file to the SSD, but then the SSD controller could write it unsuccesfully to its own disk without the user knowing 'till it's already too late.

    On paper SSD's rule, but in practiae they still don't, but I'm sure it'll be ready when I'm in the market for internal computer storage again, so...

  4. #44
    Join Date
    Aug 2009
    Posts
    2,264

    Default

    Excuse me for the wierd wrong characters and double posts but my phone is almost dead; broken touch screen and lots of wierd Java errors everywhere too. I'll get a new one tomorow... <_<'

  5. #45
    Join Date
    Aug 2007
    Location
    Poland
    Posts
    215

    Default

    Quote Originally Posted by V!NCENT View Post
    BlackStar, just shut the fuck up. I'm not even going to make any effort defending software fallback. If a user doesn't like 2fps Vs. a blank screen the problem is simply between keyboard and chair. Nobody who is able to influence the Gallium3D code will think otherwise. Noobs be noobs.
    It reminds me the times when I've run Quake 1 on Nexgen CPU (586 without FPU, recognised by apps as 386). I've done some workarounds with modification of few different FPU emulators for DOS and some other quirks to make it advertising as 486DX for DOS to make it work but I was very very happy when I finally saw it running even that it was less than 1FPS but it worked.

  6. #46
    Join Date
    Aug 2007
    Location
    Poland
    Posts
    215

    Default

    Quote Originally Posted by xeros View Post
    It reminds me the times when I've run Quake 1 on Nexgen CPU (586 without FPU, recognised by apps as 386). I've done some workarounds with modification of few different FPU emulators for DOS and some other quirks to make it advertising as 486DX for DOS to make it work but I was very very happy when I finally saw it running even that it was less than 1FPS but it worked.
    Edit: It was even ~5-20 seconds for 1 frame for 100MHz CPU with software FPU emulation.

  7. #47
    Join Date
    Jan 2010
    Location
    Portugal
    Posts
    945

    Default

    Quote Originally Posted by xeros View Post
    It reminds me the times when I've run Quake 1 on Nexgen CPU (586 without FPU, recognised by apps as 386). I've done some workarounds with modification of few different FPU emulators for DOS and some other quirks to make it advertising as 486DX for DOS to make it work but I was very very happy when I finally saw it running even that it was less than 1FPS but it worked.
    My 486DX2 66MHz only ran it at something like 15FPS with minimum resolution.

  8. #48
    Join Date
    Oct 2010
    Posts
    25

    Default

    How do I enable llvmpipe? I have Kubuntu 10.04 (Lucid) with xorg-edgers packages. My machine has an intergrate ATI express X1250 card. XBMC runs very slow on it due to r300-dri driver lacking a good GLSL implementation. I want to try out llvmpipe to see if it's better than r300-dri, but don't know how to enable it.

  9. #49
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,433

    Default

    Have you tried the 300g (Gallium3D HW accelerated driver) on your hardware ? AFAIK that should give you the best of both worlds -- LLVM JIT-compiled vertex shaders running on the CPU plus HW-accelerated fragment shaders running on the GPU.

  10. #50
    Join Date
    Oct 2010
    Posts
    25

    Default

    Quote Originally Posted by bridgman View Post
    Have you tried the 300g (Gallium3D HW accelerated driver) on your hardware ? AFAIK that should give you the best of both worlds -- LLVM JIT-compiled vertex shaders running on the CPU plus HW-accelerated fragment shaders running on the GPU.
    my glxinfo shows:

    OpenGL vendor string: X.Org R300 Project
    OpenGL renderer string: Gallium 0.4 on RS690
    OpenGL version string: 2.1 Mesa 7.10-devel
    OpenGL shading language version string: 1.20

    Does that mean I'm running r300g? If so, it's still very slow for xbmc. I got only 15fps. Not much improvement since 5 months ago:

    http://www.mail-archive.com/xorg-dri.../msg14535.html

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •