Page 1 of 2 12 LastLast
Results 1 to 10 of 15

Thread: LLVMpipe With Intel's GLSL2 Compiler

Hybrid View

  1. #1
    Join Date
    Jan 2007
    Posts
    15,138

    Default LLVMpipe With Intel's GLSL2 Compiler

    Phoronix: LLVMpipe With Intel's GLSL2 Compiler

    Last month we tested out Intel's new GLSL compiler for Mesa when running the ATI Radeon classic Mesa and Gallium3D drivers to see how this GL Shading Language compiler designed by Intel employee's for their hardware and open-source driver work for the other open-source drivers, since all of the Mesa drivers will be affected once this "GLSL2" compiler is merged into the Mesa code-base by month's end. The experience using Intel's new shader compiler with the ATI Radeon graphics driver worked fine except for Warsow where serious regressions were visible, but in the other games that are capable of running off Mesa, the experience was fine. What we have been curious to test since then with this new OpenGL shader compiler has been the LLVMpipe driver -- a Gallium3D driver we have been very excited about as it finally provides a better software rasterizer for Linux by leveraging Gallium3D and the Low-Level Virtual Machine (LLVM) compiler infrastructure for accelerating the Mesa state tracker atop a modern multi-core CPU that supports SSE4 instructions. We have now finished running tests of the Intel's GLSL2 branch with the most recent LLVMpipe driver code.

    http://www.phoronix.com/vr.php?view=15197

  2. #2
    Join Date
    Oct 2009
    Posts
    2,122

    Default

    Actually, regarding the texture-from-pixmap glx extension, that is NOT NECESSARILY REQUIRED for compiz to operate! The 0.9 branch of compiz uses the "copytex" plugin in its place.

    http://forum.compiz.org/viewtopic.ph...t=10402#p75616

    In other words, you can build the 0.9 branch and test again... you might get compiz-on-cpu.

  3. #3

    Default

    Quote Originally Posted by droidhacker View Post
    Actually, regarding the texture-from-pixmap glx extension, that is NOT NECESSARILY REQUIRED for compiz to operate! The 0.9 branch of compiz uses the "copytex" plugin in its place.

    http://forum.compiz.org/viewtopic.ph...t=10402#p75616

    In other words, you can build the 0.9 branch and test again... you might get compiz-on-cpu.
    Oh yeah, didn't think about Compiz 0.9 yet. Will give that a shot.

  4. #4
    Join Date
    Jul 2009
    Posts
    260

    Default

    so its basically up to 30% performance loss but 1-5% less energy used? heureka!
    is it normalized for the same amount of frames? so virtually Frames/Joule or Joule/Frame?

  5. #5
    Join Date
    Sep 2009
    Posts
    119

    Default

    You realize Open Arena doesn't even use GLSL right? Or I assume it doesn't, as it uses a fairly stock ioQuake3 engine, and ioQuake3 doesn't use GLSL without being patched since last I checked. Which was a few minutes ago. Also, it doesn't *look* like it uses GLSL.

    In fact, do any of those games use GLSL? I'm not sure that any of them do. (Anyone should feel free to correct me on this.)

    Maybe this compiler is also involved in the other types of shaders or it has some kind of "idle overhead," that's the only way I can explain the performance differences measured in other games.

  6. #6
    Join Date
    Oct 2008
    Location
    Sweden
    Posts
    983

    Default

    Quote Originally Posted by MaxToTheMax View Post
    You realize Open Arena doesn't even use GLSL right? Or I assume it doesn't, as it uses a fairly stock ioQuake3 engine, and ioQuake3 doesn't use GLSL without being patched since last I checked. Which was a few minutes ago. Also, it doesn't *look* like it uses GLSL.
    I grabbed the timedemo and config PTS uses and ran it using MESA_GLSL=log which dumps all shaders used to files, and ended up with nothing - so OpenArena doesn't seem to use shaders...

    Regarding the breakage in Warsow, I still can't find a bug filed about this, that's kind of a disappointment.

  7. #7
    Join Date
    Oct 2008
    Posts
    3,176

    Default

    This bug will break shaders in gallium - probably a lot of them.

    https://bugs.freedesktop.org/show_bug.cgi?id=29490

    While the article is interesting, the code is from 10 days ago and that might already be out of date. Intel has been doing tons of work fixing bugs and optimizing GLSL2 in order to get it ready for merging this Friday.

  8. #8
    Join Date
    Jan 2009
    Posts
    88

    Default low cpu usage

    Why wouldn't you want the cpu to be pegged at 100%? Something's wrong if it isn't, especially since rendering is supposed to be easily parallelizable. Every game I've ever played uses as much cpu as it can. In this case of software rendering, it seems like something's going wrong, and maybe data isn't moving around as fast as it needs to be and clogging the pipes.

  9. #9
    Join Date
    Nov 2007
    Posts
    1,024

    Default

    Quote Originally Posted by garytr24 View Post
    Why wouldn't you want the cpu to be pegged at 100%? Something's wrong if it isn't, especially since rendering is supposed to be easily parallelizable. Every game I've ever played uses as much cpu as it can. In this case of software rendering, it seems like something's going wrong, and maybe data isn't moving around as fast as it needs to be and clogging the pipes.
    Because these games are low-end ancient technology bases. If they are pegging out 100% of a modern CPU, the game is horrendously poorly written, and/or you have vsync turned off and you're just burning power to render frames you will never see.

    People also forget about little things like laptops and such, where extraneous power usage is more than just a little higher electric bill and environment hostility. It actually decreases how long you can play the game for on an plane or train or automobile. (Or boat, zeppelin, gondola, etc.)

    You can easily see the difference between an experienced and intelligent game developer and ones who think like you when you play games on an iPhone or the like. I have played gorgeous 3D games that let the battery last a good 6 hours and I have also played silly little 2D games with no crazy effects or anything that drain the battery in 2.5 hours.

    A game's simulation has a maximum rate at which it needs to run. Twitch-heavy shooters and even most other action games usually want to peg out at your monitor's refresh rate, but many other games have no need to run any faster than 30 FPS, and then only for the sake of maintaining some UI/character animations while waiting for player input.

    Even for twitch-heavy games like the Quake-based games, there's no need to run any faster than the monitor's refresh rate. I keep hearing "pro gamers" (read: losers whose knowledge of game internals and hardware comes from gamer forums filled with other clueless losers all parroting each others' old wives' tales about performance and latency) talk about how the latency of vsync hurts their games, but that's just a load of crap. The maximum 1/60th of a second of latency you might get is dwarfed by 1/20th (or higher, in many cases) of a second of latency that exists between user input to visible output that even the best gaming PC still has. Usually, people complaining that vsync causes latency are people who don't always win, refuse to accept that they're not the God of Gaming, and scapegoat everything they can: they'll end up turning off every feature and spending $1,000's on high-end computer equipment, and then just continue to yell and scream every time they get fragegd that their Internet connection lag-spiked or they got distracted or the other guy is hax0ring. They're pretty much only happy when they're fighting other gamers who are really good but not quite as good as they are, thus making the game appear challenging even though there's a small chance of losing. You see similar people in non-computer games as well, like amateur sports teams or the majority of US martial arts dojos. Not people you should listen to. Ever.

  10. #10
    Join Date
    Aug 2009
    Posts
    2,264

    Default

    I don't considder myself a pro gamer or anything, and the vsync in games I have played probably were sucky implementations, but I seriously experience vsync lag in the same way I notice wireless Logitech G7 mouse lag after playing with a mouse-o-phile Razer mouse for a long time. Just like with rockets in Quake 3 the mind adapts in a few minutes, but still.

    Call me whatever you want but before I even knew what vsync was I imediatly turned it back of due to some feeling of lag =x

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •