Page 3 of 4 FirstFirst 1234 LastLast
Results 21 to 30 of 34

Thread: Intel Linux Graphics Shine With Fedora 12

  1. #21

    Default

    Quote Originally Posted by Kano View Post
    @AdamW

    How to use your repo with rawhide?
    just add it like any other repo, with a file in /etc/yum.repos.d/ . follow the format of the other files in there, change the name and use the appropriate path for my repo. something like:

    [video-experimental]
    name=Experimental video for Fedora Rawhide
    baseurl=http://www.happyassassin.net/video-experimental/rawhide/x86_64
    enabled=1
    metadata_expire=7d
    gpgcheck=0

    ought to do it. remember to change the arch if appropriate for your system.

  2. #22
    Join Date
    Dec 2008
    Posts
    160

    Default

    I second the call to see the performance of Ubuntu 9.10 included here... is there a benefit (video wise) to running a distribution using newer packages?

  3. #23
    Join Date
    Aug 2007
    Posts
    437

    Default

    Now we are waiting on the new stack to run on real 3D accelerator. Not this intel 3D joke. If Radeons can run as good as this, it will be one of it's own kind. sigh...

  4. #24
    Join Date
    Aug 2009
    Location
    south east
    Posts
    339

    Default

    Quote Originally Posted by FunkyRider View Post
    Now we are waiting on the new stack to run on real 3D accelerator. Not this intel 3D joke. If Radeons can run as good as this, it will be one of it's own kind. sigh...
    Intel is real 3D acceleration. The tests should have included Windows XP as a control group. This would add a measure to compare to.

    Consider adding UT2004 as a litmus test.

  5. #25
    Join Date
    Oct 2008
    Location
    Sweden
    Posts
    983

    Default

    I guess you could argue about what "real" means here, Intel really doesn't target gamers.

    Anyway, comparisons with Win would be welcome.

  6. #26
    Join Date
    Aug 2007
    Posts
    437

    Default

    Oh please, not a single 'real' Intel's 3D accelerator can beat the bottom of the line GeForce 8500 or even an 785G integrated chipset, and if you've ever seen PC games in recent like 10 years, there is no place for intel's graphics. We are living in the full HD gaming era where people treat 1920x1080, 4AA resolution as "normal". functional wise, intel's graphics can't even generate an acceptable render frame in modern games. I don't know if there is any better word than 'joke' to describe this. Intel never took gaming seriously. Or maybe they tried, it's either just too easy or too hard.

    “They’re the world’s leading designers and manufacturers of CPUs – how hard could it be to build a GPU? I mean, come on, how hard could it be? That crummy little company down the road builds them – we could build them in our sleep. Come on, how hard could it be?” ——NVIDIA David Kirk

  7. #27
    Join Date
    Dec 2008
    Posts
    160

    Default

    Quote Originally Posted by FunkyRider View Post
    Oh please, not a single 'real' Intel's 3D accelerator can beat the bottom of the line GeForce 8500 or even an 785G integrated chipset, ...I don't know if there is any better word than 'joke' to describe this. Intel never took gaming seriously. Or maybe they tried, it's either just too easy or too hard.
    I don't think that anyone was disputing that Intel accelerators have not served the needs of gamers. I would guess that 80% of the broader market has not cared about games nor 3D to any significant degree. So while Intel chips have not served the needs of games, that does not mean Intel is out of touch of the broader market.

    It's also not hard to argue that 3D is serving a useful need in the general desktop, and there are innovative UI designs that can be accelerated by a GPU, and that the GPU's role in general purpose computing is becoming very significant (offloading video decoding from CPI, compositing desktops, accelerating photo editing, etc.). And Intel can be seen bringing stronger 3D offerings to their product lineup.

    Either way, Intel is also very engaged in the Open Source community with open source drivers, the Moblin platform, etc., so finding other ways to describe this other than "joke", in order to soften your tone, would offer more respect to a significant contributor to the community.

  8. #28
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,124

    Default

    Either way, Intel is also very engaged in the Open Source community with open source drivers, the Moblin platform, etc., so finding other ways to describe this other than "joke", in order to soften your tone, would offer more respect to a significant contributor to the community.
    One word: Poulsbo.

    Not to mention shitty, non-conformant OpenGL drivers across the board. Check out the OpenGL forums if you wish to how developers feel about the nightmare of supporting Intel hardware.

    You may not like it, but Intel is single-handedly holding back the adoption of OpenGL. It's one of the leading reasons why Direct3D is the only real option for consumer graphics.

    So you think *that* is good for the community? Heh, good one - tell us more!

  9. #29
    Join Date
    Nov 2009
    Posts
    6

    Default

    Quote Originally Posted by FunkyRider View Post
    Oh please, not a single 'real' Intel's 3D accelerator can beat the bottom of the line GeForce 8500 or even an 785G integrated chipset, and if you've ever seen PC games in recent like 10 years, there is no place for intel's graphics. We are living in the full HD gaming era where people treat 1920x1080, 4AA resolution as "normal". functional wise, intel's graphics can't even generate an acceptable render frame in modern games. I don't know if there is any better word than 'joke' to describe this. Intel never took gaming seriously. Or maybe they tried, it's either just too easy or too hard.

    “They’re the world’s leading designers and manufacturers of CPUs – how hard could it be to build a GPU? I mean, come on, how hard could it be? That crummy little company down the road builds them – we could build them in our sleep. Come on, how hard could it be?” ——NVIDIA David Kirk
    Please think before you post.

    Intel graphics cards are not directed to "gamers" - they don't give a shit about those people. Integrated graphics are extremely useful for business people and mom/dad types which only want decent 2D and Google Earth. And guess what, they are the most lucrative market since they represent 80% of the graphics card sales.

    Myself, I just bought a new laptop with the Intel X4500 because of 3 things:
    a) I don't play games
    b) Integrated graphics gives 25%+ more battery life
    c) Intel has excellent working open source drivers, and having been burned by Ati and their crappy fglrx I really appreciate that

    Sorry if I'm aggressive, but I'm tired of seeing people complaining about integrated graphics performance in games - come on, do you really expect them to play games? Just don't buy them for that.


    BlackStar, can you provide a link for that claim? According to Intel, the latest drivers are OpenGL 2.1 compliant...
    Last edited by caramerdo; 11-25-2009 at 03:54 PM.

  10. #30
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,124

    Default

    Quote Originally Posted by caramerdo View Post
    BlackStar, can you provide a link for that claim? According to Intel, the latest drivers are OpenGL 2.1 compliant...
    According to the 2.1 specs (pdf), "OpenGL 2.1 implementations must support at least revision 1.20 of the OpenGL shading language." (page 351, J.1). The latest drivers I could find would only support GLSL 1.10 - and badly at that.

    Personally, I spent the better part of last week rewriting an application to work on a X4500 chip: downgraded to GLSL 1.10, disabled FBO blits, floating point attachments, MRT (which translates to no HDR, bloom, shadows or antialiasing) reduced texture resolution and finally... the driver produced an utterly broken picture. Imagine memory corruption, similar to TV snow, overlayed on top of the actual rendering.

    That was on Vista, by the way. On Ubuntu, the driver simply refused to render any geometry touched by a vertex shader. In the end I simply gave up: Ati and Nvidia cards consume the application just fine, but Intel ones simply do not offer any meaningful OpenGL support. Maybe if you limited yourself to 15-year old GL1.1-level features the drivers might work correctly - but that's simply not an option.

    It's sad to see the largest IHV (50%+ marketshare) produce such garbage. Their competitors are shipping OpenGL 3.2, when Intel is still struggling with 2.1 - and yes this *does* drive developers away from OpenGL and into Direct3D (and Microsoft).

    Edit: Search for Intel on opengl.org to see how depressing the situation really is.
    Last edited by BlackStar; 11-25-2009 at 07:42 PM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •