Page 1 of 2 12 LastLast
Results 1 to 10 of 15

Thread: Intel Haswell Graphics Driver To Be Opened Up Soon

  1. #1
    Join Date
    Jan 2007
    Posts
    15,413

    Default Intel Haswell Graphics Driver To Be Opened Up Soon

    Phoronix: Intel Haswell Graphics Driver To Be Opened Up Soon

    While the Ivy Bridge launch is still a number of weeks out, Intel will soon be publishing their initial hardware enablement code for next year's Haswell micro-architecture...

    http://www.phoronix.com/vr.php?view=MTA1MzU

  2. #2
    Join Date
    Jun 2010
    Posts
    9

    Default

    xf86-video-ati DDX
    I think you mean xf86-video-intel

  3. #3
    Join Date
    Oct 2009
    Posts
    353

    Default

    If it's DX11.1 compliant then it should be OpenGL 4.x compliant, not 3.2.

  4. #4
    Join Date
    Jun 2011
    Posts
    139

    Default

    Quote Originally Posted by cl333r View Post
    If it's DX11.1 compliant then it should be OpenGL 4.x compliant, not 3.2.
    Yes and no. Intel have a bad track record of providing hardware capable of DirectX 10.0 but the drivers never supported it.

  5. #5
    Join Date
    Apr 2008
    Location
    Zagreb, Croatia
    Posts
    115

    Default

    Let others talk the talk while we tick 'n' tock.
    Opensource bits for a uarc.next().next()
    now that's an open source commitment, congrats

  6. #6
    Join Date
    Nov 2007
    Posts
    1,024

    Default

    Quote Originally Posted by cl333r View Post
    If it's DX11.1 compliant then it should be OpenGL 4.x compliant, not 3.2.
    You'd like to think that, wouldn't you.

    While likely very true from a hardware perspective, the Windows drivers simply aren't there for it. Intel GPUs that offer D3D 10 features only offer GL 3.0 on the Windows drivers, despite the hardware theoretically being capable of full GL 3.3. I think the Ivy Bridge drivers will do GL 3.1, despite offering D3D 10.1.

    This is one of the many, many reasons why OpenGL is just best avoided if you're only developing for Windows. You can argue all you want whether or not it's OpenGL's/Khronos' fault, but the reality is that 60% of the GPUs in the world have more features and better performance using D3D on the operating system used by 90% of people. :/

    (The other reason is that GL is just a poorly designed API -- e.g. binding Uniform Buffer Objects by slot directly in the shader wasn't added to Core until 4.2, and the ARB extension that adds the support to older cards is only supported by NVIDIA's drivers; likewise, binding vertex attributes by slot wasn't added to Core until 3.3. Major D3D-class features came to GL over a year later, e.g. Uniform Buffer Objects, Texture Buffer Objects, primitive restart, and instancing weren't added to GL until 3.1 and it took until GL 3.2 to add geometry shader support to Core. Those features existed as extensions, but they were neither universally available nor universally high-quality, so you couldn't actually use them in a shipping product. Granted, even once in Core, the implementations tended to be buggy, likely due to a lack of any kind of test suite for implementations to be verified against.)

  7. #7
    Join Date
    Oct 2008
    Posts
    3,216

    Default

    The graphics unit on Haswell is expected to be Direct3D 11.1 and OpenGL 3.2 compliant.
    This refers to the state of the Windows drivers, not the hardware limitations. Haswell could be fully GL4 compliant if the driver support is there.

    Sandy Bridge has been quite impressive performance-wise for being Intel integrated graphics, but with Ivy Bridge this performance is going to be upped substantially (as much as twice as fast as Sandy Bridge).
    I've been hearing more like 50% faster, but either way it should be a nice boost.

    This will happen again with Haswell where I'm told its integrated graphics should be comparable to a mid-to-high-end discrete GPU.
    Even if we're talking double IB, which is in turn double SB, that's definitely mid-range discrete territory, not high end. And that's current generation mid-range - by the time Haswell is released, it's likely low-end again.

  8. #8
    Join Date
    Feb 2012
    Posts
    52

    Default Come the heck on, AMD!

    Your main competitor had already fully embraced the open-source driver effort and releases hardware code a full YEAR before releasing the actual hardware. You don't see them losing sales or profits; in fact, they're doing pretty well. Why can't AMD just kill off their pathetic excuse for a driver bundle for GNU/Linux (proprietary, at that) and focus all efforts on Mesa/Gallium3D?

  9. #9
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,543

    Default

    Quote Originally Posted by blinxwang View Post
    Your main competitor had already fully embraced the open-source driver effort and releases hardware code a full YEAR before releasing the actual hardware. You don't see them losing sales or profits; in fact, they're doing pretty well.
    Yep. They don't make 3D workstation hardware so an open-source-only model works quite well for them. If our hardware focus was the same we would probably have the same open-source-only Linux driver approach, in fact that's what we used to do in the pre-R200 days.

    Quote Originally Posted by blinxwang View Post
    Why can't AMD just kill off their pathetic excuse for a driver bundle for GNU/Linux (proprietary, at that) and focus all efforts on Mesa/Gallium3D?
    Because we would lose a big customer base which needs the 3D performance and features that can only be delivered cost-effectively by a proprietary driver. I have answered this question a lot of times already.

    If you ignore the 3D workstation market then say you can't see any reason for fglrx to exist it's hard for me to give good answers.

  10. #10
    Join Date
    Jan 2009
    Posts
    1,738

    Default

    Quote Originally Posted by elanthis View Post
    (The other reason is that GL is just a poorly designed API -- e.g. binding Uniform Buffer Objects by slot directly in the shader wasn't added to Core until 4.2, and the ARB extension that adds the support to older cards is only supported by NVIDIA's drivers; likewise, binding vertex attributes by slot wasn't added to Core until 3.3. Major D3D-class features came to GL over a year later, e.g. Uniform Buffer Objects, Texture Buffer Objects, primitive restart, and instancing weren't added to GL until 3.1 and it took until GL 3.2 to add geometry shader support to Core. Those features existed as extensions, but they were neither universally available nor universally high-quality, so you couldn't actually use them in a shipping product. Granted, even once in Core, the implementations tended to be buggy, likely due to a lack of any kind of test suite for implementations to be verified against.)
    off topic but i often have this visions of elanthis walking into the Kronos Offices with explosives and detonating them after yelling D3D Akbar

    p.s.
    if they cant do it well enough diy

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •