Page 1 of 3 123 LastLast
Results 1 to 10 of 28

Thread: Intel Sandy/Ivy Bridge Gallium3D Driver Merged

Hybrid View

  1. #1
    Join Date
    Jan 2007
    Posts
    14,538

    Default Intel Sandy/Ivy Bridge Gallium3D Driver Merged

    Phoronix: Intel Sandy/Ivy Bridge Gallium3D Driver Merged

    The modern Gallium3D graphics driver for supporting Intel Sandy Bridge "Gen6" and Ivy Bridge "Gen7" graphics has been merged into mainline Mesa!..

    http://www.phoronix.com/vr.php?view=MTM1OTc

  2. #2
    Join Date
    Jan 2013
    Posts
    971

    Default

    Intel should use Gallium, there is simply no excuse not using it, except NIH. Even rewriting it from scratch is time worthy! It would steamline the Kernel and Xorg.

  3. #3
    Join Date
    Oct 2007
    Posts
    1,267

    Default

    Quote Originally Posted by brosis View Post
    Intel should use Gallium, there is simply no excuse not using it, except NIH
    ..and it would help their competitors if Intel used its massive resources on optimizing Gallium3D infrastructure....

  4. #4
    Join Date
    Jan 2013
    Posts
    971

    Default

    Quote Originally Posted by DanL View Post
    ..and it would help their competitors if Intel used its massive resources on optimizing Gallium3D infrastructure....
    That would help Intel, not so competitiors. "Competitors" introduced Gallium first, mind you. Given limited manpower, it is stupid to build walled gardens in graphics stack. This is definately the most negative point about Intel drivers.

  5. #5
    Join Date
    Jan 2009
    Posts
    1,325

    Default

    Quote Originally Posted by DanL View Post
    ..and it would help their competitors if Intel used its massive resources on optimizing Gallium3D infrastructure....
    They're already helping their competitors with their work on mesa, X, and wayland.

  6. #6

    Default

    Quote Originally Posted by DanL View Post
    ..and it would help their competitors if Intel used its massive resources on optimizing Gallium3D infrastructure....
    Intel's GPUs don't compete anyways, the silicon just doesn't have the performance. The idea of not doing it is moronic and comes from management that doesn't understand the basis of technology, that Intel is a HARDWARE company, not making sure every aspect of your hardware does exactly what it is capable of doing due to your bullshitting on drivers is only detrimental to your sales.

  7. #7
    Join Date
    Aug 2012
    Location
    Pennsylvania, United States
    Posts
    1,873

    Default

    Quote Originally Posted by Kivada View Post
    Intel's GPUs don't compete anyways, the silicon just doesn't have the performance. The idea of not doing it is moronic and comes from management that doesn't understand the basis of technology, that Intel is a HARDWARE company, not making sure every aspect of your hardware does exactly what it is capable of doing due to your bullshitting on drivers is only detrimental to your sales.
    I would disagree with this statement Kivada. For high end gaming, youre right. But Sandy Bridge and newer... at least for me, new Intel CPU's with a builtin GPU will replace everything from nvidia and ATI midrange and down. High-mid and up, I will still go discrete because im assuming those will be workstations or gaming machines, but mid-range and down? Just go Intel with the integrated. It really is more than an enough.

    *I type this from a Sandy Bridge Low-voltage (aka, underclocked) ultrabook. And for the most part, I couldn't be happier. I will be upgrading to Broadwell or whatever the next architecture name is after *well and I'm very excited to see the difference in performance even more.

  8. #8

    Default

    Quote Originally Posted by Ericg View Post
    I would disagree with this statement Kivada. For high end gaming, youre right. But Sandy Bridge and newer... at least for me, new Intel CPU's with a builtin GPU will replace everything from nvidia and ATI midrange and down. High-mid and up, I will still go discrete because im assuming those will be workstations or gaming machines, but mid-range and down? Just go Intel with the integrated. It really is more than an enough.

    *I type this from a Sandy Bridge Low-voltage (aka, underclocked) ultrabook. And for the most part, I couldn't be happier. I will be upgrading to Broadwell or whatever the next architecture name is after *well and I'm very excited to see the difference in performance even more.
    I take it you haven't used any of the AMD APU systems in the same price bracket then, they are considerably better in the graphics department then Intel's GPUs. In any case, Intel, like AMD and Nvidia is a HARDWARE company, the drivers should be considered a necessary part of that hardware, because what use is a driver for which there is no hardware to make use of it?

    As for Nvidia, you are right, but that is because Nvidia pigeonholed themselves by having no X86 CPU and where dependent on AMD and Intel to allow them to make motherboard chipsets for them. Nvidia only had the one out for a chance at existing 10 years from now by going to the much more competitive ARM market, but currently has little to show for it and are barely holding on to their GPGPU market via their early ability to get as many devs as possible on the CUDA bandwagon, knowing that if they went to OpenCL there would be very little incentive to only buy Nvidia hardware over what gets the highest performance per watt at the time of your purchase.

    Back in 2008 Nvidia should have either bought out VIA/S3 or fought to force Intel to allow them a license to make X86 hardware. By now they'd likely have an interesting product in the market.

  9. #9
    Join Date
    Jun 2009
    Posts
    531

    Default

    Quote Originally Posted by Ericg View Post
    I would disagree with this statement Kivada. For high end gaming, youre right. But Sandy Bridge and newer... at least for me, new Intel CPU's with a builtin GPU will replace everything from nvidia and ATI midrange and down. High-mid and up, I will still go discrete because im assuming those will be workstations or gaming machines, but mid-range and down? Just go Intel with the integrated. It really is more than an enough.

    *I type this from a Sandy Bridge Low-voltage (aka, underclocked) ultrabook. And for the most part, I couldn't be happier. I will be upgrading to Broadwell or whatever the next architecture name is after *well and I'm very excited to see the difference in performance even more.
    Performance has improved greatly to the point where the GT2 graphics on Haswell can comfortably compete with mid-range dedicated hardware but you are still constrained by system memory. And last I checked, DDR3 is still much slower than GDDR5. Plus you do leech off system ram when using onboard graphics, For some people that's a no-no if you need every last bit of memory available in the system.

    Also there has been one very annoying issue about Intel hardware; when compared side by side, a machine using the onboard graphics from Intel always seems to have a very blurred display vs the AMD and the Nvidia's where the display appears sharp and clear. I have no idea why this is the case though.

  10. #10
    Join Date
    Aug 2012
    Location
    Pennsylvania, United States
    Posts
    1,873

    Default

    Quote Originally Posted by brosis View Post
    Intel should use Gallium, there is simply no excuse not using it, except NIH. Even rewriting it from scratch is time worthy! It would steamline the Kernel and Xorg.
    Its not a matter of NIH, the devs have said many times that the only reason they didn't move to Gallium was because they had spent so much time optimizing the classic driver already that they didnt want all that work to be for nothing. They were very happy with the classic driver they had written and decided to stick with it

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •