Page 2 of 2 FirstFirst 12
Results 11 to 19 of 19

Thread: Gallium3D To Enter Mainline Mesa Code

  1. #11
    Join Date
    Jun 2007
    Location
    The intarwebs
    Posts
    385

    Default

    Quote Originally Posted by chelobaka View Post
    There is no reason to implement D3D in Gallium3D considering amount of work needed. WINE does its job well enough. Most of the apps already created for D3D will never be ported to Linux even if we'll have D3D compatible API due to devs stuck to M$ platform.
    I'm talking about implementing a Direct3D API that WINE can just pass D3D to instead of having the WINE folks try to reimplemented it in OpenGL, so they can focus on the actual win32 API. What we have now is NOT okay, it's buggy as heck and there's a performance hit from translation and poor optimization that gallium should eliminate. I'm thinking it'll be a ton easier to code it with gallium than it would be to do it in GLSL.

  2. #12
    Join Date
    May 2007
    Location
    Third Rock from the Sun
    Posts
    6,587

    Default

    Rumors have it intel is going to try to push it's own API

    http://www.fudzilla.com/index.php?op...11361&Itemid=1

  3. #13
    Join Date
    Dec 2007
    Location
    Germany
    Posts
    365

    Default

    Quote Originally Posted by mirza View Post
    Of course we will use some API, but drawing scene with ray-tracing is completely different then current rendering. Currently, there is lot of work CPU must do to paint image that "feels" realistic. Thats why we really need C/C++ right now. In ray-tracing, you just setup a scene, no object simplification (removing points) is needed, no tricks for shadows, no calculating parts of scene that you can/can't see (reducing scene size for faster rendering) etc. You can just send massive scene to GPU and use CPU only to re-arrange objects there. That would greatly simplify code for graphical part of the game. Still, AI and physics must be done on the CPU, but Java (or .NET in case of MS) can do that, I am pretty sure. Benefits are obvious (debugging multithreaded app is one clear example).
    Yeah, that's why I was pointing to the release of DirectX 11 (or 11.1?) which will support ray tracing (or, which at least is said to support it).

    However, I doubt that game developers will really accept Intel's own API (if it's really creating an own one) if they don't do something to keep up backwards compatibility. On the other hand, you can't really implement a backwards compatible ray tracer on current systems as that would simpley screw up the whole API ;D

    Look how the company (now NVIDIA ofc) behind PhysX handled the new API: they provided a general set of functions to do some drawing stuff which made use of PhysX technology if available but also provided a fallback mechanism with "conservative" methods. Thus, quite a few developers adopted it as there was no risk in losing support for older hardware.

  4. #14
    Join Date
    Nov 2008
    Posts
    781

    Default

    Quote Originally Posted by ethana2 View Post
    I'm talking about implementing a Direct3D API that WINE can just pass D3D to instead of having the WINE folks try to reimplemented it in OpenGL, so they can focus on the actual win32 API.
    The problem with implementing DirectX is figuring out what DX is supposed to do. It's poorly documented, and the internals are unknown to anyone but a few black voodoo priests at microsoft. A black-box-implementation will always be flawed, no matter if it's built on top of OGL or G3D.

    There are also versions of Wine for Mac, *BSD and possibly others. OGL is available on all of those, G3D isn't. Even if there was a DX implementation on top of G3D on Linux, the DX->OGL wrapper in Wine would have to be maintained. Adding G3D-support does not simplify maintenance, it adds additional code.


    Still, having a DX API on top of G3D might be good for the few companies trying to port their games, but most of them use engines with an OGL backend anyway (UT/iD). And there might be some performance improvements in Wine.
    If you compare that to the work of a DX-implementation on G3D, it's hardly worth it.

  5. #15
    Join Date
    Jun 2008
    Posts
    86

    Question

    Quote Originally Posted by deanjo View Post
    Rumors have it intel is going to try to push it's own API

    http://www.fudzilla.com/index.php?op...11361&Itemid=1
    That doesn't make much sense to me. From what I've read, the idea behind Larrabee is that, while DirectX and OpenGL will be supported through software libraries, developers will eventually program for it directly since it's x86, so making a brand new API would be a waste of effort.

  6. #16
    Join Date
    Jun 2007
    Posts
    145

    Default

    right. the intel's api won't have something to do with graphics (at least primarly). http://techresearch.intel.com/articl...Scale/1514.htm

  7. #17
    Join Date
    Dec 2007
    Location
    Germany
    Posts
    365

    Default

    Quote Originally Posted by chaos386 View Post
    That doesn't make much sense to me. From what I've read, the idea behind Larrabee is that, while DirectX and OpenGL will be supported through software libraries, developers will eventually program for it directly since it's x86, so making a brand new API would be a waste of effort.
    Uhm... about programming it directly... Wasn't Larrabee supposed to have something about 100 cores? I'm curious what percentage of game programmers have enough knowledge of multi-core systems to split the ray tracing algorithm optimally across all cores
    Well yeah, that's why we'll be stuck with DirectX 11 in the future, too, and Larabee won't be a big thing that solves all Graphics API mess

  8. #18
    Join Date
    Apr 2008
    Location
    /dev/random
    Posts
    218

    Default

    According to an earlier post by phoronix, Gallium3D has a Direct3D state tracker internally at Tungsten Graphics, it isn't public, yet...

    If Gallium3D implements Direct3D, then Wine can simply pass Direct3D calls to G3D, which might allow OSS drivers to have better Windows gaming support than Nvidia.

    Also, having Gallium allows instant generic XvMC to any gallium driver, soon va-api will also be supported.

    I'm wondering if Gallium will support vdpau(to some extent at least...)

  9. #19
    Join Date
    Jun 2006
    Posts
    3,046

    Default

    Quote Originally Posted by chaos386 View Post
    That doesn't make much sense to me. From what I've read, the idea behind Larrabee is that, while DirectX and OpenGL will be supported through software libraries, developers will eventually program for it directly since it's x86, so making a brand new API would be a waste of effort.
    Considering that D3D and OpenGL do nothing to actually describe CSG type ray-traced rendering, the thing they're plugging (even in the latest Game Developer Magazine they're doing it...), they're going to have to come up with something that handles that description. Adding extensions on to D3D and OpenGL don't make sense because you'd make it quite a bit more painful to do. As for directly programming it...heh...I'd doubt that everyone's going to be driving it all themselves. There'll be some API or toolchain (a' la IXP Microengine C for their former network engine chip...), or both that people will use to make things go.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •