There are no plans inside Wine to implement DirectX on top of Gallium 3D.
Wine is meant to support platforms that don't use Gallium3D (Apple, *BSD, the DX-replacement on VirtualBox etc), so the D3D->OGL wrapper needs to be maintained anyway.
I was under the impression most desktop *nixes (including *BSD's and Solaris) would be slowly going for Gallium3D. I suspect having VirtualBox talk directly to Gallium3D DX tracker would be much more ideal than using Wine DX libraries for converting DX calls to OpenGL calls, then passing them on to Gallium3D as OpenGL.
I was asking about this very same thing towards the beginning of this year from developers and they gave the reason that DX requires sections of WinAPI and that's the real reason it would be troublesome to implement it to Gallium3D.
Another idea I've heard was Wine skipping Gallium3D altogether and implementing DX by talking to the GPU over the GEMified DRM but that's another story... (And wouldn't likely ever work with closed-source drivers anyway so given nVidia's stand it's kinda moot)
Getting offtopic from Gallium3D so I stop here.
Bloat doesn't come from X11. Enlightenment and XFCE are proof positive of that.
X11 isnt the bloat, X11 is a very good protocol for the purpose it was designed for. The problem is the Xserver is getting bloated because people are trying to mold the X11 protocol into something it isn't. Frameworks like Gallium that move stuff out of the Xserver are a good thing. Frankly, I would like to see a new revision of the X protocol to suit the new demands on graphics these days. An X12 if you will. I doubt we will see it sometime soon, but I think eventually it will happen. The graphics demands are changing rapidly, and something will eventually have to give. The question will be, will it be the platform or the devs?
The problem is the Xserver is getting bloated because people are trying to mold the X11 protocol into something it isn't.
... what? Do you have anything resembling technical grounds for claims like that?
The X server does relatively little, and is one of the lightest pieces of all in the desktop stack. It handles input, window area management, pixmap management, and then a handful of rendering protocols. That's about it.
Even if you cut X out of the picture, you just have to replace it with something that does 80% the exact same thing, and then the other 20% doesn't actually go away, it would just get moved somewhere else.
Or you get something like Wayland which -- while cool, yes -- is inherently less flexible and more error prone as it requires far more functionality to be built into the display server itself, since it doesn't rely on a separate window manager or compositing manager process.
On top of it all, X is modular. If you do think some particular X extension is bloat, don't worry -- it can be removed. e.g. if you think RENDER is bloat, the good news is that once RENDER is no longer a real gain, it can just be dropped from the server (it's not core protocol and hence applications/toolkits are already required to check for it before using it) and the toolkits can drop support for it as well. Same as if you designed an X12, except you don't have to arbitrarily break the whole desktop in the process.
If you really really just want a new protocol Just Because(tm), then you probably want to look at the aforementioned Wayland.