Page 19 of 22 FirstFirst ... 91718192021 ... LastLast
Results 181 to 190 of 220

Thread: R600 Open-Source Driver WIth GLSL, OpenGL 2.0

  1. #181
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,146

    Default

    Quote Originally Posted by Qaridarium View Post
    thats wrong! wine wins tons of benchmarks!
    wine win on 3Dmark2000 and 3Dmark2001!!!
    These do not use HLSL. They are an entirely different beasts.

    you have a wrong unterstanding abaut the HLSL to GLSL bridge
    there is no need to translate it all the time!

    only the game starts slower!

    after that the complete translatet GSGL code load in the card and run nonstop.
    in theorie there is no speed 'lose' but you can also doe optimations...
    you can handle DX8 code in DX10/DX11 style...
    You translate the code once, when the shader is compiled, but this translation has runtime overhead. In the worst case, the overhead can overflow the capabilities of your card, meaning the resulting code will not run or will fall back to software emulation.

    Meaning you might need a newer card to run old code through wine, when an older card would have sufficed in native D3D.

    a DX9 based game runs well on a X1950.. but the same game loses in wine on this card...
    but a much slower card like the 4350 or 54xx can "Win"
    thats because wine translate the old code into a new openGL3.2 stylish code...

    much better texture compression save ramspeed and bring more fps!
    Win in support yes (see above). Win in speed not really, at least not with these specific cards you quoted.

    what da fu.k?????

    "EXT_geometry_shader" is a nvidia only extansion but OpenGL3.2 do not need this for the same because in ogl3.2 there is a geometry_shader !
    Oh, please.

    Code:
    $ glxinfo
    [...]
    OpenGL renderer string: ATI Radeon HD 4800 Series
    OpenGL version string: 3.2.9232
    [...]
    , GL_EXT_geometry_shader4,
    This is on my Ati 4850 with 9.12 drivers.

    you also can emulate a 'tesselation shader' thats because of the amd-OGL extansions! ...
    DX11-level tesselation works differently than Ati's DX10-level tesselation hardware. It's close but not identical and all discussions I've read on this indicate that these extensions can't be used to emulate DX11-level tesselation. Feel free to prove me wrong, though.

    you do not get the Point of wine...... wine isn't a emulator.-..

    there is no emulator!......

    wine also does not emulate shader HLSL code... wine is a compiler!
    wine is a shader compiler compiles old shader in newstylish shader
    compile HLSL shader into GLSL shader....

    there is no emulator! nativ hardware speed! NO emulator!
    Yeah right, Wine is not an emulator because it recompiles HLSL code to GLSL. I guess pcsx2 is not an emulator either then? Hey, it recompiles mips code into x86!

  2. #182
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by BlackStar View Post
    These do not use HLSL. They are an entirely different beasts.
    why does wine win??? over 40% ? ??




    Quote Originally Posted by BlackStar View Post
    You translate the code once, when the shader is compiled, but this translation has runtime overhead. In the worst case, the overhead can overflow the capabilities of your card, meaning the resulting code will not run or will fall back to software emulation.
    if your card can handle the output of the compiler the result also can be faster!
    wine is up to 50% faster in WOW directX9 than vista on the same hardware!



    Quote Originally Posted by BlackStar View Post
    Meaning you might need a newer card to run old code through wine, when an older card would have sufficed in native D3D.
    you only need the new card for the new extansions...

    you do not need a faster card ...

    DX11 hardware for exampel can have more fps on lower ramspeed only because of the extem good textur-compression!




    Quote Originally Posted by BlackStar View Post
    Win in support yes (see above). Win in speed not really, at least not with these specific cards you quoted.
    Long time ago i test this... X850 vs hd4350...

    theoretical the X850 is much faster more shader power more ramspeed...

    but in wine the hd4350 is over 30% faster in 3Dmark03!

    and yes 3dmark03 use shader's!



    Quote Originally Posted by BlackStar View Post
    DX11-level tesselation works differently than Ati's DX10-level tesselation hardware. It's close but not identical and all discussions I've read on this indicate that these extensions can't be used to emulate DX11-level tesselation. Feel free to prove me wrong, though.
    you are wrong only because you are the only person talk abaut dx10 hardware...
    you can handle DX11-tessellation on a 5870 by using openGL!
    yes you can't use old hardware for new extensions but the same hardware can do the same....


    Quote Originally Posted by BlackStar View Post
    Yeah right, Wine is not an emulator because it recompiles HLSL code to GLSL. I guess pcsx2 is not an emulator either then? Hey, it recompiles mips code into x86!
    PCSX2 emulate the hardware! wine do not emulate any hardware!

    mips to X86 is not the same as HLSL to GLSL...

    low assembler code vs high program language.-

    PCSX2 also emulate in realtime..

    wine does not translate the hlsl code in realtime.. wine does the most of the work before the game starts.

    the 'fps' does not drop because if the code run there is no need to recompile the code.

  3. #183
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,514

    Default

    Q, BlackStar is saying that when Wine translates shaders it often has to insert additional instructions into the shader code, and it's those additional instructions that could slow down execution relative to running natively on Windows.

    If you reply with "but 3DMarkxxx is faster so that's not true" I'm going to vote for a ban

  4. #184

    Default

    Quote Originally Posted by bridgman View Post
    Q, BlackStar is saying that when Wine translates shaders it often has to insert additional instructions into the shader code, and it's those additional instructions that could slow down execution relative to running natively on Windows.
    But isn't the point of all the OpenGL 3.2 "Wine extensions" to obviate the need to do this?

  5. #185
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,146

    Default

    Quote Originally Posted by Qaridarium View Post
    Long time ago i test=BlackStar]Win in support yes (see above). Win in speed not really, at least not with these specific cards you quoted.
    this... X850 vs hd4350...

    theoretical the X850 is much faster more shader power more ramspeed...

    but in wine the hd4350 is over 30% faster in 3Dmark03!
    You originally said X1950 vs HD4350 and I really doubt the latter will outperform the former in any meaningful test. X850 is very different in capabilities from the X1950 (SM2.0b vs SM3.0), so the result of this comparison does not transfer to the former.

    Not to mention that this 30% number is meaningless on its own. Did you use the same system? CPU? OS? Driver version? Wine version?

    you can handle DX11-tessellation on a 5870 by using openGL!
    No, you cannot. Not yet. AMD_vertex_shader_tessellator is a very different beast than DX11 tessellator shaders, and we'll have to wait for OpenGL 3.3/4.0 before the necessary functionality is exposed. My guess is that this won't happen before Nvidia releases its own DX11 hardware.

    yes you can't use old hardware for new extensions but the same hardware can do the same....
    Yes, iff the drivers expose this functionality.

    I won't argue the point on Wine/emulation, other than to say that HLSL to GLSL recompilation was not even conceived when the "wine is not an emulator" moto was penned. The "not an emulator" part refers to x86 instructions, not shader code.

  6. #186
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,146

    Default

    Quote Originally Posted by Alex W. Jackson View Post
    But isn't the point of all the OpenGL 3.2 "Wine extensions" to obviate the need to do this?
    Nope. The new interop extensions improve compatibility in a few parts of the pipeline (e.g. VBO loading, polygon rendering) but they don't affect shaders directly.

  7. #187

    Default

    Quote Originally Posted by BlackStar View Post
    Nope. The new interop extensions improve compatibility in a few parts of the pipeline (e.g. VBO loading, polygon rendering) but they don't affect shaders directly.
    From the definition of ARB_fragment_coord_conventions on opengl.org (emphasis added):

    What is the primary goal of this extension have?

    RESOLVED: The goal is to increase the cross-API portability
    of fragment shaders. Most fragment shader inputs (texture
    coordinate sets, colors) are treated identically among OpenGL
    and other 3D APIs such as the various versions of Direct3D.
    The chief exception is the fragment coordinate XY values which
    depend on the 3D API's particular window space conventions.

    We seek to avoid situations where shader source code must
    be non-trivially modified
    to support differing window-space
    conventions. We also want minimize the performance effect on
    fragment shader execution. Rather than an application modifying
    the shader source to add extra operations and parameters/uniforms
    to adjust the native window coordinate origin, we want to control
    the hardware's underlying convention for how the window origin
    is provided to the shader.
    ?

  8. #188
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,146

    Default

    Quote Originally Posted by Alex W. Jackson View Post
    From the definition of ARB_fragment_coord_conventions on opengl.org (emphasis added):
    Bah, forgot about coordinate conversions. This could have some positive impact, but wasn't this available as a NV-specific extension prior to GL3.2?

  9. #189
    Join Date
    Oct 2009
    Posts
    71

    Default

    The article states that I don't get to play "Unigine Heaven on Linux", but.. I do get to play chromium-bsu AND glchess! playing 1080p movies also works just fine, so I'm happy enough as of now.

  10. #190
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by bridgman View Post
    If you reply with "but 3DMarkxxx is faster so that's not true" I'm going to vote for a ban
    thats means... i don't care about theoretical explanation why wine is slower or must be slower if wine in reality is faster.....

    in the past i do a lot of benchmarks windows vs linux....

    wine and zlib... wine is 244% faster than windowsXP on my system!...
    wine is faster on 3dmark2001/2000 to...
    wine is faster in WOW...

    i do some more benchmarks wine vs windows

    Everest AES WindowsXP 13893
    everest AES Linux 14074

    everest queen windowsXP 15851
    everest Queen linux 16031

    everest zlib windowsXP 26219
    everest zlib Linux 64104

    everest photoWorxx windowsXP 8814
    everest photoworxx Linux 10038

    7zip WindowsXP 7544
    7zip Linux 7825

    so much talking bullshit why wine sould be slower than nativ directX...

    i talk about why is wine so fast?


    "I'm going to vote for a ban "

    admit it that you get paid for it by amd

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •