Page 21 of 22 FirstFirst ... 1119202122 LastLast
Results 201 to 210 of 220

Thread: R600 Open-Source Driver WIth GLSL, OpenGL 2.0

  1. #201
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by Mickabouille View Post
    This must be a bot.

    Would this thread qualify as a failed Turing test ?

    Note : I'm not a native english speaker either, but I try to make sure to have at least partially sound sentence structures. And in fact, it must be even harder for me to parse those sentences than for those that speak english fluently.

    BTW, what is "praxis" ?
    sorry praxis=practice

  2. #202
    Join Date
    Aug 2009
    Location
    Russe, Bulgaria
    Posts
    513

    Default

    Hello, since this thread is active, and I couldn't find bug report like this, I am posting it here. Using following configuration:
    Linux 2.6.32
    libgl 7.7
    xf86-video-ati 6.12.4
    libdrm 2.4.17

    I have following bugs, apart from inability to turn on KMS
    Using glxgears as benchmark..yes I know it is not really.
    In non compositing environment, when glxgears window origin is moved out of current desktop(left and up coordinates), rendering stops (shows last rendered frame, clipped with the window border of-course) frames go high ~4000 as of my hardware RS780.
    On the other hand, when the glxgears window is under other window, entirely, frames drop down to ~100.
    When simultaneously origin outside desktop and covered by window entirely FPS are ~100 too.

    In compositing environment, when moving window with GL context, window moves (content black), but GL rendering context stays(and continue to render) on the old place, above any windows. When I finish the move, then it moves in to GL window as usual. When the window origin is moved beyond desktop dimensions, window content stays black.

    Please if not appropriate here, tell me where I should post this issues.
    Thanks.

  3. #203
    Join Date
    May 2008
    Location
    Germany/NRW
    Posts
    510

    Default

    Quote Originally Posted by Mickabouille View Post
    This must be a bot.

    Would this thread qualify as a failed Turing test ?

    Note : I'm not a native english speaker either, but I try to make sure to have at least partially sound sentence structures. And in fact, it must be even harder for me to parse those sentences than for those that speak english fluently.

    BTW, what is "praxis" ?
    It's actually quite funny when you're a native german speaker.
    He often slips german words into his sentences, e.g. praxis, tausend, ...

    Anyway, to move this thread to a slightly less off-topic topic:
    Has anyone had any luck yet with running a real-world >=OGL2 application (= game, obviously :P) on r300g? I keep trying to run HoN, Nexuiz, the Penny Arcade games and some others but so far without success.
    It does pass some of the piglit-tests though...
    I guess it's too early to fill bugreports yet, or isn't it?

  4. #204
    Join Date
    Oct 2007
    Location
    Under the bridge
    Posts
    2,144

    Default

    Quote Originally Posted by Qaridarium View Post
    Quote Originally Posted by BlackStar
    As for wine graphics performance, games like Diablo 2, Counterstrike, World of Goo, Plants vs Zombies are almost unplayable on my Atom/945 netbook
    I'm so sorry abaut this.. idsn the future it will be faster intel only need OpenGL3.2.
    Intel is generally is very slow in 3D a 20€ hd4350 is 200-300% faster than your atom GPU...
    Yes, but you forgot to quote the rest of my sentence, which said that these games are also slower on my Core2/Quadro notebook. This Quadro card supports OpenGL 3.2, so if what you said was true, it should be faster through wine than native. It isn't, however.

    I don't doubt that wine can give a speed increase in some rare cases, but this speed increase either happens on software that doesn't really need it (WOW is 5 years old now? 3d mark 03 is 6 years old?) or on hardware that doesn't have a problem anyway (e.g. DX10/GL3.2 cards running DX8 software).

    One final nitpick: WOW minimaps use GL pbuffers, which are *not* windows specific. However, noone sane wants to deal with this ugly 6-year old POS extension, which is why OSS drivers won't get it anytime soon. This is why WOW/DX tends to run better than WOW/GL in wine.

    in the past AMD get foaming rabies by some Posts of me in this forum.
    Lol

  5. #205
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,458

    Default

    Quote Originally Posted by Qaridarium View Post
    The king himself Lord Bridgman gave me a tiny chance that is the truth.

    the tiny point is there is a real chance that in CAD software gamer-cards run faster in directX than on openGL.
    YOU stated that CAD software ran faster in DX on gamer cards than in GL, and stated it as a fact. I didn't have any information to prove or disprove it so said it was "possible", based on your prior statement. If you don't have any justification for that statement, you can't use my response as justification either.

    Quote Originally Posted by Qaridarium View Post
    thats because in the openGL world they have more control they cut out more extensions and then this features need to emulate by the cpu thats slowdown the OpenGl CAD software on a gamer card.

    on the directX side M$ can bypass this and integrate optimizations in the dx-core
    One more time, we don't cut out extensions on the gaming drivers. The workload is different between workstation and gamer applications (workstation tends to be much more vertex-intensive, gaming tends to be fragment/pixel-intensive) and we optimize differently for those workloads. More specifically, if an optimization improves the way a workstation application runs but slows down or interferes with a game then there's no way that change could make it into the regular drivers but it *would* go into the workstation drivers.

    The other appeal of the workstation drivers is that the hardware/software combination is ISV-certified on a range of workstation applications, which is usually not possible for regular drivers.

    Quote Originally Posted by Qaridarium View Post
    but yes i can not proof this in practice!
    With respect, if you can't prove it in theory *or* in practice then you don't have a very strong argument. You really would be better off presenting it as a theory or a possibility, with a chance that someone else might provide additional information, than presenting it as a statement of fact and having people throw feces at you.
    Last edited by bridgman; 12-30-2009 at 10:46 AM.

  6. #206
    Join Date
    Oct 2008
    Posts
    81

    Default

    Keep up the great work, bridgman.
    Not enough appreciation going on around here.

  7. #207
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,458

    Default

    Thanks, although the appreciation should keep going to the hands-on developers.

    I'm just here trying to prevent Crimes against Logic

  8. #208
    Join Date
    Oct 2008
    Posts
    81

    Default

    Right, on behalf of the developers *cough*.

  9. #209
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,458

    Default

    I knew that

  10. #210
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by bridgman View Post
    YOU stated that CAD software ran faster in DX on gamer cards than in GL, and stated it as a fact. I didn't have any information to prove or disprove it so said it was "possible", based on your prior statement. If you don't have any justification for that statement, you can't use my response as justification either.



    One more time, we don't cut out extensions on the gaming drivers. The workload is different between workstation and gamer applications (workstation tends to be much more vertex-intensive, gaming tends to be fragment/pixel-intensive) and we optimize differently for those workloads. More specifically, if an optimization improves the way a workstation application runs but slows down or interferes with a game then there's no way that change could make it into the regular drivers but it *would* go into the workstation drivers.

    The other appeal of the workstation drivers is that the hardware/software combination is ISV-certified on a range of workstation applications, which is usually not possible for regular drivers.



    With respect, if you can't prove it in theory *or* in practice then you don't have a very strong argument. You really would be better off presenting it as a theory or a possibility, with a chance that someone else might provide additional information, than presenting it as a statement of fact and having people throw feces at you.
    in the end the windows users use rivatuner to turn a radeon into a fireGL...

    in the end everyone how buys a FirePRO/FireGL are a idiot!
    Last edited by Qaridarium; 12-30-2009 at 09:03 PM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •