Announcement

Collapse
No announcement yet.

People incorrectly assume that AMD drivers suck. They don't.

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Radeon suck. All versions except 7000 (that we don't now for sure), they have bad OGL (so many errors), bad offload for CPU (they need more CPU performance), bad code morphing and bit manipulation (not good or many emulation instructions). Personal experience: RIFT with Wine, Core2duo SSSE3 CPU@3ghz, Radeon4670@500gflops: 15fps(low quality renderer), 18fps(hight quality renderer - low settings). With an Nvidia@500gflops its 25+fps. Nvidia loads more graphics, wile CPU is at the full on both.

    Comment


    • #22
      Originally posted by crazycheese View Post
      Wait wait wait, the reason why I don't buy ATI/AMD is much more simple. You see, in 2008, when trying Ubuntu 8.04 perfectly working with nvidia's 6800 card, on a HD4670 - the catalyst has simply frozen my desktop and it constantly froze the system if I switched to TTY. It was a clean install. In fact, the single working thing was OpenGL-based OpenArena. But later on came AMD' opensource story about plans for a niche driver of 3rd class citizenship. Right now I'm looking at Intel, which does the stuff right, because AMD's opensource is still underperforming and those who disagree are tossed back into proprietary driver area. Im in that area with nvidia already, works stable from here.

      By the way, you should be asking programmers who actually work with OpenGL about quality of drivers. Ask Carmack for example.

      I excuse myself in front of ATI/AMD developers and hackers there, because the paragraph above applies only to the "official" management.
      It is nobody's fault but your own if YOU are too STUPID to use the proper RADEON driver, not the blob shit, whether AMD or nvidia.

      Comment


      • #23
        Actually somebody is at fault, it should "just work". Since it's not and the user managed to use the "wrong" driver, somebody in the chain that provided him with the driver is at fault.

        Comment


        • #24
          Originally posted by boast View Post
          My conclusion? Nvidia is better on linux, so I use it. I would love to get an AMD card since they're usually better bang for buck, but I can't.
          First your ati experinces are very old, they are not what the amd blob is today like somebody posted...

          But my main point is, that assumption can only than be true if you really dont care at all if you have a free or a nonfree driver. And its not only a question of philosophy or so, I as example have installed the latest alpha from ubuntu but even after that are betas, and there are other linuxes with unstable versions too, so I can be shure zu 99,999% that the grafic driver with the kernel even if its not a "stable" kernel will work, on the other hand there is a big change that a blob will not work because of X or the kernel... even with stable ubuntu version you sometimes dont get a good experince with the blobs... then you maybe have to wait a few months on one api change I think nvidia and amd chocked nearly a year or so before they fixed it. that can happen again in the future...

          And then you have to use your card only a few years and then buy a new one, because at some point they stopp supporting them and if you then update your linux they will not work... they even can say tomorow, because linus insultet us so much we stop linux support at all, and after the next api change in X you cant use their drivers or newer distries at all...

          So you cant just feature compare free drivers vs non free drivers, you have to see that free drivers is a feature itself...

          you now maybe will say thats not that so important than that I cannot play games, thats maybe true for you, but to do so as if a free driver is no feature at all is stupid...

          Comment


          • #25
            Originally posted by artivision View Post
            Radeon suck. All versions except 7000 (that we don't now for sure), they have bad OGL (so many errors), bad offload for CPU (they need more CPU performance), bad code morphing and bit manipulation (not good or many emulation instructions). Personal experience: RIFT with Wine, Core2duo SSSE3 CPU@3ghz, Radeon4670@500gflops: 15fps(low quality renderer), 18fps(hight quality renderer - low settings). With an Nvidia@500gflops its 25+fps. Nvidia loads more graphics, wile CPU is at the full on both.

            This can easily be fixed... And you just give an random nvidia card, but fail to provide the nvidia graphics card information. Anyways, the real reason the graphics card you mention fails on linux to perform just as well is that there is not enough end users who provide viable information to the developer.

            Originally posted by droidhacker View Post
            It is nobody's fault but your own if YOU are too STUPID to use the proper RADEON driver, not the blob shit, whether AMD or nvidia.
            Yet again, the issue is not the binary driver. It's also neither AMD nor nvidia... It's all about how many users are able to return reasonable bug reports that will enable the developer ( AMD, NVIDIA, and whoever else writes drivers) to be able to fix the bugs.


            Originally posted by Vadi View Post
            Actually somebody is at fault, it should "just work". Since it's not and the user managed to use the "wrong" driver, somebody in the chain that provided him with the driver is at fault.
            The only person who can be best described as at fault would be the beta testers and end users for being unable to write good bug reports. The developer is powerless to fix bugs unless they are given the correct information to be able to fix the bug, and it definitely helps if you can get things to where you can reproduce the bug on two or more machines.

            Comment


            • #26
              Originally posted by asdx
              Blobs suck, FOSS kernel drivers rock. End of story.
              i agree but the foss driver performance is horrible.

              Comment


              • #27
                Originally posted by RussianNeuroMancer View Post
                Even TearFree video playback under compositing environment with nVidia proprietary drivers? Not likely. Even nVidia employees confirm that can not be TearFree (I would like to give proof-link, but nvnews forum is down).
                I have tested this right now running Gnome Player with VDPAU accelerated "Solaris (1975)" under XFCE/Compiz. Its tear free.

                Comment


                • #28
                  Originally posted by droidhacker View Post
                  It is nobody's fault but your own if YOU are too STUPID to use the proper RADEON driver, not the blob shit, whether AMD or nvidia.
                  The "proper RADEON" driver was doing 5 fps in QuakeArena back then.

                  And right now, the "proper RADEON" driver performance makes me not want to purchase any mid-high class AMD card.

                  Comment


                  • #29
                    Originally posted by curaga View Post
                    That's... just awful. How the fuck do they both with their huge driver teams and decades of writing drivers never test two apps running at once?
                    Partly because of the 98% of people who run their drivers, none of them use OpenGL. Even on Windows, the OpenGL stability of both AMD and NVIDIA is _terrible_. We shipped a game 4 months ago using OpenGL that literally crashed on every single NVIDIA driver except the very very latest, and we had to spend tons of time ripping out and rearranging bits of the graphics architecture until we found out what was causing the crash and how to get things rendering without triggering it. (Hence the interest of Valve and others in FOSS drivers.)

                    The D3D drivers are far more stable. In this case, the stability has a bit less to do with D3D being a better API and much more to do with D3D actually being used. Recall that Linux users still frequently run non-GL-accelerated desktops, use DDX drivers and EXA/RENDER to get their basic apps on screen. OS X uses an entirely Apple-written driver architecture. Very few Windows apps use OpenGL, and even most of the ones people think use OpenGL actually use D3D: all Windows implementations of WebGL run over ANGLE, many of the "big 3D content apps" have switched over to D3D on Windows, and the games that use OpenGL are almost entirely just simple little 2D indie games that don't do anything even remotely interesting with the GPU.

                    Granted, Microsoft also has tests for D3D and properly designed the driver model such that every vendor didn't have to reimplement all of D3D internally, while Khronos still doesn't even offer a test suite, much less a core OpenGL framework for the ISVs to build (and even if they did, at this point the ISVs have too much invested in their internal implementations, so a switch is unlikely to happen without some strong-arming).

                    Comment


                    • #30
                      Originally posted by crazycheese View Post
                      I have tested this right now running Gnome Player with VDPAU accelerated "Solaris (1975)" under XFCE/Compiz. Its tear free.
                      Try this samples: http://rghost.ru/37220226 http://rghost.ru/37220233 http://rghost.ru/37220247 .

                      Comment

                      Working...
                      X