Announcement

Collapse
No announcement yet.

AMD Radeon RX 7800 XT & RX 7700 XT Linux Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • AMD Radeon RX 7800 XT & RX 7700 XT Linux Performance

    Phoronix: AMD Radeon RX 7800 XT & RX 7700 XT Linux Performance

    Last month AMD announced the Radeon RX 7700 XT and RX 7800 XT graphics cards while today these graphics cards go on sale for $449 and $499 USD, respectively. Today also marks the review embargo lift so I'm now able to talk about the Linux support and performance for these new RDNA3 graphics cards that are designed for 1440p gaming,

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Some tests were CPU bound (like CSGO for more powerfull GPUs) which skews perf per watt metric.

    7800 XT looks like a nice sweet spot.
    Last edited by kruger; 06 September 2023, 09:21 AM.

    Comment


    • #3
      Official support for ROCm would be great for sure. The cards look like a nice sweet spot.

      I'm wondering if a self compiled ROCm would work though or what parts do not work. Many of the ROCm components list gfx1100, gfx1101 and gfx1102 in their CMakeLists.txt or in other places in the source code.

      Comment


      • #4
        Where is the perf per dollar geometric mean?
        ## VGA ##
        AMD: X1950XTX, HD3870, HD5870
        Intel: GMA45, HD3000 (Core i5 2500K)

        Comment


        • #5
          Originally posted by kruger View Post
          Some tests were CPU bound (like CSGO for more powerfull GPUs) which skews perf per watt metric.

          7800 XT looks like a nice sweet spot.
          7700 XT was pretty nice in regards to having low thermals with moderate performance.

          NGL, I think it should be priced down to $399 because, at $449 vs $499, the extra $50 is rather moot regardless if you're broke AF and have to save for months on end or have money to spend and splurge. As someone in the broke AF category, I have a bit of a bias due to paying $330 for a new 6700 XT from Newegg 9 months back, which is what they're selling for now if y'all need a new GPU, and how I don't think 36% increase in price will net a 36% increase in performance.

          Comment


          • #6
            Well, the benchmarks would be better if comparing the GPUs of the last generation like RX 6700 XT and RX 6800 XT.

            Comment


            • #7
              Originally posted by Zyten View Post
              Official support for ROCm would be great for sure. The cards look like a nice sweet spot.

              I'm wondering if a self compiled ROCm would work though or what parts do not work. Many of the ROCm components list gfx1100, gfx1101 and gfx1102 in their CMakeLists.txt or in other places in the source code.
              If they could actually make ROCm usable by the general populace (i.e. like Nvidia did to build their dev pool) it would totally change the game for AMD and actually make them relevant in the exploding ML/AI market!

              Comment


              • #8
                The most useless review I have seen in a while, in the absence of both the 4060Ti and the 4070. And yes, I know the reasons, but they don't make review more useful.

                Comment


                • #9
                  I must be getting old and cantankerous, or maybe I have been out of work for so long that I have a different appreciation for the value of a dollar, but i can't bring myself to spend close to $500 for a video card, or any individual component for that matter, just to play video games.

                  I tend to be a practical man and as such, I would rather buy an XBOX or PS5 for gaming than build a gaming PC.

                  I also realize that the human eye of even highly trained Air Force pilots can't see more than 120 fps, so telling me that a card can achieve 300+ fps, especially when the refresh rate on most monitors is capped well below that number, so even if a card is rendering that many frames, it is still prevented from displaying them.

                  All in all, gaming cards have been a waste of money for a while, I would say more than 2 decades, going back all the way to the Geforce 2 Ultra.

                  Now if i was making money with the card, for instance doing 3d rendering, or scientific analysis, then I could see spending that much cash for a video card.

                  But for gaming? No way.

                  Comment


                  • #10
                    Originally posted by bug77 View Post
                    The most useless review I have seen in a while, in the absence of both the 4060Ti and the 4070. And yes, I know the reasons, but they don't make review more useful.
                    All Nvidia cards could be very well omitted and I wouldn't care because they are no more than bricks on Linux, but at least AMD's past generation should have been present.
                    ## VGA ##
                    AMD: X1950XTX, HD3870, HD5870
                    Intel: GMA45, HD3000 (Core i5 2500K)

                    Comment

                    Working...
                    X