I have the following questions about AMD:

  1. If I want to switch to an AMD GPU, do I need to change my motherboard? Or do all motherboards work with both AMD and Nvidia GPUs?
  2. Do I need to buy an AMD CPU as well? Or can I use my existing Intel CPU with an AMD GPU?
  3. How does the AMD GPU naming convention work? More specifically, what is AMD’s equivalent of the RTX 4070? (I want to get a 4070 but I figured it would be a good idea to research AMD’s options)
    1. All motherboards work with all PCIe GPU’s. PCIe is a platform agnostic standard. If your motherboard is old enough that it does not support PCIe 4.0 though, you should definitely upgrade, because that probably puts your CPU as being old enough to bottleneck a 4070.

    2. Again, platform agnostic. there are some internal benefits for running all-AMD hardware, but they’re minimal. Intel will work perfectly well. Refer to point 1 to see if you should upgrade or not.

    3. Generally they mirror Nvidia’s numbering scheme. Aka, the x900 GPU’s are the halo card, followed by x800, x700, x600.

    Unfortunately AMD’s current lineup is a little behind in the midrange market. Their only midrange cards in the current lineup are last-generation RDNA2 6000 series cards aside the 4060-targeted RX 7600xt, which puts the closest AMD equivalent to the 4070 to be the RX 6900xt/6950xt. The 6950xt is currently floating around that 600-700$ pricepoint that the 4070 is at, and roughly matches it in raster performance.

    Raytracing is significantly worse though, it uses more power, and AMD’s software stack (frame interpolation, recording/encode, etc), while usable, just isn’t quite up to the same par as Nvidia’s. So if any of that really matters to you, I unfortunately have to recommend the 4070.
    However, if you can handle those downsides and just need to play traditional raster games, please please buy the AMD card and do not feed Nvidia’s idiotic greedy monopoly machine. Jensen is smoking crack and the last thing anyone needs to do is feed that man’s overinflated head more fucking cash.

    Or even better- wait until 2024 if you are not in need of an immediate upgrade. Intel’s first generation Arc GPU’s had a rough launch but have improved massively over the past year with driver upgrades, to the point of being fantastic first generation products. Their second generation Battlemage GPU’s are expected at the beginning of 2H 2024 and could potentially be a killer value in the midrange segment.

    • However, if you can handle those downsides and just need to play traditional raster games, please please buy the AMD card and do not feed Nvidia’s idiotic greedy monopoly machine. Jensen is smoking crack and the last thing anyone needs to do is feed that man’s overinflated head more fucking cash.

      Most of what you wrote seems fine, but I don’t agree with this, since we’re seeing, that AMD is not much better than Nvidia, and will try to fuck over consumers as much as possible themselves.

      Unless OP needs a new card right now, I’d recommend waiting as well.

  • Unsure about the 3rd but as for the first 2. No.

    Most graphics cards will work with most types of motherboards. I won’t say it’s totally 1:1 so wouldn’t hurt to look up and make sure. You do not need an amd CPU to use an amd GPU though.

    • For 1, all motherboards should work with both AMD and Nvidia GPU.
    • For 2, you can mix and match, there are some benefits to using both an AMD CPU and GPU, namely Smart Access Memory
    • For 3, AMD’s naming is confusing as hell but basically they used to have R3, R5, R9, etc for different tiers of cards, but now everything goes under the “RX” naming prefix.
    • They use the X and XT suffix to differentiate the fastest versions of the same tiers of cards or revisions of cards. (Kind of like Nvidia’s TI suffix)
    • The equivalent of a 4070 is a RX 6950 XT as far as I’m aware.
  • Do I need to buy an AMD CPU as well? Or can I use my existing Intel CPU with an AMD GPU?

    You don’t need an AMD CPU. Smart Access Memory (aka Resizeable BAR), which is what you’d be seeing if you try to research matching AMD CPUs and GPUs work with any combination of CPU and GPU. Yes, even nVidia GPUs have ReBAR now. SAM is just AMD’s marketing term for ReBAR.

    AMD is currently lagging behind in the temporal upscaling department. This would be your DLSS, XeSS, FSR, and game engine-specific temporal upscalers. DLSS3 is currently the best of the lot but is locked to nVidia cards because of the way it works. It requires the Tensor cores only found on nVidia GPUs. The main reason to use these upscalers at the moment is to get usable FPS when running Ray Tracing at high resolutions. It can also extend the life of your GPU by allowing it to render at lower resolutions and upscale to higher resolutions. Ie 1080p -> 1440p. Fair warning that nVidia’s DLSS implementations aren’t very backwards compatible since they rely heavily on GPU architecture. FSR and XeSS are theoretically platform agnostic but XeSS is better optimised to be run on Intel’s ARC.

  • From a gaming standpoint AMD lags slightly behind their equivalent Nvidia cards. But they’re still great gaming cards.

    Once you start trying to do things like video editing or other GPU intensive things the lack of CUDA really shows.

    • Fsr really needs more hardware level help to match nvidia. Dlss is an absolute game changing feature that can often be imperceptible to base rendering in game. Fsr has never looked that good and I expect will never improve at the same pace as dlss if AMD only uses the open source assets they created.

  •  ilidur   ( @ilidur@beehaw.org ) 
    link
    fedilink
    English
    11 year ago

    All the features people talk about are very much candy that will go away when you are actually playing your games. If you need framerate and resolution AMD can definitely supply that in the equivalent performance board.

    •  DdCno1   ( @DdCno1@beehaw.org ) 
      link
      fedilink
      English
      21 year ago

      No, ray-tracing isn’t just candy. Fully implemented, it massively transforms the look of games and influences the gameplay as well. Just look at Metro Exodus Enhanced Edition vs. the standard game. Proper light bounce through RT global illumination alone makes it worth it. I’ve been dreaming of RT in games for over 20 years, ever since I first saw real-time ray-tracing with the amazing heaven seven demo from 2000, which still works on modern systems, by the way. Here’s a link, use the “high res” download.

      DLSS on the other hand is just useful, providing free enhanced image quality and a massive performance boost, while AMD’s competing FSR can only do the latter at the cost at the former, absolutely butchering image quality, especially in regards to edges and transparent textures (foliage, fences, hair). With DLSS taken into account, Nvidia comes ahead practically all the time in both performance and image quality, allowing cheaper and older cards from the 2000 series onward to punch far above their weight. Since it’s enjoying widespread support, even to the point that modders are adding it to games that ship without it (often because of AMD sponsorship deals prohibiting it), it needs to be taken into account. There’s a DLSS mod for Skyrim VR, for example, which stabilizes performance and results in much more clarity.

      AMD does often ship their cards with more VRAM, which is an advantage, and their raw performance figures are highly competitive in several price brackets, but while I do like the generous amount of VRAM, raw performance is pretty worthless in the age of machine learning image enhancement. There is no reason not to take advantage of it.

      Now, if AMD’s FSR, which has already been massively improved over the years, does at some point catch up to DLSS or at least Intel’s XeSS, then AMD cards might become a solid choice again, but until then, I can’t really recommend them, especially given that driver quality still isn’t on par after decades of failing to catch up. I’m saying this as someone who has had several ATI and then AMD cards over the years. Drivers have always been an annoying Achilles’ Heel.

      I get that AMD is in the underdog position compared to Nvidia, but it’s really only by comparison. They are just as much of a massive cold, faceless multinational corporation and are engaging just as much in dubious business practices. It’s only due to their comparatively smaller market share that they can’t bully the competition around as much.