Why Are Reviewers Ripping Apart the AMD Radeon RX 6500 XT?

Why Are Reviewers Ripping Apart the AMD Radeon RX 6500 XT?

After launching several successful entries in the Radeon RX 6000 series, AMD launched the Radeon RX 6500 XT, a new budget GPU. With a $199 price tag, it seemed like a boon for gamers looking for a GPU to tackle most basic gaming tasks. But for some reason, reviewers who tried out AMD’s new GPU didn’t love it.

All of that raises the question: if this is a brand-new card, made on a bleeding-edge 6nm process and based on RDNA 2, why is it getting so much hate? And is it even deserved?

AMD’s RX 6500 XT Has Too Little VRAM


GPU needing a dock

One of the card’s many weaknesses comes in its memory. While the RX 6600, its immediate step-up, has 8GB of GDDR6 memory, the Radeon RX 6500 XT cuts that in half and only ships with 4GB. That’s barely enough memory for today’s standards. And it’s made even worse when you see the memory’s bandwidth—it has a 64-bit memory bus. By contrast, the RX 6600 has a 128-bit memory bus. Yikes.

Both the RTX 3050, NVIDIA’s direct competitor, and even this card’s predecessor, the RX 5500 XT, come with up to 8GB of VRAM, and both have a wider 128-bit memory bus. So this card’s memory is flat out insufficient even within the context of other budget GPUs.

What was AMD’s train of thought here? The only relatively positive point is that the reduced memory bandwidth makes the card effectively useless for crypto mining, which is a yay for people who want to get these things for gaming, but why would gamers want a GPU that’s potentially bad at gaming too?

RX 6500 XT’s PCIe Lanes in Short Supply


An image of a generic PCI expansion slot

Another weird aspect of the card comes in the PCI Express interface itself. Typically, unless you’re buying something dirt-cheap like a GT 710, all GPUs will come with a full set of 16 PCIe lanes. Although the Radeon RX 6500 XT has a full 16x connector, it only has 4 PCIe 4.0 lanes, so the card itself has just a quarter of the bandwidth that other RDNA 2 cards have.

MAKEUSEOF VIDEO OF THE DAY

Related: The 5 Best CES 2022 Announcements for Gamers

Now, this isn’t too bad of a problem in modern systems, as PCIe 4.0 is pretty fast, so four lanes are enough to keep a reliable link between the GPU and the system. The problem arises when a system is only capable of PCIe 3.0. And a lot of computers even today still use PCIe 3.0—Intel didn’t support 4.0 until 11th gen Rocket Lake, and while it has been around for some time on AMD, cheaper chipsets like A520, X470, and B450 don’t have it.

Cheaper systems are probably exactly the kind that would use an RX 6500 XT, meaning that would be a huge problem. You would actually need a PCIe 4.0-compatible system for this to work properly, and those would likely set you back a bit more money.

Other Odd Omissions in the RX 6500 XT

AMD’s Radeon RX 6500 XT cuts corners in other departments too. For instance, it doesn’t support H.264 and H.265/HEVC encoding, which is essential for streamers, and the card itself packs just two display outputs—one HDMI 2.1 and one DisplayPort.

So not only is this card worse in terms of performance, but it also gets rid of standard features that other GPUs in the same price category offer.

Avoid the RX 6500 XT at All Costs


AMD Radeon GPU in a PC

The Radeon RX 6500 XT uses the Navi 24 GPU die primarily seen in laptops, and this explains things like reduced PCIe lanes and the 64-bit memory bus. Regardless, this paints a terrible picture for AMD, even with the GPU storage situation taken into account.


AMD has a clearly inferior product trying to compete with NVIDIA’s RTX 3050 at a similar price point. People doing their research will likely aim for the 3050 instead—if they can find one in stock, as scalpers already scooped up the RX 6500 XT and sell it for north of $400. The RTX 3050 will likely meet a similar fate once it goes up for sale.


amd-nvidia-linux
AMD vs. NVIDIA GPUs on Linux: Which Should You Use?

Looking for a fast graphics card for Linux gaming? You have two main options: AMD and Nvidia. Here’s what you need to know.

Read Next


About The Author

Leave a Comment

Your email address will not be published. Required fields are marked *