Fine, AMD. You win. I’m jumping ship.
With the launch of the RX 7900 XTX and 7900 XT, this Nvidia fan was finally convinced to pick up an AMD graphics card as my next upgrade. I can’t believe I’m saying it, but for the first time ever, I couldn’t be more excited to be going Team Red.
I was never a fan of AMD
Yes, I admit — I was never that much of an AMD fan. Seeing as PC hardware has always been my thing, I kept up to date with AMD and its rivals in equal measures, but one bad experience with an AMD processor years ago put me off enough that I never really went back. Around 15 years have passed — ancient history, as far as computing is concerned — and beyond testing and building for others, I never owned an AMD CPU or GPU in my own personal build.
Over time, this reluctance for AMD grew into a habit, and it was often justified — I picked Intel and Nvidia because I trusted them more and their hardware was simply better. This was years before the GPU shortage when components were still affordable enough that I was OK with spending a little more if it meant I’d be putting good stuff into my new builds.
Of course, as time went by, AMD improved. With the launch of Ryzen CPUs and RDNA 2 GPUs, I was ready to acknowledge that AMD is solid again, but still not quite ready to cut the cord and say farewell to Nvidia.
So there I was, an Nvidia fan planning out my next build, until the last few weeks finally broke me. AMD’s launch of the RX 7900 XTX was the final nail in the coffin of my “no AMD” phase.
I tried to stick with Nvidia
Despite the soaring prices during the GPU shortage and the fact that AMD’s range was more affordable (even though none of it really was at the time), my upgrade plan has for months now involved an Nvidia card. I prepared different builds, ranging from an RTX 3070 Ti to an RTX 3090, and have been keeping my eye on the prices — still high in my area — until I could find a deal I’d consider worth it.
But my resolve was slowly melting. There I was, with AMD’s graphics cards within reach; perhaps not quite as good as Nvidia in ways like ray tracing, but still more than sufficient. Still, knowing that both manufacturers would be releasing new lineups this year, I made the common mistake of waiting to find out what we were getting instead of building my PC right away.
Cue the RTX 4090. It’s a real beast of a graphics card, with a pretty high power requirement and a much, much higher price. In our testing, the card proved to be pretty incredible in terms of performance, but in my mind, that still wasn’t enough to sway me to spend $1,600 on a graphics card. Not that I had the option to, anyway — despite the price tag, the GPU sold out in minutes, and I’m not going to be giving a few hundred dollars extra to a scalper just to be able to play Cyberpunk 2077 in seamless 4K.
Of course, I could wait for the RTX 4080 — the 16GB version, that is, because Nvidia promptly “unlaunched” the overpriced mistake of a card that was the RTX 4080 12GB. Unfortunately, the version with more memory didn’t convince me, either. Maybe I’m being cheap, or maybe I just want to pay reasonable prices for my hardware; either way, I wasn’t feeling up to it.
A steady decline
The last few weeks have been rough for Nvidia, even despite the initial success of its new Ada Lovelace generation of GPUs. First, the EVGA controversy — no matter how you spin it, it’s just not a good look. Then, the controversy surrounding the RTX 40-series GPUs started, and I was quickly running out of ways to defend my own choices.
Jensen Huang, Nvidia’s CEO, said it himself: “The idea that the chip is going to go down in price is a story of the past.” The timing of that statement could not have been worse, given the fact that many Nvidia enthusiasts, myself included, were pretty unhappy with the way Nvidia chose to price the next generation of graphics cards. Huang basically made it clear that things are not going to get any better in that regard.
Now, it turns out that the RTX 4090, and therefore also the RTX 4080, may have some melting issues due to the power adapter. A quick PSA: don’t bend your cables if you want to avoid a fire hazard. Don’t get me wrong — despite these problems, the RTX 4090 does seem pretty outstanding in a lot of ways, and in all likelihood, the RTX 4080 will also be a significant upgrade over the previous gen.
Somehow, that just doesn’t matter to me anymore. After 15 years, it’s time to give AMD another shot.
AMD couldn’t have picked a better time
With the disappointment in Nvidia leaving a bitter mouth, I found myself getting excited about the announcement of RDNA 3 GPUs. I’ve already toyed with the idea of picking up an AMD CPU for my next PC, and I was ready to make the same choice in terms of the graphics card.
Watching AMD’s announcement, I knew that I was on board. It’s sad that we’re at a time when a $1,000 GPU is a thrilling prospect, but it is — especially if we’re talking about a flagship that will likely become one of the best graphics cards of this generation.
The two new AMD flagships, the Radeon RX 7900 XTX and the RX 7900 XT, sound pretty great. We won’t know their true performance until they land in the hands of eager reviewers, but AMD promises a 54% increase over RDNA 2 in performance per watt being up to 1.7 times faster than the RX 6950 XT at 4K; access to DisplayPort 2.1 (and upcoming, 8K monitors, supposedly coming soon); and second-generation ray tracing that could help it catch up to Nvidia in that regard. AMD also claims that AI performance will be up to 2.7 better than the previous generation of GPUs.
AMD keeps the power requirements more conservative, maxing the TDP out at 355W for the 7900 XTX, and it’s not going to use Nvidia’s ill-fated 12VHPWR adapter, which, so far, seems to be the cause behind these melted RTX 4090s.
All of that is nice, but the best part is that AMD, unlike Nvidia, didn’t raise its prices. The flagship will cost $999 for the reference model, followed by $899 for the 7900 XT.
We don’t all need an RTX 4090
Some readers may chime in here and tell me that there’s no way the RX 7900 XTX will keep up with the RTX 4090, and in all likelihood, they’d be right. However, the truth is that not all of us need an RTX 4090 — in fact, most of us don’t. There still aren’t many games that really need that kind of power, and even if they do, you can still run them on a cheaper GPU if you sacrifice a little bit of frame rate or take the settings down a notch.
Not many people really need an RTX 4090. Some do, but I am certainly not one of them; at least, not at that price.
I believe that the market needs more of what AMD is serving up, meaning semi-affordable hardware that’s more accessible to more users, and less of the ultra-high-end components that most gamers just can’t justify in their building budgets.
AMD’s flagships sound like the perfect middle ground between the expensive enthusiast-only sector and the mid-range segment where you have to compromise on some settings in certain games. They’re likely to run most AAA titles on max settings, but they’re still priced at a level I can get behind.
I’m ready, AMD. It’s going to be nice to see you again.