It seems as though every year comes packed with something new on the technology front. New cell phones, new video game consoles, new everything. But one aspect of tech that might be the most foreign to people is that of PC building, the act of compiling all the computer parts yourself and putting it together on your own. It's daunting, but rewarding in the most simple, yet joyful ways.
And it's hard to keep track of them if you don't know your way around town, as well. You might find it difficult to distinguish between the multiple brands and models, leaving you wondering the difference between an Nvidia GTX 980 and an AMD R9 300. What are these things, and what do they mean for gamers? Well, sometimes, it's everything.
It's important to understand that there's a world of difference between an average, middle of the road card that runs $200 and a high-end, prestige card that will tear upwards of $500 to $600 out of your wallet. These are not just simple upgrades, but rather a crucial element of a society of gamers that want the best performance possible.
So, what's the difference between a high-end card vs. an average card?
While initial investigation may lead you to believe that it just comes down to price, there's a clear distinction between paying $150 for a card and paying $600. While the obvious answer is that the more expensive card will deliver more power to your system, there's a lot of things at work here.
The high-end card will typically have faster clock speeds, more VRAM, better architecture and a higher boost clock. And while all that is great, you're ultimately paying for the whole picture. It's not just one thing that makes a prestige card worth buying, it's everything.
GPU architecture is key, but ultimately, higher-end video cards are all about the package deal. Nvidia's new 1080 card is running the hotly anticipated Pascal micro-architecture, a successor to the popular Maxwell system. But in a few years, Pascal will be a thing of the past as Volta takes over. It's a vicious cycle.
So what does that mean for lower end cards? Well, it just means that the guy who is buying the brand new GTX 1080 this year is going to run every game ever made, all on the highest possible settings with all the bells and whistles, likely until the end of time itself. Or, at least, until a year or two when the next card comes out.
And at the end of the day, the guy buying the GTX 960 for $200 to $300 will ultimately be just fine. He'll be able to run nearly everything in the foreseeable future, likely at high settings. For most people, that's more than enough, but that community of GPU enthusiasts aren't ones to rest on their laurels.
$600? That seems excessive. Who buys these cards?
So then who, you may be asking, is buying these cards? Well, as it turns out, not a lot. According to KitGuru, GPU sales in Q2 2015 hit a ten-year low. In the face of a market that has its own unique share of ebbs and flows, the question of who actually upgrades isn't a simple answer. In fact, it's easier to tell you who isn't upgrading.
We ran a strawpoll for the fine folks over at the PC Master Race subreddit, who were kind enough to participate in our little survey. They also engaged in some interesting discussion surrounding GPU upgrading. The results? Well, it might not be quite what you expect.
Yes, there are people who upgrade their GPU every year. But generally, these people aren't buying a GTX 1080 for $600. Rather, these are people who have been rocking a mid-range GTX 780 like its 2013. They're not buying enthusiast cards, but simply upgrading the card they have for $150 to $250.
As Reddit user Stroggnonimus points out,
I upgrade when then the cards starts to burn out and can't keep up with the game graphics. So it's 2-3 years between upgrades since I buy mid-high range cards and not enthusiast class. I use R9 280X for almost 3 years and going to upgrade when Vega comes out. GTX 550ti that I had before lasted 2 years, and so did GT 230 before 550ti.
And as user Reckesta states, the idea of an "enthusiast" card is something that doesn't come around every fiscal year.
There is one major enthusiast card a generation per company, for example the Fury/X and the 980TI. There are many different mid tier and low tier ones, however. Mid Tier: 390X and 380X + 980 and 970. Low Tier: 960 and below, and 370 and below.
So at the end of the day, we have a community of gamers who all have different reasons and timelines for upgrading. It all depends on what each gamer is willing to let go in terms of performance, which sparks what they buy and when they buy it. For some, it's spending $150-$200 a year just to keep themselves battle-ready. But buying a $600 beast every year? Less likely.
And that's exactly what user ScarySpikes told us, as well.
I figure most people upgrade when they feel like their rigs can't play enough games that they are interested in, or not at graphics settings that make them happy, to justify the cost. Depending on the card that could mean every year or two, or every 5 or 6 years for people that invest in high end cards.
Because in the long run, you're probably going to get a lot of mileage out of that $200 card. You may not be able to run The Witcher 3: Wild Hunt with every single detail at its maximum, but you're going to have a rollicking good time with it anyway. And for those people running beast-mode cards on the reg? Good on you.
What about consoles? Where do they fit in?
Most people are probably aware that when you turn on your gaming console, you're essentially turning on a computer. They're both made up of largely the same components, just utilized in different ways. It's been a constant struggle for developers to make the experiences seamless between consoles and beastly gaming rigs.
And while they haven't quite been able to match up the power levels just yet, there are signs of encouragement.
It's hard to call it a one-to-one conversion, but it shows that the gap isn't quite what it used to be. There used to be a time when running a mid-high range GPU inside a console was laughable, leaving gamers who opted to battle with a controller instead of a mouse and keyboard in the dark.
AMD has had their hand in the development of console GPUs, having contributed to both the PS4 and the Xbox One, as well as the Wii U. The consoles have taken advantage of the company's impressive APUs, the Accelerated Processing Unit, which combines both a CPU and a GPU onto one chip.
They aren't quite as strong as the discrete cards you can find in modern gaming rigs, but they've done an admirable job at keeping Sony, Nintendo and Microsoft in the race. Furthermore, rumors continue to speculate that AMD will supply the GPU for Nintendo's upcoming NX console, planned to launch in March of 2017.
Are you Team Green or Team Red? Nvidia or AMD?
Ultimately, there are fans on both sides; Nvidia supporters proudly rock the green while AMD loyalists show their love of the red. And the enthusiast cards will continue to play an important role in the brand identity of these two companies, as both have paraded out their high-end products with no less than the red carpet treatment.
Nvidia has often promoted their cards with advanced effects, made possible by GameWorks and their proprietary physics engine known as PhysX, a feature that has been used by games such as Mirror's Edge and Bulletstorm. And all of Nvidia's cards tie directly into GeForce Experience, the desktop app that keeps all of their products in tip-top shape.
Meanwhile, AMD has been, well, basically just being AMD.
From the jump, they've taken a 'less is more approach', delivering high-quality products that tend to go for a "more bang for your buck" attitude. And while AMD may not have the market share that Nvidia has, they still maintain a loyal audience of fans that live by their products.
What this all comes down to is quite simple; both companies know they have supporters, and they do what they can to support them right back. They produce these high-end, graphical monsters because they know that there is a fanbase for enthusiast GPUs.
They know that they can go out there at CES with a sense of swagger, a sense of attitude that defines their brand and show off GPUs with confidence. Because while not everyone is going to lay down the funds,
It may not come around very often, but when a prestige-level GPU does launch, the data shows that there are people out there who are willing to put down the cash. The number two best selling GPU of 2015? The Nvidia Quadro, which sells for $789.
Gamers, hobbyists, builders, all of them build with passion and hope for an even beefier card to come out in a couple years. It's all for the love of the game.
What about you? Do you upgrade your GPU every year, or do you simply stick with what you've got? Would you ever buy a prestige-level GPU? Let us know in the comments!