Skip to content

PC Learning: Everything You Need to Know About GPUs

A Promotional shot of a Cyberpower PC with an Nvidia Rtx 3080 GPU

A Promotional shot of a Cyberpower PC with an Nvidia Rtx 3080 GPU

It’s been a busy period in the world of computer graphics. Just a few months ago, we wrote on these pages about Nvidia CEO Jensen Huang, who had announced his company’s new line of cards from the comfort of his ornately-furnished kitchen. Not to be outdone, AMD CEO Lisa Su recently took to our screens to make all kinds of promises about Team Red’s new roster. 

What these announcements had in common was a procession of impressive-seeming graphs and charts. While both presentations did a respectable job of getting their points across in a way that might be understood by those of us who don’t have three degrees in electrical engineering, there are still plenty of obscure terms being bandied about – you could be forgiven for not fully understanding exactly what it all means. What’s RDNA2? What is an Infinity Cache? And what, pray tell, is a Big Navi?

Given that so many are going to be making the investment in a new card in the near future, it’s a good time to refresh our memories about these devices and what they’re for. So, are you ready to learn about GPUs? 

Why do I need a GPU?

The Graphics Processing Unit, you’ll be unsurprised to learn, is the bit of your computer that’s tasked with dealing with all of those graphical elements that get thrown onscreen during a gaming session. If it’s being drawn, in other words, it’s being drawn by your GPU. Everything that doesn’t get drawn – whether it be enemy AI, physics, or fundamental computational housekeeping – falls within the purview of the CPU (of which we’ll speak more next week).

This raises an obvious question: why bother with a GPU at all? Why not just render everything on the CPU? Well, prior to the release of the first graphics cards, that’s precisely what happened. Early 3D games like Doom and Ultima ran on a CPU, but the fact is that modern games require a highly-specialised kind of processing to which general-purpose CPUs aren’t suited.

What’s new in DX12?

DX12 is the twelfth version of Microsoft’s DirectX APIs (that’s application programming interface). These are programs that give games direct access to system resources, rather than having Windows act as an intermediary. Now, DX12 provides developers with the ability to do all kinds of things – and this latest batch of GPUs provides the horsepower needed to make them actually happen. 

Ray-tracing

Traditionally, games are rendered using a process known as rasterisation. In a rasterised environment, objects are comprised of a ‘mesh’ of tiny virtual triangles. The corners of these triangles are points of information called vertices (singular: vertex). Look at an object, and your machine will work out what to display for each pixel on the screen, depending on the vertices you’re looking at and the information attached to them.

Raytracing works very differently. It actually simulates bounced light, albeit in reverse. Rays are traced from the player’s point of view to the objects in the game world. Like light in real life, they’ll bounce from object to object – and this will inform the image onscreen. This creates extremely realistic results – for those whose cards have the power to take advantage.

Variable Rate Shading

This is a technique through which different parts of an image can receive attention from the graphics card proportional to their complexity. If there’s a big patch of blue taking up most of the screen (like the sky) and a complicated object in the foreground (like a human face), then it makes sense to devote more resources to the latter. Moreover, the player’s attention is often focused on just a relatively small part of the screen (often the centre), so changing the ‘shading rate’ for that portion of the screen can often yield substantial performance improvements without any major compromises in the actual experience. 

Think of an FPS or a racing game – how often do you really look at the edges of the screen? Would you really notice if these areas were rendered at half, or even a quarter the resolution? The answer is – probably not. But you would notice if your frame rate were kicked up proportionally. So that’s what VRS does!

The new Nvidia cards

We’ve already taken a look at the Nvidia 30 series in some detail when it was announced more than a month ago. Since then, the RTX 3070, 3080 and 3090 have found their way into Cyberpower machines – and they’re in high demand! 

Jensen’s claim that the 3070 would deliver equivalent performance to the last generation’s 2080Ti (a card which costs more than a grand) turned quite a few heads, and benchmarking from Digital Foundry and others has since proven him, more or less, correct. The major caveat here is that the 3070 has just 8GB of video memory to the 2080Ti’s 11GB, which might cause a performance ceiling in high-resolution gaming.

DLSS 2.0

The feature that promises to set the Ampere lineup apart from the competition is DLSS 2.0. Deep Learning Super Sampling (DLSS) allows games to be rendered at a lower resolution, and then intelligently upscaled using the company’s impressive AI Tensor cores, which are built into every card. This requires some groundwork on the part of the developer, who must provide high-resolution (16k) screenshots from the game to Nvidia’s AI, which then spits out a program that analyses every frame in real-time, sharpening edges and creating fidelity from nothing.

The new AMD cards

The AMD 6000 series is based on the RDNA2 (that’s Radeon DNA, or ‘Big Navi’) architecture, which boasts up to 5120 shaders spread across 80 ‘Computer Units’. It’s the successor to the RDNA1 architecture that dropped in 2019 (which, in turn, succeeded GCN). 

AMD is targeting an improvement of 50% in performance-per-watt with every generational leap – and they’re doing this by fiddling around under the hood. Without delving too deeply into the technical details, it’s worth knowing that RDNA2 is an evolution of what was going on in RDNA1, with a few extra capabilities thrown in there to take advantage of the DirectX12 spec.

benchmarks of AMD Radeon 6000 series

Given Nvidia’s extensive expertise in AI research, it’s probably not a surprise that AMD will need to play catch-up to provide an answer to DLSS. The answer comes in the form of what AMD calls ‘super-resolution’ technology, which will perform much the same task when it’s implemented post-launch.

As for the cards themselves, there are three to be concerned with. The RX 6800 and RX 6800 XT, which hit shelves on November 18th, and the RX 6900 XT, which arrives on December 8th.

What should I know before buying a GPU?

If you’d like to know which of the new cards are going to be better suited to playing your favourite game, then you’ll need to wait for independent benchmarks. 

There are, however, a few general considerations worth making. AMD is what’s going to power both new consoles – and thus their cards may benefit from greater optimisation in cross-platform games. If you’re going to be using adaptive sync (FreeSync or Gsync), then you’ll need to be sure that your monitor can be synced to the card you have in mind. Often, this can be achieved through a firmware update.

As ever, your needs will depend on the maximum refresh rate of your monitor, the resolution you’ll be playing at, and the games you’ll be playing. Games like Watch Dogs: Legion and Cyberpunk 2077 require a monstrous rig; games like Hades and Among Us are less demanding.

 

Cyberpower PC by Game range

Discover more from Blog | CyberPowerPC UK

Subscribe now to keep reading and get access to the full archive.

Continue reading