Sunday, 31 January 2021

What is VRR? Variable refresh rate explained

What is VRR? It’s one of many acronyms floating around TV functionality these days, but you’ll want to get your head around it to make sure you’re fully informed when buying a new TV – especially if you’re going to be using it to play on an Xbox Series X / Xbox Series S or PS5.

VRR, or ‘variable refresh rate’, as it’s known, is a key feature for getting a smooth, artefact-free picture when gaming – ensuring a clean image for both offline and competitive games.

But how does it work, and how much does it actually make a difference? You’ll find all the answers in the guide below.

What is VRR?

The main job of VRR is to eliminate screen tearing when playing games. Tearing is a kind of visual glitch, where the image on your TV shudders mid-frame before carrying on as before. But what is actually going on here? 

Screen tearing happens when your TV’s refresh of its image is out-of-sync with the rate at which  your console or PC graphics card delivers frames. You end up with an on-screen image that sees, for example, the top half of the screen display one frame and the bottom the next. 

This happens because TVs don’t refresh their entire screen image instantly. The driver of a display rapidly scans down the screen, usually from top to bottom, updating the state of each pixel. It just happens too fast for our eyes and brains to notice, until it causes a visual aberration. 

Tearing becomes noticeable when, for example, you use a 60Hz TV and the game’s framerate vacillates between 45fps and 60fps. It’s particularly obvious in fast-motion games like first-person shooters. Turn around quickly in-game and the difference in on-screen information can be hugely different from one frame to the next. 

It’s a jarring look. 

VRR eliminates this by syncing the refresh rate of the display to that of the console’s output. You get no more tearing, with no performance hit because the console or PC is the pace-setter, not the display. 

The Last of Us 2

The Last of Us Part II (PS4) (Image credit: Sony/Naughty Dog)

VRR over HDMI 2.1

This concept of matching display refresh to rendered frames in nothing new, but the tech levelled up recently and became far more accessible. 

VRR is now part of the HDMI 2.1 standard – that also supports eARC – and is a feature of the next-gen Xbox Series X, Series S and PS5 consoles. 

Frame sync is no longer just for PC gaming nerds – and VRR supports resolutions up to 4K and frame rates up to 120fps, which is the current ceiling of what these consoles and the most popular TVs can output. 

VRR via HDMI 2.1 is an important standardization of the process, because before this we had to rely on G-Sync and FreeSync instead. They are the proprietary techniques from Nvidia and AMD, and arrived long before HDMI 2.1. While you do get G-Sync on LG OLED TVs, for example, it’s not as widespread across smart TVs as VRR.

VRR support: which TVs, graphics cards and consoles have it?

OK, so we already know the latest Sony and Microsoft consoles support VRR. But what else does?

Perhaps surprisingly, the Xbox One S and Xbox One X do too. They use AMD FreeSync, because they have AMD graphics processors, but have also been updated to support VRR over HDMI.

The most tech savvy among you may wonder how that’s possible when the Xbox One X and One S are not HDMI 2.1 consoles. 

Here’s where things get a little more confusing. HDMI 2.1 is not a single standard, but a collection of technologies. It’s a little like 5G in that respect. Some HDMI 2.0 devices support VRR over HDMI, but the lower bandwidth of HDMI 2.0 means it works at up to 60Hz rather than 120Hz in the Xbox One X.

This HDMI fragmentation is also why some of the latest HDMI 2.1 TVs don’t support VRR: it is not a given just because you have an HDMI 2.1 socket. It will be less of a headache by the end of 2021, when VRR over HDMI will likely become a standard feature of mid-range TVs and higher.

But as it’s a patchwork of support right now, here’s an overview of which of the most popular high-end TV and console/GPU series support VRR. 

Consoles

  • Xbox Series X: HDMI / FreeSync
  • Xbox Series S: HDMI / FreeSync
  • Xbox One X: HDMI  / FreeSync
  • Xbox One S: HDMI / FreeSync
  • PS5: HDMI
  • PS4 Pro: N/A
  • PS4: N/A
  • Nintendo Switch: N/A

Graphics cards

  • Nvidia RTX 3000 series: HDMI / G-Sync
  • Nvidia RTX 2000 series: HDMI / G-Sync
  • Nvidia GTX 1000 series: G-Sync (using DisplayPort connector only)
  • AMD Radeon RX 6000 series: HDMI / FreeSync
  • AMD Radeon RX 5000 series: HDMI / FreeSync
  • AMD Radeon RX 500 series: FreeSync

TVs

  • LG OLED CX/GX range: HDMI / FreeSync Premium / G-Sync
  • LG OLED BX range: HDMI / FreeSync Premium / G-Sync
  • Sony OLED A8: N/A
  • Panasonic HZ2000: N/A
  • Panasonic HZ1000: N/A
  • Samsung Q90T/Q95T: HDMI / FreeSync Premium
  • Samsung Q80T: HDMI / FreeSync 

What does this tell us? Top-end Samsung and LG TVs are easily the best around in terms of catering for the features of the next-gen consoles. 

However, there are some other complicating factors. 

LG CX OLED 48-inch

LG CX OLED (2020) (Image credit: LG)

Problem one: refresh rate range

Each VRR-capable TV or monitor has a working range, the variety of refresh rates at which it can operate while using VRR. This is typically something like 40-120Hz, as in the wonderful LG CX OLED. 

This means it will not work for games that massively prioritize visual quality over frame rate, and aim for 30fps performance. However, there is a solution. 

Some VRR displays have a feature called LFC (low framerate compensation). This makes the screen refresh at double the rate of the rendered frames. So they remain synced, but the TV works twice as hard.

It’s important because while the Xbox Series X and PS5 are marketed as ‘120fps’ consoles, 30fps games will likely live on. Why? By aiming for a lower frame rate, and perhaps even sub-4K resolution, developers can use more of a console’s power for advanced ray tracing lighting, texture or shadow effects. They will likely improve immersion more than a high frame rate in slower-paced adventure games. 

Problem two: AV receivers

We have bad news. You may also need to upgrade your home cinema receiver if you have a traditional surround sound setup, as it needs to support VRR as well. And unless you have a brand new receiver, it almost certainly doesn’t right now. 

Thankfully, there’s a workaround. 

You can connect your PC or game console directly to your TV, and use either the TV’s optical audio output or an ARC or eARC-enabled HDMI socket to send the audio to your receiver. 

ARC and eARC then turn one of your TV’s HDMI inputs into an audio output.

eARC (enhanced audio return channel) is the better of the two. Its higher bandwidth connection allows for pass-through of very high bit-rate formats like Dolby TrueHD and DTS-HD. 

AV receiver

(Image credit: TechRadar)

Wait, what about FreeSync, V-Sync and G-Sync?

To fully understand why VRR over HDMI 2.1 is special, it’s a good idea to look back to the precursors of this technology. Let’s start with V-Sync, the original solution to the image tearing problem. 

V-Sync flips things around by making the graphics processor work at the speed of the display’s refresh rate, which would traditionally have been 60Hz. The graphics processor times its delivery of frames to match the display’s ability to draw them.

Tearing is solved, but similarly jarring visual issues pop up if the renderer speed can’t match the pace of the display refresh. You’ll see points where the same frame is displayed twice or more in a row, resulting in judder caused by what is an intermittent halving (or quartering) of the frame rate.

This was addressed with Adaptive V-Sync, introduced by Nvidia in 2012. It simply switches off V-sync when your frames-per-second count dips below the refresh rate of the monitor. 

Neither of method was ideal, leading to the introduction of Nvidia G-Sync in 2013 and AMD FreeSync in 2015. These are very similar to the VRR implementation in HDMI 2.1, making the screen alter its behaviour rather than the PC.

VRR's OLED problem

Now we’ve given you some idea of the history of this tech, we need to pull back to get you a more technical view on what’s happening behind the scenes. In some senses VRR, G-Sync and FreeSync don’t actually change how much the screen behaves as much as you'd imagine.

Much of the display’s behaviour is still determined by its maximum refresh rate. Let's take a 120Hz TV as an example.

It can refresh its screen image 120 times a second, or once roughly every 8.3 milliseconds. Each interval is a window of time in which the TV can draw an image, and these remain the same regardless of the refresh rate VRR seeks to emulate.

The display simply waits for the frame to be finished and then slots it into one of these 8.3ms windows.

Samsung Q80T

Samsung Q80T QLED TV (2020) (Image credit: Samsung)

There’s no major problem here for LCD TVs, because of how they work. An LCD’s display pixels’ state and the light that illuminates them are somewhat independent. LCDs, including Samsung QLEDs, have LED backlight arrays that sit either behind the pixels or at the sides of the screen. 

OLED TVs have light emissive pixels and this seems to affect their performance when using VRR. Here are some impressions from oft-TechRadar writer John Archer, over at Forbes:

“The biggest issue, and one that affects both 2019 and 2020 LG OLED sets, is that when VRR is activated, the image undergoes a brightness/gamma shift that makes dark areas in games look greyer and more washed out than they do with VRR turned off. I’ve seen this for myself recently on an LG OLED48CX.”

Altering the gamma curve could be a technique used to moderate display brightness and avoid the flickering VRR can cause in some panel types. Oh, and some LG OLED owners have complained about VRR-related flickering too. 

What does this mean? The perfect OLED TV for VRR gaming hasn’t been made yet. But you can be sure it’s in the works.



from TechRadar - All the latest technology news https://ift.tt/3rdBJrr

No comments:

Post a Comment