So you’ve built a gaming PC with enough RGB lights to light up a 70’s roller-rink and hardware powerful enough to run your videogame, editing software, music, web browser and live stream, all at the same time. (Disclaimer: no-one needs to do that, ever.) Now it’s time to put that power to use, after all, why bother putting all the effort into your fancy gaming rig if you are running it on an ancient CRT monitor?
With 4K resolution being all the rage right now (let’s not get into discussions over 8K upwards), it’s easy to think that all you need to do is pick up a 4K screen from a good brand and hope for the best. Unfortunately, as any technology enthusiast will know, that is not the case. There are numerous considerations to make, including refresh rate, high dynamic range, input ports and now G-Sync and FreeSync.
Nvidia introduced G-Sync back in 2013, with an initial range of small, bulky monitors and expansion modules which were quickly abandoned. With the passage of time and the inevitable AMD response to the technology in the form of FreeSync, the range of screens available today has increased tenfold and the top brands for gaming tech, including Acer and ASUS, have released impressive high-end screens incorporating the technologies.
So what do G-Sync and FreeSync actually do? You may have noticed screen-tearing and artefacts in busy situations when gaming on your PC. This occurs when there is a misalignment between the number of frames that are being rendered by your graphics card, and the refresh rate of your monitor. G-Sync screens include an in-built chip which essentially manages this misalignment and for the most part eliminates tearing without any additional cost to your graphics card’s load. FreeSync uses a similar method but the technology is handled directly by a dedicated part of your graphics card.
G-Sync and FreeSync are essentially an evolution of existing V-Sync technologies which have been used in games for years. When V-Sync (or one of its various iterative forms) is enabled it eliminates screen-tearing but can cause a significant load on your graphics card and will commonly result in frame rate drops or visual stuttering instead. G-Sync and FreeSync seek to eliminate the tearing with no extra graphical processing cost by managing the screen-tearing in isolation.
I recently purchased an Acer Predator XB280HK G-Sync monitor, which is 28” and has a native 4K resolution. The screen is far from the best on the market, as it lacks HDR compatibility, has a refresh rate of 60HZ and some would argue that 28” is too small for 4K. However, for an entry point into 4K gaming at a decent price, I have found the results of G-Sync to be astounding. My gaming rig boasts an Nvidia GTX 1080 GPU, Intel i5 7600k processor and 16GB DDR4 RAM. This setup has proved more than sufficient to run the games I have tested on high settings at 4K on this screen, at 60 frames per second, with minimal drops, thanks to both my rig’s hardware and G-Sync. I have tested a number of games at the same settings with G-Sync disabled and found either significant screen tearing or, with V-Sync on, huge frame rate drops.
It comes down to individual choice as to whether the advantages offered by G-Sync and FreeSync are enough to justify the premium cost, as most of these screens are significantly more expensive than their non-G-Sync/ FreeSync counterparts. Some gamers may also be more interested in other new screen technologies, such as high dynamic range, which is arguably just as much of a leap in quality (at least in respect of colour depth) as a jump from 1080p to 4K resolution. Larger screens with G-Sync and FreeSync are also much harder to come by, a 28” screen will certainly not replace your 55” living room TV. G-Sync monitors featuring all of the desired elements (let’s not forget a high refresh rate!) come at significant cost.
My experience of G-Sync has been entirely positive. Based on my personal setup (which I obtained purely by the tried and true method of bargain-hunting), it is hard to see that I would be able to game at 4K so smoothly without the use of G-Sync. Gaming at 4K even on a smaller screen comes with unexpected benefits too, I am particularly picky when it comes to jagged edges in games, having always wacked the anti-aliasing settings up their highest in the past. I have noticed a reduction in the appearance of such jaggies when playing at a higher resolution. An impressive example is Creative Assembly’s wonderful Alien: Isolation, which uses a proprietary engine that features shader technology that unfortunately gives the appearance of jagged edges which cannot be rectified by anti-aliasing. A small flaw in an otherwise stunning game. However, by bumping the resolution up to 4K, I have noticed a considerable reduction in the appearance of jaggies in this game, improving the visual fidelity and adding to the atmosphere of horror the game designers worked so hard to create.
With E3 just around the corner (the annual press conference where the big players in gaming make their key announcements), Microsoft stands poised to lift the veil on its upcoming project ‘Scorpio’, the 4K iteration of its Xbox One console. The Scorpio is slated to include support for FreeSync technology and to use next-generation HDMI hardware. If the past successes of consoles spearheading new technologies is anything to go by (see: PS3 drives the success of the Blu-Ray format) we will be seeing much more of G-Sync and FreeSync in the future.