Ever since G-SYNC technology came to the fore, there has always been a key aspect of every monitor that used one of its modules that has been criticized: the price premium. FreeSync came in as the competition, but it really isn’t, nor has it been or even pretends to be, since it doesn’t use physical hardware and is all software synchronization as G-SYNC Compatible, and therefore there is no price increase, but do you know how much a G-SYNC module really costs? You’ll be amazed.
If you’re one of those users who complains about the overpriced G-SYNC monitors, after this article you might even think it’s a bargain. We’re not going to get into the comparison of what a FreeSync vs G-SYNC or G-SYNC Compatible monitor is worth, it’s nonsense mainly because they don’t compete for the same user or gaming experience, but after a little research at Igor’s Lab we now know a bit more about what NVIDIA does with these physical modules and chips.
G-SYNC vs G-SYNC Ultimate, Different Modules
It may seem obvious as such, but the reality is that as a general rule, and especially if the monitors to be compared are very different in size and performance, NVIDIA uses different G-SYNC modules by testing a series of qualities for the panel.
The one we are going to discuss here belongs to the best of the brand, G-SYNC Ultimate, and the monitor to be discussed has been an AOC Agon AG353UCG, which is between 1800 euros and 2300 euros depending on the country and the store where you look. In this type of monitor and in many others there is a serious problem with the NVIDIA module, and this is that it gets very hot.
For this reason, many manufacturers have had to resort to active dissipation through a small fan, but not before including a cooling system that could well be valid for some low-end CPUs. What we can find behind this heatsink with 6 (which is early to say) is an MXM card with an Altera Arria 10AX048H2F34E1HG-ND chip coupled with two VRAM modules signed by Micron (in most monitors).
This card is actually an FPGA that arrives signed by Intel and that curiously is manufactured by TSMC at 20 nm, boasting a voltage of between 0.8V and 0.9V to operate and with a temperature to work between 0 to 100 ºC. However, its power consumption is not specified.
A crazy price for the G-SYNC module
If we look for this chip in any listing of the main vendors, we can findWe’re going to see something really surprising: $2,604 if we decide to buy just one. Assuming NVIDIA buys literally millions of units, how much could the price drop in the end? We don’t know, and we’ll probably never have that information, but considering that it’s hard to see a monitor with a G-SYNC module for less than $800 these days… What’s going on?
How can NVIDIA offer manufacturers an MXM card with its module at an interesting and low price? If it wasn’t possible to lower the price a lot (a lot in fact), who would bear the cost of selling a monitor for almost the price of the chip itself?
And of course, seeing the price and assuming a brutal discount for the purchase of millions of units, why doesn’t NVIDIA design its own FPGA chip? Could the purchase of ARM also have something to do with this? Unanswered questions that, if events unfold, we may be able to find out.
In any case, we may not find G-SYNC monitors so expensive after all.