Your Cart

Is Nvidia’s 20 Series Disappointing? | Culture of Gaming

September 19th, 2018 was a very exciting day for PC hardware enthusiasts. Nvidia’s long awaited 20XX series (or 20 series) embargo was lifted, and the tech sphere came alive, publishing their performance results of the new cards for all to see. The overall reaction, to say the least, was at best mixed and at worst, disappointing. So why did Nvidia miss the mark with their new series of cards? After all, the previous 10 series launched in 2016, so the community has been eagerly awaiting something new from Nvidia that would appeal to them, especially with the GPU (graphics processing unit, or graphics card) market being what it is in a post mining craze. So what led to such a poor reception?

Gloomy Outlook

Rewind back to June of 2018. A mere three months ago. Computex 2018 was in full swing, and of course, Nvidia’s CEO Jensen Huang was in attendance. Making a statement, Huang said that “New, next-gen GPUs will not be available for a long time.” This news alone disappointed a lot of people. Keep in mind that earlier this year, even around this time the hardware market was in shambles. The ‘crypto craze’ was causing graphics cards to sell for double or even triple their MSRP, on top of that the price of RAM was inflated due to manufacturing shortages. Samsung, one of the world’s top three memory manufacturers, met these complaints of high prices by announcing that they are building new FABs (semiconductor fabrication plants) in order to meet demand, but it will be a ‘few’ years before they are fully built and operational. And to make matters even worse, Solid State Drives (SSDs), which were becoming staples in even entry level builds, looked like they were about to follow the same trend as RAM. The community was looking for some sort of pricing relief. Whether it was Nvidia announcing and launching a new line of cards to refresh the market, or at the very least, re-manufacture new stocks of the 10 series, or Samsung announcing they could magically produce more memory, people wanted something to be done. The hardware market was rough at the time (admittedly it still is, but the outlook is better now here in September) so Huang’s statement did not go over very well.

“You’ve Been Busy”

While Nvidia’s last mainstream series of cards was launched back in 2016 (the 10 series), Nvidia had been relatively quiet. Computing conventions would come and pass without any significant news from Nvidia for the enthusiast market. However, in the two years of semi-dormancy, Nvidia was actually hard at work.

In the time between the 10 series launch and now the 20 series launch, Nvidia made several computing strides. Primarily for enterprise and heavy work stations, Nvidia made significant and innovative progress with their massive Volta cards, devotion and development of deep learning AI, and developing smarter and smarter architecture to handle calculations better. However, to Nvidia’s credit, these goodies are not only seen in their enterprise business portion.

Many of the innovations and technology Nvidia has developed for professional purposes have trickled down to the enthusiast market, and the results can actually be seen in the 20 series of cards. The flagship feature and marketing tool of 20 series cards is real time ray tracing, a highly advanced way of computing lighting across surfaces in a scene being rendered in real time. Animated movies have used this technology years, and interestingly enough, one of the best examples of ray tracing can be found in Shrek, but that’s besides the point. Animated movies have the luxury and benefit of being pre rendered, where an animation studio has access to “render farms”, that is, rooms upon rooms full of PCs capable of rendering frames. This amount of computational power would be needed in order to figure out things like the highly accurate ray tracing for the scene. This is where Nvidia steps in. While previously highly accurate ray tracing would require massive amounts of power, Nvidia has gotten it to a degree of efficiency so that their enthusiast graphics cards can make these calculations in real time, i.e as the game is being rendered on your PC.

Another technology that made it over to the enthusiast side of things is their deep learning AI, to an extent. The 20 series cards, that contain the new Turing architecture (the previous architecture in the 10 series was Pascal), and this architecture has support for Nvidia’s DLAA system. DLAA, or ‘Deep Learning Anti Aliasing’ is a new, and allegedly more efficient way of de-aliasing a picture. De-aliasing, is essentially image edge smoothing. While previous techniques like MSAA (Multi-Sample Anti Aliasing) were performance hogs, Nvidia claims their DLAA is more efficient, and produces an overall superior quality image over other anti-aliasing techniques. All thanks to a Deep Learning AI, that is enabled in their new Turing architecture.

Nvidia has certainly not been slacking the past two years. All of these features, among others, have made it into their new 20 series of cards. But the blunt question that has to be asked is: “Do enthusiasts really care?”

MORE FRAMES!

As previously mentioned, the performance metrics for the 20 series are out. To put it shortly, the 20 series costs a lot more than the 10 series does and did, and in the case of the RTX 2080, it performs around 3% better than the previous GTX 1080 Ti. In essence, the RTX 2080 is effectively the same card as the GTX 1080 Ti, but with a few of the new features built in. Other than that, it performs the same. Where you start seeing real performance gains, on top of the previously mentioned features, is in  the RTX 2080 Ti, which boasts an average of a 28% performance increase on average over the the previous 1080 Ti at 4k resolutions. While the 2080 Ti has finally achieved ‘uncompromised’ 4k gaming in most titles (that is, not having to turn certain settings down to the lower end to get 60FPS at 4k resolution), and  offers Nvidia’s new suite of rendering technology, it costs a pretty penny. The MSRP of the 2080 Ti is $1199 USD. This works out to a price that is 78% higher than when the 1080 Ti launched. This is a massive price increase, which would explain why the dollar per frame graph for the 2080 Ti is so horrible.

So why would the 20 series be considered so disappointing?

Admittedly, all of the reviews surrounding the 20 series are positive. And when objectively evaluating the cards, it makes sense that a favorable review would be worked out. Where most of the community’s disappointment lies, is that the brand new features on the 20 series of cards, the things like real time ray tracing and the DLAA, are cool and all, but that’s not what a lot of people want in their enthusiast grade cards.

The endless quest of highest frame rates possible continue for the enthusiast community. So when ASUS has a monitor out on the market that has a resolution of 3840×2160 (4k/UHD)  and a refresh rate of 144Hz, many people in the community were hoping that the 20 series could start achieving these high frame rates to match the monitor. Unfortunately, that was not the case. While it is nice to have an ‘uncompromised’ above 60 FPS performance at 4K, it has come off as Nvidia “wasted” so to speak, more time on implementing what could be considered superficial features into their new line of cards. Sure, the ray tracing looks pretty, but in the heat of gameplay, it is something that won’t really be noticed. This same principle is the core of advice that gets passed around that in order to maximize frame rate, it’s okay to turn down details like shadows because in the heat of gameplay, the player won’t probably notice the difference between shadows on medium vs high. The same can be said for the new technology in the 20 series of cards. While the real time ray tracing looks incredibly pretty, it is not something that will necessarily be easily noticed in the pace of gameplay. In addition, it costs performance, and is ultimately up to the game developers to implement this technology into their games.

Conspiracy

So what happened? What was Nvidia thinking, tacking on superficial features instead of just trying to push maximum frame rates at 4k? What was Nvidia thinking releasing cards priced much higher than their predecessors, and in one  instance, performing about the same? While I have no insider knowledge, I believe their logic is easy to follow. Nvidia is certainly proud of their computational innovations. They want to expose it to as many people as possible, so they’ll implement these features into their latest mainstream graphics architecture. At the same time, they are able to pack enough transistors and cores onto the board that will push 4K just past the widely accepted 60FPS standard, while looking as pretty as possible. In their own words, ‘uncompromised’. Nvidia easily achieved the performance of cards to push 4K into the acceptable boundaries of frame rates without turning down settings, so their mentality shifted into packing as many new features as they could scale down into the cards, so the average consumer could get these benefits. As opposed to going for raw performance, Nvidia decided to strike a balance between performance and features.

That of course, is a very positive perspective, and gives Nvidia the benefit of the doubt. I must criticise the 2080, however. Its performance is stupidly lackluster for its price, barely holding a torch to the now much cheaper 1080 Ti. However, seeing the price and performance of the 2080 Ti leads to an interesting theory being proposed.

It goes something like this: the 2080 Ti wasn’t originally planned to be the flagship of an entirely new series of cards from Nvidia. It was originally supposed to be the next Titan card, and demonstrate the implementation of Nvidia’s new architecture. A semi-standalone card. Hence the price and performance. Nvidia however, figured they could “pull an AMD”, and relaunch effectively the same series of cards as the 10 series, but pack a new architecture with new features in, somehow justifying the price increase. Except for the high end, which boasts the most significant performance increase, but at a very high premium.

Make what you will of the theory. But as a result of the 20 series, I predict the GPU market will at least calm down some, and possibly normalize. Perhaps Nvidia’s price increase is meant to deter miners from scooping up the cards? Merely a playful speculation. Another prediction is that the GPU market will stagnate, much like the CPU market did before Ryzen, especially considering AMD doesn’t have any significant GPU announcements laid out this year.

Of course, the other side is that Nvidia is capitalizing on their majority market share, and are pricing things higher because they know they can get away with it. Only Nvidia knows their true intentions.

Are People Actually Disappointed?

The original question of disappointment. That is of course, for you to decide. From an objective standpoint, it is hard to deny Nvidia a victory here. For someone who likes to “wall-lick” (that is, turn everything to max in a game and stare at the pretty textures) also a victory. For someone hoping to make use of their new $2000 ASUS 4k 144Hz monitor, this series is a disappointment. The same with the consumer who was eagerly awaiting the 20 series to upgrade their maybe older card. For said consumer, unless they absolutely have to have things like real time ray tracing, it is probably the smarter choice to pick up a 10 series card, whose prices will most likely go down as a result of the new series of cards.

Finally, my personal opinion. As someone in the camp of trying to achieve the glorious 4k 144Hz benchmark (even though my monitor is 60Hz), I was disappointed in Nvidia. However, after writing this piece, and thinking about a lot surrounding the 20 series, I came to an understanding of what Nvidia was trying to do. And while I certainly won’t be buying a 20 series card, a market certainly exists for this line of cards. It will also be interesting to see how cards like the future 2070 and 2060 stack up.

So what’s your opinion? Let me know in our community Discord.

Leave a Reply

Latest Reviews