Nvidia should release a 3090
Nvidia should release an RTX 3090
X90 branding from Nvidia means multiple GPUs on a single card
Some of my readers will remember, and other may be too new for Nvidia's X90 branding. I may be dating myself a bit here, but the first occurrence of it that I knew was the Geforce GTX 690, a single card featuring two separate Kepler GK104 GPUs released in 2012. Many readers probably remember the Fermi-based GTX 590 released roughly a year before, or even the GTX 295 a couple years before that. These cards were known to be large, hot, and powerful, but also somewhat limited by the fickle mistress that was proper SLI implementation from developers.
While we're not going to talk a whole lot about these dual GPU cards, why they were once somewhat popular, and why they've faded away. However, what's relevant to the discussion is that today in 2020, using multiple GPUs in gaming is largely dead. There aren't enough people that use it, so there aren't enough devs working on good implementations in their games, making it not worth it, even if someone was to want it. In fact in some cases, the overhead presented in having the cards communicate with each other can cause decreases in performance when the game developer doesn't design for multiple GPUs.
While we're not going to talk a whole lot about these dual GPU cards, why they were once somewhat popular, and why they've faded away. However, what's relevant to the discussion is that today in 2020, using multiple GPUs in gaming is largely dead. There aren't enough people that use it, so there aren't enough devs working on good implementations in their games, making it not worth it, even if someone was to want it. In fact in some cases, the overhead presented in having the cards communicate with each other can cause decreases in performance when the game developer doesn't design for multiple GPUs.
The result is that today, SLI is dead. The 690, released just over 8 years ago, is the last consumer GPU to feature dual GPUs. I'm not counting either the R9 295x2 as it cost $1499 at launch in 2014, nearly 3x the price of its single card variant, or the Titan Z a month before that for an eye-watering $2999 price point.
It's time to let the X90 branding represent something other than mGPU cards
Since SLI is dead and we haven't seen any consumer multi-GPU card since 2014, or any gaming mGPU card that could in any way be argued for since 2012, it's fair at this point to dismiss the notion of reserving the X90 naming for an mGPU card. Until Nvidia unlocks the secret sauce of proper MCM (Multi-Chip-Modules) that would allow for cards to have multiple GPUs on them without requiring that devs take extra work to fully use them in games, I would say the chances of getting a Geforce card with more than one GPU on it is approaching nil.
The history on how 80ti branding for the top came about
Many of our readers will probably know that not every consumer graphics card uses the same physical chip, but I'll briefly describe it looking at a few generations of cards. It's common for GPU manufacturers to make only a few different GPUs to make up their entire product stack. In the RTX 2000 series of GPUs, Nvidia has only 3 different GPUs; the TU102, TU104, and TU106. Counter-intuitively, as the number gets larger, the size of the GPU die decreases.
Note on SM: What AMD refers to as a CU, or Compute Unit (you may have heard the PS5 will have 36 CUs and the Xbox has 52CUs), Nvidia refers to as a Stream Multiprocessor. It's the smallest individually addressable chunk of the GPU, which contains within it 64 CUDA units, as well as in some cases, other sets of things such as Tensor Cores, or Ray Tracing cores, or Int32 cores, each with their own specializations. In general, more of these means more performance.
Product |
GPU die |
SM count Active |
SM count Total |
Launch Price |
RTX 2060 |
TU106 |
30 |
36 |
$349 |
RTX 2070 |
TU106 |
36 |
36 |
$499 |
RTX 2080 |
TU104 |
48 |
48 |
$699 |
RTX 2080 Ti |
TU102 |
68 |
72 |
$1199 |
As you can see, the 2080ti is actually a different die from the 2080. This alone is part of my reasoning for the X90 to be brought back. In fact, for both the 900 Maxwell and 1000 Pascal series, the X70 and X80 both used 104 GPUs, making the 970 and 1070 more similar to the 980 and 1080, than to the 980ti and 1080ti. However there was an argument for it.
Before the 700 series, the top-of the line Nvidia GPU was the dual-GPU denoted by the X90 series with the X80 being the top single GPU. With the release of the 700 series, this changed somewhat. They initially launched the 700 series with the flagship 780, something that had been done plenty before. However, a half year later, they released a full version of that GK104 with more performance. Since it was a consumer version of the same GPU that was faster than the X80, but not a dual-GPU, it made a lot of sense to call it the 780ti which they did. This marked the first case of the top-end X80ti branding which is a trend that has continued to this day.
At the time they likely took this strategy because the full uncut version of the GK104 was being sold under the titan branding for $999, which made sense given their yields of the GPU for a while, there came a point that the Titan was not selling well enough, so they had a pile of full GPUs and figured it was better to release a new top-end uncut GPU that they could make more profit on. But they realized that the card sold extremely well as they had a new top-end card prompting a whole new wave of enthusiast adoption.
The success of the 780ti led them to do exactly the same thing again next time with Maxwell. In 2014 they released the 970 and 980 both based on the GM204 GPU, then halfway through their product lifecycle released the 980ti based on the larger GM200. They did the same thing again with the 1000 series, initially releasing the 1080 based on the GP104 in 2016 and the 1080ti based on the larger GP102 in 2017. Each time they released a new series, the X80 was faster than the previous X80ti, meaning that for people who constantly craved the top-end GPU, they would pay twice as often for the new top.
Everything changed when the RTX attacked
With the release of the 20 series, however, this all changed. Since the addition of Ray Tracing and Tensor cores inflated the size of the SMs, they couldn't afford to add as many, even though they did increase performance per SM. The 2080 ended up barely matching the 1080ti in performance with fewer SMs in total and less RAM. Nvidia, knowing they would not get the desired sales of the 2080 which only matched the performance of the 1080ti, released the 2080ti at the same time, but at a massively inflated and unheard of (for a consumer single GPU card) price of $1199. While some have argued the cost was to justify the R&D of RT cores or because of the large size of the die, given the very low price of the mature 12nm node, there were probably two main motivations for the price. First was to recoup the loss from being unable to launch an 80ti halfway though the cycle prompting enthusiasts to upgrade for a second time, and the second was the lack of competition from Radeon at the time led them to experiment with what they could get people to pay.
But why not just keep X80ti branding. it's fine, right?
I'm saying the 3090 should be the next top-end single-GPU card. However, they should have started this with the 2000 series, and instead of releasing the 2070, 2080 and 2080ti, they should have released the 2070, 2080, and 2090. The main reason for this is evidenced by the presence of the Super branding. They initially launched the 20 series intending for it to just be their lineup until the 30 series came out. However, Radeon became a threat with the release of Navi, something they didn't anticipate. In response to Navi, they had to do a mid-generation refresh. It would have been most convenient for them if they could use their established Ti branding for the refresh, something they'd done many times before. However they already had the 2080 and 2080ti. Releasing the 2080ti alongside the 2080 ended up being a problem for them. This is the reason they released the Super series, as they had to come up with a new way to say there's a mid-generation upgrade. This meant the confusing situation of having a 2080, a 2080 Super, and a 2080 Ti. Had they released the 2060, 2070, 2080, and 2090, they could have made a 2060 Ti, 2070 Ti, 2080 Ti, and even had the space to make a 2090 Ti with the fully unlocked TU102 chip with 72 SMs (4 more than the cut-down TU-102 in the 2080ti), again prompting enthusiast upgrades.
Summary
There are several points explored here supporting the 3090 branding.
- It's been 8 years since X90 meant dual-GPU. Many people probably don't even remember that X90 means dual-GPU, and certainly not most of the general community.
- Dual-GPU cards are probably not coming back anytime soon, and if they do, new branding can be made for them. Maybe Titans, like we saw with the Titan Z.
- Nvidia is still using the 104-tier GPU for the 80-tier cards, but they've brought in the 102-tier GPUs for the 80ti tier. It would make more sense for the 102-tier GPU to be represented in a halo-tier 90 series.
- We can see in the nonsense with the 20 series Super shenanigans why releasing the 2080ti alongside the 2080 bit them in the ass. Having released a 2090 instead of 2080ti would have given the flexibility to use the Ti naming later on.
- An extra one not discussed earlier, having the 2090 as the $1199 GPU would have made marketing a LOT easier, as they wouldn't have been going from a $699 1080ti to a $1199 2080 Ti prompting massive complaints. They could have simply argued that the 2090 was a new product tier, and they were releasing the 2080 at the same $699 price the 1080 initially launched at, and is higher performance than the 1080 was.
I'm going to be honest, I'm about 97% sure that there will be no 3090. To be completely honest, I expect that they're preparing the 3080ti, and depending on how well the 3080 competes with Big Navi, they may or may not even launch the 3080ti till next year. I'm not predicting that they will release a 3090. I'm just saying they should.
Thank you for reading!
As always, come visit me in Discordland! I've got a little flourishing community that's so far been rather active and we've had a lot of great tech discussions with, and I'd love more people around to tell me why I'm wrong.
Twitter: https://twitter.com/MeyerRants
Patrons: I'm going to give a shout out here to my Patron, KarbinCry. That's right, I have a Patron.
Comments
Post a Comment