Comments by "IIIRattleHeadIII" (@badass6300) on "NVIDIA... Stop trying to trick consumers! (RANT)" video.
-
3
-
2
-
@Cube930 Die size is the cost factor, along with the R&D. Nvidia doesn't pay for performance, they pay for die size, because they buy wafers, not individual chips and the smaller the die size the more chips per wafer, the cheaper the chip is, on top of being more power efficient and requiring a cheaper PCB and cooling.
Performance is a byproduct, it's not a cost factor.
And in gneeral the definition for a mid-range chip, regardless if it's a CPU, GPU, ASIC, FGPA, Trigger, etc is a chip that is roughly half the reticle limit.
So far the reticle limit has been 858mm^2 for years, until they invent something isn't based on UV.
389mm^2 and 392mm^2 is roughly half the reticle limit of 858mm^2, considering you never want to hit said reticle limit, never go above 95% of it. I'd say both the rtx 3070 and 4080 are mid-range GPUs.
Just like the gtx 1080 was a sub-mid-range GPU that should have costed 350$ but instead costed 600-700$... And if you go and see Nvidia's financial reports from 2016-2017 you'll see that their profit margins were between 90 and 138%, which is why their stock prices exploded back then and this was before AI... The RTX 4000 is just as bad, if not worse than the gtx 1000 series.
2
-
2
-
2
-
2
-
2
-
1