imo people are over exaggerating. let’s take the 4070 ti as an example. an $800 gpu that can be compared to a 3090 ti. that is overpriced to you? an $800 gpu that fast? let’s compare to the gtx 1080 ti, or the best nvidia gpu. the 1080ti launched at $700 around 6 years ago. 4070 ti at $800 in 2023. for those $100 more in 6 years you get the performance of an enthusiast gpu in 2023, and you still complain? over double the performance that’s limited by physical factors like not being able to forever decrease transistor size without increasing cost, silicon shortage and thermals is not enough? the 12gb of vram for really heavy workloads like video editing is still just fine if you’re working at a reasonable resolution, and for games, the devs should stop increasing texture size to get an unnoticeable increase in graphics quality.
also i don’t think anything over a 4060 ti/4070 should be used purely for gaming, like bro you’re spending over $1000 on a toy, if you have something faster at least use it to make money or learn a new skill (like work in 3d software, ai training, video editing etc). any gpu over $600 should be mainly targeted at creative people, for who the price tag isn’t a problem since they can easily repay it using exactly the gpu they invested in
To be honest though, AMD is really trying either. They just undercut Nvidia by 5-10% and call it a day. Gone are the days where they would undercut by 30-50% and force Nvidia to up their game. Both companies used to target a flagship price point of $699 and now we routinely have GPUs in the $1000-$2000 price range.
imo people are over exaggerating. let’s take the 4070 ti as an example. an $800 gpu that can be compared to a 3090 ti. that is overpriced to you? an $800 gpu that fast? let’s compare to the gtx 1080 ti, or the best nvidia gpu. the 1080ti launched at $700 around 6 years ago. 4070 ti at $800 in 2023. for those $100 more in 6 years you get the performance of an enthusiast gpu in 2023, and you still complain? over double the performance that’s limited by physical factors like not being able to forever decrease transistor size without increasing cost, silicon shortage and thermals is not enough? the 12gb of vram for really heavy workloads like video editing is still just fine if you’re working at a reasonable resolution, and for games, the devs should stop increasing texture size to get an unnoticeable increase in graphics quality.
also i don’t think anything over a 4060 ti/4070 should be used purely for gaming, like bro you’re spending over $1000 on a toy, if you have something faster at least use it to make money or learn a new skill (like work in 3d software, ai training, video editing etc). any gpu over $600 should be mainly targeted at creative people, for who the price tag isn’t a problem since they can easily repay it using exactly the gpu they invested in
$800 is a lot if you have a family to feed.
or if you live outside the US.
And to what? Have some shinier and more reflective visuals in games a few hours per week?
More like physically accurate lighting. But ignorance is bliss I guess.
And also considering amd has better performance at that price, really no reason to buy nvidia only if you want a 4090 or rtx
To be honest though, AMD is really trying either. They just undercut Nvidia by 5-10% and call it a day. Gone are the days where they would undercut by 30-50% and force Nvidia to up their game. Both companies used to target a flagship price point of $699 and now we routinely have GPUs in the $1000-$2000 price range.
Where I come from, that’s called overpriced. ‘Not trying’ is giving AMD kiddie gloves while dumping the blame entirely on Nvidia