The only thing keeping 4080(and 5080) cards “reasonably” priced is the fact that they only have 16GB, therefore they arent that good for ai shit. You dont need more than 16gB vram for gaming. If those cards had more vram, the ai datacenters would pick them up, keeping their price even higher than it is.
You kinda can… Nvidia card users have been having the toughest time with the Hunt Showdown update because CryEngine is happily gobbling up VRAM. For AMD cards it’s not a problem but various Nvidia card owners have been having bad experiences running at the resolutions they normally do.
Maybe 16GB is the number where things are okay, I haven’t heard complaints on cards above 12GB. However, point being… Nvidia being VRAM stingy has bit some folks and at least one game developer.
The only thing keeping 4080(and 5080) cards “reasonably” priced is the fact that they only have 16GB, therefore they arent that good for ai shit. You dont need more than 16gB vram for gaming. If those cards had more vram, the ai datacenters would pick them up, keeping their price even higher than it is.
I have a 7900xt and was using over 17gig in Jedi Survivor. No ray tracing, no frame gen. Just raw raster and max AA.
Granted, that’s because that game is so horribly optimized. But still… I used more than 16gig.
You kinda can… Nvidia card users have been having the toughest time with the Hunt Showdown update because CryEngine is happily gobbling up VRAM. For AMD cards it’s not a problem but various Nvidia card owners have been having bad experiences running at the resolutions they normally do.
Maybe 16GB is the number where things are okay, I haven’t heard complaints on cards above 12GB. However, point being… Nvidia being VRAM stingy has bit some folks and at least one game developer.
Still 32 seems EXCESSIVE.
Its very low imo if you want 4k gaming to work
nVidia & production yields decide/plan how much RAM they are gonna give them.
If it made financial sense (a market existed) nVidia would stop making desktop cards overnight.
For VR, you do already.