The original post: /r/hardware by /u/No_Narcissisms on 2025-01-21 00:29:33.
Original Title: Why is A.I preferred so much more along side the hardware instead of behind the hardware? How much performance is being replaced, that technically would exist by other means (Stream processors, clock rates, etc) by A.I?
In better wording, lets say for example, instead of DLSS/FSR being things which you can turn on/off, what if they were instead implementations placed behind the hardware instead where they exist in the amount of performance you’re getting. Something you can’t toggle on or off. What exactly makes the diffusion of A.I beneficial over a single unit implementation behind the hardware? Doesn’t having an A.I side load mean GPU designers technically are reducing the true potential of a card therefore selling us less-as-more no matter how much you spend?