• monte1ro@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I mean whats the point? The 5700X3D will be basically a slower 5800x3D. Seems odd to me. I would have liked to see more CPUs with v-cache like a 5500X3D and v-cache on APUs.

      • TheDarthSnarf@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        5700X3D will be basically a slower 5800x3D. Seems odd to me.

        Allows them to sell lower binned chips that didn’t pass all the quals to meet the requirements to be sold as a 5800x3D, but can still be used at lower clock speeds without issue.

      • sequentious@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        My first thought would be power usage – 5700X is 65W, while the 5800X & 5800X3D was 105W. However, that turns out to not be true, as a quick check of the 5600X3D has it rated at 105W, while the 5600X was 65W. So that’s not it.

        I suspect the real point is probably just yields, the same reason the 5600X3D existed (and maybe the 5700X itself, iirc). These may just be the same chips, with 8 working cores (vs 6 for the 5600X3d), but clocked lower.

      • UndergroundMartyn@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        There’s also an NPU version coming supposedly. In any case it would be better to see what they release before buying something. I was just gonna upgrade to the cheapest 5600 available.

        I don’t think I need the extra power right now. Most of my games run just fine. Except Starfield, but it’s a shitty game anyway.

      • WayDownUnder91@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        leftover chips that dont meet EPYC power draw requirements but don’t clock high enough for a 5800X3D so instead of throwing it away they sell a slightly worse chip, basically how all gpu/cpu binning works.
        They are reusing something that already exists not making something entirely new.