I was recently reading Tracy Kidder’s excellent book Soul of a New Machine.

The author pointed out what a big deal the transition to 32-Bit computing was.

However, in the last 20 years, I don’t really remember a big fuss being made of most computers going to 64-bit as a de facto standard. Why is this?

  • bankkopf@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Memory bus width != CPU Bitness

    Those two numbers describe totally different things.

    For GPUs the bitness number is usually used to describe the width of the memory bus that is how much data can be “transported” concurrently.

    For CPUs the bitness describes the size of the data that can be processed at any given time. With AVX, CPUs can handle data vectors that are up to 512-bit long.

    GPUs are in fact 64-bit processing units, the largest data type they are designed to handle are 64-bit double precision floating-point numbers.