I was recently reading Tracy Kidder’s excellent book Soul of a New Machine.

The author pointed out what a big deal the transition to 32-Bit computing was.

However, in the last 20 years, I don’t really remember a big fuss being made of most computers going to 64-bit as a de facto standard. Why is this?

  • AcanthisittaFlaky385@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Gee really? Here’s me thinking 32-bit instruction sets were cosmetic. Thank you for ignoring the part where I said we’re still in a transition phase.

    Also, with a bit of tinkering, you can run 16-bit applications. It’s just recommend to use virtualisation applications because Microsoft doesn’t ensure quality updates for 16 bit applications.