I was recently reading Tracy Kidder’s excellent book Soul of a New Machine.
The author pointed out what a big deal the transition to 32-Bit computing was.
However, in the last 20 years, I don’t really remember a big fuss being made of most computers going to 64-bit as a de facto standard. Why is this?
Gee really? Here’s me thinking 32-bit instruction sets were cosmetic. Thank you for ignoring the part where I said we’re still in a transition phase.
Also, with a bit of tinkering, you can run 16-bit applications. It’s just recommend to use virtualisation applications because Microsoft doesn’t ensure quality updates for 16 bit applications.
Point is that seldom used instructions are microcoded anyway, so they take zero space on the CPU.
I honestly don’t know when you are talking about. CPUs aren’t storage devices.