x86 came out 1978,
21 years after, x64 came out 1999
we are three years overdue for a shift, and I don’t mean to arm. Is there just no point to it? 128 bit computing is a thing and has been in the talks since 1976 according to Wikipedia. Why hasn’t it been widely adopted by now?
Other people have addressed why 64-bit is still fine, but I just want to say that “x86” and “x64” are not two different architectures the way that you’re presenting them. We still use the x86 architecture, it’s just that x86-64, or AMD64, or whatever you want to call it, is a 64-bit extension of that architecture.
And this isn’t the first time that happened; the original 8086 was a 16-bit processor, as was the 286. The 386, however, was a 32-bit processor with backward compatibility for the 16-bit software built for the 16-bit x86 CPUs.
The 386 came out in 1985, so there’s actually a 14 year gap, though actually actually an 18 year gap because a 64-bit x86 processor didn’t actually hit the market until 2003. And then there was a 7-year gap between 16 and 32-bit x86.
But ultimately as other people have said the answer is that we don’t need to go beyond 64-bit right now, and the reason there was such a short gap between 16 and 32-bit processors was because the limitations of a 16-bit architecture became practical obstacles to progress faster than they did for 32-bit, and it’s going to be much longer than that for 64-bit because the address space has grown exponentially, not linearly.