Similar to how the term ‘digital’ has shifted from doing numeric calculations and storage to how it’s used today, referring to content you purchase and stream or download from the internet.
Terminology changes over time, and I’m not sure that I like some of the changes. 🤷
i dont see it. ill never stop considering a dummy terminal ‘no local processing’.
many LLMs are moving to a locally processed version requiring far less memory usage/processor requirements than the bigger versions… thats not going to just evaporate. lots of people dont trust cloud compute.
No telling for sure, I don’t write the terminology. Some changes have actually been good changes, like when they decided to stop designating PATA hard drives as master/slave and started using primary/secondary instead.
But think about the newer generation that has never seen a true dumb terminal before, they have no concept of no local processing. So it won’t surprise me a bit if the newer generation starts asking questions like ‘Does it have Artificial Intelligence? No? Well it must be a dumb system then’…
yeah, but i only swim in technical circles and ive never, ever heard a general compute device referred to as ‘dummy’. ever. not even machines that only run an app that connects to a mainframe essentially turning into a dummy.
that said, im old. i had a rather technical coworker ask me for clarification when i made a napster reference. the young determine the language.
now get off my lawn.
Upvoted for get off my lawn 😂🤣