Yes, we still need hex, but I still can't see much need for binary
A reader commented that there are still places where ever web developers need hex – for color codes. I agree. There are also some character codes needed as well, although in HTML/XML they are typically in decimal rather than hex. But, clearly, hex is still needed.
So, I think the reader proved my point that it is hex that is important, not so much binary itself. Yes, students need to be able to count from 0 to F, but how often do they need to know the atcual bit encoding for "C" or "C136E9ABA6D8428DB935DF7BD587C0E6"? And, sure, some people do need to know about the bit-level details of Base64 encoding or cryptography and codecs, but how many out of every 1,000 software developers ever need to use actual 0 and 1 binary?
I can't even remember the last time I needed the "&" (bitwise "AND") operator to "test" a bit or "mask" a bit-field. Probably not in the last 10 years. Not even 15 years. Maybe it was 20 years ago.
So, colors or character codes in hex, yes, but where can I find anybody using binary these days, other than in hardware or hardware interfaces and drivers?
Just to be clear, there are and will be an elite few computer scientists and advanced practitioners who really do need to be able to work and think at "the bit level", but their numbers are dwindling, I think.
A related question is whether the vast majority of "modern" software developers even need to know about "shifting" or "rotating" bits.
Again, all of this said, we may be stuck with the binary mentality until there is some major computational advance on the order of quantum computing or Ray Kurzweil's Singularity.
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home