Lately I have seen a couple of couple of cultural references to "binary" and "the 0's and 1's of computers/digital data", but just this morning I realized that it has been a very long time since I needed to know much at all about "binary" data. Sure, I need to know how many "bits" are used for a character or integer or float value, but that is mostly simply to know its range of values, not the linear 0/1 quality of the specific values. Sure, I've looked at hex data semi-frequently (e.g., a SHA), but even then the 0/1 aspect of "binary" is completely deemphasized and we might as well be working with hex-based computers as binary-based computers.
Sure, hardware engineers still need to know about binary data.
And, on rare occasion software developers do find use for "bit" fields, but even though that knowledge depends on more efficient storage of binary values, I'm sure a hex-based machine could implement bit fields virtually as efficiently. In any case, bit fields don't strictly depend on the computer being binary-based. How many "web programmers" or "database programmers" or even "java programmers" need even a rudimentary comprehension of "binary" data as opposed to range of values?
Besides, when data is "serialized" or "encoded", the nature of the original or destination machine implementation or storage scheme is completely irrelevant. Sure we use 8-bit and 16-bit "encodings" but those are really 256-value or 65,536-value encodings or 1-byte vs. 2-byte. And the distinction would certainly be irrelevant if the underlying computer has 256-value or 65,536-value computing units.
Granted, software designers designing character encoding schemes (or audio or other media encoding) need to "lay out the bits", but so few people are doing that these days. It seems a supreme waste of time and energy and resources to focus your average software professional on "the 1's and 0's of binary."
My hunch is that "binary" and "1's and 0's" will stick with us until the point where the underlying hardware implementation shifts from 1-bit binary to hex or byte-based units (or even double-byte units), and then maybe another 5 to 10 years after that transition, if not longer. After all, we still "dial" phone numbers even though it has probably been 25 years or more since any of us had a phone "dial" in front of us, and certainly the younger generations never had that experience.
-- Jack Krupansky