From: Marnen Laibow-Koser (marnen@marnen.org)
Date: Fri Jun 22 2007 - 11:04:28 CDT
On Jun 22, 2007, at 11:07 AM, George W Gerrity wrote:
[...]
>
> As a retired Computer Scientist who has published on Computer
> Arithmetic and Computer Design and someone well versed in Number
> Theory, I can assure you that there is absolutely no future for
> (human-readable) representations in bases larger than 16, even
> assuming that future internal representations do not use numbers
> based on a power of 2 (the smallest computationally-useful Prime
> Number), but some power of another small Prime Number, such as 3,
> 5, 7, 11, or even 13.
[...]
I agree with you that this thread is a bit strange, but I must take
issue with your statement here. I don't know why you seem to think
that there's some sort of magic limit at 16, and in fact I have seen
practical applications of number bases greater than 16. The one that
springs most immediately to mind is a hack for use in situations of
limited memory, which I have seen recommended in at least one
programming text. The hack takes advantage of the fact that 36^3 <
2^16 in order to represent 3-character strings in [0-9A-Z] as 16-bit
integers in base 36, thus using two bytes per pseudostring rather
than four (assuming a length byte or terminator). Also, while this
may not be current today, I remember seeing base-32 notation (with
digits up to V) in active use on the Amiga, and I wouldn't be
completely surprised if it were resurrected at some point. And of
course there's base64 encoding, though that's certainly not meant to
be human-readable...
Also, it's extremely common in developing for 32-bit architectures
(particularly Mac OS) to refer to certain 32-bit constants as
pseudostrings of 4 8-bit ASCII characters, which is effectively
writing them in base 255. That mmay be a borderline case, however.
I think we are really drifting off the list topic here.
Best,
-- Marnen Laibow-Koser
This archive was generated by hypermail 2.1.5 : Fri Jun 22 2007 - 11:09:51 CDT