RE: UTF-17

From: Edward Cherlin (Edward.Cherlin.SY.67@aya.yale.edu)
Date: Fri Jun 22 2001 - 23:52:45 EDT


At 05:12 PM 6/22/2001, Carl W. Brown wrote:
>Ken,
...
>Another approach that would be IBM 1401 friendly is to convert the Unicode
>code point into decimal number and then convert each decimal digit into a
>base 5 and a base 2 number. We can call it UTF-5.2.
>
>The only thing that I can see is that it could be used for systems with
>6-bit bytes. This would also work for the 1401 except that the max storage
>is 16,000 bytes. It could handle up to 2,000 characters of data minus the
>space needed for code. If this is what it is for, don't expect me to pull
>out my Autocoder manuals and start coding for it.

Oh, that takes me right back...My father, who started on vacuum tube
computers, told me about Autocoder about the time he started teaching me
Fortran [shudder]. We were both extremely happy when he discovered APL.

>Carl

Anyway, why not? Then we can run it on the 1401 emulation for the 7090 that
we're emulating on the 360 emulator on my PC.

Yes, folks, That's how to make programs run on modern hardware slower than
on the original.[TM]



This archive was generated by hypermail 2.1.2 : Fri Jul 06 2001 - 00:17:18 EDT