RE: Perception that Unicode is 16-bit (was: Re: Surrogate space i

From: Marco Cimarosti (marco.cimarosti@essetre.it)
Date: Wed Feb 21 2001 - 04:17:51 EST


Peter Constable:
> On 02/20/2001 03:34:28 AM Marco Cimarosti wrote:
> > "Unicode is now a 32-bit character encoding standard,
> > although only about one million of codes actually exist,
> > [...]
>
> Well, it's probably a better answer to say that Unicode is a 20.1-bit
> encoding since the direct encoding of characters is the coded [...]

Your explanation is very correct. This is precisely how I used to start my
endless explanations to those colleagues :-) And they invariably interrupted
the explanation asking: "So, how many bits does it have?"

That's why I wanted to simplify even more saying something like: "It is 32
bits (yes 32, like 4 bytes, OK?) but, as not all combinations are used,
there are techniques to shrink it down a lot."

_ Marco



This archive was generated by hypermail 2.1.2 : Tue Jul 10 2001 - 17:21:19 EDT