In a message dated 2001-09-24 11:16:29 Pacific Daylight Time,
cbrown@xnetinc.com writes:
> For many application UTF-16 is a good compromise between a large code size
> and processing efficiency. As this industry changes the decision points
> change. Then there is always the great argument that many applications that
> were written for UCS-2 are much easier to convert to UTF-16.
This last argument is especially compelling when you consider the large
number of people (programmers and others) who still think "Unicode" means
"16-bit" and whose entire concept of supporting Unicode is using WCHARs and
letting library string functions do the conversions automagically.
-Doug Ewell
Fullerton, California
This archive was generated by hypermail 2.1.2 : Mon Sep 24 2001 - 21:11:47 EDT