From: Arcane Jill (arcanejill@ramonsky.com)
Date: Tue Apr 05 2005 - 00:59:55 CST
Well, I'm a software engineer, and I would just like to say that I agree with
everything Ken has said on this thread.
In particular, I have played around with writing code-generators, of the ilk
which Ken mentioned in another post on this thread, and I /never/ assumed that
all (or indeed, any) generated codepoints would be 16-bits wide. That would be
a really dumb thing to do. Why is anyone even mentioning this as a possibility?
Unicode should not pander to bad programmers. Just sack the lot of them, and
then employ people who do things properly!
Jill
-----Original Message-----
From: unicode-bounce@unicode.org [mailto:unicode-bounce@unicode.org]On
Behalf Of Kenneth Whistler
Sent: 04 April 2005 23:05
To: peterkirk@qaya.org
Cc: unicode@unicode.org; kenw@sybase.com
Subject: Re: Does Unicode 4.1 change NFC?
Anything more or less than that is just bad software engineering.
--Ken
This archive was generated by hypermail 2.1.5 : Tue Apr 05 2005 - 01:01:38 CST