Stability vs. correcting errors

From: Doug Ewell (dewell@adelphia.net)
Date: Fri Dec 26 2003 - 19:23:42 EST

  • Next message: Dean Snyder: "Re: [hebrew] Re: Aramaic unification and information retrieval"

    Those who still feel it is self-evident that the Unicode Technical
    Committee should correct "obvious" normalization errors, in things like
    compatibility equivalents and combining classes, might want to take a
    look at a new Internet-Draft:

    ftp://ftp.ietf.org/internet-drafts/draft-faltstrom-unicode-synchronisati
    on-00.txt

    in which Patrik Fältström, a major contributor to the Internationalized
    Domain Names Architecture (IDNA) effort, argues that the six
    normalization corrections already approved, together with the potential
    for more corrections, may create security and stability problems, and
    includes the following passage:

    > 3.1 Message to the Unicode Consortium
    >
    > The IETF strongly encourages the Unicode Consortium to keep the size
    > and rate of change of the correction list to an absolute minimum, as
    > it will be impossible for implementations (applications) to know
    > what version of the normalization tables which are in use. This is
    > because, in practice, the tables in many cases will be part of the
    > operating system. The end user will expect the same normalization
    > rules to be used in all applications in her environment.

    Obviously the issue of stability vs. "correctness" is not as
    black-and-white as some may think.

    -Doug Ewell
     Fullerton, California
     http://users.adelphia.net/~dewell/



    This archive was generated by hypermail 2.1.5 : Fri Dec 26 2003 - 19:52:14 EST