From: Doug Ewell (doug@ewellic.org)
Date: Thu May 19 2011 - 10:04:51 CDT
Christoph Päper <christoph dot paeper at crissov dot de> wrote:
> Changing people is harder than changing software, in general.
I don't disagree that improvements in software are needed to help people
work with text that includes combining marks. I think this is true for
other aspects of Unicode as well.
A new encoding isn't necessary to solve problems with combining
characters or to enforce normalization forms. If the user presses
Delete and the software is smart enough to make the entire a-with-acute
go away, it doesn't matter if that was one code point or two (or more)
in the underlying encoding. An OS or application can always convert
everything to NFD (please, not NFKD) if it likes.
If there are problems with typing combining characters, the solution is
to improve the typing experience, not change the encoding.
If there are problems with displaying combining characters, the solution
is to improve fonts and rendering engines, not change the encoding.
The ISO "Principles and Procedures" document should be required reading
for this discussion:
http://std.dkuug.dk/JTC1/SC2/WG2/docs/n3902.pdf
in particular the following passage:
<quote>
Annex G: Formal criteria for coding precomposed characters
G.1 Criteria
... Proposals that meet the negative criteria should use composed
character sequences instead...
Negative:
• If [the proposed character] were to introduce multiple spellings
(encodings) for a script where NO multiple spellings existed
previously...
• If solely intended to overcome short-term deficiency of rendering
technology.
</quote>
-- Doug Ewell | Thornton, Colorado, USA | RFC 5645, 4645, UTN #14 www.ewellic.org | www.facebook.com/doug.ewell | @DougEwell Â
This archive was generated by hypermail 2.1.5 : Thu May 19 2011 - 10:06:54 CDT