From: D. Starner (shalesller@writeme.com)
Date: Thu Feb 03 2005 - 17:26:15 CST
"Hans Aberg" writes:
> That seems to be problem with Unicode: by wanting to do too much, one will
> provide norms that merely will be disobeyed. THis is a gener problem with
> standards, not only Unicode. Therefore, quite few standrds will never be in
> effect used.
In many cases, so what? If you want to sort your words a different way, then
go for it, but it's nice to have a standard way of doing it.
> I think that Unicode should focus on providing the character set, the
> character numbering, and in some cases, rules for combined characters. If
> the encoding issue would have been handled correctly, it would have been
> completely independent of these issues.
>[...]
> I can think of more than one character set, going in different directions
> relative Unicode: One that is optimized by having as few characters as
> possible. Another, going the opposite direction, might be more ample in the
> set of characters, perhaps having one for each language-locality combination
> that is unique. I do not think there is one set that is the right one; it
> depends on what design objectives on has.
Have you looked at the PC market? x86 make up the large majority of the market;
PowerPC takes up the rest. I would be surprised if one in ten thousand PCs uses
a different CPU. x86 is simply a lousy chip, and everyone knows it. It's always
been a lousy chip. But compatibility is more important that not sucking.
You can effectively produce both your effects in Unicode, one by "normalizing"
the text and merging characters you want to treat the same, and the other
by using Plane 14 language tags and the PUA. But one standard with problems
is much better then dealing with multiple ones. If Everson and Whistler and
the rest of Unicode and ISO 10646 could, they'd toss out every precomposed
character in a heartbeat. But the loss in compatibility is such that it's
just not worth it.
> It
> is, in part, a question of how much traditional typesetting and rendering
> one should honor. With the modern computer tools, it is perhaps best to find
> rendering techniques adapted to that medium.
Toss out the other people's traditional typesetting because it's too hard
to get it right. Of course, every iota of Western typesetting must be
replicated exactly.
The whole point of TeX is that Donald Knuth wasn't willing to accept that
answer, that he was tired of new-style typography done because it was easier.
If you're going to toss out traditional typesetting, let's start by setting
Latin with monospace characters, and line-wrap like the Chinese and Japanese
do, by just wrapping when you hit the end of the line.
> In the LaTeX list, some folks wanted to standardize mathematical notation. I
> had to explain that mathematics has set of conventions that mathematicians
> use, but there is no real standard. Further, if somebody would attempt to
> standardize it, it would merely be ignored, because what dictates
> mathematical notation is a set of local traditions and the attempt to bring
> out the mathematical contents. Other mathematicians have independently
> remarked the same thing. So this gives one example, where attempting to
> standardize is not so wise.
No, this is an example where trying to standardize is opposed. There's a
difference. This is an example where the standards mostly exist, but there's
enough gratitious differences and stubborn people to make it more confusing
than it needs to be.
-- ___________________________________________________________ Sign-up for Ads Free at Mail.com http://promo.mail.com/adsfreejump.htm
This archive was generated by hypermail 2.1.5 : Thu Feb 03 2005 - 17:29:20 CST