On Wed, 16 Jul 1997, James E. Agenbroad wrote:
> Wednesday, July 16, 1997
> It seems to me that there are topics for standardization of interest to
> both "Internet protocol designers" (or those who get data based on their
> designs) and to "word processing file" folks. I refer to the need to
> render the various writing systems of South, Southeast and Western Asia,
> e.g., Thai,Tamil, Devanagari, Arabic and Hebrew. While requiring much
> less memory than Han based scripts, their rendering processing needed to
> produce a minimal level of legibility is much greater, IMHO. While the
> "character-glyph model" defines the distinction between the two it does
> little to bridge the gap between them. Unicode 2.0 explains many of the
> difficulties rendering these scripts pose, but must each vendor then
> tackle them separately?
No. I guess there are companies that sell their expertise and code in
this area.
> Is a "unirender" standard worth considering?
I think it is a bad idea to try to standardize typographic practice,
although this has been done in some cases, i.e. in Japan with JIS
X 4105, where the main intention was to improve the quality of
wordprocessing software.
Typography is a creative field that in general only can loose if it
gets standardized. What may well be worth is increasing the descriptions
of rendering practice as they are currently in the Unicode standard.
But this is a lot of work, and needs a lot of expertise.
Regards, Martin.
This archive was generated by hypermail 2.1.2 : Tue Jul 10 2001 - 17:20:35 EDT