From: John H. Jenkins (jenkins@apple.com)
Date: Sun Sep 29 2002 - 10:12:45 EDT
On Saturday, September 28, 2002, at 03:19 PM, David Starner wrote:
> On Sat, Sep 28, 2002 at 01:19:58PM -0700, Murray Sargent wrote:
>> Michael Everson said:
>>> I don't understand why a particular bit has to be set in
>>> some table. Why can't the OS just accept what's in the font?
>>
>> The main reason is performance. If an application has to check the
>> font
>> cmap for every character in a file, it slows down reading the file.
>
> Try, for example, opening a file for which you have no font coverage in
> Mozilla on Linux. It will open every font on the system looking for the
> missing characters, and it will take quite a while, accompanied by much
> disk thrashing to find they aren't there.
>
This just seems wildly inefficient to me, but then I'm coming from an
OS where this isn't done. The app doesn't keep track of whether or not
a particular font can draw a particular character; that's handled at
display time. If a particular font doesn't handle a particular
character, then a fallback mechanism is invoked by the system, which
caches the necessary data. I really don't see why an application needs
to check every character as it reads in a file to make sure it can be
drawn with the set font.
==========
John H. Jenkins
jenkins@apple.com
jhjenkins@mac.com
http://www.tejat.net/
This archive was generated by hypermail 2.1.5 : Sun Sep 29 2002 - 10:54:09 EDT