Re: script or block detection needed for Unicode fonts

From: Michael Everson (everson@evertype.com)
Date: Fri Sep 27 2002 - 09:57:30 EDT

  • Next message: Michael Everson: "Re: Comma below, cedilla, and Gagauz"

    At 04:34 +0000 2002-09-27, jameskass@att.net wrote:

    >Some apps won't display a glyph from a specified font if its corresponding
    >Unicode Ranges Supported bit in the OS/2 table isn't set. So, font
    >developers producing fonts intended to be used with such apps set the
    >corresponding bit even if only one glyph from the entire range is
    >present in the font.

    Good heavens.

    In Mac OS X if you have a glyph with a Unicode address attached to
    it, it will display in any application that can display any Unicode
    character. I don't understand why a particular bit has to be set in
    some table. Why can't the OS just accept what's in the font?

    -- 
    Michael Everson * * Everson Typography *  * http://www.evertype.com
    48B Gleann na Carraige; Cill Fhionntain; Baile Átha Cliath 13; Éire
    Telephone +353 86 807 9169 * * Fax +353 1 832 2189 (by arrangement)
    


    This archive was generated by hypermail 2.1.5 : Sat Sep 28 2002 - 09:07:11 EDT