Re: Fwd: RFC 8369 on Internationalizing IPv6 Using 128-Bit Unicode

From: Mark E. Shoulson via Unicode <unicode_at_unicode.org>
Date: Mon, 2 Apr 2018 21:06:51 -0400

On 04/02/2018 08:52 PM, J Decker via Unicode wrote:
>
>
> On Mon, Apr 2, 2018 at 5:42 PM, Mark E. Shoulson via Unicode
> <unicode_at_unicode.org <mailto:unicode_at_unicode.org>> wrote:
>
> For unique identifiers for every person, place, thing, etc,
> consider
> https://en.wikipedia.org/wiki/Universally_unique_identifier
> <https://en.wikipedia.org/wiki/Universally_unique_identifier>
> which are indeed 128 bits.
>
> What makes you think a single "glyph" that represents one of these
> 3.4⏨38 items could possibly be sensibly distinguishable at any
> sort of glance (including long stares) from all the others?  I
> have an idea for that: we can show the actual *digits* of some
> encoding of the 128-bit number.  Then just inspecting for a
> different digit will do.
>
>
> there's no restirction that it be one character cell in size...
> rendered glyphs could be thousands of pixels wide...

Yes, but at that point it becomes a huge stretch to call it a
"character".  It becomes more like a "picture" or "graphic" or
something.  And even then, considering the tremendohunormous number of
them we're dealing with, can we really be sure each one can be uniquely
recognized as the one it's *supposed* to be, by everyone?

~mark
Received on Mon Apr 02 2018 - 20:07:10 CDT

This archive was generated by hypermail 2.2.0 : Mon Apr 02 2018 - 20:07:10 CDT