From: Arcane Jill (arcanejill@ramonsky.com)
Date: Thu Dec 18 2003 - 11:36:27 EST
 > -----Original Message-----
 > From: Eric Scace [mailto:eric@scace.org]
 > Sent: Thursday, December 18, 2003 3:57 PM
 > To: John Cowan; Arcane Jill
 > Cc: unicode@unicode.org
 > Subject: RE: American English translation of character names
 >
 >
 >    The logical "not" glyph got into EBCDIC because the
 > concept was needed in computer programming.
I'm a programmer, and I'm older than most programmers. I'm old enough to 
remember punched paper tape ... but not quite old enough to remember 
punched /cards/. I am interested in this, though. Could you possibly 
clarify /which/ computer language used (the EBCDIC equivalent of) 
U+00AC? I only ask because I'm not aware of one, and I'm intrigued.
 >    In the late 1970s the C programming language was one of
 > the first to use the glyph "!" to mean logical "not"; e.g.,
 > "!=".
"!" is used to mean "logical not" in contexts other than just "not 
equal". As in, for example: *bool b1 = ! b2;* (although there wasn't a 
bool type back then). I remember that BASIC used the keyword "NOT" for 
the same purpose. C also uses "~" as a "bitwise not".
So ... let me see if I have understood you correctly, because this is a 
tad confusing (but very interesting). You are saying that ... in the 
days of punched cards ... there was an EBCDIC code whose meaning was 
LOGICAL NOT. So far so good - but how would such a character code have 
been /written/? Was it written like the U+00AC glyph is now? Or did its 
visual appearance vary depending on who was writing it? Or ... did it 
even /have/ a visual appearance at all? I figure that, if it didn't have 
the visual appearance of the U+00AC glyph then "logical not" would map 
better to Unicode character U+223C TILDE OPERATOR (also known as "not", 
according to the code charts) which at least /looks/ like the character 
mathematicians use. On the other hand, if it did have U+00AC appearance 
then fair enough.
 > etc).  Earlier keyboard languages used a different
 > workaround; e.g., "<>" for "not equal".
Yeah, I always wondered why C chose to deploy ! to mean "not". Weird. 
Maybe they just picked a character at random and said "Ah yes - we'll 
use that one - no-one else seems to be using it for anything"????
Jill
This archive was generated by hypermail 2.1.5 : Thu Dec 18 2003 - 12:33:14 EST