Re: UTF8 vs AL32UTF8

From: Antoine Leca (Antoine.Leca@renault.fr)
Date: Mon Jun 11 2001 - 13:05:57 EDT


Jianping Yang wrote:
>
> > Supposedly you build you Unicode data base as UTF8. You start using the
> > data for a web application. What happens when you send UTF-8s data to a web
> > browser? It will work most of the time but will give you funny results from
> > time to time. This could create a difficult bug for people to find.
>
> In this case if you want to insert and retrieve the string in UTF-8 encoding
> form, you can set your NLS_LANG to be AL32UTF8 at client side. Oracle's
> architecture will provide character set conversion between server (UTF8) and
> client(AL32UTF8), so you can server just as black box.

And it has a (noticeable) performance hit, doesn't it? And you expect DBA or
developpers to behave this way, that is to eat the bullet and decrease
performances, in addition to have non-obvious code to maintain? (because
as a new maintainer, one of the first things I may do when I get a database
to administrate, may very well be to revert the change and to use UTF8
for my UTF-8 datas, instead of the very bizarre AL32UTF8...)
I really do not believe UTF-8s is *the* solution to your problem...
What you should ask would rather to change UTF-8 definition, to match yours.

All of this reminds me of a hard debate on a different list, about a similar
issue... ;-)

Antoine



This archive was generated by hypermail 2.1.2 : Fri Jul 06 2001 - 00:17:18 EDT