[Openmcl-devel] how many angels can dance on a unicode character?
hamlink at comcast.net
Wed Apr 25 01:33:29 UTC 2007
I don't understand your statement about not being sure why you care,
maybe I missed something.
I thought this thread started (more or less) because someone wanted
UTF-16 as lisp's string representation with the rationale that talking
to the outside world (many libraries of which use UTF-16) would be
easier. I think Gail was suggesting that arrays of (unsigned-byte 16)
would meet most of the needs of that application without putting
superfluous requirements on (for example) lisp symbol names and doc
Since it appears that at least out of the box no appreciable space
savings would materialize from using UTF-16 instead of UTF-32 as a base
string representation in openmcl, what other reason for using UTF-16 is
there? Streams usually have their own element type... I can't really
think of a lot of cases where the internal representation of a string
makes much difference other than for FFI (and then largely because
generating a copy is more expensive, out or in, and because without the
extra conversions code that uses the strings destined for foreign
libraries can't use schar, etc.).
On Apr 24, 2007, at 9:02 AM, Takehiko Abe wrote:
> Gail Zacharias wrote:
>> At the risk of dragging this discussion out further than it needs to
>> go... I'm curious about one thing: if the main thing one is going to
>> do with UTF16 strings is pass them off to library functions for
>> proper interpretation and processing, why would one care whether they
>> have array element type of CL:CHARACTER? I mean, wouldn't arrays of
>> (UNSIGNED-BYTE 16) work just as well?
> You are right. I am not sure why I care though. I've never
> thought of the possiblity before.
> Openmcl-devel mailing list
> Openmcl-devel at clozure.com
More information about the Openmcl-devel