This is the mail archive of the
gcc-patches@gcc.gnu.org
mailing list for the GCC project.
Re: [PATCH] utf-16 and utf-32 support in C and C++
- From: "Andrew Pinski" <pinskia at gmail dot com>
- To: "Kris Van Hees" <kris dot van dot hees at oracle dot com>
- Cc: gcc-patches at gcc dot gnu dot org
- Date: Thu, 13 Mar 2008 12:45:36 -0700
- Subject: Re: [PATCH] utf-16 and utf-32 support in C and C++
- References: <20080313193208.GE19427@oracle.com>
On Thu, Mar 13, 2008 at 12:32 PM, Kris Van Hees
<kris.van.hees@oracle.com> wrote:
> This patch provides an implementation for support of UTF-16 and UTF-32
> character data types in C and C++, based on the ISO/IEC draft technical
> report for C (ISO/IEC JTC1 SC22 WG14 N1040) and the proposal for C++
> (ISO/IEC JTC1 SC22 WG21 N2249). Neither proposal defines a specific
> encoding for UTF-16. This implementation uses the target endianness
> to determine whether UTF-16BE or UTF-16LE will be used.
I have a couple of questions about the ABI with this patch, how does
char16_t and char32_t get mangled for C++ code. Is this documented
anywhere? How does promotion work with these types in C++ and C and
is this tested? I remember reading the technical draft for C and it
mentioned that the size does not have to exactly 16 (or 32) bytes, so
it might be best if you added documentation to the extension page
about this extension.
I don't see any of the testcases attached.
Thanks,
Andrew Pinski