[Bug c/41698] "\uFFFF" converts incorrectly to two-byte character

chasonr at newsguy dot com gcc-bugzilla@gcc.gnu.org
Tue Oct 13 21:05:00 GMT 2009



------- Comment #1 from chasonr at newsguy dot com  2009-10-13 21:04 -------
Created an attachment (id=18796)
 --> (http://gcc.gnu.org/bugzilla/attachment.cgi?id=18796&action=view)
Test case for this bug

This test uses the built-in __CHAR16_TYPE__, so that it will demonstrate the
bug even when wchar_t is four bytes wide, and as such will only compile on 4.4.
 For earlier compilers, change __CHAR16_TYPE__ to wchar_t and the test strings
to L"\uFFFF" and L"\U00010000".  When using wchar_t, the bug only appears when
wchar_t is two bytes wide.


-- 


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=41698



More information about the Gcc-bugs mailing list