This is the mail archive of the
gcc-bugs@gcc.gnu.org
mailing list for the GCC project.
[Bug libstdc++/70893] codecvt incorrectly decodes UTF-16be
- From: "redi at gcc dot gnu.org" <gcc-bugzilla at gcc dot gnu dot org>
- To: gcc-bugs at gcc dot gnu dot org
- Date: Wed, 04 May 2016 08:59:01 +0000
- Subject: [Bug libstdc++/70893] codecvt incorrectly decodes UTF-16be
- Auto-submitted: auto-generated
- References: <bug-70893-4 at http dot gcc dot gnu dot org/bugzilla/>
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=70893
--- Comment #7 from Jonathan Wakely <redi at gcc dot gnu.org> ---
(In reply to ÐÐÑÐÐÐ from comment #5)
> It makes no sense, because you can explicitly specify little_endian in the
> template parameters, but not big_endian.
And the standard says that parameter is ignored for codecvt_utf8 and
codecvt_utf8_utf16.
But as I said, the spec is a mess.
Clang's libc++ gives exactly the same result for your testcase as libstdc++
does, so its authors interpreted the spec the same way I did.