This is the mail archive of the
gcc@gcc.gnu.org
mailing list for the GCC project.
Re: Compile performance of Linux kernels in mainline gcc
On Oct 30, 2004, at 10:11 AM, Andi Kleen wrote:
On Sat, Oct 30, 2004 at 10:09:29AM -0400, Andrew Pinski wrote:
On Oct 30, 2004, at 10:07 AM, Andi Kleen wrote:
On Sat, Oct 30, 2004 at 09:48:53AM -0400, Andrew Pinski wrote:
Yes the problem is that his terminal does not support UTF8 and his
LC_* are set to UTF8, a non bug.
Huh?? While that's possible why does gcc need UTF-8 to print
standard C
identifiers?
It is printing quotes, not the identifier itself.
Quotes? You mean >>"<< ?? That is 7 bit ASCII too.
On most planets, yes.
However, in UTF-8 world, we apparently print nice quotes, not >>>"<<<<.
(I'd put them here, but they would obviously display as garbage to you,
as we've determined ;P)
I discovered this running the testsuite once without LANG=C, and seeing
tons of failures because the quotes look different than the strings we
were expecting.
?
-Andi
On Oct 30, 2004, at 10:11 AM, Andi Kleen wrote:
On Sat, Oct 30, 2004 at 10:09:29AM -0400, Andrew Pinski wrote:
On Oct 30, 2004, at 10:07 AM, Andi Kleen wrote:
On Sat, Oct 30, 2004 at 09:48:53AM -0400, Andrew Pinski wrote:
Yes the problem is that his terminal does not support UTF8 and his
LC_* are set to UTF8, a non bug.
Huh?? While that's possible why does gcc need UTF-8 to print standard C
identifiers?
It is printing quotes, not the identifier itself.
Quotes? You mean >>"<< ?? That is 7 bit ASCII too.
On most planets, yes.
However, in UTF-8 world, we apparently print nice quotes, not >>>"<<<<.
(I'd put them here, but they would obviously display as garbage to you,
as we've determined ;P)
I discovered this running the testsuite once without LANG=C, and seeing
tons of failures because the quotes look different than the strings we
were expecting.
?
-Andi