This is the mail archive of the
gcc-bugs@gcc.gnu.org
mailing list for the GCC project.
[Bug other/46633] [meta-bug] frontends use BITS_PER_UNIT when they mean TYPE_PRECISION (char_type_node)
- From: "joseph at codesourcery dot com" <gcc-bugzilla at gcc dot gnu dot org>
- To: gcc-bugs at gcc dot gnu dot org
- Date: Wed, 24 Nov 2010 15:13:45 +0000
- Subject: [Bug other/46633] [meta-bug] frontends use BITS_PER_UNIT when they mean TYPE_PRECISION (char_type_node)
- Auto-submitted: auto-generated
- References: <bug-46633-4@http.gcc.gnu.org/bugzilla/>
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=46633
--- Comment #1 from joseph at codesourcery dot com <joseph at codesourcery dot com> 2010-11-24 15:13:38 UTC ---
On Wed, 24 Nov 2010, amylaar at gcc dot gnu.org wrote:
> E.g. consider this code from c-family/c-common.c:fix_string_type :
>
> else if (TREE_TYPE (value) == char16_array_type_node)
> {
> nchars = length / (TYPE_PRECISION (char16_type_node) / BITS_PER_UNIT);
> e_type = char16_type_node;
>
> On a bit-addressed architecture, you would have BITS_PER_UNIT == 1, but
> probably TYPE_PRECISION (char_type_node) == 8.
String representations when either BITS_PER_UNIT or TYPE_PRECISION
(char_type_node) doesn't match host char have their own problems.
However, my inclination is that TREE_STRING_LENGTH counts in units of
BITS_PER_UNIT, which means the above logic is correct and the problem is
that the "char" case should involve (TYPE_PRECISION (char_type_node) /
BITS_PER_UNIT). <http://gcc.gnu.org/ml/gcc/2003-06/msg01159.html> gives
some thoughts of mine on these issues about representation of strings.