[Bug other/46633] New: [meta-bug] frontends use BITS_PER_UNIT when they mean TYPE_PRECISION (char_type_node)

amylaar at gcc dot gnu.org gcc-bugzilla@gcc.gnu.org
Wed Nov 24 05:59:00 GMT 2010


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=46633

           Summary: [meta-bug] frontends use BITS_PER_UNIT when they mean
                    TYPE_PRECISION (char_type_node)
           Product: gcc
           Version: 4.6.0
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: other
        AssignedTo: unassigned@gcc.gnu.org
        ReportedBy: amylaar@gcc.gnu.org
            Blocks: 46489


In some places the front ends use BITS_PER_UNIT when they mean
TYPE_PRECISION (char_type_node).
E.g. consider this code from c-family/c-common.c:fix_string_type :

  else if (TREE_TYPE (value) == char16_array_type_node)
    {
      nchars = length / (TYPE_PRECISION (char16_type_node) / BITS_PER_UNIT);
      e_type = char16_type_node;

On a bit-addressed architecture, you would have BITS_PER_UNIT == 1, but
probably TYPE_PRECISION (char_type_node) == 8.



More information about the Gcc-bugs mailing list