This is the mail archive of the
gcc-bugs@gcc.gnu.org
mailing list for the GCC project.
[Bug other/46677] frontends and tree optimizers use *_TYPE_SIZE
- From: "joseph at codesourcery dot com" <gcc-bugzilla at gcc dot gnu dot org>
- To: gcc-bugs at gcc dot gnu dot org
- Date: Fri, 26 Nov 2010 21:24:14 +0000
- Subject: [Bug other/46677] frontends and tree optimizers use *_TYPE_SIZE
- Auto-submitted: auto-generated
- References: <bug-46677-4@http.gcc.gnu.org/bugzilla/>
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=46677
--- Comment #3 from joseph at codesourcery dot com <joseph at codesourcery dot com> 2010-11-26 21:24:05 UTC ---
On Fri, 26 Nov 2010, amylaar at gcc dot gnu.org wrote:
> The frontends and tree optimizers use the *_TYPE_SIZE and POINTER_SIZE
> target macros.
>
> They should instead use to-be-created data members of targetm.
No, they shouldn't.
* POINTER_SIZE and ADA_LONG_TYPE_SIZE I haven't looked at in detail -
though I'm aware of the comment in cppbuiltin.c about why it uses
POINTER_SIZE (incorrectly divided by BITS_PER_UNIT rather than
TYPE_PRECISION (char_type_node)) rather than ptr_type_node.
* WCHAR_TYPE_SIZE should go away completely (it's only used for Ada); this
size should be determined from WCHAR_TYPE.
* Modifiable members of targetm are a bad idea and make LTO-based
devirtualization harder (I'd rather targetm was const for single-target
builds), and these values depend on command-line options so function
members are more appropriate.
* All the macros relating to the sizes of various C types in bits should
be replaced by hooks that are only called to create the associated tree
nodes; elsewhere they should be replaced by TYPE_PRECISION
(integer_type_node) etc. - certainly, it should be fine to replace uses of
the macros in the front ends by using TYPE_PRECISION right now. If the
tree optimizers are using these macros, they probably shouldn't be; to my
mind, it's a bug in a tree optimizer if it depends on details such as what
C "int" happens to be, as opposed to e.g. information about how efficient
computations on particular precisions happen to be. (Thus, existing uses
of TYPE_PRECISION (integer_type_node) in tree optimizers would also be
suspicious.)
* I already said in <http://gcc.gnu.org/ml/gcc/2010-11/msg00340.html> what
I thought hooks for *_TYPE_SIZE should look like; I strongly advise paying
attention to such comments. There might be a case for separating the
standard C integer types from the fixed-point types, say, but I think one
hook for all the standard types is better than separate hooks for each
type. (Many targets might define that hook to integer_type_size_il32 or
integer_type_size_i32l64 or integer_type_size_i16l32 as default versions
defined for common cases, rather than having their own function.)