This is the mail archive of the mailing list for the GCC project.

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: [PATCH] Add a character size parameter to c_strlen/get_range_strlen

On Fri, 17 Aug 2018, Jeff Law wrote:

> On 08/16/2018 05:01 PM, Joseph Myers wrote:
> > On Thu, 16 Aug 2018, Jeff Law wrote:
> > 
> >> restores previous behavior.  The sprintf bits want to count element
> >> sized chunks, which for wchars is 4 bytes (that count will then be
> > 
> >>    /* Compute the range the argument's length can be in.  */
> >> -  fmtresult slen = get_string_length (arg);
> >> +  int count_by = dir.specifier == 'S' || dir.modifier == FMT_LEN_l ? 4 : 1;
> > 
> > I don't see how a hardcoded 4 is correct here.  Surely you need to example 
> > wchar_type_node to determine its actual size for this target.
> We did kick this around a little.  IIRC Martin didn't think that it was
> worth handling the 2 byte wchar case.

There's a difference between explicitly not handling it and silently 
passing a wrong value.

> In theory something like WCHAR_TYPE_SIZE / BITS_PER_UNIT probably does
> the trick.   I'm a bit leery of using that though.  We don't use it
> anywhere else within GCC AFAICT.

WCHAR_TYPE_SIZE is wrong because it doesn't account for flag_short_wchar.  
As far as I can see only ada/gcc-interface/targtyps.c uses WCHAR_TYPE_SIZE 
now.  TYPE_PRECISION (wchar_type_node) / BITS_PER_UNIT is what should be 

Joseph S. Myers

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]