More C type errors by default for GCC 14
Thu May 11 19:25:53 GMT 2023
Eli Zaretskii <email@example.com> writes:
>> Cc: Jonathan Wakely <firstname.lastname@example.org>, email@example.com
>> Date: Thu, 11 May 2023 10:44:47 +0200
>> From: Arsen Arsenović via Gcc <firstname.lastname@example.org>
>> the current default of accepting this code in C is harmful to those
>> who are writing new code, or are learning C.
> People who learn C should be advised to turn on all the warnings, and
> should be educated not to ignore any warnings. So this is a red
Indeed they should be - but warning vs. error holds significance. A
beginner is much less likely to be writing clever code that allegedly
uses these features properly than to be building new code, and simply
having made an error that they do not want and will suffer through
>> This seems like a good route to me - it facilitates both veterans
>> maintaining code and beginners just learning how to write C.
> No, it prefers beginners (which already have the warnings, unless they
> deliberately turn them off) to veterans who know what they are doing,
> and can live with those warnings.
Indeed. I said facilitates, not treats equally. I think the veterans
here won't lose much by having to pass -fpermissive, and I think that's
a worthwhile sacrifice to make, to nurture the new without pressuring
the old very much.
> The right balance is exactly what we have now: emitting warnings
> without breaking builds.
I disagree - I think breaking builds here (remember, it takes 13 bytes
to fix them) is a much lower weight than the other case being shot in
the foot for an easily detectable and treatable error being made easily
missable instead, so I reckon the scale is tipped heavily towards the
On that note - lets presume a beginners role. I've just started using
GCC. I run 'gcc -O2 -Wall main.c fun.c' and I get an a.out. It
mentions some 'implicit function generation', dunno what that means - if
it mattered much, it'd have been an error. I wrote a function called
test that prints the int it got in hex, but I called it with 12.3, but
it printed 1.. what the heck?
Why that happened is obvious to you and I (if you're on the same CPU as
me), but to a beginner is utter nonsense.
At this point, I can only assume one goes to revisit that warning.. I'd
hope so at least.
I doubt the beginner would know to pass
-Werror=implicit-function-declaration in this case (or even about
Werror... I just told them what -Wall and to read the warnings, which
was gleefully ignored)
Anyway - I'm not making that up - this circle of 'yeah the warnings
actually matter.. a lot' has happened with everyone I've tutored that
has had little to no contact with programming before (those who had more
contact have a higher respect for the word 'warning'), and it will keep
Hell, I've seen professors do it, and for a simple reason: they knew how
to write code, not how to use a compiler. That's a big gap.
The beginner here can't adapt - they don't know what -Wall means, they
just pass it because they were told to do it (if they're lucky!).
At the same time, they lose out on what is, IMO, one of the most useful
pieces of the toolchain: _FORTIFY_SOURCE (assuming your vendor enables
it by default.. we do). It provides effective bug detection, when the
code compiles right. It regularly spots bugs that haven't happened yet
(and same goes for all the other useful analysis the toolchain can do
when it has sufficient information to generate correct code, or more;
some of which can't reasonably be a default)
(on a related note, IMO it's a shame that the toolchain hides so many
possibilities behind 'cult knowledge', depths of many manuals and bad
On the other hand... I've been writing C for a long time, and you have
been doing so for a longer time. We see 'error: ...' in an old codebase
for a pedantic (rather, by my definition, bare essential) check and we
know where to look to fix it or ignore it.
I build old, unfamiliar codebases all the time, and I've been using a
modified toolchain that enables the error proposed today for a while; my
peers and I, who are proposing this, have gone over thousands of old
codebases, I've fixed some personally, yet the percentage of those that
have been full-out broken by this change is small (configure scripts are
almost all of it, too; Po has mentioned that they use a laxer compiler
for those - we (Gentoo) test two compilers in order to detect when these
differences happen, but keep the results of the laxer one, and then try
to fix the code upstream - often it comes down to just running autoconf
This sample is subject to selection bias. My testing targets mostly
more modern codebases that have long fixed these errors (if they have
active maintainers), and exclusively Free Software, so I expect that the
likelyhood that you'll need to run `export CC='gcc -fpermissive'
CXX='g++ -fpermissive'` goes up the more you move towards old or more
corporate codebases, but, for a veteran, this is no cost at all.
Is it that much of a stretch to imagine that a maintainer of a codebase
that has not seen revisions to get it past K&R-esque practices would
know that they need to pass -std=c89 (or a variant of such), or even
-fpermissive - assuming that they could even spare to use GCC 14 as
opposed to 2.95?
As an anecdote, just recently I had to fix some code written for i686
CPUs, presumably for GCC 4.something or less, because the GCC I insist
on using (which is 13 and has been since 13.0 went into feature-freeze)
has started using more than the GPRs on that machine (which lead to hard
to debug crashes because said codebase does not enable the requisite CPU
extensions, or handle the requisite registers properly). I think this
fits within the definition of 'worked yesterday, broke today'. Should
that change be reverted? Replacing it with -mmore-than-gprs would make
GCC more compatible with this old code.
I don't think so.
This is a sensitive codebase, and not just because it's written poorly,
but because it's a touchy thing it's implementing, any change in
compiler requires reverification. The manifestation here has *no*
Mind you, this GCC change cost me more than someone seeing 'error:
implicit declaration of function ...' before adding -fpermissive would
pay when compiling their code with the wrong -std= value, because that
fails early, and unambiguously, and I had to fight a variety of nonsense
to actually debug this error.
... speaking of that, if one builds their codebase without -std=..,
they're risking more than just optimization changes breaking code that
relies on bad assumptions, they're also risking a change in language
With all that to consider, is it *really* a significant cost to add
-fpermissive? Would that cost not be massively overshadowed by the cost
of a compiler change? It feels like it's a footnote compared to
checking whether added optimizations go against invalid assumptions
(which is, by the way, also rectified by adding more hard and easy
to see errors).
I expect no change in behavior from those that maintain these old
codebases, they know what they're doing, and they have bigger fish to
fry - however, I expect that this change will result in:
- A better reputation for GCC and the GCC project (by showing that we do
care for code correctness),
- More new code being less error prone (by merit of simple errors being
detected more often),
- Less 'cult knowledge' in the garden path,
- More responsible beginners, and
- Fewer people being able to effectively paint GNU and/or C/++ as the
backwards crowd using a error-prone technique of yesteryear.
(and yes, optics matter)
Builds break. Builds breaking cleanly is a treat compared to the usual
breakage. At least this breaks the few that do break with a positive
Have a lovely evening.
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 381 bytes
Desc: not available
More information about the Gcc