Optimization Safety

LLeweLLyn Reese llewelly@lifesupport.shutdown.com
Sat May 24 13:02:00 GMT 2003


"John Anthony Kazos Jr." <jkazos@vt.edu> writes:

> I know that for some things, like GCC itself, or like GLibC, it's
> dangerous to screw around with the default optimization settings, but
> what about for random package X? Just how safe are the optimization
> algorithms? Let's say I compile with -O3, and it compiles and installs
> successfully, and seems to give no immediate horrific failures when
> running, is that good enough for a production system? Or can there be
> some tiny timebombs hidden in there because of some strange construct
> the package writer used?

The easy answer is to go with the rumor that -O2 is the most
    well-tested set of optimimzation flags, and use that.

The right answer, (i.e., the hard way) is to develop a through set of
    test cases, and run them at severl optimization levels, and if you
    discover a difference, file a bug report, and reduce your default
    optimization level. (Here, I assume you have previously
    established a need for using the best possible optimization.)

Note that glibc is quite large, since it included every unix c library
    extension ever concieved, contains a fair portion of unusual
    constructs, and, most importantly, has extrodinarily stringent
    correctness demands, since nearly every program on a glibc-based
    system depends on it (Some of them (mis)using it in unusual
    constructs.) 



More information about the Gcc-help mailing list