This is the mail archive of the gcc-testresults@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

c++ performance regressions in gcc > 2.95.3


We observed that certain large C++ applications perform worse
in gcc-3.x and gcc-4.x than they did in gcc-2.95.3.
On the theory that at least some of the cause
would show up in microbenchmarks, we tried running
bench++ with both old and new toolchains.
Because we suspect that part of the regression is due to
libstdc++, we also measured performance using stlport.

Here are our results.
A table of nanoseconds-per-iteration for each individual microbenchmark in bench++
for g++-2.95.3, and for g++-4.0.1 and g++-4.1.0-20050723
with and without STLport, is at
http://www.cis.udel.edu/~danalis/OSS/bench_plus_plus/files/report-f15_m2_X_2.6-absolute.txt
A table normalized relative to the gcc-2.95.3 results is at
http://www.cis.udel.edu/~danalis/OSS/bench_plus_plus/files/report-f15_m2_X_2.6.txt

The interesting bits are summarized in a table showing just the performance regressions,
and annotated with descriptions of the microbenchmarks which regressed.
It's at
   http://www.cis.udel.edu/~danalis/OSS/bench_plus_plus/results.html

We reported one of the regressions already as http://gcc.gnu.org/PR22563 .
There seem to be at least ten others that might be worth
reporting.  I'll try to post bug reports for a few, but
my summer internship is running out soon.  If anyone else
has time to look at the data, I'd appreciate suggestions or
criticism; maybe I can fix a few problems in my benchmark scripts
before I turn into a pumpkin.

Anthony Danalis & Dan Kegel


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]