This is the mail archive of the
mailing list for the GCC project.
Re: Plans for ABI migration
- To: bernie at codewiz dot org
- Subject: Re: Plans for ABI migration
- From: Joe Buck <jbuck at racerx dot synopsys dot com>
- Date: Tue, 9 Jan 2001 13:16:05 -0800 (PST)
- Cc: gcc at gcc dot gnu dot org
Bernardo Innocenti writes:
> did the steering comitee already set a roadmap to ease the
> migration from the old C++ ABI (gcc 2.9x) to the new one
> (gcc 3.x)?
No, because there is nothing special about an ABI change with a
major release of the compiler. Some background:
In the past, the C++ ABI broke with every major release. egcs 1.0
was different from 2.7.x. egcs 1.1 was different from egcs 1.0.
gcc 2.95 was different from egcs 1.1. Red Hat's "2.96" is
not compatible with either 2.95 or current snapshot code.
In many cases, these changes were required by the effort to support
more and more of an evolving language standard. But the ISO C++
has been complete for some time, and we'll now have an almost
complete (except for template "export" and the usual bugs)
implementation of C++.
> This is expecially needed for systems using gcc as their
> official C++ compiler, such as Linux and *BSD.
The problem has been dealt with in the past by the shared library
version number: you can run binaries that use older and newer C++
by having multiple shared libraries on your machine. If you have
a GNU/Linux box, do
You will see that there is more than one shared library.
Unfortunately other C++ libraries such as QT will also need multiple
versions. This sucks, but there's nothing we can do about it.
What is new is that we intend to "stop the insanity" and have
g++ 3.1 still be binary compatible with g++ 3.0. We know that
we'll still have to make changes to fix bugs, but approaches
similar to those used for glibc will be used so that one library
can support multiple compiler versions.
> This is going to be quite a pain for most users and even
> for distribution makers unless something is done to allow
> a smooth migration scheme.
You assume that this is a new problem. Distribution makers already
know how to handle it. There is an unavoidable disk space cost.