This is the mail archive of the
mailing list for the libstdc++ project.
--enable-libstdcxx-time changes ABI?
- From: Kenny Simpson <theonetruekenny at yahoo dot com>
- To: libstdc++ at gcc dot gnu dot org
- Date: Sat, 1 Dec 2012 20:36:35 -0800 (PST)
- Subject: --enable-libstdcxx-time changes ABI?
http://gcc.gnu.org/onlinedocs/libstdc++/manual/configure.html documents the --enable-libstdcxx-time options, but, unlike some of the other options, does not mention that it affects ABI.
or maybe I'm not understanding ABI-breakage.
My understanding of the affect of this is that std::chrono::system_clock::time_point would become a different type.
std::chrono::system_clock::time_point is defined as:
typedef chrono::time_point<system_clock, duration> time_point;
and 'duration' depends on a configuration-time setting based the enable-libstdcxx-time:
typedef chrono::nanoseconds duration;
typedef chrono::microseconds duration;
typedef chrono::seconds duration;
What makes this affect the ABI is system_clock::now:
Since only the return type is affected, and since the return type is not part of the mangled name, the exported symbol is not affected.
However, if a program at runtime were to run against a libstdc++.so built with a differing option, the results would be times/durations that are off by large factors - silently. The program could have compiled assuming nanoseconds, only to run against a libstdc++ that returns microseconds.
As such, it seems it would never be safe to mix code that was built with differing flavors of libstdcxx-time. If this is true, a note in the documentation would be nice - even if this isn't considered ABI-breakage since the exported symbols aren't affected.