This is the mail archive of the
mailing list for the libstdc++ project.
Re: libstdc++/10093: Regression: Setting failbit in exceptions doesn'twork
- From: Jerry Quinn <jlquinn at optonline dot net>
- To: gcc-gnats at gcc dot gnu dot org, peturr02 at ru dot is, gcc-bugs at gcc dot gnu dot org,gcc-prs at gcc dot gnu dot org, libstdc++ at gcc dot gnu dot org
- Date: Sun, 23 Mar 2003 01:50:11 -0500
- Subject: Re: libstdc++/10093: Regression: Setting failbit in exceptions doesn'twork
I'm a bit confused by this bug. What should the behavior be here? To my
simple reading of the standard, it seems that the code is operating correctly.
The test case is designed to throw if failbit is set. Failbit does get set,
which causes the exception to be thrown. However, it is caught in the
exception handler and not rethrown.
188.8.131.52.1 says that the exception is rethrown if badbit is set in the
exception mask (and badbit is set), otherwise not. This makes it sound like
the _only_ way to get an exception from the formatted input functions is to
set badbit in the exception mask.
Petur, can you explain what I'm missing?