This is the mail archive of the
mailing list for the libstdc++ project.
RE: libstdc++/10093: Regression: Setting failbit in exceptions doesn't work
- From: Pétur Runólfsson <peturr02 at ru dot is>
- To: "Jerry Quinn" <jlquinn at optonline dot net>,<gcc-gnats at gcc dot gnu dot org>,<gcc-bugs at gcc dot gnu dot org>,<gcc-prs at gcc dot gnu dot org>,<libstdc++ at gcc dot gnu dot org>
- Date: Sun, 23 Mar 2003 19:13:24 -0000
- Subject: RE: libstdc++/10093: Regression: Setting failbit in exceptions doesn't work
Jerry Quinn wrote:
> I'm a bit confused by this bug. What should the behavior be
> here? To my
> simple reading of the standard, it seems that the code is
> operating correctly.
> The test case is designed to throw if failbit is set.
> Failbit does get set,
> which causes the exception to be thrown. However, it is caught in the
> exception handler and not rethrown.
> 188.8.131.52.1 says that the exception is rethrown if badbit is set in the
> exception mask (and badbit is set), otherwise not. This
> makes it sound like
> the _only_ way to get an exception from the formatted input
> functions is to
> set badbit in the exception mask.
This is the relevant quote:
184.108.40.206.1 - Common requirements [lib.istream.formatted.reqmts]
-1- [...] If an exception is thrown *during input* then ios::badbit is
turned on in *this's error state.
If (exception() & badbit) != 0 then the exception is rethrown. In any
case, the formatted input function destroys the sentry object. If no
exception has been thrown, it returns *this.
Note the "during input". I read that as meaning that this refers only
to exceptions thrown by the call to num_get::get(), not to exceptions
thrown from other functions the extractors may call.
IMHO the resolution to DR 64 is in support of this reading, as well as
the description of badbit in 220.127.116.11.3.
Also, the function exceptions(iostate) is rather silly if only badbit
is supposed to cause exceptions to be thrown.
The whole text about the meaning of exceptions() and the handling of
exceptions thrown during calls to I/O functions seems rather messy.
Perhaps someone more familiar with the standard cares to comment?