This is the mail archive of the
gcc-bugs@gcc.gnu.org
mailing list for the GCC project.
libstdc++/6282: bitset<0> results in nonsense behavior
- From: pme at gcc dot gnu dot org
- To: gcc-gnats at gcc dot gnu dot org
- Date: 12 Apr 2002 20:08:24 -0000
- Subject: libstdc++/6282: bitset<0> results in nonsense behavior
- Reply-to: pme at gcc dot gnu dot org
>Number: 6282
>Category: libstdc++
>Synopsis: bitset<0> results in nonsense behavior
>Confidential: no
>Severity: critical
>Priority: low
>Responsible: unassigned
>State: open
>Class: sw-bug
>Submitter-Id: net
>Arrival-Date: Fri Apr 12 13:16:00 PDT 2002
>Closed-Date:
>Last-Modified:
>Originator: Phil Edwards
>Release: 3.x
>Organization:
>Environment:
All. (Not platform-specific code.)
>Description:
Creating a std::bitset instance of zero size still stores
a word's worth of data.
Consider
std::bitset<1> x(555555);
int i = x.count();
Only the least significant bit from 555555 will be stored,
and i will correctly be set to 1 (the count of "on" bits
in x). But if x is defined as
std::bitset<0> x(555555);
then there are still sizeof(unsigned long) bits being
stored and initialized, and x.count() returns 9 instead of
the more correct 0. Other bitset member functions return
likewise silly results.
The base class for bitset is not taking into account the
empty case.
>How-To-Repeat:
>Fix:
>Release-Note:
>Audit-Trail:
>Unformatted: