This is the mail archive of the
gcc-bugs@gcc.gnu.org
mailing list for the GCC project.
[Bug c++/52654] [C++11] Warn on overflow in user-defined literals
- From: "3dw4rd at verizon dot net" <gcc-bugzilla at gcc dot gnu dot org>
- To: gcc-bugs at gcc dot gnu dot org
- Date: Fri, 06 Apr 2012 17:40:27 +0000
- Subject: [Bug c++/52654] [C++11] Warn on overflow in user-defined literals
- Auto-submitted: auto-generated
- References: <bug-52654-4@http.gcc.gnu.org/bugzilla/>
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=52654
--- Comment #16 from Ed Smith-Rowland <3dw4rd at verizon dot net> 2012-04-06 17:40:27 UTC ---
Thank you for your comments.
I was trying to follow the style of enum that I saw in the vicinity of the code
I was editing. I was not able to discern a single style. If lower-case is
more modern (I like it) then that's good for me. I have no problem moving it
to real.h and using it all over either (I think just one return type). I also
used the enum in interpret_integer though. Maybe that's not a problem really.
Maybe I could go as far as changing cpp_number.overflow to use this enum as
well instead of a bool? - no real is not part of libcpp by design it seems.
As far as just storing a string and parsing it later you may be right. Up to
now it was just convenient to keep the numeric values. I tried to figure out
how to run interpret_integer, etc in parser but I got stuck trying to feed
cpp_token in parser in the c++ fe when all I could see was cp_token. Is there
a way I can get the preprocessor token from the C++ token or is the former
stored somewhere? If so I'll do that in a heartbeat. OK, I'll try
parse_in->cut_token.
I thought about breaking up interpret_ into separate pieces that took strings
but that seemed like more trouble than it was worth.
Thank you for your comments.
Ed