cast to an enum does not throw errors even for invalid values

Young, Michael Michael.Young@paetec.com
Fri May 25 16:54:00 GMT 2007


Shriramana Sharma wrote :
> If an error is not called, I feel it defeats the very meaning of
> casting, to convert an object of one type to an object of another type.
> When there is no equivalent object of the target type, how can the
> casting happen?

1) "the very meaning of casting" is not necessarily conversion, but coercion.
2) the underlying object type of an enum is an integral type - it's just restricted to a particular domain.  Casting from an integer to an integral type (even an enum) is allowable by C and C++ - you're not even going to get a run-time platform error/signal (typical for undefined behavior), as that would violate the behavior defined in the language standards.  There is an "equivalent object of the target type" here; just as you would expect to be able to cast (-3) to an unsigned short, you can cast it to an enum that doesn't have a defined domain value of -3.

Even if that was not the case (i.e., the target type is not compatible / equivalent), you wouldn't get an error from the compiler - and any run-time error would be caused because you subsequently invoke some functionality on the target object that is not available/compatible with the actual binary layout, not because of the coercion by itself.

When you cast (or static_cast or reinterpret_cast, but not dynamic_cast, in C++), you're saying, "hey, compiler, I'm the programmer and I know what I'm doing - allow me to break the type-safety rules here and treat this entity as if it were an X (even though in reality it may be a Y, where Y is compatible or incompatible).  Perhaps a warning for casting "out-of-domain-range" integer values to enum would be OK, but not an error - that affects too much existing code and violates standard definitions.  (I'm sure it's in the std, but I can't quote chapter and verse.)  But why even warn about "dangerous cast", when casting is inherently dangerous because you've explicitly "escaped" the type-safety system of the language?  The behavior you want is not even a candidate for an "extension", as the behavior is contrary to the existing language definition.

This is not a compiler bug - it is a behavior of C++ (and there's pros and cons of that behavior).  gcc/g++ strives to implement the language as defined in the standard - it's implementation is constrained by that definition.  Wishing, or even vigorously and sincerely insisting, otherwise doesn't change the facts.  (Yes, I've wished I could change certain things in C/C++, or the compiler, too.  Of course, the good news is that with gcc, you can, if you so choose!  If you actually change the behavior of the generated code, you are creating a non-std C++ variant, which I don't recommend - but that really is up to you!)

BTW - perhaps there's an easy workaround here if you do some OO analysis... I always really look at whether I need/want to use "enum", as I've discovered that, in my applications, they're really a C++ object "wanna-be".  If you define Planet as an object (implemented using an integer, perhaps), you can restrict the "domain" (name attribute) of that object, and you can control conversion via ctors and assignment operators.  Also, you can provide meaningful string output without a free (breaks encapsulation) decoder/conversion function that maps values to strings.  Consider also that there is no "inheritance" in enums - you can't define one enum as a superset of another without duplication / redefinition.  The only place I generally use enums anymore is for providing  readable parameters when invoking 3rd party APIs that use bit flag integer fields to specify behavior.  Finally, I've found that if you use macros, you can use an object type (class defintion) for development to provide debugging features and checks, but then switch to enums late in the development cycle for speed, just by changing the macro.  (Of course, the production version can't use any of the extended functionality provided by the class.)  However, I've found that I rarely change the macro definition, as my apps usually don't benefit from any speed gained by this change.  This technique may or may not be viable for you - it's probably less appealing if you are working with a large base of legacy code that would need to be refactored.

  - Michael



More information about the Gcc-help mailing list