The discussion didn't reach a definitive conclusion:
http://gcc.gnu.org/ml/gcc-patches/2007-01/msg01166.html
It seems that the consensus is that:
* Context matters, so we should warn for int i = int(1.0/0.0); (not
sure how to implement this, but I will try).
* Lexical form should not matter, so warnings for 1.0/0.0 and 1.0/0
should be consistent.
* C and C++ front-ends should agree.
So the only question is:
Do we warn for 1.0/0.0 (like C++ front-end does) or we consider that
it is a legitimate way to obtain infinities so we don't warn (like C
front-end does) ?
Thanks,
Manuel.