Summary: | strange sign extension in gcc | ||
---|---|---|---|
Product: | Gentoo Linux | Reporter: | Joakim Tjernlund <joakim.tjernlund> |
Component: | [OLD] Development | Assignee: | Gentoo Linux bug wranglers <bug-wranglers> |
Status: | RESOLVED INVALID | ||
Severity: | normal | CC: | jer, truedfx |
Priority: | High | ||
Version: | unspecified | ||
Hardware: | All | ||
OS: | Linux | ||
URL: | http://en.wikipedia.org/wiki/Sign_bit | ||
Whiteboard: | |||
Package list: | Runtime testing required: | --- |
Description
Joakim Tjernlund
2008-05-15 15:15:47 UTC
Be wary of mixing signed and unsigned integers. Yes, but I don't see what is wrong in this case. Can you explain why the sign extension works differently in case 1 and 2? x is unsigned, so x-x and x*0 are unsigned, so (x-x)+y and (x*0)+y are unsigned. That's just the way C works, and you will see the same results with other compilers. x*1 is also unsigned but works as expected in: z = (x*1) + y; It appears that when the x expression becomes zero, then the signed/unsigned conversions break down, strange. |