Speaking specifically of Objc, (although it might apply other places), I understand that there can be a problem with casting a pointer to a BOOL:
Not very ObjC looking, sorry, but the point is that if an NSString* is cast to BOOL, then really there are 64 or at least 32 bits, I guess, being treated as a BOOL, an unsigned 8, so that, 1/256th of the time, theoretically, the pointer is some humongous value which just happens to have all 8 lowest bits equal to zero, so the BOOL is incorrectly determined to be false when it really should be true. In this case, it could be fixed by saying "return ( str != 0 );"
Am I correct so far? I think so; hope so.
So my question is, what happens in an "if" statement, or while, or ?: or...
In the above code, in ObjC, does the compiler at some point cast the "str" to a BOOL type, in which case that code is buggy? Or does the compiler proceed directly to evaluate the zero or non-zero value of str, no matter how many bits, in which case it's fine?
IOW, in that code, is it necessary to change it to "if ( str != 0 )" ? I'm not asking about opinions about style, I mean is it really actually totally Turing-machine necessary?
Thx.
Code:
BOOL CanMakeAStringB(void) {
NSString* str = returnSomeValidStringOrReturnZero();
return str;
}
Not very ObjC looking, sorry, but the point is that if an NSString* is cast to BOOL, then really there are 64 or at least 32 bits, I guess, being treated as a BOOL, an unsigned 8, so that, 1/256th of the time, theoretically, the pointer is some humongous value which just happens to have all 8 lowest bits equal to zero, so the BOOL is incorrectly determined to be false when it really should be true. In this case, it could be fixed by saying "return ( str != 0 );"
Am I correct so far? I think so; hope so.
So my question is, what happens in an "if" statement, or while, or ?: or...
Code:
NSString* str = returnSomeValidStringOrReturnZero();
if ( str ) praiseBob();
In the above code, in ObjC, does the compiler at some point cast the "str" to a BOOL type, in which case that code is buggy? Or does the compiler proceed directly to evaluate the zero or non-zero value of str, no matter how many bits, in which case it's fine?
IOW, in that code, is it necessary to change it to "if ( str != 0 )" ? I'm not asking about opinions about style, I mean is it really actually totally Turing-machine necessary?
Thx.