Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

abcdefg12345

macrumors 6502
Original poster
Jul 10, 2013
281
86
I'm trying to convert decimal to base 36 and it works all the way until 35 decimal i get a value of z then when i enter 36 i get 0 instead of 10 anyone know whats wrong with my code

Code:
- (IBAction)button:(id)sender {
        {
        NSString *base36 = @"0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ";
        
        NSString *newvalue = @"";
        NSString *g = @"";
        
        int i = 0;
        do {
            int x ;
            if (i == 0)
            {
                x = fmod(_field1.integerValue, [base36 length] );
            }
            else {
                x = fmod([g doubleValue], [base36 length]);
            }
            
            NSString *y = [[NSString alloc] initWithFormat:@"%c", [base36 characterAtIndex:x]];
            newvalue = [y stringByAppendingString:newvalue];
            
            i++;
        } while ([g intValue] != 0);
        
            _field2.stringValue=newvalue;
    }
}
 
"g" is never set to anything other than the empty string. Therefore the loop can only go one iteration. Use a debugger and step through the loop.
 
"g" is never set to anything other than the empty string. Therefore the loop can only go one iteration. Use a debugger and step through the loop.

i set the values for g but i still keep getting incorrect results

Code:
- (IBAction)button:(id)sender {
    
        NSString *base36 = @"0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ";
        
        NSString *newvalue = @"";
        NSString *g = @"";
    
        NSInteger value  = _field1.integerValue * 1;
            
        int i = 0;
        do {
            int x ;
            if (i == 0)
            {
                x = fmod(_field1.integerValue, [base36 length] );
            }
            else {
                x = fmod([g doubleValue], [base36 length]);
            }
            
            NSString *y = [[NSString alloc] initWithFormat:@"%c", [base36 characterAtIndex:x]];
            newvalue = [y stringByAppendingString:newvalue];
            
            value = value / 36;
            
            i++
            ;
            g = [[NSString alloc] initWithFormat:@"%0.0f", value - 0.5];
            
        } while ([g intValue] != 0);

    [_field2 setStringValue:[NSString stringWithFormat:@"%@",newvalue]];
}
 
i fixed it up and values are showing correctly only problem is 36 still shows 0 but anything above or under shows the right answer

Code:
- (IBAction)button:(id)sender {
    
   double value = [_field1 doubleValue];
    
    NSString *base36 = @"0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ";

    NSString *returnValue = @"";
    NSString *g = @"0";
        
    int i = 0;
    do {
       int x ;
        if (i == 0)
        {
            x = fmod(value, [base36 length] );
        }
        else {
            x = fmod([g doubleValue], [base36 length]);
        }
        
        NSString *y = [[NSString alloc] initWithFormat:@"%c", [base36 characterAtIndex:x]];
        returnValue = [y stringByAppendingString:returnValue];
        
        value = value / 36;
        
        i++;
        
        g = [[NSString alloc] initWithFormat:@"%0.0f", value - 0.5];
        
    } while ([g intValue] != 0);
    [_field2 setStringValue:[NSString stringWithFormat:@"%@",returnValue]];
}
 
You need to Break Things Down and Confirm Your Expectations. See this post:
https://forums.macrumors.com/posts/20214279/

If you know what each variable is expected to be each time through the loop, then it should be easy for you to step through with the debugger (as mfram suggested), look at the variables, and find the discrepancy.

If you don't know what to expect, then stepping through is less likely to reveal anything, because you could step right past the crucial clue to understanding what's actually happening. That's the most important part of debugging, to understand what's actually happening.


If stepping with the debugger isn't in your skill set, now is a good time to learn how. The method is relatively small and self-contained. You know exactly what the input and the expected output are. The problem is you don't know or can't confirm what the intermediate expected results are. So learn to step with the debugger, write down the expected values at each iteration, and confirm your expectations.

Another pathway is NSLog'ing values, to see if the actual value matches your expectation. This may be easier to learn than the debugger, but the debugger has longer lasting value. NSLog won't remove the need to know what values are expected at each iteration. So be sure to do that no matter what else you do. It's an essential part of the process.
 
There should be no need for using fmod, i, g or the conditions inside the loop and I don't know why you subtract 0.5 from g. The basics of what you need should be:

Code:
while(value) {
    digit = value % base;
    value = value / base;
    // insert charset[digit] at beginning of result;
}
 
This is madness. Can you comment every line with what it's doing?

Can you write out valid input and output? Do you support floating point values or only integer values? If integer values (don't try floating point now, you have to deal with estimates and such that will over complicate things.

Write out how you would manually convert an integer in base 10 to base 36. You're using mod. So you've got that much. You know to mod by 36. You're doing floating point mod. Using a truncating divide ought to work, too, but it has to be integer. A lot of these problems are with bases that are power of 2, or evenly divisible by one. That makes it a little easier.

Litter your code with prints. Every step. What is every value. What do you expect them to be?

Essentially you need to have an int, check mod 36, modify your value, check that it's nonzero and repeat. You've got the right ingredients, they're just combined in the wrong way.

-Lee
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.