Hi all,
This is my first post, so please excuse me if I come off a little (extremely) inexperienced.
I'm trying to teach myself C because I am an engineering student and have seen that it can be useful for simple analysis problems. Using a very out of date book (Published 1987), I have been going through some lessons. One of the lessons asks you to write a program that determines the number of letters between two input letters. I am using xcode to compile the program, and it has caused me no other issues so far. The issue that I am having is that the output number makes no sense.
The code that I have so far is as follows:
When I run the program, I am given a result that is magnitudes larger than the number of letters in the alphabet, so something must be wrong here. Any thoughts?
-John
This is my first post, so please excuse me if I come off a little (extremely) inexperienced.
I'm trying to teach myself C because I am an engineering student and have seen that it can be useful for simple analysis problems. Using a very out of date book (Published 1987), I have been going through some lessons. One of the lessons asks you to write a program that determines the number of letters between two input letters. I am using xcode to compile the program, and it has caused me no other issues so far. The issue that I am having is that the output number makes no sense.
The code that I have so far is as follows:
Code:
main()
{
int a, b, diff;
printf("Please type two characters below:\n");
scanf("%c %c", &a, &b);
if(a>b)
diff=a-b;
else
diff=b-a;
printf("The number of letters between each character is %d.", diff);
}
When I run the program, I am given a result that is magnitudes larger than the number of letters in the alphabet, so something must be wrong here. Any thoughts?
-John