Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

hoangluong

macrumors member
Original poster
Jan 12, 2009
37
0
Sydney, Australia
Hi, this program (a programming project in King's book) is for calculating the mathematical constant e as an infinite series e=1 + 1/1! + 1/2! +... + 1/n!, where n is entered by the user and the terms continue to be added until the current term is less than epsilon, which is entered by the user. I don't know where it goes wrong involving the if statement. Could you please help? Thanks in advance.

Code:
#include<stdio.h>

int main(void)
{
	float d, f, e, epsilon;
	int n, i;
	
	printf("Enter an integer: ");
	scanf("%d", &n);
	printf("Enter an epsilon: ");
	scanf("%f", &epsilon);
	
	f=1.0f;
	e=1.0f;
	epsilon=0.0f;
	i=1;
	
	for(i=1; i<=n; i++) {
		f=f*i;
		d=1.0f/f;
		if(d>=epsilon) 
		e+=d;	
	}
	printf("The approximate value of e for n = %d is: %0.8f.\n", n, e);
	
	return 0;
}
 
Epsilon is always 0, so the if condition is a tautology until f gets so big that the floating point approximation of 1.0/f is 0.

Also, you're not dividing by f!, just f. Also, never use floats for anything. By the time you run into the very few exceptions you'll know it.

-Lee
 
Epsilon is always 0, so the if condition is a tautology until f gets so big that the floating point approximation of 1.0/f is 0.

Also, you're not dividing by f!, just f. Also, never use floats for anything. By the time you run into the very few exceptions you'll know it.

-Lee

Thanks a lot for your help.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.