Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jdrm

macrumors newbie
Original poster
Jan 16, 2006
15
0
Well just an stupid question.....
will a O(2^(n/2)) function operation time would be the same that just O(2^n)??
 

notjustjay

macrumors 603
Sep 19, 2003
6,056
167
Canada, eh?
Big-O notation and algorithm complexity was never my strongest subject, but I offer up the observation that 2^n is the square of 2^(n/2). That's most definitely an exponential increase.

2^8 = 256
2^16 = 65,536
2^32 = 4,294,967,296!

On the other hand, n/2 is simply another constant n', and if Big-O doesn't care about the magnitude of the constant n or n', then they might be seen as the same complexity.
 

zimv20

macrumors 601
Jul 18, 2002
4,402
11
toronto
notjustjay said:
they might be seen as the same complexity.
that was my first thought, too. but then i started wondering why anyone would bother making the distinction. if they did, perhaps it is important.

jay -- what is the context?
 

telecomm

macrumors 65816
Nov 30, 2003
1,387
28
Rome
Ugh... memories of computer science classes come flooding back...

After having a quick review of the entry on big-O notation on Wikipedia (see the formal definition) a function f is O(2^n) iff it is O(2^(n/2)). The function appearing in the O notation doesn't indicate anything about the actual number of computations required by a function, it only says that (in the first case above) "f is bounded above by some constant M times 2^n when the input values n of f are greater than some number a." Recasting this claim in terms of 2^(n/2), you can prove the two equivalent.
 

superbovine

macrumors 68030
Nov 7, 2003
2,872
0
zimv20 said:
it would have taken me infinity minus 20 years to figure that out ;-)

yeah, I had to take that class twice before I passed... the TA who graded my final which was a big proofs wrote on the last question: "Nice try".
 

mwpeters8182

macrumors 6502
Apr 16, 2003
411
0
Boston, MA
I'm not sure they'd be the same order, as it could also be written as

O(sqrt(2^n)) - I've seen stuff written like that in big O notation before. Also, there's a huge difference between 2^100 and 2^50 (if we look at a specific case). Not nearly the same order.
 

telecomm

macrumors 65816
Nov 30, 2003
1,387
28
Rome
Oops...:eek:

mwpeters8182 is right—I made a mistake in factoring the exponential (a factor I thought could be pulled out still carried with it an exponent n).

Anyway, it works out that any function f with O(2^(n/2)) is also O(2^n), but not vice versa.

Remember, though, that this notation considers the behavior of the function as n approaches infinity, and so specific calculations don't really yield a lot of info. For instance f(x) = 1000000x and g(x) = x^5 evaluated at x = 4 might incorrectly suggest that f is more complex than g, but as the numbers get larger...
 

gnasher729

Suspended
Nov 25, 2005
17,980
5,566
telecomm said:
Ugh... memories of computer science classes come flooding back...

After having a quick review of the entry on big-O notation on Wikipedia (see the formal definition) a function f is O(2^n) iff it is O(2^(n/2)). The function appearing in the O notation doesn't indicate anything about the actual number of computations required by a function, it only says that (in the first case above) "f is bounded above by some constant M times 2^n when the input values n of f are greater than some number a." Recasting this claim in terms of 2^(n/2), you can prove the two equivalent.

Your conclusion is not correct. Two functions are considered to have the same growth if there is only a constant factor between them, for example if one function is always between 100 times and 10,000 times the other function. In this case, there is no constant factor: 2^n is larger than 2^(n/2) by a factor of 2^(n/2), which becomes arbitrarily large as n grows.
 

mbabauer

macrumors regular
Feb 14, 2006
105
0
gnasher729 said:
Your conclusion is not correct. Two functions are considered to have the same growth if there is only a constant factor between them, for example if one function is always between 100 times and 10,000 times the other function. In this case, there is no constant factor: 2^n is larger than 2^(n/2) by a factor of 2^(n/2), which becomes arbitrarily large as n grows.

But again, my rememberance of Big-O notation and Analysis of Algorithms is that its not so much concerned with raw numbers (i.e. number of , only growth, which both show the same exponential growth. Obviously, by looking at it, you can tell 2^(n) will growh much faster, but thats not the point.

Its like saying quicksort does its work in O(n)=n log n, but in actualits its probably more like (n log n) + c. The c constant just doesn't matter, because what you are showing is the logrithmic growth, not the total comparisons at 'n' number of elements.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.