that was my first thought, too. but then i started wondering why anyone would bother making the distinction. if they did, perhaps it is important.notjustjay said:they might be seen as the same complexity.
it would have taken me infinity minus 20 years to figure that out ;-)telecomm said:Recasting this claim in terms of 2^(n/2), you can prove the two equivalent.
zimv20 said:it would have taken me infinity minus 20 years to figure that out ;-)
telecomm said:Ugh... memories of computer science classes come flooding back...
After having a quick review of the entry on big-O notation on Wikipedia (see the formal definition) a function f is O(2^n) iff it is O(2^(n/2)). The function appearing in the O notation doesn't indicate anything about the actual number of computations required by a function, it only says that (in the first case above) "f is bounded above by some constant M times 2^n when the input values n of f are greater than some number a." Recasting this claim in terms of 2^(n/2), you can prove the two equivalent.
gnasher729 said:Your conclusion is not correct...
superbovine said:the TA who graded my final which was a big proofs wrote on the last question: "Nice try".
gnasher729 said:Your conclusion is not correct. Two functions are considered to have the same growth if there is only a constant factor between them, for example if one function is always between 100 times and 10,000 times the other function. In this case, there is no constant factor: 2^n is larger than 2^(n/2) by a factor of 2^(n/2), which becomes arbitrarily large as n grows.