March 25th, 2003, 11:45 PM

really basic question
I am starting analysis of algorithms for a class and I am having trouble with the basics, could someone help me work through this simple problem. Im not asking do it for me but I am not sure where to start. I read my prof book(which the problem is out of) but its explanations are too hard for me to understand.
prove or disprove each of the following:
f ( n ) = O(( f ( n ))^2)
if the statement is true prove it if false show how my example demonstrates it's false.
March 26th, 2003, 03:26 AM

f ( n ) = O(( f ( n ))^2)
To me it looks like algebra, just respect the basic rules of precedence: PEMDAS. Sounds hard to be true but then again, who is O?
BTW, what language is that?
Words must be weighed, not counted.
March 26th, 2003, 06:11 AM

well it is 100% algorithms (comlexity, i think its called) but the question is what the f(n) is.
I mean the very same n ( an array[n] for example) has an different complexity {O[f(n)]} for a different algorithm f.
expl. the bublesort is n^2 and the quicksort is an O(n).
but just F( )..... it doent say a thing to me soory, if you could be more specific.
No sign
March 26th, 2003, 10:59 PM

that is bigO notation, referring to the asymptotic complexity of the algorithm. if i remember correctly, bigO is the worst case scenario, or upper bounds, of how an algorithm will perform. basically, the performance of an algorithm(f(n)) will never be worse than its bigO. that example is not true. one way to prove would be to take the function: n^2 + 2n + 50
the bigO of that is n^2, which is clearly not = to (n^2 + 2n + 50)^2 ... how did i get this? this stuff should be in ur book, and its rather hard to explain over a computer, but i can tell u this: with simple quadratic functions, the term with the highest power in it is almost always the bigO. This is b/c with extremely large values of n, that is the term that will grow at the highest rate.
March 27th, 2003, 04:16 AM

This link should be helpful
Extract from the link
Formal Definition: f(n) = O(g(n)) means there are positive constants c and k, such that 0 <= f(n) <= cg(n) for all n <= k. The values of c and k must be fixed for the function f and must not depend on n.
This formal definition is what you should be looking at to answer your question.
notice the difference between wour prof's question and the definition
The prof replaced g(n) by f(n)^2
so the real question is
Can this hold for all times
0 <= f(n) <= c( f(n)^2 ) for all n <= k.
is g(n) "the complixity of a function" always < (f(n)^2) ?
could it ever be possible that the big O complixity of a function be greater than f(n)^2 ?
I hope I have put you in the right direction, without answering the question for you.
April 16th, 2003, 09:38 AM

just disprove by example
let f(n) = n^2 + 1
then [ f(n) ]^2 = n^4 + 2*(n^2) + 1
then O( [ f(n) ]^2 ) is n^4
and clearly n^2 + 1 != n^4
hence
f ( n ) != O(( f ( n ))^2)