Wed Apr 9 18:39:00 GMT 2003
On Wednesday, April 9, 2003, at 01:41 AM, Gerald Pfeifer wrote:
> On Wed, 9 Apr 2003, Andrew Pinski wrote:
>> But really the math is K*n*O(n) = O(n^2) for any K>0 aka you call the
>> function K*n times, and the function takes O(n) time. So really it is
>> still O(n^2).
> I was not talking about the specific algorithm here (and I agree with
> what you write above), I just disputed
>>>> If you take a linear operation (that is, O(n)), and then you perform
>>>> it K*n times for some fixed constant K, then the program is O(n*n),
>>>> which is quadratic.
> which is not correct (as far as big-O is concerned).
I thought what Geoff said was perfectly clear, but I believe you missed
what he meant to say. If you envision what he said as being true, you
will understand what he said.
If you take a linear algorithm (O(N)) and run it, once (or any finite
non-negative constant) per N (1*N * O(N)), the resulting algorithm is
in fact O(N^2).
If you take a linear algorithm (O(N)) and run it once (or any finite
non-negative constant) per the entire dataset, the resulting algorithm
is in fact O(N).
More information about the Gcc-patches