Shmuelpro wrote: oh common can anyone at least tell me that they know what it is
I think we most often refer to "Big-O notation" in discussing the magnitude of algorithms... is that where you're going with this?
There's several versions of algoritihmic analysis. Big O is the most common, and often it's quite intuitive. IIRC, O is the maximum time/resource that will ever be spent. You have other symbols for things, like "minimum possible resource expenditure", and average. Average runtime analysis is hard, but needed.
Quicksort is a O(n^2) algorithm. Like Bubble sort. That is, the runtime of the algorithm is proportional to the square of the length of the input (the number of items to be sorted).
If you do propability analysis on the formular however, you will find the run time to near O(n log n), which is a theoretical lower bound for comparison based sorted (bucket sort is in theory faste, as it does not compare, in order to sort). QS will only ever hit the worst time, when it's splitting the array to be sorted, when selection of the pivot point results in a constant split, such as:
[2 elem] [pivot elem] [rest of elem]
However, bad luck like this is extremely unlikely, and as long as you're getting splits, like this:
[x % of elems] [pivot elem] [rest of elem]
Then the time it takes, will be O(n log n).
I don't know what Wikipedia says on the topic, but the page on QS is awefully long: http://en.wikipedia.org/wiki/Quicksort
A pretty standard book on the analysis of algorithms and datastructures is the "Introduction to Algorithms" from MIT Press, by Cormen, et al.