Coffeehouse Thread

26 posts

The BigO notation.

Back to Forum: Coffeehouse
  • User profile image
    magicalclick

    Certainly we don't want explosion of computation. But, I also feel like BigO notation is a bit scewed. I mean, O(100n) = O(n). So we don't care about it anymore? O(n) sounds like one minute, but the reality is, it is 100n, which is 100 minutes. That's more than 1 and half hour. Expressionless

     

    And often those are used in theoretical analysis, which is kind of scary.

     

    Leaving WM on 5/2018 if no apps, no dedicated billboards where I drive, no Store name.
    Last modified
  • User profile image
    Sven Groot

    Big-O notation isn't about how long something takes, it's about how well it scales.

     

    O(n) simply means that the execution time is linear in n. If with n=10 elements it takes 5 seconds, then with n=20 elements it takes 10 seconds. Incrase the number of elements linearly, and the time increases linearly too.

     

    Therefore, it's not important if it's O(100n), because that still denotes a linear progression. If it were O(100n), twice the number of elements still takes twice as long.

     

    Big-O doesn't tell you how long one "n" takes. It simply tells you what happens to the execution time (or whatever else you're measuring, like memory) when you change n. Multiplying by a constant has no bearing on that, so it's not important.

  • User profile image
    W3bbo

    Sven Groot said:

    Big-O notation isn't about how long something takes, it's about how well it scales.

     

    O(n) simply means that the execution time is linear in n. If with n=10 elements it takes 5 seconds, then with n=20 elements it takes 10 seconds. Incrase the number of elements linearly, and the time increases linearly too.

     

    Therefore, it's not important if it's O(100n), because that still denotes a linear progression. If it were O(100n), twice the number of elements still takes twice as long.

     

    Big-O doesn't tell you how long one "n" takes. It simply tells you what happens to the execution time (or whatever else you're measuring, like memory) when you change n. Multiplying by a constant has no bearing on that, so it's not important.

    Can Big-O be used to express the complexity of every function? So far all the course material we've worked with has been fairly contrived stuff: processing arrays, lists, that sort of thing. Aren't there (non-deterministic?) algorithms that don't have such predictable behaviour that need a special notation, if any exists?

  • User profile image
    JoshRoss

    I'm not sure where you are going with this. Were you looking for a more comprehensive way to express the of cost of execution?

  • User profile image
    Sven Groot

    W3bbo said:
    Sven Groot said:
    *snip*

    Can Big-O be used to express the complexity of every function? So far all the course material we've worked with has been fairly contrived stuff: processing arrays, lists, that sort of thing. Aren't there (non-deterministic?) algorithms that don't have such predictable behaviour that need a special notation, if any exists?

    Big-O isn't a silver bullet. It probably won't work with non-deterministic algorithms, but even with deterministic algorithms it doesn't tell you everything. Quicksort for example is O(n log n), which is good, but that doesn't tell you it has an O(n2) worst case (it is for that reason that heap sort is preferred under some circumstances).

  • User profile image
    magicalclick

    W3bbo said:
    Sven Groot said:
    *snip*

    Can Big-O be used to express the complexity of every function? So far all the course material we've worked with has been fairly contrived stuff: processing arrays, lists, that sort of thing. Aren't there (non-deterministic?) algorithms that don't have such predictable behaviour that need a special notation, if any exists?

    This is why you have up-bound, lower-bound, and average case. And average case is rather abstract because you have to somehow formulize the distribution correctly.

     

    I am not a believer of those myself. Sure I can use a bit of those methmatical values to represent my program, but, definitely not in-depth analysis. I believe some people focused on analysis so much, they even get biased toward the development. I often fear people goes for one route of development because it is easier for the analysis. Which is not a good direction IMO.

     

    Leaving WM on 5/2018 if no apps, no dedicated billboards where I drive, no Store name.
    Last modified
  • User profile image
    Bass

    Tell me, if n is "close to infinity", what algorithm you think would be generally faster? A O(100*n) one  or a O(n lg n / 100)?

     

    Big O notation concerns itself with how an algorithm performs at the limits of n. Which "n" is really large, these constants become increasingly meaningless for comparison purposes. For low values of "n", you probably shouldn't rely on big O notion to determine the efficiency of an algorithm.

  • User profile image
    magicalclick

    Bass said:

    Tell me, if n is "close to infinity", what algorithm you think would be generally faster? A O(100*n) one  or a O(n lg n / 100)?

     

    Big O notation concerns itself with how an algorithm performs at the limits of n. Which "n" is really large, these constants become increasingly meaningless for comparison purposes. For low values of "n", you probably shouldn't rely on big O notion to determine the efficiency of an algorithm.

    I thought the opposite. the N is large, thus, the explotion of computation is scary. My concern is, O(C) = O(1). But C = "close to infinity"? I see a lot of poeple just flah around the BigO, but, they don't even have second thought about the crazy C they just gladly ignored.

     

    Or my original concern, in order to make the analysis easier, they dump down their algorithm. Sad

    Leaving WM on 5/2018 if no apps, no dedicated billboards where I drive, no Store name.
    Last modified
  • User profile image
    Sven Groot

    magicalclick said:
    Bass said:
    *snip*

    I thought the opposite. the N is large, thus, the explotion of computation is scary. My concern is, O(C) = O(1). But C = "close to infinity"? I see a lot of poeple just flah around the BigO, but, they don't even have second thought about the crazy C they just gladly ignored.

     

    Or my original concern, in order to make the analysis easier, they dump down their algorithm. Sad

    Because that's not what Big-O is measuring. O(1) tells you that no matter the number of items, it'll be the same. It doesn't tell you how long it'll be, just that it'll always be the same. If you want to know how long it'll actually take, rather than just the order, don't use Big-O.

  • User profile image
    magicalclick

    Sven Groot said:
    magicalclick said:
    *snip*

    Because that's not what Big-O is measuring. O(1) tells you that no matter the number of items, it'll be the same. It doesn't tell you how long it'll be, just that it'll always be the same. If you want to know how long it'll actually take, rather than just the order, don't use Big-O.

    A lot of times you are required to use ONLY Big-O. That's why I am talking about this.

    Leaving WM on 5/2018 if no apps, no dedicated billboards where I drive, no Store name.
    Last modified
  • User profile image
    Sven Groot

    magicalclick said:
    Sven Groot said:
    *snip*

    A lot of times you are required to use ONLY Big-O. That's why I am talking about this.

    Required? By whom?

     

     And Bass is right: assuming large n, the constants matter less. Let's assume n=1,000,000,000. That means with O(100n) it'd be 100,000,000,000. But with O(n2), it's 1,000,000,000,000,000,000. With O(2n) or O(n!), the numbers are so large that Windows Calculator can't represent them anymore. Compare to that, a constant factor of 100 is peanuts.

     

    And optimization (of the implementation, without changing the algorithm) can reduce the constants. But you can never optimize it so that it improves the order without changing the algorithm. I can easily write two versions of quicksort where one is twice as fast as the other, but they'd both still be O(n log n). And I can never do better than O(n log n) while still using quicksort. That's what Big-O is telling you. I can run it on a 286 16Mhz or on a Core i7 3Ghz, and even though it'd run much faster on the Core i7, it's still O(n log n).

  • User profile image
    magicalclick

    Sven Groot said:
    magicalclick said:
    *snip*

    Required? By whom?

     

     And Bass is right: assuming large n, the constants matter less. Let's assume n=1,000,000,000. That means with O(100n) it'd be 100,000,000,000. But with O(n2), it's 1,000,000,000,000,000,000. With O(2n) or O(n!), the numbers are so large that Windows Calculator can't represent them anymore. Compare to that, a constant factor of 100 is peanuts.

     

    And optimization (of the implementation, without changing the algorithm) can reduce the constants. But you can never optimize it so that it improves the order without changing the algorithm. I can easily write two versions of quicksort where one is twice as fast as the other, but they'd both still be O(n log n). And I can never do better than O(n log n) while still using quicksort. That's what Big-O is telling you. I can run it on a 286 16Mhz or on a Core i7 3Ghz, and even though it'd run much faster on the Core i7, it's still O(n log n).

    Well, so far my AI master course requires BigO on space and time complexity ONLY. Obviously it is not just some simple sorting analysis or general program analysis.

    Leaving WM on 5/2018 if no apps, no dedicated billboards where I drive, no Store name.
    Last modified
  • User profile image
    Sven Groot

    magicalclick said:
    Sven Groot said:
    *snip*

    Well, so far my AI master course requires BigO on space and time complexity ONLY. Obviously it is not just some simple sorting analysis or general program analysis.

    Let's put it this way. Big-O tells you something fundamental about the algorithm, not its implementation. The constants you're thinking of are typically tied to the implementation.

     

    I've been working on an FP-Growth implementation. I started out with a simple reference version that did everything the most naive way possible. And after optimization it uses about 10 times less memory and is 100 times faster. But it's still the same algorithm, it still has the same time complexity. And without modifying the algorithm itself, I can never beat it. That's the value of Big-O.

     

    Your AI master course, if it's a science course, is probably more interested in the fundamental limits of the algorithms itself rather than the implementation efficiency.

     

    The constants you can improve on, by improving the implementation, or just by throwing more hardware at it. The complexity you can only improve on by inventing a better algorithm. That's what Big-O is for.

  • User profile image
    magicalclick

    Sven Groot said:
    magicalclick said:
    *snip*

    Let's put it this way. Big-O tells you something fundamental about the algorithm, not its implementation. The constants you're thinking of are typically tied to the implementation.

     

    I've been working on an FP-Growth implementation. I started out with a simple reference version that did everything the most naive way possible. And after optimization it uses about 10 times less memory and is 100 times faster. But it's still the same algorithm, it still has the same time complexity. And without modifying the algorithm itself, I can never beat it. That's the value of Big-O.

     

    Your AI master course, if it's a science course, is probably more interested in the fundamental limits of the algorithms itself rather than the implementation efficiency.

     

    The constants you can improve on, by improving the implementation, or just by throwing more hardware at it. The complexity you can only improve on by inventing a better algorithm. That's what Big-O is for.

    I am not talking about what BigO is for. I have being saying we don't want explosion of compution from the begining.

    Leaving WM on 5/2018 if no apps, no dedicated billboards where I drive, no Store name.
    Last modified
  • User profile image
    kriskdf

    magicalclick said:
    Sven Groot said:
    *snip*

    I am not talking about what BigO is for. I have being saying we don't want explosion of compution from the begining.

    In real life, Big O is most important for the design and architecture review.  The C is important during code reviews, development, and testing.  If you can't get the right algorithm to scale for your problem, you are doomed.  Once you have the right algorithm, you should use a profiler and measure the best way to reduce the cost of "C".  For school, it seems like the right thing to focus on is BigO since the amount of time spent on performance tuning is really about finding a good balance of meeting real world requirements and managing cost of development.  In school, they don't tend to focus on the best compiler settings for performance, measure unit test coverage, do threat models, run PreFast, or other such tasks, so in most cases, they also don't focus on other optimizations that are constant time. 

     

    BTW, the constant time is sometimes considered upfront in real world applications if it is obviously going to be a problem.  For example, for each N, do you need to go to memory, disk, or network?  Knowing some of these things can change an algorithms design (maybe do work in batches of 1K N instead of individual N, especially if N is going to be very big).

  • User profile image
    AndyC

    kriskdf said:
    magicalclick said:
    *snip*

    In real life, Big O is most important for the design and architecture review.  The C is important during code reviews, development, and testing.  If you can't get the right algorithm to scale for your problem, you are doomed.  Once you have the right algorithm, you should use a profiler and measure the best way to reduce the cost of "C".  For school, it seems like the right thing to focus on is BigO since the amount of time spent on performance tuning is really about finding a good balance of meeting real world requirements and managing cost of development.  In school, they don't tend to focus on the best compiler settings for performance, measure unit test coverage, do threat models, run PreFast, or other such tasks, so in most cases, they also don't focus on other optimizations that are constant time. 

     

    BTW, the constant time is sometimes considered upfront in real world applications if it is obviously going to be a problem.  For example, for each N, do you need to go to memory, disk, or network?  Knowing some of these things can change an algorithms design (maybe do work in batches of 1K N instead of individual N, especially if N is going to be very big).

    Big O is great in 'theoretical' terms, but in practical terms 'N' is always bounded and as such Big O isn't necessarily a good indicator for real world comparisons of two algorithms/data structures since the effect of constants dropped from a big O calculation can be very significant. Practical things like page faults and the non-linear speed of memory addressing (due to layers of caching) can be, and generally are, far more significant in determining how something scales within realistic bounds.

  • User profile image
    kriskdf

    AndyC said:
    kriskdf said:
    *snip*

    Big O is great in 'theoretical' terms, but in practical terms 'N' is always bounded and as such Big O isn't necessarily a good indicator for real world comparisons of two algorithms/data structures since the effect of constants dropped from a big O calculation can be very significant. Practical things like page faults and the non-linear speed of memory addressing (due to layers of caching) can be, and generally are, far more significant in determining how something scales within realistic bounds.

    well...i guess if you make N a constant by giving it a practical bound you can decide which contant has the biggest impact on overall performance. Smiley  I work on Bing and N can be very big and continues to grow.  Because N is so big, the BigO is important, as is the constant.  Spending 1 second on a web doc has huge impact compared to 500ms.  Going from O(N) to O(NlogN) is huge if N is bounded at 100B or so.  How much would you need to reduce the constant time to make it worth sticking with O(NlogN) over O(N)?  How about at a trillion docs?

     

    So i guess the conclusion is that BigO and the constant are important depending on the situation. Smiley 

  • User profile image
    magicalclick

    kriskdf said:
    AndyC said:
    *snip*

    well...i guess if you make N a constant by giving it a practical bound you can decide which contant has the biggest impact on overall performance. Smiley  I work on Bing and N can be very big and continues to grow.  Because N is so big, the BigO is important, as is the constant.  Spending 1 second on a web doc has huge impact compared to 500ms.  Going from O(N) to O(NlogN) is huge if N is bounded at 100B or so.  How much would you need to reduce the constant time to make it worth sticking with O(NlogN) over O(N)?  How about at a trillion docs?

     

    So i guess the conclusion is that BigO and the constant are important depending on the situation. Smiley 

    Yeah, I think both are really important. But often people get obsessed with BigO and ignore the C there. And they often claim they are faster when they just playing number tricks.

     

    And a lot of times, getting BigO is hard. And I fear some poeple are so obssesed with BigO, they developed their solution based on how easy it is to analysis. I believe the better way is develope ideas about the solution and then do analysis on that. But, from the school trend, it feels like the analysis is up front and find a solution that's easier to analysis instead.

     

    Leaving WM on 5/2018 if no apps, no dedicated billboards where I drive, no Store name.
    Last modified

Comments closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums, or Contact Us and let us know.