I'm going through some old past papers for my data structures class, and many questions come up saying.. "Name and describe a sorting algorithm that has average time complexity O(n log2 n). Show how this algorithm would start sorting the above array (three steps will do.)" Looking on wikipedia http://en.wikipedia.org/wiki/Sorting_algorithm#List_of_sorting_algorithms

Shell sort is the only one with O(n log2 n) but that's under 'worst'. Just a bit confused here, also on wikipedia it says 'Memory' would that be the best case time complexity for an algorithm?

Shell sort is the only one with O(n log2 n) but that's under 'worst'.

O(n log2 n) has a similar complexity to O(n log10 n) -just like O(n^2) is considered similar to O(n^3)-, so I would consider Heapsort and Quicksort to be in the same league as Shellsort.

also on wikipedia it says 'Memory' would that be the best case time complexity for an algorithm?

No. Time complexity and memory complexity are two different things. When one talks about complexity, time complexity is almost exclusively what's being referred to (the Average and Worst columns in that table). And "average" is the more important metric; "worst" -like "best"- rarely occurs, so it generally is just a footnote.

You can think of it like this: Memory is cheap, so if the algorithm needs more of it, you can just buy more. But you can't buy time, so that's what you are concerned with. [ February 09, 2008: Message edited by: Ulf Dittmer ]

Gary Goldsmith
Ranch Hand

Joined: Mar 06, 2007
Posts: 30

posted

0

So is there a reason why the average section for bubblesort, cocktail sort, and shell are empty? Or are they just constant regardless of how much data they have to deal with.

Could someone go over these questions with me?

1. In this question refer to the following array of integers to illustrate your answers 10 11 6 5 14 18 9 8 4 13 2

a) Name and describe a sorting algorithm that has average time complexity O(n2). Show how this algorithm would start sorting the above array (four steps will do.) [7 marks] So I would put down Selection sort here.

b) What is the best case time complexity of the algorithm named above? When is it achieved and why? [3] Would the answer be O(n2) because it goes through a list once at a time

c) Name and describe a sorting algorithm that has average time complexity O(n log2 n). Show how this algorithm would start sorting the above array (three steps will do.) [7] Not sure

d) What is the worst case time complexity of the algorithm named in c)? When is it achieved and why? [3] Not sure

e) Describe a O(n) sort. Under what circumstances would you be likely to use this sort? Would a BinaryTree sort be a fair answer?

If I go through my past papers and put the answers online, would anyone be willing to look them over to see if they are correct?

Ulf Dittmer
Rancher

Joined: Mar 22, 2005
Posts: 42954

73

posted

0

b) What is the best case time complexity of the algorithm named above? When is it achieved and why? Would the answer be O(n2) because it goes through a list once at a time

No, selection sort has a best-case complexity of O(n^2). Insertion sort has a best-case of O(n).

c) Name and describe a sorting algorithm that has average time complexity O(n log2 n).

As mentioned above, IMO heap, shell and quick sort all apply because they are O(n log n).

d) What is the worst case time complexity of the algorithm named in c)? When is it achieved and why?

Quicksort would be a good candidate, as it can degenerate into an O(n^2) algorithm.

e) Describe a O(n) sort. Under what circumstances would you be likely to use this sort? Would a BinaryTree sort be a fair answer?

Not sure what you mean by that - heap sort? But that's not O(n).

Chapter 2 of Niklaus Wirth's freely available classic "Algorithms and Data Structures" is a good introduction to sorting and its analysis. [ February 10, 2008: Message edited by: Ulf Dittmer ]

Nearly all "good" sort algorithms are classifed as O(N ln(n)) (it actually makes zero difference if its ln (base e) or log (base 10) since the difference is just a constant, and that falls out as the C in the Big Oh definition.)

The reason there are so many is: 1) back in the 1950s, you could get a paper published with new ones 2) while Big Oh tells you some things it doesn't tell all.

As mentioned upthread, it really depends on whether you are talking about 'typical' input data, and if you mean best, worst, or mean case.

Some sorts are N ln(N) for random data, but N^2 for presorted data.

As long as N is small, you can ignore it. But if you are feeding in transaction for a big bank's nightly transactions, and the data is mostly sorted from last night's processing, it makes a huge difference.

[Ulf]: O(n log2 n) has a similar complexity to O(n log10 n) -just like O(n^2) is considered similar to O(n^3)-

I disagree - O(n^2) and O(n^3) are different orders, while O(n log2 n) and O(n log10 n) are the same order. The difference between the two logs is just a constant factor which is ignorable in big O notation. The difference between n^2 and n^3 is not a constant, and not ignorable under most circumstances.

"I'm not back." - Bill Harding, Twister

Ulf Dittmer
Rancher

Joined: Mar 22, 2005
Posts: 42954

73

posted

0

Jim is right to disagree - I was simplifying things a bit, and hand-waved away the simplification by using words like "similar" instead of equal.

Gary Goldsmith
Ranch Hand

Joined: Mar 06, 2007
Posts: 30

posted

0

So what be the correct answers to those questions in your opinion?

Originally posted by Gary Goldsmith: So what be the correct answers to those questions in your opinion?

Do you have a real-world data-set/computational goal ? There are plenty of algora, but few details in your challenge. [ most of them sound like an academic assignment in a 2-year program ] It is not the answers that you gave ( to what sounds like an undergrad intro to cs ) but what challenge you are facing today that causes you to investigage ~ that challenge will have sufficient info that we will have a foray of Masters competing for your accolades .... describe the challenge.

"The differential equations that describe dynamic interactions of power generators are similar to that of the gravitational interplay among celestial bodies, which is chaotic in nature."