programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
• Campbell Ritchie
• Paul Clapham
• Ron McLeod
• paul wheaton
• Devaka Cooray
Sheriffs:
• Jeanne Boyarsky
• Tim Cooke
• Liutauras Vilda
Saloon Keepers:
• Tim Moores
• Tim Holloway
• Stephan van Hulst
• Carey Brown
• Piet Souris
Bartenders:
• salvin francis
• Mikalai Zaikin
• Himai Minh

# Algorithmic Thinking - Complexity & Comparisons

Ranch Hand
Posts: 10198
3
• Number of slices to send:
Optional 'thank-you' note:
Does the book also talk about using different algorithmic techniques for a certain problem set and goes to the depth to explain why certain algorithm is best suited for certain tasks? Do you also explain about the Time complexity in Big O notation?

Author
Posts: 34
3
• Number of slices to send:
Optional 'thank-you' note:
Hi Joe,

Yes and yes

One motivational tool in the book is that I solve each example in several different ways, each solution better/more efficient than the last. I do that instead of just starting with the best solution straight away, so that we can go on a journey from problem to first attempt to second attempt to ultimate solution. I hope this approach helps readers appreciate what these algorithms/data structures are helping them doo.

For example, in the Dynamic Programming chapter, we start by trying to solve an example without using Dynamic Programming, instead just using recursion. Spoiler: it doesn't work... recursion is not fast enough for these problems. But now we're in a position to understand why recursion isn't efficient enough, and why Dynamic Programming is so powerful.

We use this same approach to motivate data structures, too. For example, in one chapter, we solve a problem that requires us to find the minimum value in an array that keeps being updated. We can try solving it without the proper data structure, but that ends up being too slow. Now we're ready for heaps.

I use big O notation sometimes in the book, but for readers not familiar with it I included a big O crash course in an appendix.

Thank you,
Dan

Joe San
Ranch Hand
Posts: 10198
3
• Number of slices to send:
Optional 'thank-you' note:
That's a great reply! Solving problems with different approach and finding out which one is efficient is a good way to learn and reason about. Thanks!

Marshal
Posts: 72672
317
• Number of slices to send:
Optional 'thank-you' note:
I know it is less of a problem now that I can buy space for 84594592634937658327687429856934 bytes (well, ≥ 500GB anyway) for about the same price as a takeaway pizza, but do you say anything about space complexity? I remember being warned against recursion on small devices because it is expensive on space. Does that still apply?

Daniel Zingaro
Author
Posts: 34
3
• Number of slices to send:
Optional 'thank-you' note:
Hi Campbell,

lol indeed. Aging myself I bet, but my family's first computer had like a 300MB hard drive in it. That's, what, enough storage for half a CD???

You're right about space complexity, especially with recursion. Even something like quicksort can crash using too much stack space if not implemented carefully. I do address space complexity in the book (e.g. replacing recursion by iteration, and dropping unused parts of arrays in dynamic programming).

Thank you,
Dan

Campbell Ritchie
Marshal
Posts: 72672
317
• Number of slices to send:
Optional 'thank-you' note:

Daniel Zingaro wrote:. . . enough storage for half a CD??? . . .

Left half or right half?

Joking aside, thank you for those answers and for answering all the interesting questions people are asking

Daniel Zingaro
Author
Posts: 34
3
• Number of slices to send:
Optional 'thank-you' note:

Campbell Ritchie wrote:

Daniel Zingaro wrote:. . . enough storage for half a CD??? . . .

Left half or right half?

Joking aside, thank you for those answers and for answering all the interesting questions people are asking

I'm off topic, wow, but

... a memory just came to me. I remember trying to copy a few songs from my No Doubt CD back in the day, but they were taking up too much space on my hard drive. So I compressed the heck out of them, like down to mono voice quality, all scratchy and barely listenable. To this day when I think about the song Don't Speak I hear that muffled version in my head...

...

Thanks, everyone. Having fun here.
Dan

Posts: 229
1
• Number of slices to send:
Optional 'thank-you' note:

Daniel Zingaro wrote:
... a memory just came to me. I remember trying to copy a few songs from my No Doubt CD back in the day, but they were taking up too much space on my hard drive. So I compressed the heck out of them, like down to mono voice quality, all scratchy and barely listenable. To this day when I think about the song Don't Speak I hear that muffled version in my head...

LOL... been there, done that.

Sai Hegde
Posts: 229
1
• Number of slices to send:
Optional 'thank-you' note:
On the same topic... maybe more from an agile project management perspective, how do you go from the ideation of an algorithm to a reasonably optimum solution in a packed sprint??

Daniel Zingaro
Author
Posts: 34
3
• Number of slices to send:
Optional 'thank-you' note:

Sai Hegde wrote:On the same topic... maybe more from an agile project management perspective, how do you go from the ideation of an algorithm to a reasonably optimum solution in a packed sprint??

Hmm, interesting question! I think that some (many? not all, for sure) problems, if read/understood from the perspective of an algorithm designer, have clues that tell you which algorithm to use. I have a section in most chapters to try to distill these clues for the reader.

As for algorithm implementation in a sprint: I'd go for the easiest implementation that uses the algorithm correctly. I don't do much code-level optimization in the book, and often you won't have to because these algorithms are super fast anyway.

What's that saying: don't optimize code unless it truly is a bottleneck? Something like that

Thank you,
Dan