So if quantum computers are on the way should beginner programmers already keep that in mind and do something differently in their learning? I mean are there any programming languages, strategies or technologies that will soon become "too old" to start learning now?
Welcome to the Ranch
The basics of imperative programming have been the same since the 1950s. New techniques, e.g. structured programming, functional programming, object‑oriented programming, have been added since. (Actually Lisp introduced functional programming in 1958). So I would think all programmers will still have to learn the same basics. I can't see any programming paradigm disappearing.
Hi Stella! Thank you so much for posting your question.
Programming Languages: Regarding programming languages, none of them become "too old" due to the emergence of QPUs (quantum processing units). Whether you're reading this reply on a phone or a computer, your machine has a GPU (graphics processing unit) handling graphics functions. That GPU has a special instruction set the CPU doesn't understand, but when you write programs you can still use all of the existing languages. Same thing with a QPU. When you need to access the special (QPU or GPU) functions, you call a library, which uses drivers made by the device manufacturer.
Do something differently: For quantum computing, the new things to learn involve identifying what kinds of problems are suited for QC, and how to implement solutions for those. That's what our book covers. Like GPUs, the new functions are really interesting, but they don't stop anyone from learning the essentials of programming, and being able to make their own software.
Sort Algorithms: This is a really interesting topic. There is a great deal of recent research on using QPUs to sort information, including this Cardinal, Joret, Roland paper just published last February. However many of the key QPU algorithms and functions involve solving problems (search and others) without the need to sort at all. While a CPU might spend time shuffling items in a list to learn which three are at the top, a QPU can run its program and then retrieve the top results more quickly. Although which result is returned is random, it's weighted by probability, so the closer an item would have been to the top of the sort, the more likely it is to come out as the answer. In the QC community, it's common practice to run programs more than once to produce a selection of the most probable results.
In the book, lots of information and hands-on code can be found in chapters 6 and 10, including really good code ready-to-run samples such as this one.
Regarding BogoSort (a diabolically random sorting algorihm), I don't think that'll be better than using one of the QPU algorithms designed to make sorting unnecessary, but I'm always ready to be proven wrong. :]