People don't change how they program to accomodate new computers, they change how the new computer behaves to make it behave like an old computer, or use new programming languages so they can interact with it in the same way as before (people can write Visual Basic in any language, and if we get a quantum computer, they'll write Visual Basic on that too).
Look at the past ten years of computing development - your data no longer goes along bus lines in the north-bridge, it goes along the QPR bridge. You no longer have RAM, you have flash-RAM. You no longer have a single platter disk, you have a multiplatter magnetic RAID disk that uses sector streaking to improve performance and redundancy, and you don't have one CPU you have four, all connected along a hybrid bus along the outside of the CPU - and has any of this changed how we program? Not one jot.
If and when quantum computers come along, they'll be abused for specialist applications like encryption, sure, but if they ever reach the desktop (and that's a long way off) then Microsoft and Linux will write an OS that sits on top that just takes the "quantum" as a speed improvement over other computers and throw the rest of the magic physics away, because programmers don't care about physics, they care about getting their product to the maximum number of users in the minimum amount of effort.
Although in some respects, abstractions will be made to hide certain kinds of complexity, in the big scheme of things, I don't agree, because of a few reasons. First of all, weare changing the way we program for multiple CPUs and distributed systems. I am certainly doing so for the Azure project I am on. It uses the same languages, but the design is very different. Parallel programming and asychronous programming idioms like C# Async, TPL and Rx provide an abstraction to make it simpler, but it's still quite different, and results in a different design. Rx code looks very little like "ordinary" C#.
Also, the fundamental algorithms for solving problems on a quantum computer are drastically different than classical algorithms. They are so different, that simply understanding them requires at least a working knowledge of quantum mechanics. There may be a group of programmers that don't get into that depth - and they may be the majority - but there will still be a lot who must, just as today there are still a lot of programmers that need to program to the machine (C++ is still very much alive). So, as part of a decent computer science education, students will at least have to learn the basics, just as I had to learn all about memory management, even though I rarely work on code that deals with its own memory management any more.
But, who knows. We'll see. It's just the beginning.