There is simply some practical problems that can not be parallelied effectively, if algorithms depend on intermediate data you are SOL until that intermediate data is computed. It's not science fiction it's logicial impossibility. This whole parallel affliction
is one of the worst things to ever happen to the software industry. I don't think it's something that most software developers should have to worry about.
Sometimes that is true, others it is only true because we are attempting to do the absolute minimum amount of work possible to solve a problem. I suspect that as CPU cores become more numerous, you'll start to see much wider use of algorithms and languages
that rely on attempting multiple possible solutions simultaneously and discarding the results of those that turn out to be unnecessary.