Right tool for the right job. That's the key.
Increasingly (and it has already reached that point for 99%* of applications, as exemplified by Bing), managed code will pay more dividends as the bottleneck continues to not be the CPU.
One has to realize that while safe/managed code has overhead like array bound checks, for example -- these are often dwarfed by the applications memory access patterns, and code performance is generally dictated by locality of reference. And, if you're I/O-bound, you have a totally different set of problems to solve.
But, of course, one can contest that this is not always true. Sometimes you are truly compute-bound like looping over an array that say fits inside the processor's fastest cache. Even in this case, it's not like the JIT-compiler is going to do something bizarrely different. It just has to be safe. Oh, and there is a time-component involved, because JIT-compiler optimization have to justify the time taken to generate the optimized code.
Sigh, I digressed.
Back to the question about Bing's GC and JIT improvements (TechNet blog post). The point I think the video is trying to drive home is that developing on an established framework, like .NET, gives you these "free gains". You didn't change a line of code, yet you reaped benefits.