Firstly, I can see where you're coming from. I myself have never looked upon it this way, though.

Computer science is a fast moving world. Hardware capabilities grow at an unbelievable rate, and what we, as developers/designers/engineers/whatever-you-do-with-computers are expected to do changes with it. Nobody will like it if you produce an app that looks and feels like Windows 3.1, because people today expect more.

One of the biggest challenges with the evolving complexity of software (and hardware) is how to effectively manage it. The technology that was big a few years ago may be wholly inadequate to deal with the software projects of today. As such, new technologies frequently emerge, and you are frequently left to guess which will be worthwile. You cannot keep up with all of them.

What however doesn't change so often though, is the programming paradigm. Back in the olden days, when memory was limited and machines were slow, programmers were expected to do it all themselves. Optimization in assembly was daily practice. As machines got better, and the software that was written got more complex, people started to appreciate that software engineering was indeed a problem, and that things should be done to make it easier. So steps were taken to alleviate the programmer, sometimes at the cost of a few CPU cycles. First was the move to higher languages. Next came Object Oriented programming. And nowadays you have component based software engineering. But I think that these are the only major shifts that have occurred over the years. (yes, I know that's not a complete list)

Learning new technology is easy. The more technologies you know, the more similarities you see, the easier it gets to learn the next big thing.

Learning C# and .Net didn't take me nearly as much time as learning COM or C++. Why? Because the underlying idea isn't different. Changing the way you think about developing software is difficult. Changing whether or not you need to manually free your objects is not.

This is a dynamic industry. Those who are not willing to learn become obsolete. But in the end, I don't believe that things change so radically really often.

What's the point of doing "Hello World" in yet another language? No point at all. The C# language is not important. The .Net Framework is important. It is the new technology, C# is just a tool to do it. If you know the .Net Framework, you can learn most of the languages that use it in a pretty short time. Object oriented languages are object oriented languages, no matter in what flavour they come. It shouldn't take much time to learn them.

But in the end, what's important is that you have a language and technology/API that allows you to what needs to be done, and in a way thats efficient and easy. And if you're doing it professionally, a way that keeps you employed. All other concerns are secondary, really.

To quote a saying from martial arts:
"It doesn't matter which path you take to the top of the mountain, in the end we all see the moon."