Entries:
Comments:
Posts:

Loading User Information from Channel 9

Something went wrong getting user information from Channel 9

Latest Achievement:

Loading User Information from MSDN

Something went wrong getting user information from MSDN

Visual Studio Achievements

Latest Achievement:

Loading Visual Studio Achievements

Something went wrong getting the Visual Studio Achievements

Vance Morrison: CLR Through the Years

Download

Right click “Save as…”

  • High Quality WMV (PC)
  • MP3 (Audio only)
  • MP4 (iPhone, Android)
  • Mid Quality WMV (Lo-band, Mobile)
  • WMV (WMV Video)
CLR Architect Vance Morrison has been very busy working on the future of the CLR, especially as it relates to execution performance and the type system. Some of his latest work is present in the upcoming 4th version of the virtual machine that powers all things .NET, CLR 4, which ships with Visual Studio 2010. Vance has been on the CLR team since its inception. MSIL, the intermediate language produced by the compilers of all .NET languages, is primarily Vance's doing.

Here, Vance guides us through some of the history of the CLR, a look inside the upcoming version and some insights into the future. One of the things that Vance is thinking about with respect to type inheritance is what he calls default interfaces: they are contractual, but with default implementation characteristics, as opposed to purely abstract as interfaces are today. So, a default(implementation) interface is capabe of changing without breaking the systems that implement it. Wait a minute, that' goes against the basic rules of interfaces in the OO world. Vance explains. Relax.

Meet Vance, the face of MSIL. There's much of his thinking and code inside the CLR. Learn about some this here. Tune in.

Enjoy.

Tags:

Follow the Discussion

  • stevo_stevo_ Human after all

    Great video.. so whats the deal with the interface thing, because I love interfaces but hate the versioning trouble.. it sounds like theres been a decision to allow a default member implementation.. but when is this?

    Also, whats the story about a different exception handling model? couldn't you guys put your thoughts out there for people the masses to crit?

  • WOOOT for default interface implementation!  This is something that I have wanted for a long time.  Is this going to be in .NET 4? Or some latter verison?

    I know in the VS2010 beta released yesterday I got the error "interface members cannot have a definition", so I assume its some latter version of .NET?

  • CharlesCharles Welcome Change

    As stated, these are potential futures, not presently implemented or upcoming in 4...

    You're going to see more CLR4-specific interviews over the next few weeks. Don't worry. There's a a lot in 4.

    C

  • Grant BoyleGrantB What the hell are we supposed to use man? Harsh language?

    Top notch work guys. I need to watch this one again.

  • PerfectPhasePerfectPhase "This is not war, this is pest control!" - Dalek to Cyberman

    The COM stuff mentioned at the end, was that the PIA-less deployment, I thought that was part of V4, has it been bumped?

  • CharlesCharles Welcome Change

    No-PIA is in V4. More on this in the near future, including a deep look at Type Equivalence in V4.

    C

  • At around the 26th minutes, he says:

    "There is this notion of server GC and workstation GC, ...... this distinction came back from the good old days when only servers were multi core, no one had a multi core box as a client"

    This is plain wrong, really wrong. Workstations which of course are not servers have had multi processors (which is basically multi cores but on different processors) for years, back the the nineties. You could get boxes (they were terribly expensive) from Digital, Sun, HP, SGI, IBM with Alpha, Sparc, PA-RISC, MIPS and Power processors respectively, configured with 2-ways or 4 ways processing. Basically when those processors became multi-processing capable, they got their way to servers and high end workstations at around the time. And by the way, all those processors were 64 bits. 

    You could even get cheap dual G4 processors machines from Apple back to 2001, when no one else was speaking about multi-processing in the personal computer world (the workstation market was considered as high end market distinct from the personal computer market). This was the time when Apple was trying to convince the pc industry (and Intel) that running after the gigahertz was pointless and will have to come to an end. It eventually happened.... 

    Taking advantage of multi processing is not a new problem, it is a more important problem now because the number of customers and developers affected by it is larger as the whole industry has embraced multi processing as the main path for higher performance. So we will see more and more cores that we need to use.

    Before the problem was limited to people doing rocket science who needed to reduce the computing time of large data sets, and to develop complicated visualization apps, or anything involving high performance. They had (still have) to write concurrent code for 4, 16 cores. From now, we will have to write concurrent code for 64 cores and higher not too far in the future, making the problem even bigger.

    But again saying that multi-processing was limited to servers is totally wrong, it had appeared on the client many years ago, again during the time when multiprocessing was not a big player in the windows world. 

    Some Microsoft people should really go out sometimes, really...

  • Christian Liensbergerlittleguru <3 Seattle

    Hmmm... I haven't watched the video yet but I'm convinced that it was meant in the way that most people (or the average guy) didn't have a multi-core machine at home.

  • stevo_stevo_ Human after all

    Sorry but its just not true at all that multi proc/core cpus existed on the desktop when the clr was first released.. its only the last 2-3 years where multicore has become common.. and perhaps the only the last 1-2 years where the average users are starting to buy them..

    Until its a common scenario that the average desktop will have this capability, it would be pointless and I'm guessing ever so slightly damaging to enable a concurrent designed gc on a single threaded exec machine (guessing that the workstation gc may run slightly better than the server one in that setup).

    Also charles, cmon he basically said they'd made a decision, and highlighted how annoying it is today Wink.

  • I agree with you. I just had a multi-core machine. The multi-cores machines don't have a common in our side.

  • rhmrhm

    I think you need to be a bit more realistic in how you interpret what people say. Of course you could get multi-CPU machines for desktop use a long time ago - everyone knows that - so obviously the guy meant that the user-base of such machines was so small it didn't matter to most developers.

     

  • Allan LindqvistaL_ Kinect ftw

    @ 29min there is mention that "nobody knows what to do with 100+ cores"
    i dont fully agree Smiley in the graphics and gaming world people actually have those kinds of cores available. my amd 4850X2 card at home has 1200 stream processors.. 1200! (yes i know thats not the same as regular cpus but still) Smiley and its not even a super high end card.. granted, not a whole lot of people work directly with gpu processors, but GPGPU apps are coming strong..

  • stevo_stevo_ Human after all

    aL_ thats because gpu simd processing is easy to parallelize.. in general an operating system and applications running with 100 x86 cores available.. how do you best put those 100 cores to use? 

  • First few minutes of the video:

    "Most of the original CLR guys have left."

    "Where did they go?"

    "Most of them are working on Midori."

    Man, MS is really putting some valuable resources on Midori. I think Chris Brumme and Joe Duffy are now working on Midori (haven't said it explicitly, but you can read between the lines). And Midori is a rather secretive project; searching for it on MS Research just points to the old Singularity project. MS seems to have big plans for this.

  • DouglasHDouglasH Just Causual

    I seriously doubt we will actually see 100 core x86-64.

    I see a lot of specialization to start occuring.  larebee and fusion are a couple of examples.

    We may see up to 64 cores at the server level but memory contraints would be a huge issue in such a platform. especially the way that intel chips are currently designed with need for memory bandwidth for performance. 

    at the consumer level specialization will be much more prominant. 8 cores with 4 graphic cores and 4 gpgpu cores. as an example.

    excel would benifit on some calculations with simd support so perhaps an expansion of simd processors.

    The constraint really is going to be how to connect memory to those all those cores(unless we build memory onto the CPU itself.

    douglas

  • CharlesCharles Welcome Change

    No. Some of CLR's original people are working across the company,in various roles on various projects, as often happens at Microsoft... In fact, most of the folks I interviewed for CLR 4 are original members of the CLR team...

    Midori is a private incubation. There are no release plans, no public disclosure roadmap, no product planning. It's fundamental research, in a real sense. It is not meant to be publicized as a result... Joe Duffy works on the Parallel Computing Platform team and has nothing to do with Midori.

    I need to start editing my videos, I guess. I forgot we had mentioned this. Apologies for releasing inaccurate information or information that causes confusion. Vance was talking about a few fellow architects and not most of the CLR team...... Still, this was something better left out of the conversation to limit confusion.

    C

  • Charles,

    Come on, that wasn't a slip, I’m sure if it was you would've caught it.  I would love a completely managed OS.  That would be mind blowing!  Think of all the problems it would solve.  Being able to install a new run-time and suddenly every app on your system gets faster, with no code changes!  Imagine the possibilities.  I'm sure that is Sci-Fi as well, but one can dream can't they?

    I really enjoyed this interview.  I think many of us have forgotten the days before managed code.  We have all forgotten how painful it was to share code with each other.  Now we have one unified run-time, we can use reflection to inspect running processes, we can use reflector to disassemble, we have amazing technologies like LINQ, MVC, WCF, WPF, which are huge foundations.  Sometimes we complain, sometimes we expect lots from Microsoft, but we have come a long way with .NET.

    I mean think about, besides what gets added in a new release, the .NET Framework provides so many other features most of which we don't even think about.  We have complete type safety (not possible in native code), you know, you can’t have something that is 2-bytes long and suddenly start talking to it like its 4-bytes long.  We can’t talk to uninitialized memory; buffer over-runs are not possible because of boundary checking.  And strings, we have one string in .NET!!, try that in native code.  We also have assurances like; a rogue process can't take over our whole machine.  We can write a piece of code and use it from a Windows, Web, or a Service app.

    And the Studio, we can use one product to develop everything.  We can write add-ins, we can "attach to process", we can debug pretty much anything, and to top it off were not going to get parallelism in the next release, thank you.

    I probably sound like a fan boy and some of you may want to shoot me, but .NET has made all of our lives easier whether we choose to admit it or not.

    Cheers!

  • Vesuviusvesuvius Count Orlock

    Watched this on the way back from work and thought to myself, why do people like Vance never get suffcient exposure on Channel 9?

    All us .NET developers are minnows when it comes to MSIL. Were intelligence not a barrier for me, that is where I'd want to work.

    Definitely an interview I will watch again.

    Greetings from England Vance!

  • CharlesCharles Welcome Change

    Vance will appear on C9 again next week. Is that often enough? Smiley

    Keep in mind that folks like Vance are beyond busy. They don't have a ton of time to spend in front a camera. For big releases, they find the time because they are the right people to speak to the technology at hand, for obvious reason (they thought it up, for one thing).

    Glad you enjoyed this conversation. I know I did (as I did all of the CLR pieces).

    C

  • br1br1

    This is the best channel 9 video I've seen.  The only things I would have added is more about scaling down the framework, making it fit silverlight and WinCE, and also the rest of the things that could have been made better, like utf8 strings.

  • Vance mentioned that there's little opportunity for the CLR itself to leverage multi-core/processors other than the GC. But what about the JITter? If there are enough processing-resources free, wouldn't it be nice to let some background thread collect profile information and feedback on the JITter, so that after code pitching happens the JITter could produce better code? Or put in other words, why not make the Execution Engine more adaptive? And how adaptive is it nowadays in CLR 4?

  • CharlesCharles Welcome Change

    Good questions, Ravenex. In fact, these questions were asked of the GM of the CLR, Ian Carmichael, in a conversation I had with him just a few weeks ago. I'll post this in the next couple of days.

    Of course, the CLR itself is pretty good at dealing with multiple-core chip architectures. That said, it's not as thought it doesn't need some work to scale to many-core. I agree with you that there are some exotic execution potential afforded by lots and lots of processing resources on the host machine. We'll follow the progress of this very interesting topic, many-core's impact on software architectures, very closely on Channel 9 (we have been to some extent already, but perhaps we've not been explicit enough - well, the reality is that the industry is not quite sure what 100 cores means to client computing... LOTS of thinking going on. Much thought, there is.)

    C

  • We are finding opportunities to leverage more cores for code generation in the CLR. For version 4, in some scenario we will ngen multiple assemblies across many cores. This should have an advantage of faster install time. We're looking through other options too.

Remove this comment

Remove this thread

close

Comments Closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums,
or Contact Us and let us know.