Entries:
Comments:
Posts:

Loading User Information from Channel 9

Something went wrong getting user information from Channel 9

Latest Achievement:

Loading User Information from MSDN

Something went wrong getting user information from MSDN

Visual Studio Achievements

Latest Achievement:

Loading Visual Studio Achievements

Something went wrong getting the Visual Studio Achievements

Concurrency and Parallelism: Native (C/C++) and Managed (.NET) Perspectives

Download

Right click “Save as…”

  • High Quality WMV (PC)
  • MP3 (Audio only)
  • MP4 (iPhone, Android)
  • Mid Quality WMV (Lo-band, Mobile)
  • WMV (WMV Video)

Parallel Computing Platform team members Stephen Toub, Rick Molloy, Don McCrady and Dana Groff join me for a chat about the differences and similarities in their conceptual approach to designing and building concurrent programming abstractions targeting .NET developers and native (C/C++) developers.

Besides the obvious semantic (and runtime) differences between purely managed (.NET) and native code (C/C++), how does the Parallel Computing Platform team develop technologies for each domain and how are these technologies different? Surely, system level developers need system level tooling support that can improve their experience with writing native code that can effectively (and safely) scale to 8 cores (and that's nothing. How many cores will be the norm in 5-8 years? Mostly likely significantly more than 8....). There’s no PLINQ for C++, for example. That said, the fundamental problems the Parallel Computing Platform team are trying to solve span languages and runtimes, but are the differences only in implementation details and programming abstractions? What's the same? What's different? How? Why?

This is another great conversation with some of the folks designing and building technologies that will ultimately, in one form or another, converge into tools (or components of tools) that will help software developers effectively, efficiently and reliably compose applications and services in a Many Core world.

Enjoy!

Tags:

Follow the Discussion

  • Most of this comes down to design patterns and making sure that your algors are correct for function and performance. While I appreciate the efforts and ease of coding .NET attempts to offer, working with oil/gas and dot/dod we don't allow for such overhead. 70% of all work is done in 'C', 25% done in 'C++', and 5% in MASM. This is not a breakdown of lines of code but a reference to module count. Parallel computing is designed to make use of cores to their fullest and this is a good idea.

    I feel that a 'C' API for every function is important for "system" developers like myself. When I need to have parallel processing I already know because the design dictates such and effort and I understand these needs. Using generalized tools such as PCP can have great positive impact on people who do not yet understand how their computers work or how to properly create algors for anything other than pretty syntax. Functional programming will be more and more forced upon the newbs of computer science and the more they learn the less they will truely understand about their own computers.

    I'm not a .NET "hater" and think that C# is fun for prototyping and stepping through the lines with it's fancy debugger. But once the algor is complete in C# as far as it can go ... our situations require us to recode that algor into 'C' (normally, but 'C++' or MASM if needed) to get the performance we need. I agree that this is not for everyone, but we have the luxury to change OSs based upon availability of need. If we no longer had strong 'C' support then we would need to seek alternatives outside the Microsoft offering.

    Please keep up the awesome work for PCP. Many people will surely gain much from it's work. I would like to see CRT versions with PCP ... but I would also like to see heavily documented results that compare it's true performance when used in 'C', 'C++', and anything .NET. I would love to see PCP coupled with the GPU for the CRT and allow the standard windows API to really push it's performance levels. I think only then will we be able to see all applications, all applications compiled from different languages, all runtimes and codecs to really feel the performance boost that can potentially happen.

    My last paragraph, I would just like to rant even more freely. Because we have professional teams that use Visual Studio and the latest MASM for production I feel we have a unique perspective. We have an understanding of the machines we deal with outside the .NET hype. It's disgusting to see so many people in the public today (including the MSDN magazine) that don't understand or teach good programming tactics. It's too odvious that the internal Microsoft academia loves self-promotion and ensures class separation by creating constructs and paradigms that only tons of sample code and a heavy dose of UML can explain. If an understanding of true computer concepts/arch/design/instructions then we can get over this bloat (<cough> .NET) and really use those cores as they were designed to do. Allocation/Free is not hard people.
  • Are you sure you're not a .NET hater?  How quickly can you crank out Web apps or Windows GUI's with MASM?  How about we use the right tools for the job and not assume that C/C++/MASM is the best tool for everything that isn't a prototype?

  • Nope. Not a .NET hater. If you read my post you would understand that "cranking out [x]" is not really part of what I need.

    In paragraph #1, I advocate looking at your algors and to be watchful of overhead (and overhead that might be in your .NET application).
    In paragraph #2, I categorize myself as a "system" programmer ... and if you watched the vid then you would understand that "system" (later "gamer") level programmers might not benefit from this technology because they claim themselves to be in their own infancy.
    In paragraph #3, I mention that "we" (not you) need this extra performance, I agree that this is not for everyone, and I state a fact that Microsoft itself knows in that: if you can't keep your system-level applications competing then people will go elsewhere (that's the whole windows/*nix debate).
    In paragraph #4, I encourage PCP further development and would even like to use the PCP->CRT and would like to see performance reports (especially if there is GPU) because companies like mine need every ounce of performance we can get.
    In paragraph #5, I echo the vid in voicing that system/game-level programming requires more than most .NET (of which I find to be hype).

    It's in the last paragraph where I discuss my disgust with people who do not, in your words, "use the right tools for the job and assume [.NET] is the best tool for everything". It's not my job to "crank" code but to get it right the first time (or really close) and ensure that performance is at it's peek. You argue your ability to make a web app or windows gui versus "system" programming. It's not the same ... and if you think that creating services, oil/gas calculations, live traffic projections, isapi extensions, and the such are equ to a SharePoint application then hopefully you understand now that they are not.

    PCP, in my pov, is being designed for me, for my team, for my company. Within the two groups that need PCP you have: 1) people who don't thread or need performance but want a magic bullet, 2) people who do this by hand all day long to get the best of the best and would love to have some relief with hardened/tested/proven code/runtime/libs. I am the second group, but don't be upset with me if you are in the first because these tools are being designed for you too. Will I want to hug/kiss these people who might actually make a multi-core compiler (especially for my 'C'/MASM), yes!

    .NET has it's place. It's the more-than-luke-warmest technology on the market today! It's not for my industry. I would love some help with multi-core but for us to use it we would need to prove that it's faster than what we have today. For me, .NET is a cool prototype too. In that capacity it *is* the best tool because prototyping in MASM is not always a good first choice.

    Lastly, please note that ISAPI, Windows GUI, NT Services, other MAPI and API are all pretty easy with MASM once you get used to them. Just like anything else, you create a set of helper functions or tools that assist and you create project wizards that help with the creation. It's just like anything else really.
  • CharlesCharles Welcome Change
    I think the uber point in all of this is that it's not really about the the languages used to express algorithmic intention, but rather enabling the design and construction of composable systems for as many levels of programming capability as possible. As we move farther and farther away from the machine it follows that a complete understanding of the physiology of the target machine in order to successfully program it becomes an optional requirement for building scalable software empowered by the machine.

    .NET is more than a collection of libraries targeting a garbage collected, type safe and verifiable runtime. It's an emergent system (more than the sum of its parts). The motivation behind .NET is based squarely on the idea of software composability, which allows for unsually scalable, predictable, debuggable, parallelizable and coordinating application, services and runtime systems. The addition of new features and algorthims in this model can't destabilize or undermine the overall system. Of course, .NET has not yet realized this vision of composable software in every desktop and in every device (Smiley), but it makes it practically possible as the technology matures given the "composable trajectory" of the evolving platform.

    In terms of the native vision, of course it inherits the base goal of composability (it must, otherwise we're wasting our time, at least in the current iteration of hardware and OS architectures), but the problem space is significantly more complex for native given the intrinsic unsafety of the executing system (non-verifiable, type unsafe, lack of sufficient memory protection abstractions, etc). As you learned in this video (and in several others - just click on the concurreny tag for an abundance of information), the PCP team is working hard on the fundamentals of the problem (at the software layer...) and they are finding common ground across platforms and creating useful general purpose solutions for .NET (TPL) and native developers (ConCRT).

    C
  • ZippyVZippyV Fired Up
    What about OpenCL which is coming in the new OSX? Is Microsoft going to implement it too?
  • ChevalN2Cheval Why not null?
    First up, great video! (except for the sound clipping) Please keep on chasing them more on this subject. Plus I'm curious about seeing a video on the the testing lab mentioned.

    Unlike t.man above, we are more aligned towards Business Intelligence and Line of Business work, so we only need the power this provides in bursts, so the .Net works well enough for us. If anyone is interested in a side thought what I would like in the parallel/concurrency model is more of a hardware/software combination of hardware "turbo" mode and software linear types. The hardware would be great if we could have a large capactior design which could push huge juice for very small amount of time to process the complex calculations and the rest of the time sit cooling down doing nothing but waiting for user input or report printing. The software, while I can see some benefit of the functional world, teams are expense to run where you need the multi-syntax model (eg. t.man's C/C++/MASM or functional/iterative) skills, so we would prefer to stay with .Net but "remove/isolate" "shared state/variables" slowly with linear types. ie. Bring back true maths to software development. Basically the ability to be able to code at times how to do things and other times just declare parrallel without shared state. This is why LINQ is great. We can do some setup code, tell LINQ to filter/process it and pull the result apart to get the desired information is great.

    Anyway, keep up the hard questions Charles, as it's good to see the body language of the tool makers in response.


Remove this comment

Remove this thread

close

Comments Closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums,
or Contact Us and let us know.