Entries:
Comments:
Posts:

Loading User Information from Channel 9

Something went wrong getting user information from Channel 9

Latest Achievement:

Loading User Information from MSDN

Something went wrong getting user information from MSDN

Visual Studio Achievements

Latest Achievement:

Loading Visual Studio Achievements

Something went wrong getting the Visual Studio Achievements

Parallel Computing Platform: An Integrated Approach to Tooling

Download

Right click “Save as…”

  • High Quality WMV (PC)
  • MP3 (Audio only)
  • MP4 (iPhone, Android)
  • Mid Quality WMV (Lo-band, Mobile)
  • WMV (WMV Video)
The Parallel Computing Platform team's Steve Teixeira (PUM), Daniel Moth (PM and Channel 9 Screencaster Extraordinaire!) and Sean Nordberg (GPM) sit down with me to discuss Microsoft's overall approach to Parallel Computing inside Visual Studio (improved concurrency debugging and new runtime analysis tools inside VS). Of course, we also address the high level issues of making parallel computing easier to compose on the Microsoft stack and peak into the future. At PDC Steve et al will go much deeper into the VS tooling support for parallel computing in several sessions. Parallel Computing is a major theme at this year's PDC so I strongly encourage you to head down to LA and pay close attention to our plans for making writing concurrent code reasonable and readily understandable by a much larger class of developers.

See you at PDC.

Tags:

Follow the Discussion

  • stevo_stevo_ Human after all
    That was really cool, it makes a lot of sense that we don't really want to be working purely with threads, but expressing concurrency with an abstraction which can more intelligently handle threads and thread pools etc.. the task panel seems like such a natural thing that should exist.
  • Simon Hughessjh37 Software should not break. It should just work. Forever. No matter what...

    You have to be careful with parallel execution when it comes to disk I/O.
    If each thread is accessing a file, the whole thing slows down to a crawl as the hard disc read head has to jump to each file every 20ms. It would be MUCH better if the operating system could allocate more time to read a file before it allowed a context switch. say 500ms. that would allow more data to be retrieved from the hard disc, less head thrash, less time waiting for the head to move, and performance would go up greatly.

    Just try creating 2 or more zip archives at the same time, then time it again but only doing 1 at a time. Winrar has a feature where it will wait (probably using a global mutex) for other winrar windows to finish before the next one starts.

    You can context switch CPU threads till the cows come home, but a phsical device needs more time to read/write when the head arrives.

Remove this comment

Remove this thread

close

Comments Closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums,
or Contact Us and let us know.