Loading user information from Channel 9

Something went wrong getting user information from Channel 9

Latest Achievement:

Loading user information from MSDN

Something went wrong getting user information from MSDN

Visual Studio Achievements

Latest Achievement:

Loading Visual Studio Achievements

Something went wrong getting the Visual Studio Achievements

CCR at MySpace

19 minutes, 29 seconds


Right click “Save as…”

MySpace has done some pretty amazing things with the Robotic Developer Studio.  When they found out it contained a very powerful component, the Concurrency and Coordination Runtime (CCR), the architects built it into the architecture of MySpace, the largest .NET site in the world.  At MySpace, I met Principal Architect Erik Nelson and Senior Architect Akash Patel who walked me through how they were using the CCR. 
If you have an MSDN subscription, you are a student, or you are at a qualifying startup, you can now download and use the Robotic Toolkit as part of the program.


Follow the discussion

  • Oops, something didn't work.

    Getting subscription
    Subscribe to this conversation
  • What is the average throughput for Myspace? How much has CCR improved performance?

  • Those are both pretty broad questions. I can't actually give an answer to our total throughput, but that's a combination of many technologies, so it's difficult to say how much CCR affects it directly.


    As to how it improved performance, it was used as a part of a rearchtecture that happened concurrently with the site growing in size many times over. Since our load increased dramatically during the time period we were implementing it, and its use came along with other changes in our middle tier, there really isn't an apples to apples comparison that can be made.


    I apologize for the non-answers here, but if you have more specific questions, I can try to address them!

  • Hey Erik - In the presentation you mentioned that even the search team in MySpace is using CCR. Could you elaborte a bit on it?

    As you menntioned it made sense to use CCR to improve throughput in the communication layer when you have a lot of messages and data to transfer between a lot of components.

    So how is it that search team benefits from the high concurrency that CCR enables. Is it that some kind of map/reduce paradigm being used to query a lot of nodes/documents parallely?

  • Within our search infrastructure we use the CCR to manage concurrency in our processing pipeline to assign indexing tasks to a pool of workers.  The benefits we've receieved from using the CCR are that it simplifies concurrency management and provides a very high level of throughput.  At this time we are not currently using the CCR during search execution but are examining ways in which we can.

Remove this comment

Remove this thread


Comments closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums, or Contact Us and let us know.