Loading user information from Channel 9

Something went wrong getting user information from Channel 9

Latest Achievement:

Loading user information from MSDN

Something went wrong getting user information from MSDN

Visual Studio Achievements

Latest Achievement:

Loading Visual Studio Achievements

Something went wrong getting the Visual Studio Achievements

Building Scalable N-Tier Apps with Windows Azure Cloud Services & Virtual Machines

1 hour, 5 minutes, 17 seconds


Right click “Save as…”

+My schedule - My schedule
Learn common patterns that will help you build cloud services that seamlessly at scale using Windows Azure Virtual Machines and Cloud Services. This session will cover topics such as multi-tiered deployment patterns, high availability, data access, and parallel processing while leveraging the best of IaaS and PaaS.
For more information, check out this course on Microsoft Virtual Academy:

Follow the discussion

  • Oops, something didn't work.

    Getting subscription
    Subscribe to this conversation
  • Great session, good suggestions to consider for scaleable empire App. Would have liked more info on High Availability practices, especially for DB (and how to leverage SQL Azure services to protect and distribute persistent DB). 

  • Greg HoltGreg Holt

    Awesome presentation. Several from my company were present in the session and we all enjoyed it and found value.

  • Tim SandersTim Sanders

    Very practical and informative session I have attended on Windows Azure. Especially, I liked very candid view on what works and what does not work on Windows Azure without beating around the bushes.

    Nice job presenter!

  • Kumar SachinKumar Sachin

    Nice presentation. very practical point of view.

    thank u for sharing the tricks on deploying to vnet from web role.

  • Jeff ConellyJeff Conelly

    Best Azure session I attended. The content & the demos were brilliant. The presenter was kind to answer my technical challenges offline after the presentation. Well done!

  • the link to the presentation is broken; receiving the following message. Can someone from Channel 9 help please?

    The specified blob does not exist. RequestId:fb8efe71-d4ad-4e77-b649-834bc7dd0243 Time:2013-06-29T18:07:38.3833925Z

  • Can someone share more technical details about the Heineken project? Specifically I have following questions.

    - Why were there so many storage accounts? Any particular reason?

    - How did you come up with 10,000 data partition (why not 20,000 or 5,000)

    - What was the partition key and row key

    - A message from the UI was pushed across 4 data centers, I think. How was the worker implemented as to avoid processing the same message more than once? My understanding is that you had 4 different queues!

    - It appears that SQL Database was not utilized, rather a SQL Server based database infrastructure was established. How was that infrastructure configured for high availability? Was it traditional SQL Server Active-Passive configuration or does Azure offers something else for implementing fault tolerance / high availability for SQL Server?

  • Kevin RansomKevin Ransom

    Great talk and very practical. Thank you!

    Can you please fix msdn documentation and forum FAQs with the tips and tricks with samples from your session. It will be really valuable.

  • @mannysiddiq​ui Let me answer your questions:

    - Why many storage accounts?

    We don't need that many to be honest, but storage accounts do come with limits (see this great blogpost: http://blogs.msdn.com/b/windowsazure/archive/2012/11/02/windows-azure-s-flat-network-storage-and-2012-scalability-targets.aspx)

    Since storage accounts are essentially free we didn't want to take the risk of hitting a bottleneck there.

    - We came up with 10.000 partitions to hit our targets (1 Million game plays per hour). This is really tied to our use-case and always requires testing. With more partitions you get more compute power serving your requests to storage which is why 10 partitions didn't work.

    - Partition Key: last 5 digits of the facebook ID, rowkey the Facebook ID

    - Messages were being pushed from the UI to a Web API that would put the message in 4 queues.
    Every DC had its one workers that would process the messages. Azure queues hide messages during processing and after processing we delete them. We had some auto conflict handling based on timestamps.

    - There was no SQL Server used as part of the solution in production, only a simple SQL Server Express to collect the loadtesting data.

    Let me know if you need more information. My email address is dmulder @ microsoft.com

  • @dennismulder:Thanks Dennis. The answers make sense. I will reach out by email b/c I have one or two questions about a project that we are working on. Thank you.

  • Keith TaylorKeith Taylor

    Excellent & very practical session. Nice job to the presenter for being candid

Remove this comment

Remove this thread


Comments closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums, or Contact Us and let us know.