Entries:
Comments:
Posts:

Loading User Information from Channel 9

Something went wrong getting user information from Channel 9

Latest Achievement:

Loading User Information from MSDN

Something went wrong getting user information from MSDN

Visual Studio Achievements

Latest Achievement:

Loading Visual Studio Achievements

Something went wrong getting the Visual Studio Achievements

The Big Dimmer Switch

Today, we're introducing Vector. It's a blog about developers, platform, and big ideas/trends in tech, but with a little bit different bent on what folks are used to from Channel 9...

It's been eight years since Channel 9 first introduced many of our key engineering folks to the developer community to deliver a transparent view into how we think about the technologies we're working on, the teams doing the work, and the huge community of people in and around Microsoft that are connected to all of it. None of that changes ... the team continues to produce the great content that has personified Channel 9 since 2004. But we're also going to spend some time on a more 'elevated' view of what we're thinking when it comes to our own strategy, our competitors, broad-based industry stuff, and of course the intersection of all of this with business. Sometimes it'll be topical, other times it will just offer commentary on something that people are talking about. We will also have guest posters talking about what they're doing that fits in this vein.

So that brings us to an actual post on something meaningful ... the disruptive impact of the services model on businesses.

- The Channel 9 Team

In 2007, Nick Carr published "The Big Switch," which chronicles the evolution of electricity from being locally-generated by businesses on their factory floors to something we all now consume as a pay-for-what-you-use utility. The book draws parallels between electricity and packaged software as a means to offer up a potential end-state for cloud computing, hence the title ... there will be, the book asserts, a switchover from in-house datacenters to software delivered as a utility, and it's just a matter of time. It's a good book, not steeped in technical jargon but rather a set of thoughtful mappings between these two eerily similar eras of technology disruption.

Nearly five years have gone by since it was first published, and a lot has happened (and not happened) relative to the pace of cloud adoption ... we know a lot more about what motivates companies to push some apps out the door and into public clouds in a hurry, while other apps will take their time, or maybe even continue to run on-premises in so-called private cloud environments. So it begs a bunch of questions: What is the end state for the services disruption? Is it public cloud platforms and SaaS, or a hybrid of public and private cloud deployments, combined with traditional IT? In other words, is the big switch actually more of a dimmer switch, in the sense that it's not just a simple matter of on/off? Are there any other historical lessons or examples of disruption we can draw insight from?


Disruption is a great word ... if you talk to enough developers, IT folks, and industry pundits about services, "disruption" shows up as by far the most-used and (IMO) best descriptor for what's happening in computing today. Scenarios that used to be impractical, uneconomical, and just plain impossible are now fair game for developers to build and deliver, all because of increasingly cheap and abundant resources like compute, storage, and bandwidth, available at scale and on a pay-as-you-go basis. So when people talk about disruption, the shape of its impact on the market is generally assumed by most folks to be one of outright replacement, such as the advent of electricity as a utility, the combustion engine's replacement of the horse-drawn carriage, and digital media's disruption of physical media, to name a few - basically, the end state in which the disruptive technology means buggy-whip obsolescence for the existing technology. But not all disruptions play out this way, and there are more than a few historical examples, my personal favorite being the captivating story of the microwave business. Seriously, it's actually pretty interesting ...

So here's the story (courtesy of Wikipedia):

As we all know, atomic research started in the 1940's for military purposes, but one of the offshoots of it was the discovery that you could actually use microwave radiation to heat food. Like most technology disruptions (including cloud computing), the discovery & development pre-dated mass adoption by many years. In the case of the microwave, the earliest patents were filed by Raytheon just after World War II, and were licensed to Tappan for the first home-use microwave. It was introduced in 1955 and cost over $1,000, but not surprisingly it didn't do well in the market. Raytheon got back into the game by acquiring Amana and introducing the Radarange in 1967 for about $500, and that's really where market adoption began to take shape. In 1971, 1% of US households owned a microwave, and by 1986 it was 25%, and today it's over 90%.

What's interesting here is how the adoption curve was shaped based on the market's education on what you could and couldn't do with this thing. Keep in mind that the value prop of the microwave was time-savings for the subset of cooking tasks for which the new technology could be used. Can you bake a cake with a microwave? They're not ideal for that. Can you thaw out frozen stuff? Yeah, it's great for that. What about broiling a salmon? Well, no. How about reheating leftovers? Yeah, it's perfect for that. Why do sparks fly everywhere when I put metal in it? You should really read the owner's manual. This was all part of what could best be described as a partitioning process ... partitioning what you do in a kitchen between the existing thing and the new thing. How was this process accelerated? It was just outright education, in many cases through print and TV advertising, which were rife with "ideas" about what you could actually cook, but also by shipping microwave cookbooks with the actual units. Some of the recipe ideas were a stretch (Thanksgiving turkey in a microwave?), but over time, people figured it out and knew what they should and shouldn't be cooking with it, and that essentially determined the end state for the disruptive technology: every modern kitchen will generally have both a conventional oven and a microwave.


So what can we learn from this? Allowing for the fact that the oven business and computing are two entirely different animals, the biggest and most obvious parallel is the ongoing education of the market that we're seeing now about which apps are and are not necessarily well-suited to public cloud deployment. In other words, within any business' app portfolio, there are no-brainers for cloud deployment (web workloads, email, collab, CRM, test, HPC, etc.), while other apps and workloads are subjected to more scrutiny, at least for the time being (ERP, mission-critical apps, apps with HBI data, etc.). Every business, no matter how small, has a portfolio of apps, and this process of portfolio portioning is pretty similar to the task-partitioning process that shaped microwave adoption. In the software industry, we see this in business scenarios, in which there's a lot of focus these days on things like PII at massive scale, infrastructure security, data sovereignty, the regulatory environment, and a host of other factors that business folks consider as part of the go/no-go decision on cloud computing.

At any rate, the end state is becoming increasingly clear: businesses end up with a mixed bag of delivery and deployment approaches to deal with the variable needs & complexities of each & every app in their respective portfolios, at least for the foreseeable future. If you pay attention to cloud computing rhetoric in the industry, hybrid cloud is the new black, but for our part, but the answer was there all along. There's an obvious tension between the possibilities afforded by what is arguably the biggest shift in our industry since the advent of client/server, and the practical realities of technology change for businesses that are grappling with this new era of computing. But from a technical strategy standpoint, there is no confusion on our part ... the design point that defines cloud computing is the path forward for app dev: the next set of apps that matter will be designed for scale and elasticity. They'll be resilient, multi-instance, and highly available. Everything we're doing in the platform across Windows Azure and Windows Server is geared toward enabling developers to meet this design point with the new apps they're building.

This means that today's great debate is twofold: a.) where will these new apps run? And b.) where will those existing apps end up? Off-prem in the cloud, or on-prem in the data center? The answer is "yes." And that's really the point ... the discussion about *where* the apps run (and what most people fixate on when they talk about the services disruption) is orthogonal to the discussion about whether it's an app that meets the bar for the cloud design point vs. an n-tier app running in a VM that's really yesterday's design point. Most businesses' portfolio of apps will have both kinds, old school and new school, and they'll be partitioned across off-prem and on-prem. Cloud adoption in big companies still has a ways to go, but even in these early days, the emerging trend line is becoming increasingly clear. Nick Carr even called this one in the "Big Switch", in the form of this excerpt from pg. 118...

"...larger companies...can be expected to pursue a hybrid approach for many years, supplying some hardware and software requirements themselves and purchasing others over the grid. One of the key challenges for corporate IT departments, in fact, lies in making the right decisions about what to hold on to and what to let go."

The idea that app portfolios will be partitioned in this way seems pretty intuitive to the folks that are grappling with the change, at least based on the customer discussions we're having these days. To be clear ... the cloud design point is, without a doubt, what we're headed toward with a new generation of apps that are going live in increasing numbers every day (and the subject of a future post), but their place in the broader business app portfolio makes it's trajectory more akin to a dimmer that's turned up over time than a simple on/off.

Thanks for listening – comments & feedback welcome.

-Tim

Tags:

Follow the Discussion

  • Dave Williamsondavewill here birdie birdie, get in my belly!

    Yes and yes, Tim.  All businesses change from state A to state B by way of a transition that weighs the downside risk against the upside gains.  Your dimmer analogy is right on.

    The stress these days on the development supply side is the loss of market supply at scale for on-premise.  To parallel the electricity analogy, the cost of backup generators today is way more expensive than if everyone had a generator for primary power.  The same holds true for the physical on-premise IT parts.

    Developers have to follow the dimmer and move their finely tuned code base from state A to state B just as carefully (change, retune, change, retune, change, retune,...).

  • tobitobi

    like

  • figuerresfiguerres ???

    Interesting...

    i wonder if this can follow thru with what it says....

    what is the strategy?   how to get on the proactive / advanced side.

    how to not let others reap the benefit of your work...

  • SpragueDSpragueD

    Nice start. Good luck with the blog. Cheers!

  • CharlesCharles Welcome Change

    @figuerres: What are you asking, exactly?

  • figuerresfiguerres ???

    , Charles wrote

    @figuerres: What are you asking, exactly?

     

    well honestly several things are on my mind and seem related to this post:

    1) microsoft seems to be trying to change how they share information with developers, i see some valid reasons but i also see problems with this shift. it's a double edged sword.

    2)  related to #1  is the issue of how the tech changes and how in recent history developers have been left with a feeling that Microsoft has tried to "jump" on a new trend at the cost of dropping another tech. or at least making the annoncments come across to many developers as a kind of "Slap in the face" as it were.

    3) also realated to #1 but also to the larger picture is the list of things that microsoft has done amazing work on but yet this work has failed in many ways to get the public acceptance we would like to see. Also the way other companies have been able to benefit from Microsofts hard work.

    just look around the many "Tablet Devices"  from multiple companies that are hitting the market and have sold many many units....  a large part of the work to make them was done by Microsoft but not one of the popular devices today is making money for microsoft or running a Microsoft OS - as far as i know of.

    Granted just around the corner we have WIndows 8 and Windows on Arm but my concern is that like the Windows Phone that this will be coming to the market too late to get the market share needed to attract companies to invest in developing for it.

    I have as yet to see any real demand for windows phone apps in the real world. i do see demand for android and iOS apps.

    and this is interesting as I also see MonoDroid and MonoTouch beeing used to make apps for these devices,   again a case where Microsoft has done a HUGE amount of work but gets no fame, no money and no real advantage.

    there is for sure a paradox of if you try to be "Open and Share"  you might be giving stuff away.

    but if you do not share then you can be accused of all kinds of Evil things ....

    I do not have any simple answers ....  but I do see a lot of things that concern me.

Remove this comment

Remove this thread

close

Comments Closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums,
or Contact Us and let us know.