The Big Dimmer Switch
- Posted: Feb 15, 2012 at 12:18 PM
- 51,534 Views
- 6 Comments
Loading User Information from Channel 9
Something went wrong getting user information from Channel 9
Loading User Information from MSDN
Something went wrong getting user information from MSDN
Loading Visual Studio Achievements
Something went wrong getting the Visual Studio Achievements
Today, we're introducing Vector. It's a blog about developers, platform, and big ideas/trends in tech, but with a little bit different bent on what folks are used to from Channel 9...
It's been eight years since Channel 9 first introduced many of our key engineering folks to the developer community to deliver a transparent view into how we think about the technologies we're working on, the teams doing the work, and the huge community of people in and around Microsoft that are connected to all of it. None of that changes ... the team continues to produce the great content that has personified Channel 9 since 2004. But we're also going to spend some time on a more 'elevated' view of what we're thinking when it comes to our own strategy, our competitors, broad-based industry stuff, and of course the intersection of all of this with business. Sometimes it'll be topical, other times it will just offer commentary on something that people are talking about. We will also have guest posters talking about what they're doing that fits in this vein.
So that brings us to an actual post on something meaningful ... the disruptive impact of the services model on businesses.
- The Channel 9 Team
In 2007, Nick Carr published "The Big Switch," which chronicles the evolution of electricity from being locally-generated by businesses on their factory floors to something we all now consume as a pay-for-what-you-use utility. The book draws parallels between electricity and packaged software as a means to offer up a potential end-state for cloud computing, hence the title ... there will be, the book asserts, a switchover from in-house datacenters to software delivered as a utility, and it's just a matter of time. It's a good book, not steeped in technical jargon but rather a set of thoughtful mappings between these two eerily similar eras of technology disruption.
Nearly five years have gone by since it was first published, and a lot has happened (and not happened) relative to the pace of cloud adoption ... we know a lot more about what motivates companies to push some apps out the door and into public clouds in a hurry, while other apps will take their time, or maybe even continue to run on-premises in so-called private cloud environments. So it begs a bunch of questions: What is the end state for the services disruption? Is it public cloud platforms and SaaS, or a hybrid of public and private cloud deployments, combined with traditional IT? In other words, is the big switch actually more of a dimmer switch, in the sense that it's not just a simple matter of on/off? Are there any other historical lessons or examples of disruption we can draw insight from?
Disruption is a great word ... if you talk to enough developers, IT folks, and industry pundits about services, "disruption" shows up as by far the most-used and (IMO) best descriptor for what's happening in computing today. Scenarios that used to be impractical, uneconomical, and just plain impossible are now fair game for developers to build and deliver, all because of increasingly cheap and abundant resources like compute, storage, and bandwidth, available at scale and on a pay-as-you-go basis. So when people talk about disruption, the shape of its impact on the market is generally assumed by most folks to be one of outright replacement, such as the advent of electricity as a utility, the combustion engine's replacement of the horse-drawn carriage, and digital media's disruption of physical media, to name a few - basically, the end state in which the disruptive technology means buggy-whip obsolescence for the existing technology. But not all disruptions play out this way, and there are more than a few historical examples, my personal favorite being the captivating story of the microwave business. Seriously, it's actually pretty interesting ...
So here's the story (courtesy of Wikipedia):
As we all know, atomic research started in the 1940's for military purposes, but one of the offshoots of it was the discovery that you could actually use microwave radiation to heat food. Like most technology disruptions (including cloud computing), the discovery & development pre-dated mass adoption by many years. In the case of the microwave, the earliest patents were filed by Raytheon just after World War II, and were licensed to Tappan for the first home-use microwave. It was introduced in 1955 and cost over $1,000, but not surprisingly it didn't do well in the market. Raytheon got back into the game by acquiring Amana and introducing the Radarange in 1967 for about $500, and that's really where market adoption began to take shape. In 1971, 1% of US households owned a microwave, and by 1986 it was 25%, and today it's over 90%.
What's interesting here is how the adoption curve was shaped based on the market's education on what you could and couldn't do with this thing. Keep in mind that the value prop of the microwave was time-savings for the subset of cooking tasks for which the new technology could be used. Can you bake a cake with a microwave? They're not ideal for that. Can you thaw out frozen stuff? Yeah, it's great for that. What about broiling a salmon? Well, no. How about reheating leftovers? Yeah, it's perfect for that. Why do sparks fly everywhere when I put metal in it? You should really read the owner's manual. This was all part of what could best be described as a partitioning process ... partitioning what you do in a kitchen between the existing thing and the new thing. How was this process accelerated? It was just outright education, in many cases through print and TV advertising, which were rife with "ideas" about what you could actually cook, but also by shipping microwave cookbooks with the actual units. Some of the recipe ideas were a stretch (Thanksgiving turkey in a microwave?), but over time, people figured it out and knew what they should and shouldn't be cooking with it, and that essentially determined the end state for the disruptive technology: every modern kitchen will generally have both a conventional oven and a microwave.
So what can we learn from this? Allowing for the fact that the oven business and computing are two entirely different animals, the biggest and most obvious parallel is the ongoing education of the market that we're seeing now about which apps are and are not necessarily well-suited to public cloud deployment. In other words, within any business' app portfolio, there are no-brainers for cloud deployment (web workloads, email, collab, CRM, test, HPC, etc.), while other apps and workloads are subjected to more scrutiny, at least for the time being (ERP, mission-critical apps, apps with HBI data, etc.). Every business, no matter how small, has a portfolio of apps, and this process of portfolio portioning is pretty similar to the task-partitioning process that shaped microwave adoption. In the software industry, we see this in business scenarios, in which there's a lot of focus these days on things like PII at massive scale, infrastructure security, data sovereignty, the regulatory environment, and a host of other factors that business folks consider as part of the go/no-go decision on cloud computing.
At any rate, the end state is becoming increasingly clear: businesses end up with a mixed bag of delivery and deployment approaches to deal with the variable needs & complexities of each & every app in their respective portfolios, at least for the foreseeable future. If you pay attention to cloud computing rhetoric in the industry, hybrid cloud is the new black, but for our part, but the answer was there all along. There's an obvious tension between the possibilities afforded by what is arguably the biggest shift in our industry since the advent of client/server, and the practical realities of technology change for businesses that are grappling with this new era of computing. But from a technical strategy standpoint, there is no confusion on our part ... the design point that defines cloud computing is the path forward for app dev: the next set of apps that matter will be designed for scale and elasticity. They'll be resilient, multi-instance, and highly available. Everything we're doing in the platform across Windows Azure and Windows Server is geared toward enabling developers to meet this design point with the new apps they're building.
This means that today's great debate is twofold: a.) where will these new apps run? And b.) where will those existing apps end up? Off-prem in the cloud, or on-prem in the data center? The answer is "yes." And that's really the point ... the discussion about *where* the apps run (and what most people fixate on when they talk about the services disruption) is orthogonal to the discussion about whether it's an app that meets the bar for the cloud design point vs. an n-tier app running in a VM that's really yesterday's design point. Most businesses' portfolio of apps will have both kinds, old school and new school, and they'll be partitioned across off-prem and on-prem. Cloud adoption in big companies still has a ways to go, but even in these early days, the emerging trend line is becoming increasingly clear. Nick Carr even called this one in the "Big Switch", in the form of this excerpt from pg. 118...
"...larger companies...can be expected to pursue a hybrid approach for many years, supplying some hardware and software requirements themselves and purchasing others over the grid. One of the key challenges for corporate IT departments, in fact, lies in making the right decisions about what to hold on to and what to let go."
The idea that app portfolios will be partitioned in this way seems pretty intuitive to the folks that are grappling with the change, at least based on the customer discussions we're having these days. To be clear ... the cloud design point is, without a doubt, what we're headed toward with a new generation of apps that are going live in increasing numbers every day (and the subject of a future post), but their place in the broader business app portfolio makes it's trajectory more akin to a dimmer that's turned up over time than a simple on/off.
Thanks for listening – comments & feedback welcome.