It is handy to know see the issues by version, so you can see how many versions you could upgrade a database before running into issues.
That said, often the complexity arises when a solution comprises of multiple SQL servers. So the upgrade has dependencies on a Mirrored Cluster with Log-Shipping, replicated to a Reporting / DW. Some distributed queries & Service Broker in the mix as well. The order you upgrade becomes vital if you don't want to rebuild everything
Personally I find the older style Azure Portal much easier to use & find things than the "New Look" Azure portal.
I'd also like to have the results export direct to a SQL Database Table of my choosing rather than load to Excel & import.
I'd also hope you follow the lead of the Best Practice Analyzer in Windows Server's Server Manager tool. When it displays an error. I get a link to a web page. This gives me background on the issues & step by step instructions on how to fix it. Often with either screenshots or Powershell commands.
I would like this feature to be optimized for bandwidth sensitive sites too. Would it be possible to skip a backup if nothing has changed?
Example: Full Backup Daily, Log Backup scheduled every 2 hours. But this is a 9-5 Mon-Fri operation. Outside of those hours no data modification happens, but maybe an occasional query.
Could you check the LSN against the LSN of the prior Backup. And just skip the operation if not required? Clearly its not a huge win for a log backups but it may reduce clutter in tables that track the backups taken. Avoiding the unnecessary backups on Sat & Sun could result in a 28.5% saving in bandwidth & storage.
As much as we all like to get excited about 7*24 systems. There are still a large number of systems that are only active in office hours. Similarly there are tons of Reporting & DW systems that only get changed at night.
Nice, Seems we've come full circle. Like SQL 2000's SQL Notification Services, it uses SQL Queries for rapid development. But takes advantage of Azure framework to overcomes the clunky setup the plagued SQL NS.
Constraining the images to be contained in an image library seems quite a limitation. Personal Libraries lock you into a single User. Public Libraries solve that issue but are limited to a single drive. Often the C Drive, often quite full. Much more flexible to have an option to point to any file location. That way I could access images from a large cheap external USB drive or fileserver.
Bye Nathan. We'll miss you. I'll look forward to the new enhancements you'll be involved in making to the Azure platform.
PS: It would be REALLY nice if you could get some kind of Event Notification / Event Handler feature on the Azure Storage queues. Polling queues sucks. SQL's Service Broker is a significantly more efficient model & much easier to program too.
James I am very disappointed by your omissions re the NN environment. You comment above about your experience with SQL's NN. Yet your talk made no mention of that as an alternative. That work & related DM algorithms were developed by MSR. The papers they wrote on this topic were seen by the global DM research community as significant breakthroughs. They worked closely with the dev teams to incorporate all those learning's into the SQL Data Mining engine. Unlike the core algorithm you present here, They scale massively & solved many of the "limited by memory" & model training issues most DM algorithms / products have. They also provide / solve a ton of other issues you'd need to think about when embedding a NN system into your code. eg: They have a pluggable interface that works with most standards (ADO, OLDDB). A language, DMX, that makes it easy to enhance, configure & use with no code change. Tools that automate the training & evaluation of your model, Ability to tweak your model's parameters. AND as it is a platform is it easy to extend the DM experience by embedding your own algorithm into that platform to create new mining models. (Given you prefer your own algorithm, consider writing it as a DM plug-in & compare your perf with what already exists. It frees you from the plumbing & allows you to focus on the bit you do best.)
Yet you turned your back in it all. Suggesting NN wasn't well documented (which is is) & talking about "the only thing going was a java app" (there are heaps of products). Now you've encouraging attendees to completely reinvent the wheel. Instead of a project that could take hours/days (6 lines of code to embed into their app, a sql report, etc) they will take weeks or months doing everything from scratch & still unlikely to get close to the multi-core performance, the parallel model training, the scale nor the benefit of the insight outlined in the many research papers published on this subject by the MSR folks.
In isolation. Nice talk you covered the technical details & background of NN well.
But as a representative of MSR &/or Microsoft you failed to accurately brief these folks. And did your audience a disservice by implying that their only option was to start at square one.
You are a smart man & do write great articles. You have great influence. Please be more responsible in the future.