I just wondered whether anyone else had seen Blaise talk yesterday noon (well, at least it was noon on the US west coast) at O'Reilly Where 2.0 2011. He announced Bing Maps' new collaborative Read/Write World project, which revolves around the realtime geo-registration of media to existing aligned imagery such as Streetside panormas, etc.
From what I heard yesterday, they've already signed on 360 Cities (a popular high quality panorama site) as well as C3 Technology (3D aerial city-wide scanned models) and of course partners like Navteq who have been driving Microsoft's Ultracams around on the top of their cars capturing Streetside panoramas for the last year and Digital Globe who is in the process of completing a high resolution satellite imagery scan of the entire 48 contiguous United States at a resolution of one pixel per square foot/30cm.
Here's a link that includes the YouTube video of his talk, the official website, the Bing Maps Blog entry, and a few social network sites to keep up with the latest from the team. http://bit.ly/readwriteworld
The site is still very much in alpha, but they're going with the approach of letting the community know their intentions early and gathering feedback early and often as they develop, which is extremely similar to the Live Labs approach, in many ways, so I'm glad to see the spirit of Live Labs live on, though the organization is gone.
For me, this is absolutely the most exciting announcement to come out of Microsoft since the initial Photosynth announcement, mainly because Photosynth, as it currently exists, has not delivered the critical piece of connecting neighbouring and overlapping synths and panoramas. With Read/Write World, we not only will get that piece of the puzzle finally put in place, but also be able to match other media from around the web against existing panoramas, synths, and model textures. There is also an aspect of the project aimed at developing single viewers which can handle any number of different media types (synths, panoramas, videos, etc.) and then open sourcing those viewers to the community so that robust viewers can be developed in many different technologies which their team could not hope to complete viewers for all on their own without exceeding either their manpower or allotted resources.
Their indexing and matching services are all running on Azure, if that interests you, so this is also a throw back to some of the initial concept of Photosynth (after all, if Live Labs had been formed after June 2009, they would have been called 'Bing Labs') where it is running on servers|in the cloud as a service. Once they scale that up, that will mean that they can open up the doors for contributing to synths from any device, rather than only those which can run the Photosynth app for Windows client-side.
@Charles, I would love to see you sit down with David Gedye, Blaise, Gur, or whoever else is appropriate to talk to over there, as long as it doesn't mean them giving away their gameplan to the Californians.