I started using the ARPAnet in 1975. It seemed harmless enough at first, but within a few years I was spending my whole day reading E-mails from our Program Manager at ARPA. The subsequent decades online have been, um, a mixed blessing...
Thanks for the video. For all the talk about how wonderful .NET is and how "the future of programming is managed code," etc., it would be nice to have a follow-up video specifically about programming unmanaged code with VC 8 and about MS's roadmap for native code development in future versions. After all, Vista itself is based on the Windows server 2003 R2 code base, so I wonder to what extent the future of programming in Windows really is managed code. The Win32 API will hang around for a long time yet, and it would be good to have a video on new and forthcoming tools for programming with that API and with MFC or ATL.
More videos on C++ programming emphasizing concurrency and clustering would also be welcome!
GaryBushey wrote:The real question that didn't get asked. Did she write the infamous "Developers" speech?
Watch for the follow-up video: "Steve Ballmer's Choreographer".
Then, of course, everyone will want to see the "Steve Ballmer Workout" video, in which his personal tainer demonstrates the chair-toss exercise.
scobleizer wrote:>they are probably taking source code from linux and putting it into longhorn right now...!!!!!!
Did you know it's a fireable offense for anyone working on Windows at Microsoft to look at GPL code? So, I doubt it.
Since GPL code such as Linux is regarded as being that "cancerous", I suppose MS people wanting to recycle *NIX code will have to dig into their 16-bit archives and pull out a listing for one of MS's first products, namely XENIX. MS still owns some of the rights to that, don't they, or is the source code now owned by SCO (present or previous incarnation thereof)? XENIX incorporated some BSD features into the ATT code base, so Linux fans might find a few worthwhile tidbits in it, albeit it ones that you could easier write from scratch than extract. (I'm assuming it would be of very limited use, unless MS has plans to release Longhorn for the 8086 or 80286, that is!)
Blackcomb will be the first version that fully implements the vision. layed out in a platform called cairo in 1995 although it has greatly expanded from that time.
Not to belittle LH, it is like the middle film in a trilogy
Ah, so Blackcomb will be "The Godfather Part III". Now, that will really be worth waiting for!
I think instead I'll peak in and see what's playing on the next screen over, here at the multiplex. Pass the popcorn, Mr. Jobs!
> I think our track record has shown a consistent track record of winning and even in areas, like the Internet, where people we weren’t gonna win we came back to win.
Gee, I was unaware that the Internet was even a "competition". Actually, I had always thought of it more nearly as a "cooperation", but obviously that attitude is alien to MS.
I do not want MS to "win" the Internet. I have been using the Net since way before MS even existed, and not only does it not need MS (or anyone else, for that matter, including Sun) to "win" it and control it, but in fact it's better if no one does so. To a certain extent, Hailstorm was MS's attempt to "win" the Internet, by coming between the end users and the content providers and other commerical businesses (no doubt with the ambition that eventually the users would regard the MS middleman as their single point of contact, thereby enabling MS to assume the back-end role themselves and then take over whatever business area they intended to "win"), but that failed, and I'm glad it did. Paladium / DRM is another approach to MS's "winning" the traffic between the end users and the content providers / retail businesses, and I hope that initiative fails too, for the same reason.
So Mr. Ballmer is mistaken in his belief that developers want, most of all, MS to "win". Perhaps developers **really** want MS to conform to agreed-upon standards, so then the developers' apps will succeed, regardless of which platform provider "wins"!
It should be interesting to see whether users can develop GIS-like client-side apps for Virtual Earth and for the Google Maps / Keyhole product, without having a "real" GIS, by using only AJAX. If that is possible, it must be painful for MS to watch, given that for years now (ever since the Netscape wars) the mantra there has been "Windows good, browser bad". That's why IE has been stagnant, with just one security Band-aid after another applied to it, instead of being completely re-worked, as so many users wanted. The philosophy of MS has been "rich clients" (and hence "rich MS"!) requiring Windows, Active X (or now .NET), etc., not browser-hosted scripts. After all, if your apps run in Firefox, who cares about the OS?
I do wish this feature hadn't been dropped for C++. I realize that C++ is a lot more, er, "unmanageable" than the "pure" .NET languages, so it gives extra headaches beyond those that result from building something like this for C#. However, (based on the C#/VB work) couldn't a tool have been completed at least for the **managed** part of C++/CLI? It's disappointing for MS to come out with a class designer that doesn't handle C++ at all, especially considering how many CASE tools there were for C and C++ back in the era when those were so popular.
It doesn't seem to me that either Part 1 or Part 2 of this video made a very compelling case for why Mom and Dad should want a 64-bit computer. That's of course because in fact they don't - E-mail, Word, Excel, Intuit, Photoshop, etc., all work fine now in 32 bits. I think the speakers got closer to the truth in urging developers to climb the mountain of 64 bits simply because, in effect, "it's there". Intel and AMD will sell mainly 64-bit CPUs starting within the next year or so, so it becomes a moot point whether they're needed. It won't really change the world - after all, lots of us have been using 64-bit SPARCstations etc. as out main work platform for years and years. What **may** have a big impact, though, is 64-bit multi-core, multi-processor, distributed, and eventually grid computing. I'm not saying Mom and Dad will have a Beowulf cluster in the basement to run Intuit and Photoshop faster, but the "utility computing", "Web tone", etc., model of computers everywhere (and seen nowhere) could come to pass, and developers (who are already thinking in terms of 'Can I make this run on a cell phone?' when they plan an app) need to be thinking of 'What if this app had access to arbitrarily many CPUs?' when they do their design. How about some Channel 9 videos, and a Route64 road show, on 64-bit C++ using OpenMP?