Entries:
Comments:
Posts:

Loading User Information from Channel 9

Something went wrong getting user information from Channel 9

Latest Achievement:

Loading User Information from MSDN

Something went wrong getting user information from MSDN

Visual Studio Achievements

Latest Achievement:

Loading Visual Studio Achievements

Something went wrong getting the Visual Studio Achievements

Comments

earnshaw earnshaw Jack Sleeps
  • Rob Short (and kernel team) - Going deep inside Windows Vista's kernel architecture

    I think videos like this one with Rob Short are *GREAT* because they provide context for understanding product and how it got that way and how people think it should evolve.  Learning about product in a contextual vacuum is, for me at least, an unrewarding exercise.
  • Life and Times of Anders Hejlsberg

    Anders is THE MAN
  • Steve Ballmer - Quick chat with Microsoft's CEO

    The "take home" message of the Steve Ballmer video is that the next 10 years, for developers, will be as good as or better than the last 10 years.  It is left to the viewer to decide what "better" means.  Perhaps it means more interesting work.  Perhaps it means higher volume of work in general.  Perhaps it means increased gross receipts for developers.  What Mr. Ballmer thinks when he thinks "developer" remains a mystery.  What aids to developers Microsoft plans to offer over the next 10 years is unspecified.  I come away from his brief comments with the same confused, "what was THAT all about?" that I used to experience at half-time when coach would give us a pep talk.

    Nevertheless, it is extremely good to have the big cheeses honor us with a few comments.  I don't think this qualifies as a contribution to the still missing dialogue between Microsoft management and developers, but it is a start.  (I know, management believes there is an ongoing, long-term, in-depth dialogue that makes the world safe for innovation.  In my neck of the woods, the means to advise Microsoft management is non-existent.)  Good show!

  • Tony Goodhew - The path to Orcas, (future Visual Studio), studying the market research

    By observation I conclude that decisions made in Redmond are based on the idea that the ENTIRE customer base has the enthusiasm for product churn that dotnetjunkie embraces.  The young, enthusiastic lions in Redmond steep in a go-go atmosphere and come to think that the go-go atmosphere exists everywhere.  Well, it doesn't.  I love new technology that adds value.  New technology that is simply recycled old technology is a marketing trick, not an advancement (e.g. renaming OLE COM and COM ActiveX controls).  It is strange to think .NET is now considered "old" technology.  From a teaching and learning and doing point of view, .NET has got COM beat hands down.  Now we are to jetison .NET for something bigger, beefier and bouncier.  If so, let it be.  But first, figure out how those of us who prefer to take our lessons gradually and systematically can ease into it.  It is valuable to see the taxonomy of technologies both from a historical perspective and to arrange one's teaching plan so that the true core technologies are introduced first.  Or not.  I could be way off base here. 
  • Windows, Part I - Dave Probert

    Very fine presentation!  The old mainframe that I used to work on had processes (called "runs") and threads (called "activities").  The thread dispatcher maintained context for each thread in a structure called a "switch list" (SWL) which was a misnomer because the actual switch list was a priority table of linked lists of SWLs.  Paired with the SWL was the Activity Save Area (ASA) which contained processor state (relocation information for program instructions and data, which were separated), and the contents of CPU registers the last time the thread halted in favor of a different thread.  There was no paged memory.  Programs were either entirely in memory or entirely out on the swapping drum (yes, drum).  That decision was taken to avoid thrashing, which was possible because working set theory didn't exist and it was overkill for this machine's small memory of iron cores.  The OS was written to run equally well and in parallel on all available CPUs.  By 1970, this machine only crashed once or twice a day because of bugs in the OS.  It supported dozens of users in interactive mode using teletypewriters (Model 33, Model 35) and it also ran jobs in background as "batch" processing.  A huge backlog of batch jobs would ordinarily accumulate during the day and would be worked off at night.  Many trees died to afford users something to look at as output.  The whole thing was royally poo pooed by sophisticated faculty from universities "of quality" where they had adopted Unix wholesale (and Multics before that) as the sine qua non of operating systems.  The thing they didn't like was that the interactive mode was exactly the same as the batch mode in user interface.  (The text-based user inteface was nothing like JCL:  it was NOT compiled, it comprised simple commands.)  That was quite an advantage when creating production code and for testing out production run control language.  But hell, what good is an operating system without redirection and piping and a tree-structured file system, etc., etc. Anyway, my point is that Windows NT is a very good operating system, beats the pants off of Linux in terms of out-of-box usability, and builds on the valuable legacy of the OS that I described above, which was very good for its time -- and still exists -- and still can run binary programs written for it 40 years ago if you can figure out how to read in the deck. 

    When I first read about hyperthreading in 2002, I decided that Intel had built a chip that was able to hold context for two threads at the same time.  From what I have read in response to Robert Probert's talk, I was right.  Windows must somehow schedule the right two threads on the chip so that the fast context switch in the chip can be used.  Otherwise, HT is of no value.  I imagine the top two threads on the priority queue would ordinarily be a good choice, assuming they aren't already scheduled on some other chip.  Then, when one of the two threads is blocked waiting for, say, an I/O completion, the other thread can instantly be restarted using context already onboard the chip.  There are a lot of CPU cycles to be saved by avoiding the slow context switch!
  • Kang Su Gatlin - On the 64-bit Whiteboard

    As a matter of policy, I would avoid unmanaged code in a 64-bit environment.  Of course, there is the porting problem.  People can and do write code with pointers and employing pointer arithmetic; that was peachy in the time, long past, not far removed from assembly language, where any geeky method to save a few CPU cycles or bytes of memory was smiled upon.

    Any time the underlying architecture changes, in some fundamental way, these insects fall out of the woodwork.  Mercy, with the high level of abstraction now possible, can't we stop revisiting this stuff?
  • Herb Sutter - The future of Visual C++, Part I

    I rode an old war-horse called assembly language through the bulk of my programming career.  I also dabbled in Fortran, APL, COBOL, even did some Algol and Snobol.  So it's nice to hear something positive said about the value of getting closer to the iron as the trend has been to abstract the iron away.  I get a charge out of C# because it makes doing simple things simple and increases my productivity.  And I don't have to create for the 100th time some variation on a collection class.  I had a conversation last summer with one of my contemporaries during which I remarked that today's Computer Science student may not be getting fully exposed to core concepts like trees, queues, hash tables, dequeues, stacks, spin locks and so forth because these are abstracted away as prewritten classes.  Not that that's bad in general.  It's not.  But it poses a problem for teachers of computer science who must ensure the way these things work under the hood are revealed.  Of course, this piece is about C++ which I used for many years as a systems programming language.  When I first read the C++ for .NET book, I was frankly appalled at how different the language I had grown so familiar with looked.  That's when I learned C#.  I don't denigrate C++ and I am happy to learn problems with using C++ in a managed code environment are being addressed.  For me, though, I use C++ only when C# does not fulfill my needs.
  • John Pruitt - Thinking about the customer in design

    That a product does not end up looking like Frankenstein's monster makes sense from many points view:  end users, marketers, developers.  Still, Microsoft customers receive products that present an inscrutable public face, with many controls whose purpose and rationale and existence is not at all clear from their appearance in the product let alone in-product "help" and published literature, if any.  I am reminded of the repetitive experience of relearning the IDE as each release is published.  If I perform a certain task using a certain idiom in release X then that task is performed using a different idiom in release X+1 without so much as a how-do-you-do.  There are tons of widgets that I don't care about and won't ever use.  There are some that I should know exist and should be able to learn in under an hour through some well-defined teaching aids integrated with the product.  Differences between releases should be better explained.  What is available should be made explicit.  A good presentation of the design philosophy of a release would help users be better users of that release and willing buyers of the next release.  Back in the early 1970s I looked forward to receiving weekly updates (natural language summaries on the purpose of a feature set and how to use it) on a locally developed text editor.  It was a pleasure to vicariously experience the product as it was being built and to learn each feature as it was added.  In the 2000 aughts, things have devolved so that I see only the end product with no systematic approach provided to learn what it offers and how to use it.  This goes for everything from the IDE to the operating system.  Some of the Knowledge Base articles may as well be written in Martian for all the insight they provide.  There must be a tacit rule that all such writing be overly concise, rigorously accurate, commit to nothing, and, in the end, explain nothing unless you came to the article already more or less understanding the solution to the problem that you were trying to solve.  What must be obvious to people who work with products on a daily basis, because they create them, need not be and usually isn't obvious once the product is bought, paid for and sitting on someone's desk.  Crossing the gap has no single or obvious solution, but the gap should be recognized.
  • Don Box - What goes into a great technical presentation?

    I think nudity should become de rigueur for all technical talks.  At least, when the talk is going badly, there will be other things to think about.
  • Jason Zander - Tour of the .NET CLR team

    It is interesting to note the same kinds of memory management bugs of 30 years ago occur today.  The technology for resolving them is oh so much better, but I have often had to put PRINT statements into memory managers to detect by whom, when, why and in what order some buffer got doubly allocated or orphaned.  Fortunately, these bugs are now closeted away from application programmers.  They are left for the systems programmers of Washington State to detect and destroy. One machine I worked on had an 18-bit address of 36-bit words.  Inside the OS these addresses were not relocated; they pointed to actual hardware locations.  It was great fun when code using these addresses ended up clobbering memory completely unrelated to the code's function.  Evidence that such corruption happened could be detected literally days after it happened.  Tracking such phenomena back to the point of origin was very, very difficult.   Eventually the hardware evolved to the point that so-called absolute pointers into memory were abolished so that memory limits registers were always used and corrupt pointers usually caused an immediate, diagnosable machine stop.  But I digress.