Tony, it’s a shame you’re a physicist. I say this because in your talk you never alluded to biology or such things. There are superb examples of componentisation in nature at the macro/micro level and from an architecture viewpoint something to be admired.
This is not to say they don’t exist in quantum physics.
For instance, some basics building blocks are common to all forms of life on earth such that a butterfly might become an albatross given time. As software’s existence is imperceptible in biological timescales it could just be that one of your legacies, or
a colleagues, will have been to help create a software molecule that become fundamental to all future software.
My favourite from all the years of Word, Excel etc has got to be drag and drop.
I wonder if Mary and her colleagues operate in the stratosphere of Microsoft or do they become involved with the vision of product evolution in a user legal sense at the developer or detail level?
A users legal document integrity or fidelity in the Office product range still has a long way to go. When I look at the evolution of the Office product I see its genesis as starting in the non-legal domain, a toy, in a paper bound world. The product still
lives there, is no toy, and has been taken up big time by global government agencies and industries like pharmaceuticals were there is an FDA or other legal requirement.
Mary works in an undiscovered country and alludes to the difficulties of relating previous case law with new technology. This seems especially so when the case law can come from many countries and change dynamically.
What I would like to see, is discussion on how the fidelity of an electronic Office document cited in evidence would be set beyond repudiation. Perhaps there should be a version of Office designed for this purpose and one used specifically when there was a
legal requirement or obligation.
Another issue is technology related. If legal documents were stored electronically then their content would need technology to read them, technology that would not be available in the long term. This contrasts with paper, which requires no technology to impart
meaning and which has considerable longevity compared to a hard drive platter.
Is this the push behind Office 12 going XML, I wonder? It goes a long way in overcoming the technology issue but reduces the fidelity requirement. Presumably, digital fingerprints for archived Office products will be built into Office 12?
This has to be a good move! Outside of XML is this a philosophical return to Wordperfect/Wordstar? These technologies had the equivalent of tags surrounding text.
It will make the document's content technology proof given that Office has such a wide customer base.
Long-term storage of documents/data is becoming a problem as they are tied to the version of an application. The change from O97 to O2000 is a case in point. Documents may have to be kept for 30 years or more.
A rule of thumb has always been to save the info in ASCII format. However, given that the new docs will be compressed it seems that bit 8 is still alive.
In this respect One assumes that the zip format in use is not propriety and will still be available in 30 years time.
Who cares, I hear you say!
Well, it could be information relating to your mortgage; the one that you pass down to your children.
I was doing some research into tradeoffs between hard drive SATA and SCSI technologies when I came across the fact that Server 2003 supported Raid 5 Striping something that is new I believe in the server product.
I was also looking at Tag Command Queing (TCQ) and SATA related NCQ for hard drives.
Does Server 2003 support TCQ or something akin to it during management of Raid 5 or does the kernel leave all of this to disk controller hardware?
I ask the question because a number of motherboards manufacturers Ausus for one have implemented Raid 5 but have ignored TCQ I guess because the intended market is workstation. However Abit have implemented NCQ without featuring raid on a server board.
Should the kernel glue these two together somehow?
Another reason for asking this is one of vision - perhaps in the future distributed computing may become an everyday occasion. In this scenario workstation hard drive access begins to mimic that of a server in that jobs may be running from many diverse sources
giving TCQ a role to play.
What Dave brings out in this piece reminds me of something that has sadly disappeared from modern computing by way of evolution. It’s that in those early days of the 70’s and 80’s you were more than likely to be an Electronics Engineer as opposed to a programmer.
Chicken and egg, which came first the hardware or the software.
One can sense the potential of multi-core CPU’s but as Dave points out in many senses they have been around for some while at the machine level. The challenge for contemporary programmers is to exploit this new technology and its not going to be easy.
Dave has a lot more control at the Kernel level than say a C# programmer has with a real time thread. So do we end up with an NT5 style programmers kernel sat atop the machine Kernel simply to control such threads?
In my experience real time threads are of limited use outside of machine control and mechanical automation, e.g. CNC lathes. The OS systems that control these machines are very similar in philosophy to what is being done in NT5. The only real difference
is that real time means literally that. A tooltip must arrive at its destination at the right time or crunch.
Now C9 are in the in the movie business and doing it so well its good manners to also name the camera-man, sound-man and the interviewer as well as the interviewee's in a fixed location piece like this one.
It would also be nice to see a list of published material such as whitepapers, books etc. It could afterall supplement someones income.
I must admit to feeling a little jealous of the kernel designers and maintainers in that, and I don't know for sure, but I guess the scope of what they do has remained manageable and somewhat stable over the years.
Whilst for the rest of us in the application layer - well, change is the name of the game and the scope sometimes feels infinite and overwhelming.
The application layer has seen an explosive growth in support technologies for application design. I just wish that the same was available for OS design. I'd love to see thousands of prospective OS being churned out, each one relevant to the moment and not
constrained by commercial or backward compatibilty hangups. Maybe this is so and I'm just out of touch.
Perhaps Neil [et al], given their collective experience, should write a book(s) along the lines of Donald Knuth and the MIX language but instead of a lanuage define a model OS from first principles.
My perception of MS products is that versions seem to be coming thick and fast of late. Which is different to what used to happen. Is the Business guilty of moving the stock around the store to keep the buyer from becoming habitual?
Given that we are all subject to to some degree of Pavlovian conditioning, the small changes that take place seem more of a nuisance than of real value.
Instead of tinkering with a products look and feel should MS instead be investing in heuristic learning as the user interacts with the product?
This is partly there in the Intellisense features.
Its been a long, long time since I worked with Foxpro or was it FoxBase+ back in 1988.
At the time it was the fastest thing around with version 2.x coming on five 1.44Mb diskettes.
We won't talk about the Rushmore heist!
Given that type of foot print, have you considered re-issuing a cut-down version at a similar size as part of the Express range of products. It would be a cool way to introduce a new generation of programmer/hobbyist to an xbase legacy.
I'd like to see some benchmarks that compare Foxpro with other MS database technologies - anybody up for the the challenge?