Good stuff. I remember in the early days of the Internet, where I used to spend quite a lot of my time in Usenet (newsgroups). I finally came to the realisation that I was spending way too much time on it and email, so I went cold-turkey - changing email
addresses, very cautiously giving the new ones away, deleting the Usenet setup and history, etc. The best thing I ever did.
Well, I did loose a lot of "customers" over this. But in reality, they were mostly scavengers, rarely every paying for any of my time or assitance. So apart from the loss in some self esteem, I was better off.
I suppose the time will come when I'll have to do the same with Ch9. Though, I do have better discipline, these days...
PS: Tim, I can be a pedantic sod, at times. Which means that when I notice a problem, I tend to be distracted by it, and investigate. I figure there might be a few other like me, so letting them know up front
may allow them to see pass it. Good work, anyway. I doubt I could operate a camera AND engage in a coherrent conversation at the same time - assuming you weren't cheating...
Yeah, the first edit-point does break the thought, so here's a filler for you...
I wrote a short (colourful) article (many years ago) that talked about being aware about unexpected behaviours, which I think is relevant to this topic of UAC spoofing. The article I wrote was specifically about floppy-based virus infections, and how, through
the dicipline of keeping the write-protect tabs in place at all times (yes, 5.25" floppies), I was able to detect suspicious behaviours, like the floppy being accessed at (repeatedly) inappropriate times.
By familarising myself with what were expected behaviours, awareness of any unexpected ones  would trigger an investigation, checking for viruses, etc.
So in the case of UAC spoofing (without the Secure Attention Sequence - Ctrl-Alt-Del), if you see more than one elevation request, be suspicious !
Do I think that's a sustainable practice, having to train users into what are expected and unexpected behaviours ? No, but until UAC is nailed down and "hardened", so that it does become a (first-class) security boundary, then you are stuck with having to
re-live (some of) the past...
 Because one of the aims of a virus (at that time) was to spread itself via floppies, a virus would repeatedly attempt to write itself to the floppy until it finally succeeded. In some cases, however, the virus would continue to (regularly) check, even though
it had successfuly written itself (infected) a floppy. Given that the floppy drives were quite noisy, it wasn't difficult to notice.
So does this mean we are going to get a WinRE video ?
Like with my skillset, I'm seriously considering wether there's any opportunity for someone like me, creating "addons" to WinRE, or perhaps convincing them into allowing some kind of addon API.
Now a direct question for Jamie/Andrew.
I ran into an issue a couple of days ago which I thought had been fixed since Windows 2000 sp2. That is 48-bit LBA on ATA (IDE) Hard disks.
In this case, there was an existing Windows 2000 Server where the Admin needed to setup a parrallel install of Windows on a second D: volume. Now this was a 250MB basic disk volume, and was ~60-70% full. The Admin booted off his Windows 2000 sp4 CD (slip-streamed)
and proceeded to through the text-mode setup, installing to D:\WINNT. Upon reboot, entering what should be the 2nd stage, GUI part of the installer, you get a BSOD and a message about D:\WINNT\SYSTEM32\NTOSKRNL.EXE being missing.
Upon my investigation, it became apparent that NTOSKRNL.EXE was well beyond the old 28-bit LBA limitation (~128GB point). But I wasn't able to investigate any further, due to urgency. Is this likely an issue of a bad service pack integration of the Windows
2000 Server CD. That is, the CD was still using the old, pre 48-bit LBA IDE/ATAPI driver(s) ?
PS: I side-stepped the issue by resizing and moving the partition up, leaving a 2GB partition at the start of the disk for the new parallel install.
I suspect that what I'm about to say is already understood, but the thing that bugs me about this personalisation stuff is that it does it without my permission, and typically, in a half-arsed (appendaged/simple) fashion.
I would probably use at least some of it, if it instead collected the appropriate information (tightly integrated), and then only after it has figured out that there would be a
significant improvement in my productivity, make an offer to personalise the program. The offer should provide supporting evidence, and appropriately focused solutions, much like you have would to do in the real world. eg:
"Hey, I noticed that you keep opening and closing the object browser. You don't have any of the accessibility features enabled, and you have enough screen real estate, did you know that you can dock it so it is now accessible via a Tab ?" with options for "Show
me" and "Bugger off"
Or the opposite, "Hey, I noticed that your screen redraws are piling up because your using the Visual Studio IDE via Remote Desktop. If you unpin these panes..."
Just like in the real world, these offers should only occur at appropriate times (scheduled ?). Naturally, this includes the ability to completely turn this stuff off. Logging and all...
RichardRudek wrote:Unless your unfortunate enough to do some
sub-contracting work for a large Corporation or Government who are just now deploying Windows XP to Desktops with IE6 and Office 2003...
In that case, why not just deploy .Net 2.0 with the image?
What image ? You mean the new desktop deployments ?
Note that as a sub-contractor, I have very little say about how the client runs their IT departments or infrastructure.
To that end, these types of clients are so anal about what is supplied to them, that even when you follow everything to the letter, it takes a year to deploy really simple stuff.
I can understand their motivations, ALL of the stuff which becomes part of their SOE (Standard Operating Environment) needs to be regression tested, which includes security audits.
So the reality of the situation is that deploying something like the .NET framework to desktops or servers will fail the security audits, because they are basically overworked (perhaps lazy) and/or incompetent - the better safe than sorry principle ; when in
doubt, leave it out... To achieve "breakthrough", there has to be a significant amount of back-pressure from many vendors. Usually yielding without having actually performed any competent form of security audit...
Hell, just trying to have them setup Integrated Security between their Servers is a non-starter. So you have this insane situation where as a developer, you have to involve myself in their security administrivia - violating the "need to know" security principles.
Stupid stuff, really.
But then again, these IT departments are just appendages - their core business/purpose is not IT.