The question is asked "how many developers do you need for these tools to be effective". This question come on the back of a short explanation of code coverage.
Instantly I said ONE. Code coverage for a solo developer would be great. [....]
I've found some good side of such a high pricing. Currently huge corps have a lot of people working on thouse lame tasks like a testing / code coverage / review / documentation / ...
Team System promise to make it as easy as "Click Here to Start" inside Windows 95. Thus - in case if it will be allowed to use entire product effectively by a single developer - then all others people will be fired (think about - most of companies now think
about cost saving - not market share increase).
So - if thouse tools will requere 5 persons to using them - then 5 people will have a job. PAID JOB !!
You have to train yourself to find something good in any event.
As a developer I wholeheartedly disagree. Microsoft's new, open attitude is a great thing for me and helps me to do my job more effectively. Ok, maybe I can't plan 10 years down the road but I can certainly plan 3-5 years. Discussing features that will or
won't make it to the final release, and keeping us updated, is really important to developers. Keep up the good work guys!
3-5 years ? Can you tell me then your next paycheck will arrive ? Every two weeks ? Ohh ...
This mean that company you are working for - need solution currently, today, already installed and working .... !
I believe that focus on far future is flawed. I preffer more focus given on current Microsoft offerings.
As developer - visit https://msdn.microsoft.com and take a look that you see ?
Instead of assisting you with current development problems - you see articles about product you will not be used in production for at least 1-1,5 years ! This is clearly an attempt to push you to buy this new product.
Sure .. This will benefit you if you will read all thouse articles in advance.
But for most of msnd.microsoft.com visitors - they need their projects to be done today, not in 2+ timeframe. So you are unable to benefit from thouse articles currently
I just find it hilarious how people are saying that LH is going down the tubes, JUST BECAUSE MS removed winFS from the inital release of LH which as many people on here stated, IS NOT, I repeat IS NOT a "File System",
Nobody told here that they are worried if this is FS or nope.
Main issue is trust. Then Microsoft will anounce next time that they are going to build "next big thing" - nobody will trust them.
If you are unable to maintain schedule - keep it private!
Take a look on Apple. They keep all information about future private, but they produce cool products !
I believe that there is no needs for Microsoft to anounce their plans up to 10 years ahead.
Simply keep old (possibly a little bit modified cover page wording from 1990 annual report "Our most important accomplishment during our fist 15 years has been to prepare ourself for the next 15"
Okay, basically, nothing's really changed... if you go back far enough. Back during the Whistler beta, the roadmap was quite simple - Whistler, followed by Blackcomb. The specifications for Blackcomb were laid down, UI prototypes and concepts were created and
demoed at the Financial Analysts meeting by Steve Guggenheimer, and Blackcomb was all set to become the next major client release of Windows.
I can confirm this. Here is an email in my Inbox from Microsoft dated Tue, 27 Feb 2001 19:01:27 -0800
Paul Richardson (WINDOWS) wrote:
Thank you for the report.
The WindowsXP development team is aware of this issue and there have
been other reports of similar behavior. As such, this bug has been
resolved as "duplicate". But at this time, only the most critical bugs -
bugs that will stop the shipping and/or deployment of WindowsXP are
being fixed. The master bug will remain active and be re-visited after
the release of WindowsXP for further consideration/investigation.
The problem you describe is currently not possible because the XXXX
XXXXXXXX is not fully PnP (among other reasons). When it is uninstalled,
it requires a reboot before it can be re-installed due to some services
not being able to shut down. We don't allow components that require
reboots on uninstall to be re-installed without rebooting first (for
many, many good reasons). Resolving this as a duplicate of the
XXXXX-removal bug for Blackcomb, which has the XXXXX PnP bug as a
If you disagree with this course of action please send me mail
explaining the reasons why the bug must be considered for the WindowsXP
release and we will revisit the issue at that time and escalate
Thank you again for your efforts and support,
Windows Whistler Beta Team
So ? What is your current estimate on timing for this bug report to be fixed ? 2001-2008 (Blackcomb) or 2001-2005 (Longhorn) ?
Anyway - several years for a fix - this is cool
I've expected this. Separating Windows Servers and Windows Clients has clear business benefit for Microsoft (read: more money collected!)
This motivation similar to one used for
separation of Winter and Summer Olympics.
This is pretty smart move.
Microsoft already releases their software on annual basis. Service packs always add new features (in constast with legendary "no new features in service packs" promise).
pay-us-for-upgrade versions ('major' releases) become rare.
This is smart move to separate client and server releases in different financial years. This way business can pay twice (once for clients and once more for servers).
I think that Microsoft will have to release Service Pack (free upgrade) for Longhorn clients in the same time they release Longhorn Servers.
Just like in was for WinXP SP1 and Windows 2003.
As for trimming features - this was expected taking in account
announced delays. They were able to release Service Pack or stand-alone applications for WinXP/2003 with new technologies or bundle part of them in new (but trimmed from original expectations) OS version. They decided to do both.
The issue at hand requires a user with administrator access to run a program that an attacker has sent them.
Sure. Running as root or Administrator make your system insecure. But this is common practice for Windows users to run everything as Administrator.
I did not expect that it will such an easy to bypass security measures in XP SP2 Security Center.
For example consider
System File Checker that allow to verify system files digital signatures.
Even after succesfull attack you will be able to repair your system (partialy) if not totaly prevent system files replacement (at least for outdated non-SFC aware viruses).
But this is not true in this case. No easy way to repair/detect Security Center Spoofing after attack.
Even more - nobody yet checked if it's possible to spoof this before SP2 installation to hide current mailware and troyans on not yet patched PCs.
I do not understand Windows team motivation about simplifing access to Security Center. There is not so many
firewall vendors. Requerements to digitaly sign (just like WHQL signed drivers) thouse important system components and check their signature will not hurt (AFAIK, it's already done for kernel part of thouse components).
Security development is totaly different from regular software development. If you miss several user scenarios during regular software product - you can address them in new version, but if you will miss single attack scenario - there is no easy way to fix this.
I'm pretty sure that there will be security flaws in software products for a long time.
I would like Microsoft spend more resources not only on attack prevention, but also on after-attack recovery. For example, some kind of
bootable CD/CDR with
diagnostics utilites/antiviruses/system recovery tools to decrease recovery costs after incindents and make it possible for moms and dads to repair their system easily.
P.S> A few quotes from
Handling Bad Publicity marketing guide for Small Bussinesses published on Microsoft bCentral:
"After a crisis, emphasising positive stories such as improved practices and community involvement will help to restore your reputation in the longer term."