Pablo Z

Pablo Z Pablo Z

Niner since 2005


  • Outstanding Technical Achievement: C# Team

    I guess I can’t expect people to recognize the difference between saying “we are coming to a point where there is a fundamental change in Moore’s Law” and saying something like “we are coming to a point where there is a fundamental change in the outcome produced by Moore’s Law”.

    Also, I don’t feel any better or worse for correcting a small little mistake in the way that Anders expressed the future and Moore’s Law, because doing that doesn’t make me any better or worse.
  • Outstanding Technical Achievement: C# Team

    I would like to point out that Anders Hejlsberg made a little mistake when he was talking about the multi-core stuff. He said “we are coming to a point where there is a fundamental change in Moore’s Law, it’s not going to go away, but it’s going to change in that it’s not going to give us faster CPUs anymore, it’s going to give us more CPUs”. While it is a fact that we are coming to the point that CPU’s are not getting faster, Moore’s Law didn’t refer to the speed of the CPU’s it referred to the numbers of transistors on an integrated circuit. While this in the past has translated into faster CPU’s, now Moore’s Law continues to apply but instead of a faster single CPU with get more transistors in a multi-core CPU that runs at a speed that hasn’t increased as before.

    But beside that little mistake the video is great.

  • VC++: Safe Libraries with Martyn Lovell

    You may want to change the name of the library, SecureCRT is the name of a very well known SSH client.
  • Alan Cooper - Questions after his keynote

    God, I hope Alan is working on About Face 3.0 or something cause I'm pretty sure most people will make horrible programs from a usuability point of view once Vista is released. All the cool Avalon stuff will get overused and used in wrong horrible ways.
  • Martin Taylor and Bill Hilf - Linux at Microsoft, Part I

    Beer28 wrote:
    Microsoft only releases patches for it's OS and related microsoft products. In a standard redhat sources file you can have repositories from everything from jpackages for java applications, to dag, freshrpms, and fedora-extra for thousands of non-OS client  and server applications to the regular repo's for system updates.

    So I'd like to see Microsoft try to keep dependancies and packages between itself and everything available on on it's windows update site. That would be a more realistic comparison.

    So once in a while a DAG package may mismatch with a Fedora-extras package, but RPM will catch that and will warn you about it and make you resolve it before you install that package that doesn't check against the others.

    So yes sometimes, if you have lots of repo's there are package mismatches, but they are few and far between, and considering how much non-OS software is available in repositories vs. the limited amount of MS system updates and office updates in the MS windows update repository, I think it's pretty amazingly good.

    As far as package testing from repositories, I'm not sure how much testing goes into DAG, freshrpms, or the fedora core repos, or the jpackages or mono, I couldn't say because I haven't been on a team that does those repositories. I'm not there so I would be more prudent not to speculate on that.
    I can imagine Microsoft does do alot of testing of it's packages before releasing updates, but I can't imagine commenting unless I knew for sure.

    I do know that most all packages I've used over the years do work and the dependencies match up correctly. It wasn't that great when I used to use urpmi, but with yum and up2date, it's pretty good.

    PS- I have no doubt of the man hours and money MS puts into testing, I'm mearly stating I don't have hard facts or test hour stats on either side so a comparison wouldn't be fair. I do test before I rpmbuild on the target.

    That doesn’t mean its good at all, or that is better than Microsoft’s approach. It’s just a way to do it, and to tell you the truth; I’m happier most of the time with Microsoft’s approach. Many times I had to deal with the problems caused by some distribution having old buggy versions of different applications in some servers. So I rather go to a centralized place to get my updates, a place that I know it’s up to date, where the updates a formally tested, than to have updates for all my apps that aren’t formally tested. There are many more problems with doing such system that the Linux world hasn’t resolved yet. But of course there are many benefits too so it’s a question of what’s best for each. For Microsoft it’s better to have formally tested applications only in Windows Update. I think Linux world in general can’t afford to do formal testing (unless IBM, Novell or whatever put money into that) but they used that problem as an advantage and made systems such as urpmi up2date, etc. where updates are available for a huge number of applications.

    To make my point clear, I think the problem is that for Microsoft to do what you ask they have to give up many things that they are not willing to do. In the same way, for the Linux community to do what Microsoft does successfully, they have to give up things that they are not willing to give up. What needs to be realized is by both, Linux and Windows trolls is that they are not the same, that they probably will never be the same, and that each one is good at some things and bad at others. Choosing Linux simply because Microsoft “is a monopoly that just cares about money” is just wrong, and choosing Windows just because “the Linux people are a bunch of communists” is wrong too. What’s best is to choose in a per situation basis, in that way you know that you will always use the OS and tools that you think are the best based on what you have to achieve.