Coffeehouse Thread

21 posts

Forum Read Only

This forum has been made read only by the site admins. No new threads or comments can be added.

BUGS = FEATURES, when will that end?

Back to Forum: Coffeehouse
  • User profile image
    codeRonin

    When I purchase I product, I expect it to work, correct? Yes, maybe there are bugs, but Micrsoft has had a history in my book of selling 90% finished products for the price of a Rolles Royes. So I will have to wait and twiddle my thumbs until there are enough Fixes/Service patches before I can actually use it. Or they say that it'll be in the next version.

    But you said it would work in your very own manual?! What's up with that. Heck, by the time all the patches/etc are out, you say there is version X+1 which is better, but it never is. It's the same headache all over again. So:

    1. When will you start shipping products tat are 99.999% finished AND Working!

    2. WHen are you going to use service patches to actually add features and fixes to updated technology that was "after the fact" Instead of trying to fix-up what wasn't there before, but should have been.

  • User profile image
    HE3

    AHAHAH. Here we go!

  • User profile image
    mvuijlst

    Serieously, name one software product that's "99.99999% complete and bug-free". Oh yes, and that's as complex as, say, Windows. Or Office. And that needs to run on practically antediluvian hardware.

    Heck, if even Don Knuth can't do bug-free...

  • User profile image
    codeRonin

    Ok. From my 6 years in working with and talking with numerous Internet Service Providers who have had experience using WinNT, Windows 2000, and Linux (mostly Red Hat) as their web server, they have all sounded like a broken record.

      "With Windows I always had problems. People breaking in, servers locking up for various reasons, issues logging in for dial-ups, slow, poor documention, etc, should I say any more? Ever since I formated and loaded Linux, I've never even had to touch thing. It's like it's not I don't have customers anymore becuase I'm never getting any calls about the server being down or something not working right."

    Not only have I heard from other ISPs, I've experienced it myself, and our ISP down the hall has told me that very same thing. Every weekend, or two weekends he'd have to come in just to reboot the machince. No Formats and reloads could every fix it. After Red Hat 9 was installed, he never had any problems since. And actually I think it's been about a year now. But before that other Linux geeks would tell me that it was a ritual to reboot their linux server every year, just to remember what it was like to reboot a machine Big Smile

    Go ahead. Tell me that isn't 99.999% bug free.

  • User profile image
    codeRonin

    Sorry, I forgot the antediluvian comment.

    My server was a Pentium 100Mhz 486 with 80MB of Ram and a 4 GB HD running Red Hat 7. This was faster, yes faster than my 350Mhz 128MB (PC100) system which was running windows 2000 Adv Ser. Year was 1999/2000/2001? Whenever windows 2000 came out I guess.

    I hope this helps.

  • User profile image
    Karim

    I looked at your other posts and they kind of sound like a broken record too.  Did an MCSE steal your girlfriend or something?  LOL 

    Golly, you're probably right about that whole "99.999% bug free" thing.  Sayyyyy... why don't you post the IP addresses of these Linux boxes that have had Linux installed and never been "touched" or rebooted.  I can run a quick, uh, "vulnerability test" to see if they're missing any of the 5,892 Linux security patches that came out in the last year.  Muahahahah....

  • User profile image
    Lowrez

    I'm no Windows Cheerleader, but I've got to say you're being a bit unreasonable.  Comparing Red Hat 7 (with Apache I assume) to Win2K Advanced Server for basic web hosting is a bit like comparing how good of a grocery getter a Corvette is compared to a Shelby race car.  One's gonna take a lot more work and still not do the same job.  Now since the product I work on happens to be a web load testing tool (cross platform, natch) I feel pretty confident in saying for pure speed Apache has got it down on almost all other web servers.  Then it becomes a matter of getting it onto a platform where the OS makes the smallest dent in the overall performance so Linux is your winner.  There are reasons for Win2K, but speed isn't it.  Tying it together with your NT domain is really what should be driving those installs.

    Now as to the lower bug counts, that's simply a matter of corporate culture and user expectations.  Like I mentioned before, our software is cross-platform and the bug counts are pretty similar between Solaris, AIX, HP-UX, Linux, and Windows.  None of the platforms make it any easier or harder to write bug free code.  However, different modules are used by different types of customers so we know what severity of bugs we can and can't ship with.  The report analysis tool which goes out to every manager needs to be a lot cleaner than the toolkit which goes to your site developers.  The advantage that a lot of Open Source projects have is that they release the buggy code with a little tag that says "not for general use" and the users who want the newest features swallow the bugs for the latest Good Thing(tm).  I wish we could do the same thing, but when people pay for software they expect the latest features and stability all the time.

  • User profile image
    Pseudo

    Lowrez wrote:
    I feel pretty confident in saying for pure speed Apache has got it down on almost all other web servers.  Then it becomes a matter of getting it onto a platform where the OS makes the smallest dent in the overall performance so Linux is your winner.  There are reasons for Win2K, but speed isn't it. 


    Why do people keep saying things like this without any benchmarks to back it up?  I'm not saying you're flat out wrong, but at least drop a link to SOMETHING showing such a bold claim.  It sort of reminds me of the NT days when everyone said how slow it was compared to Linux/Apache and then the OSS guys finally got what they wanted  (the Open Benchmark) and the benchmarks proved them wrong under thier own conditions.  My point isn't that Windows is faster, it's that in the past everyone assumed Linux was faster and then when they are challenged, and agree to all conditions of the benchmark before hand, the results show that the general knowledge that they were faster was infact wrong.

  • User profile image
    Lowrez

    Pseudo wrote:
    Lowrez wrote: I feel pretty confident in saying for pure speed Apache has got it down on almost all other web servers.  Then it becomes a matter of getting it onto a platform where the OS makes the smallest dent in the overall performance so Linux is your winner.  There are reasons for Win2K, but speed isn't it. 


    Why do people keep saying things like this without any benchmarks to back it up?  I'm not saying you're flat out wrong, but at least drop a link to SOMETHING showing such a bold claim.


    Sorry, we don't post the results of our tests online.  I can't really speak to the testing methodolgies of other companies, but we run Apache and IIS on identically configured Windows boxes and Apache gives better results.  Personally, I have problems with URL blasters like web bench and I think everyone should be using Compuware's QALoad (blatant plug) or at least Mercury's LoadRunner when doing performance analysis.

  • User profile image
    Karim

    Lowrez wrote:

    Sorry, we don't post the results of our tests online.  I can't really speak to the testing methodolgies of other companies, but we run Apache and IIS on identically configured Windows boxes and Apache gives better results.  Personally, I have problems with URL blasters like web bench and I think everyone should be using Compuware's QALoad (blatant plug) or at least Mercury's LoadRunner when doing performance analysis.


    I think you are proving his point about anecdotal, non-published "test results."  Smiley  I know a guy who knows a guy who knows a guy who has a brother whose third cousin swears that Apache is faster... Smiley

    I'm not saying I doubt your results, but you didn't say anything about your testing methodology.

    Are you comparing apples to oranges?  That is, a web server designed (by default) to serve static content (Apache) versus a web server designed (by default) to serve dynamic content (IIS 5.1 and earlier)?  Maybe an apples-to-apples would be dynamic (PHP) vs. dynamic (ASP).  IIS can also be tuned to provide better performance if it's only delivering static pages (i.e. ASP not required), which would be a much more fair comparison to Apache, but IIS (5.1 & earlier) isn't tuned that way out of the box.

    Also, are you referring to IIS 5.1 or earlier, or IIS 6?  IIS 6 only does static content "by default" (you actually have to go out of your way to enable ASP!!!) and the performance is far, far better than previous flavors of IIS.

    You don't have to tell the world your test results, but knowing a little about your testing methodology would give us a clue about whether you're smoking dope or not when you say "Apache is faster."

    Sometimes people omit details for a reason (e.g. claiming that their '74 Pinto is faster than a Ferarri, while conveeeeeeeniently neglecting to mention that the Pinto has been fitted with solid rocket boosters and the Ferrari is missing all four wheels).


  • User profile image
    troublefunk

    "The advantage that a lot of Open Source projects have is that they release the buggy code with a little tag that says "not for general use" and the users who want the newest features swallow the bugs for the latest Good Thing(tm).  I wish we could do the same thing, but when people pay for software they expect the latest features and stability all the time."

    Are you suggesting that Microsoft does not have Beta test programs? There appears to be one for Longhorn, and I imagine it comes with the same kind of disclaimer.

  • User profile image
    Manip

    Personally I would not use IIS for any tasks even if the company had paid for it... To an extent I don't care which is faster, it is only a few extra connections one way or another anyway... my problem with IIS is firstly its track record is very poor and secondly it is overly-complicated. With Apache (one windows and Linux) you have folders for this and that and you can see plugins and remove them. That means you have complete and total control of what plugins are running on your server and can view it just as easily... with IIS you have an insane hieratical system where you have no clue what components are interfacing with your server.

    Here is a basic off the top of my head list:
    - Security (Apache is more secure)
    - Control (Apache gives greater control, nothing to do with OSS)
    - Automation (Apache has plain-text configs, automation is easy)
    - Separation of system security / web-server security (If you add a new user to a Windows/*Nix system that does not effect the web-server)
    - More Modular (You can add and remove stuff more easily)

    I am not even going to pick at the free FTP server, it is too easy! That thing should just be removed.

    Microsoft's engineers don't think stuff though enough and just keep building more components on old ones and cross linking until we reach a point where, if one component high up the chain has a problem the entire tree inherits it.

    Also, linking a web-server / FTP server into a domain is NOT a good idea and NEVER was.

  • User profile image
    Jeremy W

    Not even sure where to start on this list. I'll leave security alone since it's such a hotspot (and is so incredibly relative it isn't even funny).

    You'll need to define what you mean by Control to me... Webserving is an incredibly simple task, how much control do you want? Between IIS and web.config you have incredible amounts of control, in fact more than with Apache/.htaccess.

    Automation... Between servers? Scheduled tasks? Remoting? Web service-fired events? OS-based listeners?

    Wink

    You're right about the separation of windows / web users. They are all in the same place, but they are easily separated. It's called sandboxing and has been done since the beginning of time. But yes, it would be nice to somehow have different users in some cases.

    However, in non-ISP environments you can be damn sure I want to use NT/Directory authentication. Saves the trouble of login scripts entirely. If someone's logged into the domain they have access to all the apps they need.

    Try doing that with Linux Wink (oh, is That what you meant about control? Wink).

    Linking a webserver to a domain... Again, in an ISP environment, sure. But since most of the webservers out there are in corporate environments, it sure does make sense to link it to a domain. Inside a DMZ or semi-safe zone sure, but still within the sphere of domains that a corporation operates (we have 50+).

  • User profile image
    JParrish

    And then theres the fact that the current release of IIS has it's configuration files in pure XML.. Apache should have done that years ago.. and that is coming from an avid supporter of both.

  • User profile image
    Manip

    Control as in the ability to select exactly what modules your web-server is using, the control to set up your web-server for the exact load you want and the control to have your web-server act different from the norm.

    I like how you compare IIS's MAIN config file with one of Apache's policy config files. The main apache configuration file makes IIS's look silly.

    Actually you CAN do that with Linux fyi. But by default that is not how Apache is produced and or runs. Making it more portable as well. Try linking IIS into a groupwise system... go ahead! I'm waiting...

  • User profile image
    Jeremy W

    Linking IIS into Groupwise?

    How about linking IIS into Groupwise, Netware and eDirectory (along with ZEN) across multiple sites, in multiple cities, where authentication and directory services are replicated from individual trees and forests to the main tree without conflict.

    Oh, and ditto for printing, browsing and helpdesk.

    Yes, that's exactly what we have Tongue Out

    As has been mentioned, IIS's 5/6's config is much better than 3/4 (what most people know) and allows for just as much control (based on what you've defined).

    There is still room to grow, up IIS isn't the same beast you are obviously used to.

  • User profile image
    Lowrez

    troublefunk wrote:
    "The advantage that a lot of Open Source projects have is that they release the buggy code with a little tag that says "not for general use" and the users who want the newest features swallow the bugs for the latest Good Thing(tm).  I wish we could do the same thing, but when people pay for software they expect the latest features and stability all the time."

    Are you suggesting that Microsoft does not have Beta test programs? There appears to be one for Longhorn, and I imagine it comes with the same kind of disclaimer.


    Beat test program != Dev Branch Release.  Like all the other developers I work with, I have full access to the MSDN developer downloads and access to beta code, but it isn't nearly the same thing.  You can download the latest Open Source code branch for most projects at any time.  If someone checked it in yesterday, you have access today.  I know that I checked in a really cool feature yesterday to our code base, but even the beta user group won't get to peak at it for nearly 9 months while other modules are stabilized.

  • User profile image
    troublefunk

    Ok, fair enough. I guess the betas you release have to be stable enough so that the feedback you get from external users is meaningful, and not along the lines of "Where did my hard drive go?" Smiley

    However, does this mean there another nine month cycle before feeback from a wider group of beta testers is incorporated? Or is this  particular case unusual?

    Have you ever considered using a "Stable/Testing/Unstable/Current/Frankly we are suprised it even compiled" type scheme?

    Some of us are still using Win2K as it has been patched enough to be truly reliable, and any little quirks with particular apps are well documented. As there is no official 'Stable' version of Windows though, the only way to find the most reliable version is to see which has been around the longest that still does what you need.

    Sorry for all the questions!

Conversation locked

This conversation has been locked by the site admins. No new comments can be made.