I have code I wrote for a big supermarket 3 or 4 years ago that is still used daily . I wrote the application over 3 or 4 months with no time constraints, because the software had to work, and be error free. Any mistakes would result in their orders system
being out of step which was not acceptable.
Given the same project today, someone would wet their finger with some spit, hold it up in the air to see what direction the wind is blowing, and shout 42. This project should take 42 days!
I would probably attain some functionality by the time the 42 days was up, but I'd be overworked and overstressed, and hate the job I am doing. I am also beyond any doubt that the code would have needed several updates, that in the long run would add up
to more time than I initially spent without the pandemonium of micro management making the project managers feel as if good work is being done.
The long and the short of it is that project managers generally have a very short term view, and never ever factor in having code that is of a high quality, and the benefits that gives. We have made advances in memory managed applications like Java and
.NET, but C# isn't some magic programming language that makes everything easy. You can still create spaggetti applications as easily in .NET as C++ or VB6.
In several years time, the focus on SCRUM and Dickensian style production of code that has become unmaintainable will leave people scratching their heads as to how they ever got into this mess? I already know of some people that are up sh** creek because
their .NET code is not just in need of a refactor but a complete re-write [shudder]
Programmmers are expensive for a reason, if you treat them like children, and if you start making them work in ways where they cannot practice the science of computing and the art of making things easy to understand, you end up with the worst of both worlds.
Micromanagement is intrusive and stressful, but I came to accept it as a necessary evil. Developers (and I'm no exception) are bound to obsess on a single aspects of a product (be it an algorithm, an architecture, a framework, whatever), forgetful of when
it's high time to call it good enough and start turning that shiny new toy into an actual product that customers would use and buy.
Our quest for high quality code is just a step from wasting time: software always rotted, but nowadays it just rots faster as platforms evolve at breakneck speed. Take any C# code that was written pre-2005. Would you reuse it? Would you ship it now? No generics,
no lambdas, no LINQ, no automatic properties, shiny custom controls that just don't look right in Windows 7 (and that aren't written in WPF anyway), use of deprecated classes and methods, code that duplicates what is now in the BCL. It was perfectly fine when
it was written, now it's a stinking mess.
Not to mention death by a thousand cuts, that is customers requiring a list of what they deem minor changes that get funded accordingly. You barely have the budget to make it work, never to make it right; that gets postponed to the next mythical big rewrite.
Marketing might be at fault there, but the problem is that there aren't many customers who can understand the value of good code. (Or even bad code for that matter: when they ask that their order system can optionally work in the Mayan calendar, all they
perceive is the new checkbox in the options dialog... "Oh, come on, how hard can it be?").
This is not an apology for writing sloppy code, I still try hard to produce the best code that circumstances allow, and fight customers and marketing to get a little respect for quality. But I came to realize that I need someone, or a process, that keeps
me close to reality. And, sad as it might sound, reality is that it's good enough that pays the bills.