, kettch wrote

Even stuff like HTML and JavaScript aren't helping. The only reason we can get anything done is because of the huge numbers of script libraries that provide canned functionality. HTML5 is really only requiring that browsers implement natively what people have been doing by hand for years. Then, tooling like VS helps developers herd all of these cats.

++

The biggest problem with HTML5 is that it's providing code in a can, not let's program this better. That's why jQuery is a more important innovation on the web than all of HTML5, by some margin.

To get to a better web, we need people to stop spitting out webpages from the back-end of a console PHP/ASP/RoR/whatever program into a DOM that works differently on every browser. We need web-developers to write classes and components.

jQuery helps in this regard -- encourage developers to write your javascript as a "sort of" class instead of blindly giving developers more elements they can shove out the backend of a PHP script because you object on principle to Adobe Flash and you've realized that your platform can't handle living without it.

In the 1980s we invented a language called C where everything was an int or a pointer (which is an int), the compiler tries to second-guess what you were doing (rather than helping you weed out the ambiguities), and no two platforms were the same - so writing cross-compatible code was a total ball-ache. Those ambiguities led to unstable programs that were difficult to understand, but pretty easy to hack.

Fast-forward to 2013, where by the grace of God we've somehow invented a series of dangerous languages where everything is a variant or a string (which is a variant), the compiler and the browser both compete to come up some baffling interpretation of what you've written (rather than helping you weed out the ambiguities) and no two platforms work the same - so writing cross-compatible code is a total ballache.

Oh yes - and it's also really easy to hack web-applications because they're just spaghetti that's been duck-taped together by monkeys and rammed into a command prompt on a linux box.

Desktop developers on the outskirts aren't "jumping ship" to the web because the web is shiny and beautiful and does everything they want it to. They're standing on their aircraft carrier laughing at your dingy made of logs tied together with some string you found on a beach.

Don't get me wrong, we're impressed that you made it this far into the ocean with an oar made out of a plastic spade and a seagull that you've tied to the front of your dingy. But that doesn't make you the king of the navy, or demonstrate that the world "better look out" because of your platform's new innovations.

I'm not sticking with the Desktop because I'm stubborn. I'm sticking here because it's better. The web is no less than 20 years behind the desktop in terms of security, ability to write good code and platform stability - and solidly five or six behind in terms of speed. If the platform of the web didn't suck so much, maybe people would be jumping ship to the web instead of jumping ship from the web to write apps. But sadly it's not. The web is just awful, and it's an embarrassment to those of us who can actually program worth a damn.