I don't think PHP is well designed. But people make great stuff in it. Any time someone brings up .NET vs PHP, it reminds me of MySpace vs Facebook. It proves that languages are ultimately kind of unimportant compared to having talented people and sane design.
The problem is that people often bring up that you can write insecure and secure code in both X and Y for all turing complete languages X and Y, which ultimately misses the point that X and Y are not then equally good languages.
For example, it's possible for an experts to write a lovely swisshy app entirely out of x86 assembly compiled with NASM. But it's a whole ton easier to just use WPF.
And shouting "yeah, but you can do a buffer overflow in C# too if you're really dumb, and you can write code with no buffer overflows or memory leaks in C as well" kind of misses the point. It's waay harder to write good C code with no buffer overflows or memory leaks than it is to write good C# code with no buffer overflows or memory leaks.
Just because it's possible to write crappy code in .NET and crappy code in PHP doesn't make them equal. Case in point:
One of those is a authentication bypass. The other is a compile-time error.
The list goes on, but the point is that crappy code in PHP/MySql/Python/RoR is vastly more likely to turn critical than like-for-like stupid code in .NET/C#.
The point is that in a random sample of the thousands of companies that I've visited to audit their code everyone writes crappy code. Some experts write crappy code once a month. Some junior coders write crappy code every day of the week. But the point is that having a language that is there to encourage you to do it right and make it hard for you to do it wrong (without making it impossible to do it at all) means that people gravitate to doing it right when the deadlines are near, instead of writing shoddy code to get the product (and all too often, their user's credit card details) out of the door and onto the Internet.
I write bugs all of the time. But I kind of like the fact that 95% of them are caught by my compiler, 4.99% of them are found my unit tests, and of the 0.01% that are left to run on customer machines, Microsoft makes sure that only 0.000001% of those are actually exploitable.
That's better for my customers than finding 99% of the bugs with unit testing, and half of that 1% of bugs that get through being trivially exploitable by a hacker determined to get root on my server.