Off the top of my head, I'll say --

1) It helps the intended users be more productive in their tasks,

2) It stays out of the users' way,

3) It does not force them to work harder at learning the application than it does for them to do their job (and, therefore, violate # 2),

4) It doesn't make the user afraid to use it without the conscious or subconscious fear of it crashing (and, again, violate # 2),

5) It doesn't force the user to read a manual in order to do at least 80% of what it's meant to automate (in other words, it's intuitive).

I'm sure others can bring up many other points.  Many developers are good at creating software that works well.  The hard part is satisfying above list.  I have known very few developers and teams who were able to satisfy all of those.

One of the biggest weaknesses is the ability for developers to place themselves in the users' shoes.  They find it very hard (and often don't even bother) to frame their mindset to the users of their software.  Ignoring that is one of the biggest sins of software development, in my opinion.  This is why "eating your own dogfood" is so very important (but not always possible).  It's so much easier to meet these goals when you are your own target audience.

If you really want to be proud of the applications you develop, your goal should be to empathize with your target audience.  Otherwise, it is just a job, and you'll do enough just to get by (writing code that just "does" something).