What benefit you get from a static analysis tool depends on the tool. Many of them, including FxCop, are simple best practices checkers: does your naming follow .Net convention, do you dispose every object that implements IDisposable, do objects that manage
native resource implement the disposable pattern properly, do you use the recommended pattern for events, exceptions, etc., do you put the proper attributes on classes and methods, do you always specify a culture for culture-sensitive methods, that kind of
thing. For most of these, only function-level analysis is needed so the effort is not too great.
sven, if those are the only reasons for doing static analysis, i will be pretty disappointed. most of the reasons you stated above can be achieved using code review, design patterns, and best practices, which are all conventions. i kind of expecting static
code analysis is more a mathematical way of proving correctness. i guess what i'm expecting is some tool that take the whole set of formal method and prove code correctness using mathematical model. like AndyC pointed out up there, i guess the effort of doing
static code analysis is too great to be practical right now. i don't know if fxcop doing something in line with formal method, but if it is a static code analysis tool, then it probably still lack the power and benefit of formal method (that is proving code
correctness). would it be fair if i say that fxcop is a static code analysis tool in early stages?
on the other hand, with terminator, just looking at the introduction there i feel they are gearing towards formal method. but then again, there is still nothing solid yet. looking at wikipedia, it seems that some tools that do static analysis based on formal
method, only implements parts of the formal method. i guess, the complexity is too great.
im wondering, have anyone does (or tried) static code analysis by hand?