开发者

When to stop following the advice of static code analysis?

I do use static code analysis on a project with more than 100.000 lines of Java code for quite a while now. I started with Findbugs, which gave me around 1500 issues at the beginning. I fixed the most severe ove开发者_JS百科r time and started using additional tools like PMD, Lint4J, JNorm and now Enerjy.

With the more severe issues being fixed, there is a huge number of low severity issues. How do you handle these low priority issues?

  • Do you try fixing all of them?
  • Or only in newly written code?
  • Do you regularly disable certain rules? (I found that I do on nearly any of the available tools).

And if you ignore or disable rules, do you document those? What do your managers say about "leaving some thousand low priority issues not fixed"? Do you use (multiple) tool specific comments in the code or is there any better way?


Keep in mind that static analysis is meant to generate a lot of false positives; this is the price you pay for generally avoiding false negatives. That is, they assume that you would much rather be told incorrectly that something looks suspicious (a false positive) instead of being told that something that's actually a problem is perfectly fine (a false negative).

So in general, you should be configuring these tools rather than accepting the out-of-the-box defaults, which generate a lot of noise.

Do you try fixing all of them?

On projects where I have technical control, my standard modus operandi is to encourage a culture where people review of all new static analysis defects from our CI system. If we decline to fix enough defects over a period of time that are of a specific kind, we disable that rule since it's become just noise. Every so often we'll look at the disabled rules to make sure that they're still relevant.

But once we've turned off the less effective rules, yes, we fix all the defects. If you think that something is a problem, you should fix it if the priority doesn't exceed that of other things you have to do. Ignoring it is damaging to your team's culture and sends the wrong message.

And if you ignore or disable rules, do you document those?

The rules file is part of our project, so a commit message is sufficient to document the fact that such-and-such rules were disabled in this commit.

What do your managers say about "leaving some thousand low priority issues not fixed"?

Nothing. We make sure that they understand what we're doing and why, but they're usually (rightfully so) focused on higher-level metrics, like velocity and overall project health.


What do your managers say about "leaving some thousand low priority issues not fixed"?

I expect managers to prioritize: to decide (or, be told) whether any task is high- or low-priority, and to be happy with people's spending time on high-priority instead of low-priority tasks.


If you were to look at the analogy of your bug tracking database, a good number of those reported are low priority bugs that you'll probably never get to. Sure, they are real bugs and you would like to fix them but most programmers work under very real constraints and don't have the time to address every concern. I wrote an article recently about the special nature of static analysis defects.

One important difference about addressing static analysis bugs though is that they are typically much easier to deal with than a regularly reported bug. Thus a quick scan of the defects to identify not only the high priority items to fix but also the ones that are easiest to fix can be useful. Static analysis defects after all are detected very early in the development process and the specific parts o the code in question are very plainly spelled out. Thus you'll likely catch quite a few low hanging fruit on the lower priority ones.

The various strategies I've seen used to make this successful include: * First of all, make sure the analysis is tuned properly. Static analysis comes "out of the box" with factory settings and can't possibly understand all code. If you can't tune it yourself get some help (disclaimer, we provide some of that type of help). You'll lower the false positive rate and find more good bugs. * Identify the characteristics that for the most part prioritize the defects (they could be specific categories, specific areas of the code, built-in prioritization scoring provided by the static analysis tool, etc.). * Determine what level of threshold is important and possibly make it an acceptance criteria (e.g. all high and critical need to be addressed) * Make sure that each defect that blocks the acceptance criteria is addressed (addressed meaning that it at least has to be looked at because some could be false positives) * Make sure that the ones marked false positive are checked, either through a peer code review process or through a tail end audit process so you don't have any mismarking problems.

Bottom line: choose what to focus on, review what is necessary, don't fix everything unless your business requirements make you


Decide what rules you want to follow at your company and disable the rest. A lot of the rules that you can set up with those tools are subjective style issues and can safely be ignored. You should document what rules you are following once in a company style guide (so you don't clutter your code with that information).

I wouldn't make a change to old code just based on the recommendation of a static analysis tool. That code is already tested and presumably working. If I need to go into the code and make a change for any other reason, then I'll run the analysis and make any changes it recommends.


The key in.my mind is to at least review all the issues even if you decide in the end not to fix them. The reason is that bugs, like misery, love company. I can't enumerate ow many times I've found all kinds of nasty bugs that findbugs didn't report; but found them by looking at seemingly unimportant ones that it did report.


I personally find code analysis though useful but overrated.

I can't say for Java but with C# there is also such a thing as a code analysis. I agree it gives you a lot of smart ideas how to do thing better, but at times the recommendations are just annoying. Some suggestions are not based on common sense but are only a matter of style. After some time playing around with the code analysis, I stopped using it. The reason is that I happened to disagree with many warnings and didn't want to follow the proposals.

In general, I would recommend doing what the code analysis says. But up to a certain point. Whatever seems to be a matter of personal style of whoever write the rules definitions, can easily be ignored. You can add exceptions for rule 1, then 2, then three... until it gets old. Then you just deactivate the feature.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜