wood burning stoves 2.0*
The moose likes IDEs, Version Control and other tools and the fly likes SonarQube - how to configure it to meet your needs? Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Engineering » IDEs, Version Control and other tools
Bookmark "SonarQube - how to configure it to meet your needs?" Watch "SonarQube - how to configure it to meet your needs?" New topic
Author

SonarQube - how to configure it to meet your needs?

Mani Sarkar
Greenhorn

Joined: Jun 12, 2013
Posts: 5
Hi,

I'm a Sonar fan for a long time, and I strongly believe it a very useful tool but you can also loose yourself sifting through all the numbers and graphs, etc...

What would you suggest or how should one configure it to come up with a compact and effective dashboard to read the metrics and take action.

Rather I'd like to know of a methodology to use when using SonarQube so to use all the signals and make sensible decisions on the code base. Do you have an existing configure I can refer to or an online project like Nemo that I can look at to learn from it?

Thanks,

Regards,
mani
PS: I have also written a blog post on SonarQube lately - one more will follow on usability - your input will strongly help in the process, see http://neomatrix369.wordpress.com/2013/09/16/installing-sonarqube-formely-sonar-on-mac-os-x-mountain-lion-10-8-4/
G. Ann Campbell
Author
Ranch Hand

Joined: Aug 06, 2013
Posts: 33
    
    5
Mani,

It sounds to me like you're asking which metrics you can ignore, although I could be reading that wrong.

I'd flip that around and ask which metrics you find to be the most important? Whatever they are, scoot their widgets to the top of the dashboard. I don't like to hide/remove any of the widgets because I don't want to sweep any dirt under the rug. The one exception to that at my day job is the test coverage widget. Nearly all of our projects are legacy projects. None of them have unit tests and none of them are going to get unit tests. Rather than putting numbers in my users' faces that I know they'll tune out automatically, I just remove the widget.

All that said, the default project dashboard is a bit of a grey haze of numbers. I've got mine juiced up with a treemap (showing rules compliance) and a timeline widget. In addition to being communicative, they add some color and interest to the page and make it more palletable/attractive.


HTH,
Ann
Patroklos Papapetrou
Author
Ranch Hand

Joined: Aug 06, 2013
Posts: 32
    
    5

Hi Mani

Rather I'd like to know of a methodology to use when using SonarQube so to use all the signals and make sensible decisions on the code base. Do you have an existing configure I can refer to or an online project like Nemo that I can look at to learn from it?

I'd suggest that you read the post about Continuous Inspection


Follow me on twitter ( @ppapapetrou76 ) or see my linked profile and connect with me
You can slso subscribe to my technical blog
Burk Hufnagel
Ranch Hand

Joined: Oct 01, 2001
Posts: 814
    
    3
G. Ann Campbell wrote:I'd flip that around and ask which metrics you find to be the most important? Whatever they are, scoot their widgets to the top of the dashboard. I don't like to hide/remove any of the widgets because I don't want to sweep any dirt under the rug. The one exception to that at my day job is the test coverage widget. Nearly all of our projects are legacy projects. None of them have unit tests and none of them are going to get unit tests. Rather than putting numbers in my users' faces that I know they'll tune out automatically, I just remove the widget.

All that said, the default project dashboard is a bit of a grey haze of numbers. I've got mine juiced up with a treemap (showing rules compliance) and a timeline widget. In addition to being communicative, they add some color and interest to the page and make it more palletable/attractive.

Ann,
I think that reorganizing the dashboard so the most interesting widgets are on top makes a lot of sense - as does using the treemap (though I didn't know that what it was called) to make the numbers more visual. It's easier for most people to see differences when presented as an image rather than just numbers, so showing the relative size of components then color coding them based on code quality can quickly give people an overview of the project status. "Wow, everything's green but that one red chunk - what's up with that?"

From your description, I think it makes sense to hide the test coverage widget. If there was a chance that it would encourage folks to improve the number then I'd leave it, but towards the bottom. But if there's no way that will happen then it's just noise, and a reminder that there's a problem, so people may start avoiding the dashboard so they aren't reminded about the problem.

Thanks for sharing that,
Burk


SCJP, SCJD, SCEA 5 "Any sufficiently analyzed magic is indistinguishable from science!" Agatha Heterodyne (Girl Genius)
Mani Sarkar
Greenhorn

Joined: Jun 12, 2013
Posts: 5
Hi,

Thanks for the response, and I have looked at the Continuous Inspection thread - interestingly I didn't know what I did personally with Sonar was called Continuous Inspection - but I always did it locally on my PC.

It does answer or atleast confirm one of my queries but the other one to do with widget settings and configuration is something I would love to hear about as well. So even when using this methodology one only looks at a few widget stats to take action, and so which ones should we enable and what config should be set them to so to use them as a minimum benchmark.

Cheers,
mani
PS: Thanks for selecting me as one of the winners of the book, happy to be a contributor to this thread as well.

Patroklos Papapetrou wrote:Hi Mani

Rather I'd like to know of a methodology to use when using SonarQube so to use all the signals and make sensible decisions on the code base. Do you have an existing configure I can refer to or an online project like Nemo that I can look at to learn from it?

I'd suggest that you read the post about Continuous Inspection
G. Ann Campbell
Author
Ranch Hand

Joined: Aug 06, 2013
Posts: 33
    
    5
Mani Sarkar wrote:It does answer or atleast confirm one of my queries but the other one to do with widget settings and configuration is something I would love to hear about as well. So even when using this methodology one only looks at a few widget stats to take action, and so which ones should we enable and what config should be set them to so to use them as a minimum benchmark.


Mani,

I'm not quite sure I understand what you mean about widget settings and configs. If you're logged in you should see "configure widgets" and/or "manage dashboards" links toward the top-right. They'll let you create and rearrange dashboards.

But if you're looking for what metrics to pay attention to, you may have missed the point of the Continuous Inspection post. Just use the 'Time changes' dropdown to compare current values to the previous ones. The previous values are your minimum benchmarks.

Of course, it's not always possible to avoid adding technical debt. As they say, you have to break a few eggs to make an omelet. Just be very aware of what you're doing when you do it.
 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: SonarQube - how to configure it to meet your needs?