We had an interesting encounter earlier this year. Our automaton friend, Helpful Robot, extended the olive branch mentioning that it had been busy inspecting our community modules and looking for ways to be, well... helpful.
To help us make sense of Helpful Robot's interesting statistics about our modules, we ran a survey, in which many of you took part.
Here's what you told us about the importance of quality modules in the SilverStripe ecosystem:
- 81% said that it was likely or very likely that quality modules are a key factor when looking at adopting a particular CMS product.
- Only 40% said that the module website was currently effective in helping you select modules.
- 86% said that seeing how modules rated when compared against the SilverStripe supported module standard would be a valuable addition to the addons website.
Based on these three key pieces of data, we are delighted to say we've integrated with Helpful Robot's module scoring API and can now provide a simple way to check modules rating against the more objective aspects of the module standard. While we are providing a simple visual indicator (the new "dot" next to all the modules). We've gone for shape and contrast colours so you can, at a glance, see how a module is doing.
The dot indicator can be read as such:
Green is meeting 70+% of the standard as measured by the API.
Half green represents the module scores above the overall community average score and less than 70%.
Grey indicates the module scores below community average score.
You can also find out in which areas the module could be improved in a more detailed UI for each module.
If you are a module maintainer (65% of you that took the survey were!), this will help you know where to put your efforts to gain some additional module standard points. If you are looking to select a module for a project, it should also help make an informed decision when selecting modules for use. For this first roll-out, we've kept things simple and will be later adapting the indicator and scoring based on further feedback and practical usage. We figure “best to get something released, test it and fine tune”.
The reason we've gone in this direction (apart from an overwhelming number of you asking for this score to be published) was that one of the top ways developers indicated in the survey as to how they currently select modules was looking at simple indicators of module quality. This included the presence of documentation, tests, good module structure and other aspects like a module being correctly licensed under a free (as in freedom) & open source (FOSS) license. It is some of these key areas that the score is calculated on, so it will save you some time not having to peek at the module source when making that initial short list of modules for a project.
Just note, at this point, Helpful Robot is sticking to those things that can understandably be objectively assessed in respect to the standard. The score is also weighed on aspects of complexity and effort that maintainers put into setting up those things being measured. For example, you get a higher portion of the overall score from having things like docs and tests (because we know how you love to write these things and the effort required to do them well). We may revise these scores once we've had time to determine their usefulness.
So check out the new module scores on addons, and let us know below in the comments if you find them useful.
For maintainers, Helpful Robot is working to raise some handy pull requests against many modules. If they make sense, gaining a better score may be as simple as merging a pull request or two.
For users of modules, we're hoping this helps raise module quality and selection process when making use of the many community modules in projects.
Thanks Helpful Robot, you sure live up to your name! Go check out the update to the Addons website now.