The Wikimedia Foundation this week aborted a plan that would have seen version 5 of the Article Feedback tool (AFTv5) rolled out to all English Wikipedia articles (Editor-Engagement mailing list). As a result of fairly damning community feedback (see previous Signpost coverage), the extension, which adds a box to the bottom of articles asking for comments, will now only appear when the article has been added to a certain category. According to a revised release plan, the tool will continue to receive updates, though the focus will be on making it available to other wikis.
Together with last month's "undeployment" of the Moodbar extension and its associated Feedback dashboard, the move marks the end of the line for two of 2011's bigger projects. "As an experiment, Moodbar was a fair success", wrote the WMF's Brandon Harris on 6 February, "but we have come to the conclusion that it will require a fair chunk of development work (on the Feedback Dashboard side) to make it fully usable as a mechanism for new user engagement... [which will only now be as] part of the upcoming Flow initiative".
Despite the suggestion of a future revival of the Moodbar at a later date, the outcomes can only be demoralising from a developer standpoint: the Article Feedback tool was ultimately rejected despite an incredibly energetic community engagement campaign, and the Moodbar simply never took off, despite filling an even more obvious need. It would be tempting, then, to think that the English Wikipedia community rejects those tools that are seen to create burdens and embraces those that are seen to empower (the VisualEditor, Lua, Wikidata). However, the success of the Teahouse points to the dangers of drawing overhasty conclusions on this point. In any case, with AFTv5 almost entirely switched off, there will be much for WMF team leaders to ponder over the coming weeks.
In late September, the Signpost published an independent analysis of code review times, an analysis it repeated in November. To the 23,900 changesets analysed the first time and 9,000 added in the revised edition, a further 20,000 have since been added. Across those 51,380 changesets, developers (human and bot) have contributed some 73,000 patchsets and 167,000 reviews. This report is designed to supersede the preceding analysis, bringing the analysis up-to-date in time for the first anniversary of the Git switchover. The methodology remains the same, though the list of WMF deployed extensions has been updated and changing Gerrit practice has required a slight revision to some figures; interested users should consult the preceding reports. As with all data, the possibility for error is always present, though the figures presented are robust at the margins.
The undeniable conclusion is that code review times have stabilised at a good but far from perfect equilibrium. The headline figure – median review time for a proposed change to WMF-deployed code – only crept up slightly after October's low of 2 hours, 20 minutes, reaching 3 hours, 29 minutes in January. Over the same period, the 75th percentile was unchanged at approximately 22 hours. Early indications for February suggest no great shift. Fears expressed a year ago that code review would grind to a halt once a pre-review system was brought in appear, then, to be unfounded, at least in aggregate terms.
Unfortunately, however, the composition of those aggregate times is also stable: staff get their patches reviewed 2 to 3 times quicker than volunteers (illustrated right). Even if staff write smaller patches – and there is no particular reason to think that they do – that multiple seems stubbornly high. All of the top five most prolific all-time first-reviewers for core code are staff; between them, they have provided 40% of the first-reviews over the last 12 months, though that figure is tracking downwards at a healthy rate. In total, staff have provided ~70% of first reviews for core code – also tracking downwards – a percentage which rises to ~80% if WMF-deployed extensions are also included (the all-time top 19 reviewers for such extensions all being staff). Thus, staff still do more of the reviewing and get their own code reviewed quicker: but at least more staff are now becoming proficient reviewers.
Not all fixes may have gone live to WMF sites at the time of writing; some may not be scheduled to go live for several weeks.
Discuss this story
Article Feedback tool
It's good to see that the WMF has listened to the feedback it received about this tool through the RFC. This was a worthwhile experiment, but IMO it didn't work and it's good that it's going to be changed to opt-in. Nick-D (talk) 07:22, 13 March 2013 (UTC)[reply]
LUA templates
Whilst there are over 80 templates in Category:Lua-based templates, it should be noted that about 20 of these are sandboxes of their parent templates, plus one or two user sandboxes, leaving about 60 legitimate templates. An optimist on the run! 08:17, 13 March 2013 (UTC)[reply]
Code review
I can understand why you focus on the first review rather than on the final result (rejection or merge), but I'm not so sure of this WMF vs. non-WMF distinction, because I think the point should rather be how good we are at bringing flesh blood. We'd need a third line for non staff (please try to include WMDE in staff), non-mediawiki patches: non-WMF staff teams may skew the numbers, as well as core volunteer developers (whose patches' quality is probably, on average, better than the WMF's, given the higher and longer MediaWiki experience they often have, although more are being added lately). As for the merges, the number of open commits is still increasing (as far as I can see), and 80 % of them are non-WMF. --Nemo 09:24, 13 March 2013 (UTC)[reply]
MoodBar
"the outcomes can only be demoralising from a developer standpoint" -- I wouldn't assume that. In fact, I would definitely ask folks on the Product and Engineering teams what their opinion was before speaking for them. Some of us do love MoodBar and the Feedback Dashboard (I helped start the response team) but part of being professionals in software development is that you can't be afraid to kill your babies, if you catch my drift. Steven Walling (WMF) • talk 18:24, 13 March 2013 (UTC)[reply]
AFT on request
So I take it from this we can place AFT on certain articles on an opt-in basis... is there a page with more detail on how to do this somewhere? -- phoebe / (talk to me) 03:25, 14 March 2013 (UTC)[reply]
AFT question
Will there be automatic anti-spam, anti-abuse measures on opt-in articles containing AFT? --74.202.39.3 (talk) 22:18, 18 March 2013 (UTC)[reply]
AFT from watchlist