As of today, there were a total of 385,469 assessed articles in the English Wikipedia. If we use the figure from Special:Statistics of 1,688,879 articles (sorry, no permalink here), that means that 22.8% of articles have been looked at by someone and classified according to their quality. (If we count the number of unassessed articles that are stored in the 1.0 database, we have 40.7% of the article base covered.) While the proportion of the numbers themselves make for interesting observations, most users do not know where those numbers come from, or how they are processed. Since I was involved in the design of the WP:1.0 bot framework, I thought it would be a good idea to explain how it works. It is quite a fascinating process, if I'm allowed to say so myself... ;)
First the WikiProject needs to do a bit of legwork. The WP1.0 bot uses the Mathbot code, which uses Perl to determine if there were any additions or subtractions to a particular category. Therefore, the vast majority of the processing is just a matter of, "Was an article added to this category? Was one removed? Was there an article in a category that was in a different category yesterday?" Before those operations can be done, the categories to process must, well, exist. Picking on WikiProject Tropical cyclones (as usual), the categories to make are:
- Category:Hurricane articles by quality
- Category:FA-Class hurricane articles
- Category:A-Class hurricane articles
- Category:GA-Class hurricane articles
- Category:B-Class hurricane articles
- Category:Start-Class hurricane articles
- Category:Stub-Class hurricane articles
- Category:Unassessed-Class hurricane articles
Projects also have the ability to categorize pages by importance or priority. This is slightly more controversial, (take WikiProject Biography, for example: how can you say that someone is of "Low importance" without upsetting the person?) and is not required. However, for the projects that desire to use this portion of the framework, there's another category setup to do, parallel to the quality categories:
- Category:Hurricane articles by importance
- Category:Top-importance hurricane articles
- Category:High-importance hurricane articles
- Category:Mid-importance hurricane articles
- Category:Low-importance hurricane articles
- Category:No-importance hurricane articles
Once this is done, the way most projects feed their articles onto the bot framework is by adding a "class" parameter to their WikiProject banner, and optionally, an "importance" parameter as well. Once the MediaWiki job queue does its thing, all of the articles are in the "Unassessed" category.
|Hurricane Nora (1997)|
Assuming for a moment that these articles are brand-new and unassessed (which they aren't), at this point, the bot has still not done its daily run. At about 3:00 UTC that day, the bot starts its run. It reads the category tree, and copies them inside its internal database. The table above is now mirrored in the hard drive of the bot's computer. The bot also spits out a log, and updates both the statistics table for the project, and the global database.
Over the course of the day, these articles are assessed. This is done by updating the parameters on the WikiProject banner. An example of this would be the following, on Talk:Hurricane Katrina:
That same day, the three articles are assessed: Hurricanes Nora and Katrina to FA-Class, and Wilma to B-Class. So, our categories in the wiki are now the following:
|Hurricane Nora (1997) |
At this point, the bot runs. Again, all it does is compare the current categories and the previous category snapshot. The bot sees that the FA-Class was formerly empty, and now contains two articles: Hurricane Katrina and Hurricane Nora; the B-Class category contains Hurricane Wilma, and the Unassessed category is now empty. The bot now updates the statistics tables and the log, and updates its own snapshot with the current data.
The internal logic of the bot allows it to see that in these two cases, the articles were upgraded from one class to another, and the log will reflect that. If the bot sees a new article, then the log will also identify it as a new addition. If an article was removed, then the bot will flag its removal in the log as well. A recent modification to the code now allows page moves to be adequately recorded, instead of being recorded as an addition and a deletion.
This is the bot structure that 427 WikiProjects, task forces, Regional noticeboards, etc. use nowadays to monitor article quality and editorial progress. It works flawlessly, except that it cannot scale forever. The bot is now run every other day, as each one of its runs took approximately 36 hours to finish. This bot framework is also used by the Hungarian Wikipedia, and the Spanish Wikipedia as well; so if a friendly developer wants to make this part of the Wikimedia wiki farm's software arsenal, a lot of people would be happy... :)