Actions

User talk

Zerote000

From WikiApiary, monitoring the MediaWiki universe

Welcome to WikiApiary!

Hello ,

Welcome to WikiApiary, a wiki that collects, displays and analyzes information about the usage and performance of 21,561 independent MediaWiki installations as well as 7,531 wikis running in 220 wiki farms; and growing every day! WikiApiary is a great place to see what other wikis are doing, see where 8,712 extensions and 3,347 skins are being used. Start your discovery tour and browse through the sites listed on the main page, explore what they are all about and get some inspiration.

Thank you for registering! Please add your website to WikiApiary, if it isn't already here, to see important information about it at a glance, e.g. editing activity, etc. In the future WikiApiary will be sending out notifications that will greatly help wiki administrators and editors keep an eye on their sites. You can register for them while adding your website. If you are a wiki maven and routinely discover new wikis, use the WikiApiary bookmarklet to easily add wikis you discover as you browse the web — it only takes a couple clicks to add a site using this!

There are also mailing lists for WikiApiary. For general discussion sign up on the wikiapiary-l list. If you would just like announcements, wikiapiary-announce is the list for you. If you are interested in helping build WikiApiary, check out wikiapiary-dev.

You can also show the world your site is monitored and share your support for WikiApiary by adding the Monitored by WikiApiary icon to your wiki footer!

Cheers 🐝,


-- Welcome Bee (talk) 20:22, 18 April 2019 (UTC)

WikiTeam and status updates

Excellent work, thank you! You probably saw it already, but just to be sure: emijrp also had a script for some WikiTeam-WikiApiary updates. https://github.com/WikiTeam/wikiteam/tree/master/wikiapiary --Nemo 09:41, 20 April 2019 (UTC)

Thanks. Hadn't seen that script before, but unfortunately, it looks like its search is based on the "Originalurl" tag on IA, which not all dumps have. Would it make sense to change it to search using the identifier instead, or at least use it as a fallback option? Should I use a separate bot user when running the script?--Zerote000 (talk) 14:44, 20 April 2019 (UTC)
I don't think the bot flag here is that important. The originalurl field should be added to the dumps which lack one: if you make a list (maybe in a spreadsheet), I can make the changes. Nemo 15:01, 20 April 2019 (UTC)
I found that if the originalurl points to the index page instead of api page (if the api was not available or working properly), the script will also not find the page. Should the originalurl attribute be changed on IA, or should the script also attempt to search for the index? --Zerote000 (talk) 16:26, 20 April 2019 (UTC)
IA metadata should preferably record the URL which was actually used to make the dump, so it might be the index.php URL rather than the api.php URL. The script for WikiApiary should probably ignore the difference. Nemo 17:39, 20 April 2019 (UTC)
I created a PR on GitHub that searches using both the API and Index URL. Not sure if it's the best way to do it, but I checked that it worked with the wiki I originally found the issue with. --Zerote000 (talk) 21:24, 20 April 2019 (UTC)
Thanks for testing. Merged. Please check that it works with a larger amount of wikis too, I wonder what happens if there are multiple items. Nemo 07:00, 21 April 2019 (UTC)

Thanks both for this. Line 27 in script

  • "gen = pagegenerators.CategorizedPageGenerator(cat, start='Spyropedia')"

should be changed to

  • "gen = pagegenerators.CategorizedPageGenerator(cat, start='!')". I did the commit. Emijrp (talk) 08:26, 11 June 2019 (UTC)