- User Since
- Feb 27 2015, 8:23 PM (430 w, 5 d)
- IRC Nick
- LDAP User
- MediaWiki User
- Jimkont [ Global Accounts ]
Nov 21 2019
I read the post, very good documentation.
Sep 2 2019
Aug 27 2019
Thank you for your answer, it is quite detailed and clear
Aug 23 2019
Aug 22 2019
Jul 10 2019
The solution is to add #wgServer with the proper value in LocalSettings.php
to clarify, the QueryService syncs data correctly with the subdomain I set on the docker compose file as a namespace.
Aug 17 2016
I find this info useful as I do not need to parse the wikitext to see where an item redirects to (see T143200).
Nov 27 2015
Perfect! Thanks @ori
Nov 26 2015
@Aklapper can we put the deadline 2 months back, end of February?Thanks
Sep 18 2015
We are low on human resources atm but will try to push this a bit. Hopefully in the following (2-3) months
We had some problems during summer with the switch from http to https in the wikipedia apis and some live clients were broken for some period.
IIRC, when this project started (5 or more years) ago there was a 1 req / sec restriction and we were processing ~130 pages / minute.
This was the reason we chose the OAI + Wikipedia mirror approach
Read api calls. For each page we process we make two calls, one to get the wikitext and another to get the text from the first paragraph of the wiki page (abstract) using the TextExtracts extension.
We need OAI for 2 reasons
(1) to get the update stream (which is solved with RCStream) and also important (2) to have a local Wikipedia mirror where we can exceed to wikimedia api rate limits.
Jul 28 2015
Does RCStream has a plugin that keeps a local Wikipedia copy up to date?
Jul 24 2015
Hi, due to the recent switch in the wikipedia api from http to https DBpedia stopped feeding for 1-2 weeks until we identified & fixed the problem.
Are there any plans for OAI? we certaninly want to keep getting update feeds but we can switch to a new service if we can get a similar api
May 6 2015
other examples of old serializations can be found here: