User Details
- User Since
- Nov 16 2019, 12:38 PM (231 w, 4 d)
- Availability
- Available
- LDAP User
- Unknown
- MediaWiki User
- Tilomi [ Global Accounts ]
Jul 17 2020
You can close it, thanks again for the answer!
Jun 20 2020
@AntiCompositeNumber Thanks for the answer, this is really useful information for us. From your description i would say the concurrent request limit gets us the 429 error since we load more than 4 pictures at once.
Jun 19 2020
@Aklapper Thanks for the fast answer.
Nov 17 2019
Ok interesting, so i guess i have to wait until this issue is resolved or is there another functionality which i can use that won't load and initialize sitelinks? For a new feature we use the wikipedia sitelink but the performance problem occured already before we used sitelinks.
Thanks for the answers.
I also tried the default value and higher values like 10, 20 or 60 for maxlag. But there were unfortunately no improvments with maxlag changed.
I'll take a look at WikidataIntegrator and compare it with pywikibot.
Nov 16 2019
I thought about data dumps aswell. But i think it's a bit overkill to download 60 Gb to query about 100 Mb of data so i wanted to check first if there are any other possibilities regarding the SPARQL endpoint or pywikibot.
I played around with them, but i saw no change in performance. I set following options to following values because i thought they would affect the performance:
minthrottle = 0 # default value
maxthrottle = 1 # default 60
put_throttle = 0 # 10 (i only read pages so this was probably unnecessary)
maxlag = 1 # default 5
step = -1 # default value