User Details
- User Since
- Apr 24 2016, 2:38 AM (427 w, 3 d)
- Availability
- Available
- LDAP User
- Unknown
- MediaWiki User
- Termininja [ Global Accounts ]
May 26 2016
Ups, sorry, I wanted to write 1 second, and yes, I will make it in serial so every previous request has to be finished.
Of course, this will solve the problems, in the next days I'll refactor my bot code. I think to make it to do all POST requests in serial, but to continue to use parallel for the GET requests. In any case in future will there is minimum 1 min interval between my bot contributions.
May 19 2016
Yes, I understood for this yesterday for first time. Nobody informed me before for this in Wikidata, where my bot works mainly. Also, nobody learned me how to write my bot code, I learn on place. When I started one year half ago my bot used web browser to make edits and this took ~15 seconds to add link to bgwiki for some item (for example). I was reading only Wikidata:Bots where nothing for parallel or serial requests was mentioned. Even to was mentioned, I didn't know anything for asynchronous requests, so I could not understand it. But later I improved my bot code, it started working fast, I was happy... After the first warning about the speed I changed the code, the other people was happy, me too... From then I improve my bot code non stop, which means I improve everything else, but do not increase the speed! And how I already mentioned above every time when somebody tell me that my bot is fast I decrease the speed more from the last time (see above: 300 → 100 → 60 sec as limit for median lag). When I was warned last time 4 days ago from User:Hoo man, he only suggested to use maxlag parameter, so I did it. And thought everyone was happy after that, but not, I didn't know anything for problems in other wikies, and from where to know?!
May 18 2016
Thanks Aklapper, I'll use it if decide to do something in Commons. And yes Legoktm, I've never said that my bot use synchronous requests, why you have to believe in this?
May 17 2016
The blocking is not problem, this is normal and right action when there is some problem with some bot, but don't forget that this bot is commanded by me, and I don't want to harm to any wiki, so keeping block on is of course unnecessarily and very stupid decision. How you see every time when someone informed me for some problem I took immediately action to resolve the problem (the same happened yesterday, when I was warned the advice was to use maxlag and I did it), but for this what you are talking about here I can't help. The guideline mentioned above from you is too abstract to be followed strictly from anyone, we need concrete measures and values. Take what decision is needed and just inform me, I'll take needed actions later from my side as always. I'm here to help not to harm! Just don't forget to clarify this changes in https://www.wikidata.org/wiki/Wikidata:Bots#Bot_accounts for all bots globally not only for mine.
One year ago there was similar issue, then my bot made ~200 edits/sec but I didn't know anything about bot policies and the lags. When I was warned my bot started to checks median lag on every minute and stopped work if the lag reached 300. But the problem was continues so I decrease the max lag to 100, and later I made it to slow the speed if lag is more than 50 and stop if reach 60. This is how my bot worked in the last one year.