In T200339#4577502, @MichaelMaggs wrote:Have I missed something?
- Queries
- All Stories
- Search
- Advanced Search
- Transactions
- Transaction Logs
Feed Advanced Search
Advanced Search
Advanced Search
Sep 19 2018
Sep 19 2018
Sep 13 2018
Sep 13 2018
Sep 12 2018
Sep 12 2018
In T163642#4503980, @Smalyshev wrote:@Multichill I think with new description it is clearer what this is about.
Sep 4 2018
Sep 4 2018
I checked several systems and both run 2.7.6. . This task is way too soon. Come back in a couple of years. You're going way too fast on this dropping campaign.
Sep 3 2018
Sep 3 2018
In T200339#4552218, @Lokal_Profil wrote:Thanks @Multichill for finding the source for this.
Not finding the header doesn't stop the page from being harvested, but not finding the row template does. In both cases redirects are not resolved so those entries are skipped.
{{HB Scotland header}} and {{HB Scotland row}} are still referred to from the template documentation so its unclear if a search/replace is desired.
I'll update the config to use the two target "HS listed building header" at least.
https://gerrit.wikimedia.org/r/#/c/pywikibot/core/+/371659/6/pywikibot/site.py sure looks like it. Good to see it implemented!
Sep 2 2018
Sep 2 2018
Multichill lowered the priority of T192912: Unexpected ratelimit makes Pywikibot crash from High to Low.
In T192912#4551136, @Framawiki wrote:@Multichill is the error still given ? I've read somewhere that this hacky limitation was removed from wd.org
Aug 15 2018
Aug 15 2018
Mvolz awarded T163642: Index Wikidata strings in statements for fulltext search a Love token.
Aug 4 2018
Aug 4 2018
@Smalyshev / @debt :I think this is one of those tasks where we have a bit of a misunderstanding about scope (see https://lists.wikimedia.org/pipermail/wikidata/2018-August/012282.html ). Close this one as resolved and make clearly scoped follow up tasks to untangle this? :-)
Jul 25 2018
Jul 25 2018
I don't think anything else is using it. I would check two weeks of logs to see if anything tried to use it and if not, just kill it.
Jul 22 2018
Jul 22 2018
Yup, had this too. In https://commons.wikimedia.org/w/index.php?title=Campaign%3Apainting-pd-art-self&type=revision&diff=312102316&oldid=312090259 I fixed the license. This change didn't have any effect. I kept getting a broken message, see for example https://commons.wikimedia.org/wiki/File:Jos%C3%A9_Garcia_Ramos_-_El_ni%C3%B1o_del_viol%C3%ADn.jpg where it still uses the old template. Later I did https://commons.wikimedia.org/w/index.php?title=Campaign%3Apainting-pd-art-self&type=revision&diff=312135588&oldid=312102316 and that solved it almost instantly.
Jul 18 2018
Jul 18 2018
Framawiki awarded T192690: Mass message broken on Wikidata after ratelimit workaround a The World Burns token.
Jul 13 2018
Jul 13 2018
In T102533#4422722, @Liuxinyu970226 wrote:For Norwegian, I strongly encourage @Multichill and other Dutch users to cease "one Norwegian" glitch, rather we, just all Wikimedians around the world, should always separate them as one of
<knip>
Jul 12 2018
Jul 12 2018
In T194950#4420642, @Addshore wrote:There are still some bots and tools that are not setting appropriate maxlag values for their requests but we can follow up elsewhere with that.
In T199379#4419930, @Addshore wrote:Should be fixed now, but pages that are already showing bad stuff will need a purge!
Jul 10 2018
Jul 10 2018
Multichill added projects to T102533: [Bug] Disallow (or resolve) dummy language codes.: I18n, MediaWiki-Internationalization.
I ran into this because I imported data with was tagged as "NOR" and is a valid ISO 639-2 language code that maps to the ISO 639-1 "no" language code, see https://en.wikipedia.org/wiki/Norwegian_language . Norwegian is a valid macro language, see https://en.wikipedia.org/wiki/ISO_639_macrolanguage and wouldn't be the first macro language to include, we also have ar (Arabic) or ne (Nepali) as valid language codes.
Jul 9 2018
Jul 9 2018
Multichill added a comment to T199146: "Blocked" response when trying to access constraintsrdf action from production host.
In T199146#4409514, @Smalyshev wrote:Yeah looks like ipblocks table for wikidata has block on 2620:0:862:101:0:0:0:0/96 by user "Merlissimo" with comment 'Toolserver Range - no anon edits' but this doesn't seem to match wdq9. So probably not this one.
Jul 8 2018
Jul 8 2018
Jul 5 2018
Jul 5 2018
Multichill added a comment to T198849: Argument processing chokes on python 2 when an argument contains non-ascii.
Looks like @Dalba introduced it in https://gerrit.wikimedia.org/r/#/c/pywikibot/core/+/440096/
Jul 2 2018
Jul 2 2018
Multichill renamed T194392: Add throttle exception for Wikimedia hackathon 2018 in Barcelona from j7caaaaaaa to Add throttle exception for Wikimedia hackathon 2018 in Barcelona.
Jun 29 2018
Jun 29 2018
Steinsplitter awarded T110833: Provide service to filter over categorization from a list of Commons categories a Pterodactyl token.
Multichill moved T194950: Include Wikibase dispatch lag in API "maxlag" enforcing from Backlog to Upstream on the Pywikibot-Wikidata board.
Multichill added a project to T194950: Include Wikibase dispatch lag in API "maxlag" enforcing: Pywikibot-Wikidata.
Jun 28 2018
Jun 28 2018
@Xqt : Why are you deprecating perfectly valid generators?
In T194950#4322078, @Magnus wrote:Excellent, that means my existing code should Just Work (tm).
Is the "lag" value in the API reply in seconds? maxlag does not appear to be documented for the query action...
Jun 14 2018
Jun 14 2018
Multichill added a comment to T192800: WDQS could allow discovery of skos:exactMatch links via autocomplete.
In T192800#4273584, @LJ wrote:@Smalyshev I think we couldn't agree on implementation details, so this is probably going to be discussed at the Berlin workshop, at which time someone will add a more detailed example & description.
So I stopped operating the bot back in 2015 because the time it would cost to fix it didn't add up with the (negative) feedback.
If there are multiple people who are willing to help out here I'm more than happy to invest a bit of time to set the thing up again.
Jun 10 2018
Jun 10 2018
Multichill reopened T138517: mysqldump is timing out preventing all tables from being included in the dump as "Open".
tools.heritage@tools-bastion-02:~/logs$ date
zo jun 10 11:44:09 UTC 2018
tools.heritage@tools-bastion-02:~/logs$ ls -alt update_monuments.log
-rw-rw---- 1 tools.heritage tools.heritage 96412359 jun 10 11:44 update_monuments.log
tools.heritage@tools-bastion-02:~/logs$ grep mysqldump update_monuments.log
mysqldump: Error 2013: Lost connection to MySQL server during query when dumping table monuments_am_(hy) at row: 13534
mysqldump: Error 2013: Lost connection to MySQL server during query when dumping table monuments_be-vlg_(fr) at row: 19365
mysqldump: Error 2013: Lost connection to MySQL server during query when dumping table monuments_be-vlg_(fr) at row: 56932
Jun 1 2018
Jun 1 2018
May 30 2018
May 30 2018
In T163642#4237402, @Smalyshev wrote:@Multichill I think the point of this task were to index the statements, which is done. For searching, you can use haswbstatement for now. I am not sure whether it makes sense to copy the statement value into all field, where it would be then searchable by plain search too - may be useful for distinctive IDs but I am not sure how many of them are distinctive... I think it's better to make a separate task for this.
Multichill reopened T163642: Index Wikidata strings in statements for fulltext search, a subtask of T46529: Wikidata search problems (tracking), as Open.
Multichill reopened T163642: Index Wikidata strings in statements for fulltext search, a subtask of T179815: Enable searching by author name string, as Open.
May 28 2018
May 28 2018
Freenode implemented the iline. I didn't hear anyone complain about irc so I guess it worked.
@Smalyshev https://www.wikidata.org/w/index.php?search=%22SK-C-5%22 doesn't work yet, but this task has been closed. Can you explain? This is listed in the task description as something that should work.
May 24 2018
May 24 2018
action=purge?
May 20 2018
May 20 2018
Multichill updated the task description for T195178: New constraint type to ensure that Items have a Label in a specific language.
This is part of https://www.wikidata.org/wiki/Help_talk:Property_constraints_portal#Improvements_for_2018 . Would start with just binary in the first version and maybe do a second version that looks for real references (and not imported from).
May 18 2018
May 18 2018
Multichill added projects to T194956: corrupted files in the cache: Multimedia, SDC General, Cloud-Services.
Multichill added a comment to T89552: Implement International Image Interoperability Framework (IIIF) prototype service on Wikimedia labs.
Multichill moved T194933: Discuss property creation on Wikidata for Structured Data on Commons from Backlog to Session on the Wikimedia-Hackathon-2018 board.
Multichill updated the task description for T194933: Discuss property creation on Wikidata for Structured Data on Commons.
May 17 2018
May 17 2018
Multichill added a comment to T194503: Special:EditWatchlist TOC navigation doesn't work since OOUI conversion: <legend> elements have no "id" parameters.
Confirmed on both Wikidata and the Dutch Wikipedia using Firefox.
May 16 2018
May 16 2018
Multichill triaged T194391: Freenode Iline for Wikimedia Hackathon 2018 in Barcelona as High priority.
May 15 2018
May 15 2018
I emailed Freenode. We should poke them if we haven't gotten confirmation by Thursday.
Multichill added a comment to T194346: Add throttle exception for Netherlands Hackathon 2018 - Women Tech Storm.
Note: Because we were a bit late with deploying this change, it didn't actually work and we hit the limit. @Reedy what was the limit again and where is this documented? I would like to update https://meta.wikimedia.org/wiki/Mass_account_creation#Requesting_temporary_lift_of_IP_cap to prevent other people from running into the same problem.
Multichill updated subscribers of T194392: Add throttle exception for Wikimedia hackathon 2018 in Barcelona.
Multichill renamed T194392: Add throttle exception for Wikimedia hackathon 2018 in Barcelona from [Ipaddress still missing] Add throttle exception for Wikimedia hackathon 2018 in Barcelona to Add throttle exception for Wikimedia hackathon 2018 in Barcelona.
May 11 2018
May 11 2018
I cloned Pywikibot using $ git clone --recursive ssh://multichill@gerrit.wikimedia.org:29418/pywikibot/core.git pywikibot
May 10 2018
May 10 2018
Multichill updated subscribers of T194392: Add throttle exception for Wikimedia hackathon 2018 in Barcelona.
And https://www.wikidata.org/w/index.php?search=haswbstatement%3AP217%3DSK-C-5 works :-). https://www.wikidata.org/w/index.php?search="SK-C-5" doesn't work (yet?). Is that the next step?
May 9 2018
May 9 2018
Multichill moved T194307: Structured Wikiquote from Backlog to Project on the Wikimedia-Hackathon-2018 board.
Multichill moved T166956: Cannot use Composer's CLI to manage a project's dependencies from Backlog to Project on the Wikimedia-Hackathon-2018 board.
I'm sorting out the hackathon 2018 board (on which this task is) and well described tasks generally have a higher chance of actually being worked on. Might not be the best board for a tracker task.
Multichill moved T27000: Deploy ThrottleOverride extension to Wikimedia wikis from Backlog to Project on the Wikimedia-Hackathon-2018 board.
Multichill moved T192067: Prototype ideas for translation on mobile from Backlog to Project on the Wikimedia-Hackathon-2018 board.
Multichill moved T193395: QuickPresets v2 from Backlog to Project on the Wikimedia-Hackathon-2018 board.
Multichill moved T172600: Tool "globalsearch" loads bootstrap from bootstrapcdn from Backlog to Project on the Wikimedia-Hackathon-2018 board.
What is the goal of this task? Can you elaborate?
Multichill moved T194254: Documentation sprint on Wikibase installation from Backlog to Session on the Wikimedia-Hackathon-2018 board.
Multichill moved T194279: Lexicographical data on Wikidata: what's coming? from Backlog to Session on the Wikimedia-Hackathon-2018 board.
Multichill moved T194275: Structured Data on Commons and GLAM session at the Wikimedia Hackathon from Backlog to Session on the Wikimedia-Hackathon-2018 board.
Multichill moved T192567: Expose constraint violations to WDQS from Backlog to Project on the Wikimedia-Hackathon-2018 board.
Multichill moved T194123: How to properly design and write browser tests from Backlog to Session on the Wikimedia-Hackathon-2018 board.
Multichill moved T193701: Explore using user clicks data to tune Wikidata search parameters from Backlog to Project on the Wikimedia-Hackathon-2018 board.
Multichill moved T192526: Diff-view: indicators should not be copied on copy-and-paste from Backlog to Project on the Wikimedia-Hackathon-2018 board.
Multichill moved T138371: WordPress plugin to associate tags with Wikidata IDs from Backlog to Project on the Wikimedia-Hackathon-2018 board.
Multichill moved T189791: 2018 Hackathon: Tell Me Why Your Search Sucks! from Backlog to Session on the Wikimedia-Hackathon-2018 board.
Multichill closed T163475: Populate the page_props table on Wikidata with wb-identifiers as Resolved.
I'm pretty sure this is done. Please re-open if that's not the case
In T163642#4192776, @Smalyshev wrote:@Lea_Lacroix_WMDE Also, for newly edited items it should be working as soon as wmf.3 is deployed. But for older items it will need reindex.
May 8 2018
May 8 2018
@Lydia_Pintscher @Ladsgroup any status update on this?
Apr 24 2018
Apr 24 2018
In T192690#4153593, @Legoktm wrote:This seems inherently broken, the noratelimit right exists for a reason. I'm not really sure how to work around this in MassMessage besides disabling rate limits I suppose?
Did I miss an announcement to wikitech-l/wikidata-tech that something like this was going to happen?
Multichill updated the task description for T192690: Mass message broken on Wikidata after ratelimit workaround.
For me the image is a link to https://commons.wikimedia.org/wiki/File:Douglas_adams_portrait_cropped.jpg and the link goes to the local page. No mediaviewer. Linking directly to Commons is something that most Wikipedia's have been doing for years to eliminate the extra local step.
Apr 23 2018
Apr 23 2018
I don't agree on renaming, that would break it for the users who use it. It's just a bunch of symlinks to /data/project/pywikibot/ anyway. What I would propose:
- Change owner to tools.pywikibot for /shared/pywikipedia/ (now a big mix)
- Link /shared/pywikibot to /data/project/pywikibot/public_html/core/
- Leave a readme in /shared/pywikipedia/ to say it's deprecated and people should use /shared/pywikibot/
Content licensed under Creative Commons Attribution-ShareAlike (CC BY-SA) 4.0 unless otherwise noted; code licensed under GNU General Public License (GPL) 2.0 or later and other open source licenses. By using this site, you agree to the Terms of Use, Privacy Policy, and Code of Conduct. · Wikimedia Foundation · Privacy Policy · Code of Conduct · Terms of Use · Disclaimer · CC-BY-SA · GPL