Wed, May 26
Sun, May 23
Sat, May 22
Multichill's SDC editing with pywikibot code as a reference
Thu, May 20
This affects the query service which uses the formatter URL to render the URI.
Wed, May 19
I think that this will still work as an example for reading CSV/TSV row by row and updating wikidata based on values from the file.
May 10 2021
Apr 14 2021
Apr 13 2021
Apr 11 2021
Apr 9 2021
Unwanted regression. CSS-classes for the lines in the history/recentchanges etc lists which will tell if the revision is reviewed or not is gone. An example light blue lines in this screenshot for the edits reviewed by Opa.
Mar 31 2021
Given the context above, it may be worth considering whether to focus this change to focus only on newcomers. In this way, the more experienced editors would still get opportunities to contribute in a way that better suits them.
Mar 24 2021
Hmm, If I understand this correctly the "sighted" (stable version) is enough for Google News and it doesn't use quality tier for aggregating news. It woud be nice to confirm somehow.
Another example case from finnish Wikipedia
Mar 21 2021
Mar 20 2021
In Finnish Wikipedia, they are used so that there is some set of articles (say featured articles) which edits are checked as normally for vandalism and then there is also the slower round where users are doing time to time more deeply checking what is changed after last quality revision.
I understand. But the choice here is not between "having tiers" and "not having tiers". It's between "not having tiers" and "undeploying flaggedrevs". The code under the hood is a big mess. My suggestion is to remove the tiers (and keep levels for now) and if needed:
- Either implemented later in a better way inside flagged revs
- or have a new small self-contained extension to handle that. I know there was some efforts in WMF to write something to allow users mark a revision/page in a structured way but I think this is out of scope of this extension. the extension job is about "pending changes" and should only do that (and do it well)
Ah, you mean that if the user can still select "well-sourced" when the user is reviewing then it is good enough(?) I try to explain again what I tried to say earlier.
Mar 19 2021
Hmm, it seems that seeing the info requires suitable user group. However here is screenshot
Not sure if I was able to follow you on how you defined "tiers". In fiwiki there is in user interface
- stable = alias for any review
- checked = levels 1, 2
- quality = level 3
Right. You know that least fiwiki is using more than two levels.... I may not even be against axing the levels, but I was thinking that I was in some level to following what is happening with FlaggedRevs and it would be nice to even to hear or participate to the discussion where you are deciding things.
@Jdforrester-WMF if we are keeping levels inside single dimension it would be more useful to keep quality and pristine and link them together in documentation.
"SimpleUI" will currently break the fiwikis page layout as it will collide with some templates. "Non-simple UI" aka basic text box works well when the number of the pages where it is visible is low (like it is in fiwiki) and it will actually notify users that there is new changes to users to check.
Mar 18 2021
I would keep the wording same as in documentation. Ie. "Flagged revisions will no longer have multiple tags like "accuracy", "tone" or "depth" and will only have single tag"....
Mar 16 2021
Bug confirmed with Windows 7 and chrome 89.0.4389.82. It seems that if the link is in the bottom line then it is not activated when mouse cursor is hovered over the link. Link cannot be clicked either.
If somebody is fast enough, then there is still time to drop application to this years software/research project grants. Deadline is today 16.3.2021.
@Ladsgroup just some notes:
Mar 15 2021
This is summary by Kyykaarme of the reasons why the community wants to disable the invitations to translate articles.
Mar 14 2021
@Jdforrester-WMF Why WMF is not able to fund developing FlaggedRevs (or replacement)?
Mar 2 2021
I copied the acceptance criteria from the older ticket as these are merged. Feel free to edit them, but if you are dropping the "if the local wiki configuration allows to, users can publish translations anonymously" then please give some well-thought rationale for it as it feels like a point where there is a collision between what user communities think and technical design.
Feb 18 2021
Root cause for the problem could be race condition when configuration values are merged (Warning: this was not never validated)
Feb 5 2021
Feb 4 2021
Very low tech workaround for decreacing the current load could be generating unsanitized id / timestamp pair tables where user could query possible min/max values for the primary ids. This could be even compacted so that it would contain only one pair per day.
Feb 2 2021
Yes, the connection was started after your "Tue, Feb 2, 3:53 PM" comment.
Thanks for fixing. I was able to log in but connection (or timeouts) is still little bit shaky.
Seems to be down
Jan 21 2021
Jan 19 2021
Sure, in July 2019 when i wrote the comments the rev_actor fields werent documented
Jan 18 2021
I just tested Phash (ImageHash.py :s implementation) and for detecting duplicates it would be huge improvement compared to SHA1 even if it would be used for testing the exact matches which would be fast using SQL. There would be false negatives though.
Ok, thanks. By default pythons ImageHash librarys pHash length is 64 bit and even if I change it longer it doesn't generate same hashes. So it is confirmed that its pHash and JImageHash PerceptiveHash doesn't generate same hashes.
Jan 17 2021
Hi, is it possible to get some phash hash values for pictures in Wikimedia Commons? I would like to see if the JImageHash would generate compatible hash values with Pythons Imagehash, but I expect that it doesn't do that.
Use case example
Jan 15 2021
Jan 13 2021
Is it possible to get information how big the commons/wikidata tables are on the disk? (ie. which are the problematic tables in the replication point of view)
Some generic examples how I am using cross-database joins. Problem is not really if I know how to recreate those using code, but if the cross-database joins are going away they are a lot more complex to implement (=more work to do write the code). They are also a lot slower to run if I need to first fetch all the data locally to do manually the matching as now the work is done transparently by the SQL server.
Jan 7 2021
Jan 1 2021
If this is still true:
Just FYI, there is a discussion in commons about this tool .
Dec 10 2020
More specifically I am interested on largest editor communities not readership as the readrship is rather skewed when comparing wikidata/wikipedia/commons. I tried to analyze this by myself by using how many wikidata editors from last 30 days have ever edited finnish Wikipedia in any significant amounts and same for Commons. However, this ignores all of those who dont edit fiwiki and includes users whose contributions are historical langlink edits and global vandal fighting.
Dec 9 2020
Ok, thanks for the info. Also, if you catched the general idea of the information what i am trying to get then feel free to refine the question more easily resolvable if needed.
Dec 8 2020
@Aklapper I didn't say that you would have bunch of unused developers. I wrote that priotizing this as low priority and postponing this to some unknown future is an error.
@MarkTraceur. I would like to question your triage to low priority I demonstrated on the ticket that responsiveness could be mostly fixed just by CSS which is trivial and by focusing on creating a new upload tool the problem will be still there on 2025. Also uploading files to Commons are the most basic features of the site. It is a more elemental feature for the usage of the site than for example the structured data.
Dec 7 2020
One use case for the historical data.
Dec 2 2020
And for current uses, least I am using them for the graphs. Though, this is not realtime as it used to aggregate data from other tables too than flaggedrevs_statistics so i have updated the stats when I have needed them.
I think the most important use case is statistics analysis. Also afaik i think it is needed also WMF/Community figures out what to do with the FlaggedRevs as we don't really have much a data how the the FlaggedRevs is used.
Please don't purge them. If one wants to do timelines or graphs (Like i do) of how the revieweing have been developed or wants to compare different language versions he or she needs historical data.
Dec 1 2020
Actually not least by T237191 as Tgr noted in second ticket.
Nov 30 2020
Backlink to the Community Wishlist Survey 2021 proposal
Nov 29 2020
This seems to be same as T246746
Another idea. Currently the recentchanges rc_params contains this information for remote edits and it is already used for generating backlinks to wikidata. This could also contain the rc_patrolled value from Wikidata which could be used for filtering.
One solution could be to hide all the edits which are autopatrolled in Wikidata. However, currently all wikidata edits are propagated to client wikis (what is correct term for this?) with value rc_patrolled=2 so there should be either correct value for autopatrol status OR more possible values for separating local autopatrolled edits and remote autopatrolled edits.
Nov 27 2020
Yes, it sounds agreeable to me.
In finnish wikipedia there is two use cases for disabling the links because the page is archived or project page. Currently there is no known problems with this except that the css is custom local change.
Nov 12 2020
Nov 11 2020
First, thank you for fixing this. Do you have any best/worst guess when the tooldb is up again? There's WLM jurying going using http://montage.toolforge.org which is down and we need to inform the jurors of when the reviewing the photos is expected to continue.
@Vis_M in which wiki you were planning to use it?
Nov 9 2020
@ppelberg just FYI that I added some quick CSS workarounds for problems mentioned earlier in this page.
Oct 28 2020
There was also comment  that Reply-tool currently adds [ reply ] to archived parts of the discussion which should not be changed. These links can be hided even by local CSS so it is not blocker but just FYI that it is something which needs to be solved somehow.
Oct 27 2020
Is it possible to configure reply tools so that it would add [ reply ] link only when line starts with :? I think that there is newer such a case in Finnish Wikipedia when line starts with # or * when user should reply. All of those are either votes OR participants lists and actual discussion is in separated section.
Oct 26 2020
Article previews doesn't work because RESTBase config is broken? This is a visible problem because the article preview is toggled on by default in the settings.
Oct 25 2020
I created a discussion on the wikidata project chat for defining the community side of the process.
Oct 21 2020
Oct 19 2020
@Zache are you able to describe what you are referring to above in a bit more detail? I want to make sure I'm understanding what you are saying before responding.
Oct 17 2020
Oct 15 2020
It seems that we could do spacing with nowiki also instead of html-comments. With nowiki broken comments would not fail silently.
Oct 14 2020
In the enwiki's help page, the spacing between items should be done using comments. Like this in wikicode.
@ppelberg It was to make editing the wikicode easier. The original proposal was in the village pump discussion "Selkeyttävä ehdotus in 2005. It was low volume and contained only three comments. The next village pump discussion which referred to empty lines in discussions was in 2015 (Muistutus_keskustelusivukäytännöstä) when Otrfan notified that writers should follow the new line guideline to make commenting more easier.
Oct 13 2020
Thanks. It however should be able to handle cases like this too (in wikicode) where comment is splitted to multiple dd:s.
At HTML-level it looks like this. So, with newlines, the extra margin between rendered comments comes from the margin between separate lists. However, as Whatamidoing (WMF) noted in village pump the separated lists are an accessibility problem for screenreaders etc.
Oct 1 2020
Hmm, it seems to be working now though I tested the url before I submitted the bug report. If it stays up then this can ticket can be closed.