Fri, Jan 18
Per decision in T213913, removing the instance should solve the running the outdated OS problem,
That was fast, thanks!
Thu, Jan 17
oh, sorry for duplicating the effort. I dropped out of IRC, and also didn't refresh the task so missed all the comments. This is the last time such misbehaviour of mine happens.
Wed, Jan 16
Cool, let's nuke it then!
@Lydia_Pintscher any concerns about disposing this instance?
It is about https://wikidata-lexeme.wmflabs.org/index.php/Main_Page
Mon, Jan 14
@hashar Please do not remove these jobs. Those are daily jobs of Wikibase(Lexeme) extensions which we want to keep. It is true though, those have been red for more than a while. We're working on fixing those failures, to allow us to gradually migrate away from ruby tests to node ones.
It is actually my personal goal for this quarter to get all these daily selenium jobs of Wikibase's green. It is simply embarrassing that we haven't solved those problems for so long.
Sun, Jan 13
Thu, Jan 10
As Noa's manager, I endorse the request.
Wed, Jan 9
I've submitted RFC about the whole concept of Wikibase front end changes as T213318. I've taken the liberty to subscribe all people who were kind to comment on this task to the RFC.
This ticket was intended as the "pure" service request, hence removing the TechCom-RFC tag. Also marking as stalled for now, to focus on the RFC ticket first, as the service request has little point without the general approach of ours being discussed first.
Tue, Jan 8
It did (today, not on Monday though). I hope the outcome is I hope that @Joe and @akosiaris have a better understanding of what are we have in mind. What we talked about (Wikibase front-end architecture) is also going to be turned into an RFC in next 24 hours.
Mon, Jan 7
T200011 got resolved and revealed that, at least when targetting "real" sites not a local wikis, Wikibase(Lexeme) browser tests should include the login step.
See subtask for details of this blocker.
Disclaimer: In this comment I exclusively focus on technical/Wikibase as a software aspects of the topic. I am entirely opaque to what might be good or not so good social/community process additions to what the software allows/provides.
Except T205192 there is also the issue that EntityInfoBuilder is also involved when generating the links on the item etc page, and it is still hitting wb_terms. That is T198868, or T205193 in particular.
Arguably, changing this is not blocking this is not required to consider this use case solved, but it would be really nasty to leave not changed.
Very much appreciated @JAllemandou!
@JAllemandou: as at WMDE we are in the need for the December data (Dec, 1st in particular), could you maybe please do the magic and re-run the job and backfill the data again? Thanks in advance.
Thu, Jan 3
Thanks everyone for comments so far. This ticket in its current state is definitely not a ready RFC, you're right. We're going to turn it into one/create a separate RFC ticket in upcoming days.
As a preparation, we're going to have a little chat with @Joe on Monday, to talk about our plans, and see what elements of our plan are particularly not clear/problematic. Things are clear in our heads, but this does not mean it is all obvious to other people :) Interested CET timezone people are welcome to join of course.
This talk is of course is not meant as a replacement to the RFC review process.
good catch. Please ignore this message, it is an artefact from the days when tests were run using sauce labs. The same message is shown also in passing tests, I believe.
Looking at the job 443 you used as an example, the first failing tests is https://integration.wikimedia.org/ci/view/Selenium/job/selenium-Wikibase-chrome/BROWSER=chrome,MEDIAWIKI_ENVIRONMENT=beta,PLATFORM=Linux,label=DebianJessie%20&&%20contintLabsSlave/443/testReport/junit/(root)/Using%20time%20properties%20in%20statements/Check_UI_for_invalid_values__outline_example_____32_12_2015___/
In this case, the actual error message is:
timed out after 30 seconds (Watir::Wait::TimeoutError)
Dec 21 2018
To avoid misunderstandings: I was not questioning MediaWiki's action API being performant. By "lightweight" I was referring to "PHP has high startup time" point @daniel made above as one of the reason why no service should call MW API.
the request to index.php is conditionally routed directly to the SSR service. In our world, the SSR service is there, so we configure it in Varnish, it returns html, and Vue takes over client-side. For other mediawiki installations, index.php knows to render a basic version of the html which pulls in the Vue.js modules. Once this loads in the browser, it renders the interface.
Dec 18 2018
and LabelsProviderEntityIdHtmlLinkFormatter indeed seems to be used more broadly than ItemIdHtmlLinkFormatter so it might be better to be careful with jumping to conclusions too early (I mean conclusions on merging those two classes)
It is not only config change indeed.
I tend to agree, but I'd like to further discuss with you how much of a difference this really makes in the code,
Trying to take the step back and think on how the functionality in question has been implemented/outlined two years ago, here would be my thoughts:
- Having two not-necessarily same config flying around and parts of code arbitrarily picking one config, and other parts picking up the other seems like a bug/unfinished implementation to me. I am surprised it only surfaces now (it has been for 99% me who had messed that up), but should be fixed. If the current "broken" state actually makes commons work the fixing schedule can be of course postponed :)
- It looks to me that Commons and Wikidata federation is a bit of special case of the federation as it has been envisioned as a general concept. There is no need to have local and wikidata items (like most non-Wikimedia Wikibase instances request in the context in federation). "Funnily" enough, while the "typical" federation is nowhere in use due to current implementation's limitation, so the only real use case is this special/reverse one.
- I don't claim to have the thorough understanding of the Commons issues now, but it seems to me that those two kinds of federation, i.e. one intending to have different entity types in different repos, possibly having e.g. items from multiple repos, and the one where it is clear some entity types are coming from repo A, and some from repo B are actually separate things, they're not really overlapping. The former requires and is based on the concept of prefixes (to be able to distinguish between different sources of items), whereas the latter could actually do without having prefixes at all. Both make sense as separate approach (the former for non-Wikimedia Wikibases, the latter for Commons, for instance). I am not aware of any practical or planned instance where mixing both concept would actually be needed. Therefore I would strongly encourage to NOT mix both approaches in the implementation and to NOT create a super generic federation where all the things could be done using some config magic. The existing stuff is already overly complicated. Let's at least not make it worse.
Dec 17 2018
I've checked both affected instances, and confirmed they're no longer in use and not needed. Hence, I've deleted them.
Apologies for taking so many weeks for such an easy measure.
Dec 14 2018
I believe the point of running those not per-commit, but say once a day was brought as the counterweight to a possible concern that having additional browser tests will lead to siginificant slow down of per-change CI jobs.
If this is not the case at this point, I guess it doesn't make sense to build additional browser test infrastructure.
Dec 12 2018
Dec 10 2018
Gerrit project created: https://gerrit.wikimedia.org/r/#/admin/projects/wikibase/termbox
NOT to be used just yet. Code falling behind the WMDE github repo, no CI set up yet.
As Greta's manager I endorse this request.