Page MenuHomePhabricator
Search Open Tasks
Use the application-specific Advanced Search to set additional search criteria: Tasks, Commits. (More information)
    • Task
    Cloud VPS Project Tested: Site/Location: codfw Number of systems: 1 Service: kube-apiserver and etcd Networking Requirements: internal Processor Requirements: 4 Memory: 5GB Disks: 30GB Other Requirements: No DRBD
    • Task
    [MinT](https://www.mediawiki.org/wiki/MinT) supports machine translation for different products. In the initial integrations of MinT to support Content Translation and the Localization infrastructure, the requested contents to translate are quite unique. That is, an editor creating a new translation for a Wikipedia article will generate requests for MinT to translate the different sentences of such article. which will be very different from those translated by another editor working on a different article. In this context, there was no need for caching. However, as MinT is applied in contexts where multiple users may be reading the same translated content, there may be a bigger benefit in applying caching to reduce waiting times and the general workload for the service. This may be relevant for context such as the translation of wishes (T363306) for the new Wishlist process and MinT for Wikipedia readers (T359072). This ticket proposes to consider ways in which the MinT service could use caching to provide a better performance in the described contexts. Given a certain request text, language pairs and models, the result should come from the caching system if the same exact request was made previously/recently. We need to consider that (a) language models are not updated frequently, but we also plan to allow users to provide community-verified translations (T351748) and the caching system should not get in the way of users being able to correct a bad translation (i.e. taking time to see changes reflected).
    • Task
    In order to test migrating the wikikube control planes to hardware nodes and colocate with etcd we should migrate the wikikube clusters from the current setup with 2 control planes + 3 etcd VMs to 3 VMs that co-locate kube-apiserver and etcd. This will be useful for testing new puppet roles as well as the procedure and will ensure that staging and prod clusters are set up the same way as prod (which is required so we can catch errors early during future k8s upgrades etc.). [] Request a new VM (4 CPU/5 GB) in codfw (currently etcd runs on 2 CPU/3 GB and apiserver on 1 CPU/3 GB) without DRBD [] Create a new puppet role `role::kubernetes::staging::master_stacked` [] Configure the new instance with that role so that it joins the existing etcd and k8s cluster [] Repeat the above for two additional new VMs [] Remove the old control-planes and etcd nodes (delete VMs etc) [] Repeat the process for staging-eqiad
    • Task
    This #epic ticket compiles the different efforts needed for translation in the new approach to support the Community Wishlist process. These include the translation of the Wish pages using machine translation when community-translations are not available (T361514), the localization of the wish form (T361512), the automatic translaiton of discussions (T295862) and other initiatives captured [in the sub-tasks](https://phabricator.wikimedia.org/maniphest/?parentIDs=363306#R).
    • Task
    ## Background The are no guidelines for the `External link`, so we need to document the External link in: - [[ https://doc.wikimedia.org/codex/latest/components/mixins/link.html#types | Link > Types ]] - [[ https://doc.wikimedia.org/codex/latest/components/mixins/link.html#best-practices | Link > Best practices ]] ### Acceptance criteria [] Include the External link guidelines in: [] Link > Types [] Link > Best practices
    • Task
    We need to create OpenSearch saved views for each Superset deployments, to get easy access to their logs, the same way we did it for the Spark History service (https://logstash.wikimedia.org/goto/bb9f9bca1c6f3cce40acb2b86d306a77) and for [[ https://logstash.wikimedia.org/app/dashboards#/view/5ab748b0-d228-11ee-985c-97a00bd32564?_g=(filters:!(),refreshInterval:(pause:!t,value:0),time:(from:now-120m,to:now))&_a=h@476e85d | superset ]]
    • Task
    ``` import pywikibot asite = pywikibot.Site('de') #asite.login() page = pywikibot.Page(asite, "Wien") print (page.title()) # works print (page.coordinates(primary_only=True)) # fails with # WARNING: API error mwoauth-invalid-authorization-invalid-user: The authorization headers in your request are for a user that does not exist here # NoUsernameError: Failed OAuth authentication for wikipedia:test: The authorization headers in your request are for a user that does not exist here ``` why can I access some properties (like title()) and get an error when accessing coordinates()? With my config I can access wikipedia, commons, wikidata without problems, but coordinates() fails. full stacktrace: ``` --------------------------------------------------------------------------- NoUsernameError Traceback (most recent call last) Cell In[60], line 6 4 page = pywikibot.Page(asite, "Wien") 5 print (page.title()) # works ----> 6 print (page.coordinates(primary_only=True)) # fails with 7 # WARNING: API error mwoauth-invalid-authorization-invalid-user: The authorization headers in your request are for a user that does not exist here 8 # NoUsernameError: Failed OAuth authentication for wikipedia:test: The authorization headers in your request are for a user that does not exist here 9 (...) 13 #api.loadpageinfo(page) 14 #page.loadpageprops() File /srv/paws/lib/python3.10/site-packages/pywikibot/page/_basepage.py:1673, in BasePage.coordinates(self, primary_only) 1671 if not hasattr(self, '_coords'): 1672 self._coords = [] -> 1673 self.site.loadcoordinfo(self) 1674 if primary_only: 1675 for coord in self._coords: File /srv/paws/lib/python3.10/site-packages/pywikibot/site/_decorators.py:58, in need_extension.<locals>.decorator.<locals>.callee(self, *args, **kwargs) 54 if not self.has_extension(extension): 55 raise UnknownExtensionError( 56 'Method "{}" is not implemented without the extension {}' 57 .format(fn.__name__, extension)) ---> 58 return fn(self, *args, **kwargs) File /srv/paws/lib/python3.10/site-packages/pywikibot/site/_extensions.py:183, in GeoDataMixin.loadcoordinfo(self, page) 175 title = page.title(with_section=False) 176 query = self._generator(api.PropertyGenerator, 177 type_arg='coordinates', 178 titles=title.encode(self.encoding()), (...) 181 'globe'], 182 coprimary='all') --> 183 self._update_page(page, query) File /srv/paws/lib/python3.10/site-packages/pywikibot/site/_apisite.py:1438, in APISite._update_page(self, page, query, verify_imageinfo) 1435 if not self.sametitle(pageitem['title'], 1436 page.title(with_section=False)): 1437 raise InconsistentTitleError(page, pageitem['title']) -> 1438 api.update_page(page, pageitem, query.props) 1440 if verify_imageinfo and 'imageinfo' not in pageitem: 1441 if 'missing' in pageitem: File /srv/paws/lib/python3.10/site-packages/pywikibot/data/api/_generators.py:1021, in update_page(page, pagedict, props) 1018 page._langlinks = set() 1020 if 'coordinates' in pagedict: -> 1021 _update_coordinates(page, pagedict['coordinates']) 1023 if 'pageimage' in pagedict: 1024 page._pageimage = pywikibot.FilePage(page.site, pagedict['pageimage']) File /srv/paws/lib/python3.10/site-packages/pywikibot/data/api/_generators.py:950, in _update_coordinates(page, coordinates) 948 coords = [] 949 for co in coordinates: --> 950 coord = pywikibot.Coordinate(lat=co['lat'], 951 lon=co['lon'], 952 typ=co.get('type', ''), 953 name=co.get('name', ''), 954 dim=int(co.get('dim', 0)) or None, 955 globe=co['globe'], # See [[gerrit:67886]] 956 primary='primary' in co 957 ) 958 coords.append(coord) 959 page._coords = coords File /srv/paws/lib/python3.10/site-packages/pywikibot/_wbtypes.py:129, in Coordinate.__init__(self, lat, lon, alt, precision, globe, typ, name, dim, site, globe_item, primary) 127 self.name = name 128 self._dim = dim --> 129 self.site = site or pywikibot.Site().data_repository() 130 self.primary = primary 132 if globe: File /srv/paws/lib/python3.10/site-packages/pywikibot/__init__.py:243, in Site(code, fam, user, interface, url) 241 key = f'{interface.__name__}:{fam}:{code}:{user}' 242 if key not in _sites or not isinstance(_sites[key], interface): --> 243 _sites[key] = interface(code=code, fam=fam, user=user) 244 debug(f"Instantiated {interface.__name__} object '{_sites[key]}'") 246 if _sites[key].code != code: File /srv/paws/lib/python3.10/site-packages/pywikibot/site/_apisite.py:140, in APISite.__init__(self, code, fam, user) 138 self._loginstatus = login.LoginStatus.NOT_ATTEMPTED 139 with suppress(SiteDefinitionError): --> 140 self.login(cookie_only=True) File /srv/paws/lib/python3.10/site-packages/pywikibot/site/_apisite.py:400, in APISite.login(self, autocreate, user, cookie_only) 398 try: 399 del self.userinfo # force reload --> 400 if self.userinfo['name'] == self.user(): 401 self._loginstatus = login.LoginStatus.AS_USER 402 return File /srv/paws/lib/python3.10/site-packages/pywikibot/site/_apisite.py:668, in APISite.userinfo(self) 661 if not hasattr(self, '_userinfo'): 662 uirequest = self.simple_request( 663 action='query', 664 meta='userinfo', 665 uiprop='blockinfo|hasmsg|groups|rights|ratelimits', 666 formatversion=2, 667 ) --> 668 uidata = uirequest.submit() 669 assert 'query' in uidata, \ 670 "API userinfo response lacks 'query' key" 671 assert 'userinfo' in uidata['query'], \ 672 "API userinfo response lacks 'userinfo' key" File /srv/paws/lib/python3.10/site-packages/pywikibot/data/api/_requests.py:1087, in Request.submit(self) 1085 pywikibot.error(f'Retrying failed {msg}') 1086 continue -> 1087 raise NoUsernameError(f'Failed {msg}') 1089 if code == 'cirrussearch-too-busy-error': # T170647 1090 self.wait() NoUsernameError: Failed OAuth authentication for wikipedia:test: The authorization headers in your request are for a user that does not exist here ```
    • Task
    Setup alerts for when the pods fail/are not running similar to [[ https://wikitech.wikimedia.org/wiki/Analytics/Systems/Superset/Administration | Analytics/Systems/Superset/Administration ]]
    • Task
    We need to move the helmfile deployments to`dse-k8s-services`
    • Task
    Redirect the datahub traffic to the DSE-k8s cluster
    • Task
    During the move we need to create the `datahub` namespace on dse-k8s
    • Task
    There are a number of ACLs, rate-limits, quotas, and network policies on the production wiki endpoints, to protect the services from misbehaving clients, or single clients consuming too many resources. Some of these policies are IP-based. Some of these rules, policies and constrains are documented here as seen from Toolforge side: https://wikitech.wikimedia.org/wiki/Help:Toolforge#Constraints_of_Toolforge To mitigate Cloud VPS and Toolforge hitting the limits, we have an exemption mechanism for the general Cloud VPS egress NAT for wiki endpoints. As of this writing, all traffic leaving Toolforge to establish a connection with production wiki endpoints are exempt from this general Cloud VPS egress NAT. Unfortunately, such wiki endpoints could potentially still see many tools from a single source IP address: the Toolforge k8s worker node address. We have seen Toolforge tools consuming the wiki endpoints quotas in different way. In some cases, in ways that would prevent or limit what other "neighbor" tools running on the same Toolforge k8s worker node could do network-wise with the wiki endpoints. This ticket is to explore if we could introduce some egress network quotas, to limit what a single tool can consume from the wiki endpoints, things like: * number of concurrent open connections * bandwidth Some examples of the problems described here: * {T308931} * {T329327} * {T356164} * {T356163} * {T356160}
    • Task
    The LiftWing model-server repo has a [[ https://github.com/wikimedia/machinelearning-liftwing-inference-services/blob/main/Makefile | Makefile ]] that makes building and running of model-servers locally much easier. In this task, we will update the existing Makefile to support building and running the [[ https://github.com/wikimedia/machinelearning-liftwing-inference-services/tree/main/logo_detection | logo-detection ]] model-server.
    • Task
    **Goal** - Surface metric numbers from scraper data - Think about how we support self-serve for accessing scraper data **Steps** - Retrieve the data from the results - Document instructions on how the data can be fund/extracted **Metrics**: Should be retrievable from current scraper results [] # of duplicate (identical) refs in a given wiki [] # of articles with at least one identical ref [] # of articles with more than 25 refs and have at least one identical reference, [] proportion of duplicate refs in articles with >25 refs vs. proportion of duplicates in articles <25 refs, split by wiki. - Assumption: longer reference lists have more duplicates because hard to find and manage [] # of articles without references [] ratio of reference to paragraph per wiki **( TBD: Can we even do that without a code change to the scraper and a re-run? )**
    • Task
    **Validate Assumption** The #citoid extension is used by most users of Visual Editor and Wikitext Editor 2017 to insert references. We have seen this in multiple discovery interviews, however, more quantitative data is required to validate this. **Steps** - Check if data already exists in [[ https://www.mediawiki.org/wiki/VisualEditor/FeatureUse_data_dictionary | VisualEditor/FeatureUse_data_dictionary ]] and document they keys - If not implement instrumentation using `ve.track` and document they keys **Questions and Metrics ** [] How often is automatic reference generation used across sessions, broken down by #visualeditor-mediawiki-2017wikitexteditor and #visualeditor . - e.g. 20% of all sessions have used the Citoid at least once [] How often do users interact with the Citoid successfully. Defined by adding a reference by using the reference generation. [] How often do users fail to add an automatic generated reference. Defined by interacting with the input field and then aborting the dialog. [] How often and in which situations does the tool fail in retrieving results? Defined by how often the error message is triggered. - Error Message: We couldn't make a citation for you. You can create one manually using the "Manual" tab above.
    • Task
    Dear Wikipedia team, First of all, I am writing to you as a representative from UIN Sunan Ampel Surabaya, an academic institution located in Surabaya, Indonesia. We have a plan to utilize Wikipedia as a valuable source of information for the public. As part of this plan, we require at least one account for each lecturer and staff at our university. However, we have encountered difficulties in creating multiple accounts due to the limitations imposed by the existing API. Therefore, we would like to request a whitelist for our IPs, allowing us to proceed with our mass account creation program. Below, you will find the details of our request: Event Title: Mass Account Creation at Wikipedia Date: 25th April 2024 until 25th august 2024 Wikis: id.wikipedia & en.wikipedia Number of Participants:1000 accounts Public IPv4 Addresses: IP-1 : 103.211.49.0/24 IP-2 : 103.165.213.112/29 IP-3 : 36.66.204.104/29 IP-4 : 43.218.32.106 Our Domain : uinsa.ac.id Thank you for considering our request. We believe that by expanding our access to Wikipedia, we can contribute to the dissemination of knowledge and enhance the learning experience for our students and the wider community. Best Regards, Fariq Maulana IT Department UIN Sunan Ampel, Surabaya, Indonesia
    • Task
    Gerrit reviewer bot should add reviewers as CC instead of actual reviewers, so that the change doesn't get continually marked with the attention flag for them on every new patchset. This is particularly irksome with people who added themselves to the list of reviewers long ago and are no longer active, and leads to Gerrit dashboards being less useful, because they can't rely on the attention flags to tell who is required to take action next. * https://www.mediawiki.org/wiki/Git/Reviewers * https://github.com/valhallasw/gerrit-reviewer-bot * https://toolsadmin.wikimedia.org/tools/id/gerrit-reviewer-bot
    • Task
    jenkins-bot should clear the attention flag for the patch uploader after a new patchset passes tests (if it would vote V+1), if it had set the flag after a previous patchset failed tests (if it voted V-1). As a patch uploader, removing that flag yourself is tedious, especially if you contribute to a variety of repositories and often rely on Jenkins to run your tests instead of setting up everything locally. It's tedious enough that it leads some people to ignore this feature, which they might otherwise find useful. This in turns leads to Gerrit dashboards being less useful, because they can't rely on the attention flags to tell who is required to take action next.
    • Task
    7 new wikis were created on 2024-04-23. We should run [the code to add these](https://github.com/wikimedia-research/canonical-data/tree/master/wiki) to canonical wiki dataset. The sites table hasn't immediately been update, so we will have to wait some time for the source data to populate. I'm not sure how long that will take, but it should be done within a week.
    • Task
    This may not be really a bug, but I appreciate help in investigating it regardless. I have this SQL query which creates some statistics about fawiki users in terms of how many articles they have created and how large the articles were at the time of creation. Please ignore that the query may not be optimized (e.g., I could have used `actor_revision` instead of `actor`, etc.) That's not the question here. When I run the query using #superset.wikimedia.org it finishes in about 10 minutes or so. I also have this other process where I schedule these queries to run on Toolforge Jobs framework. The process consists of a python wrapper called [[https://github.com/PersianWikipedia/fawikibot/blob/master/HujiBot/stats.py|stats.py]] which finds each query listed in a different file called [[https://github.com/PersianWikipedia/fawikibot/blob/master/HujiBot/weekly-slow.py|weekly-slow.py]] and runs it. The query in question is the sixth on in `weekly-slow.py` (look for `"sqlnum": 6,` and please ignore that I just updated the query to actually use `acto_revision` .. again, besides the point) and `stats.py` puts a limit to how long these queries can run; the limit is 30 minutes and is enforced using a `SET SESSION MAX_STATEMENT_TIME` command [[https://github.com/PersianWikipedia/fawikibot/blob/master/HujiBot/stats.py#L153|on line 153]]. The issue: when the python-based process runs, this query times out. In other words, a query that runs in ~10 minutes when executed via Superset, will take more than 30 minutes when executed via this python wrapper program. If you are curious, the command for the TFJ job is [[https://github.com/PersianWikipedia/fawikibot/blob/master/HujiBot/tfj/commands.sh|here]] (last line) and the job itself is in [[https://github.com/PersianWikipedia/fawikibot/blob/master/HujiBot/tfj/jobs/weekly-slow.sh|this sh file]] which essentially sets up the python virtual environment and runs the `stats.py` script. Is it because I am not specifying a memory limit for my job and it is too slow as a result of small memory assigned? Can this be checked somehow? Are there other ways to investigate why the query runs more slowly through the jobs framework? ```name=the query, lang=sql SELECT actor_name, STR_TO_DATE(LEFT(MIN(rev_timestamp), 8), '%Y%m%d'), SUM( CASE WHEN rev_len BETWEEN 0 AND 2048 THEN 1 ELSE 0 END ), SUM( CASE WHEN rev_len BETWEEN 2048 AND 15 * 1024 THEN 1 ELSE 0 END ), SUM( CASE WHEN rev_len BETWEEN 15 * 1024 AND 70 * 1024 THEN 1 ELSE 0 END ), SUM( CASE WHEN rev_len > 70 * 1024 THEN 1 ELSE 0 END ), COUNT(rev_first) tot FROM revision_userindex r JOIN ( SELECT MIN(rev_id) rev_first, rev_page FROM revision GROUP BY rev_page ) f ON r.rev_id = f.rev_first JOIN page ON page_id = r.rev_page JOIN actor ON rev_actor = actor_id LEFT JOIN user_groups on actor_user = ug_user AND ug_group = "bot" WHERE actor_user <> 0 AND ug_group IS NULL AND page_namespace = 0 AND page_is_redirect = 0 GROUP BY rev_actor ORDER BY tot DESC LIMIT 200 ```
    • Task
    **Feature summary** (what you would like to be able to do and where): * discussiontoolsedit API should have the option noautosignature * defaults to false * when set to true, prevents discussiontools from auto adding a signature **Use case(s)** (list the steps that you performed to discover that problem, and describe the actual underlying problem which you want to solve. Do not describe only a solution): * prevent over-signing. for example [[ https://en.wikipedia.org/w/index.php?title=User_talk:Saanvi_Samanta&oldid=1220439649 | here ]] or [[ https://en.wikipedia.org/w/index.php?title=User_talk:1882claret&oldid=1220429426 | here ]] * i chose to use the discussiontoolsedit API instead of the edit API in the examples above because discussiontoolsedit makes it easy to subscribe to the section. but the third signature is undesirable * there are many repos where I plan to convert the edit API to the discussiontoolsedit API so that I can give users the option of auto subscribing, including [[ https://github.com/wikimedia-gadgets/afc-helper/pull/337 | AFC helper script ]], [[ https://github.com/wikimedia-gadgets/twinkle/issues/1848 | Twinkle ]], and PageTriage (T329346) **Benefits** (why should this be implemented?):
    • Task
    **Note**: the reproduction steps for this issue are somewhat incomplete. Some steps may be redundant or missing. **Steps to replicate the issue** (include links if applicable): * Have a MW environment using SQLite and CentralAuth installed * Run the [[https://gerrit.wikimedia.org/g/mediawiki/extensions/CampaignEvents/+/3c2c1084e64f23263604421b2cbf1008717350fc/tests/selenium/specs/editEventRegistration.js | editEventRegistration.js]] selenium spec in CampaignEvents (`npm run selenium-test -- --spec tests/selenium/specs/editEventRegistration.js`) **What happens?**: It goes to the login page, submits the form, and then it hangs forever. **What should have happened instead?**: It should login normally. **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): master **Other information** (browser name/version, screenshots, etc.): The following lines get written to the log file when the test times out: ```name=mw-dberror.log,lines=12 Wed Apr 24 0:42:49 UTC 2024 4133e281f1a3 my_wiki Error 5 from Wikimedia\Rdbms\Database::beginIfImplied (MessageCache::loadFromDB(en)-big), database is locked BEGIN IMMEDIATE localhost #0 /var/www/html/w/includes/libs/rdbms/database/Database.php(1161): Wikimedia\Rdbms\Database->getQueryExceptionAndLog() #1 /var/www/html/w/includes/libs/rdbms/database/Database.php(652): Wikimedia\Rdbms\Database->reportQueryError() #2 /var/www/html/w/includes/libs/rdbms/database/DatabaseSqlite.php(660): Wikimedia\Rdbms\Database->query() #3 /var/www/html/w/includes/libs/rdbms/database/Database.php(2334): Wikimedia\Rdbms\DatabaseSqlite->doBegin() #4 /var/www/html/w/includes/libs/rdbms/database/Database.php(942): Wikimedia\Rdbms\Database->begin() #5 /var/www/html/w/includes/libs/rdbms/database/Database.php(711): Wikimedia\Rdbms\Database->beginIfImplied() #6 /var/www/html/w/includes/libs/rdbms/database/Database.php(643): Wikimedia\Rdbms\Database->executeQuery() #7 /var/www/html/w/includes/libs/rdbms/database/Database.php(1350): Wikimedia\Rdbms\Database->query() #8 /var/www/html/w/includes/libs/rdbms/database/DBConnRef.php(126): Wikimedia\Rdbms\Database->select() #9 /var/www/html/w/includes/libs/rdbms/database/DBConnRef.php(358): Wikimedia\Rdbms\DBConnRef->__call() #10 /var/www/html/w/includes/libs/rdbms/querybuilder/SelectQueryBuilder.php(730): Wikimedia\Rdbms\DBConnRef->select() #11 /var/www/html/w/includes/language/MessageCache.php(622): Wikimedia\Rdbms\SelectQueryBuilder->fetchResultSet() #12 /var/www/html/w/includes/language/MessageCache.php(511): MessageCache->loadFromDB() #13 /var/www/html/w/includes/language/MessageCache.php(428): MessageCache->loadFromDBWithMainLock() #14 /var/www/html/w/includes/language/MessageCache.php(348): MessageCache->loadUnguarded() #15 /var/www/html/w/includes/language/MessageCache.php(1317): MessageCache->load() #16 /var/www/html/w/includes/language/MessageCache.php(1222): MessageCache->getMsgFromNamespace() #17 /var/www/html/w/includes/language/MessageCache.php(1193): MessageCache->getMessageForLang() #18 /var/www/html/w/includes/language/MessageCache.php(1091): MessageCache->getMessageFromFallbackChain() #19 /var/www/html/w/includes/Message/Message.php(1536): MessageCache->get() #20 /var/www/html/w/includes/Message/Message.php(1024): MediaWiki\Message\Message->fetchMessage() #21 /var/www/html/w/includes/Message/Message.php(1111): MediaWiki\Message\Message->format() #22 /var/www/html/w/includes/language/Language.php(715): MediaWiki\Message\Message->text() #23 /var/www/html/w/includes/language/Language.php(4293): Language->getMessageFromDB() #24 /var/www/html/w/includes/language/Language.php(4310): Language->formatComputingNumbers() #25 /var/www/html/w/includes/debug/MWDebug.php(771): Language->formatSize() #26 /var/www/html/w/includes/debug/MWDebug.php(724): MWDebug::getDebugInfo() #27 /var/www/html/w/includes/api/ApiMain.php(1954): MWDebug::appendDebugInfoToApiResult() #28 /var/www/html/w/includes/api/ApiMain.php(924): ApiMain->executeAction() #29 /var/www/html/w/includes/api/ApiMain.php(895): ApiMain->executeActionWithErrorHandling() #30 /var/www/html/w/includes/api/ApiEntryPoint.php(158): ApiMain->execute() #31 /var/www/html/w/includes/MediaWikiEntryPoint.php(199): MediaWiki\Api\ApiEntryPoint->execute() #32 /var/www/html/w/api.php(44): MediaWiki\MediaWikiEntryPoint->run() #33 {main} ``` ```name=mw-debug-www.log,lines=12 Start request GET /w/index.php?title=Special%3AUserLogin [... various request headers...] ParserFactory: using default preprocessor [CentralAuth] Loading state for global user 172.18.0.1 from DB [CentralAuth] Loading attached wiki list for global user 172.18.0.1 from DB [CentralAuth] Loading groups for global user 172.18.0.1 [CentralAuth] Loading CentralAuthUser for user 172.18.0.1 from cache object [cookie] setcookie: "my_wiki_session", "5rels4rvgvgguu9p2auuq1i0ho4fperv", "0", "/", "", "", "1", "" [cookie] already deleted setcookie: "my_wikiUserID", "", "1682383307", "/", "", "", "1", "" [cookie] already deleted setcookie: "my_wikiToken", "", "1682383307", "/", "", "", "1", "" [cookie] already set setcookie: "my_wiki_session", "5rels4rvgvgguu9p2auuq1i0ho4fperv", "0", "/", "", "", "1", "" [cookie] already deleted setcookie: "my_wikiUserID", "", "1682383307", "/", "", "", "1", "" [cookie] already deleted setcookie: "my_wikiToken", "", "1682383307", "/", "", "", "1", "" [BlockManager] Block cache miss with key BlockCacheKey{request=#413,user=#642,replica} MediaWiki\MediaWikiEntryPoint::commitMainTransaction: primary transaction round committed MediaWiki\MediaWikiEntryPoint::commitMainTransaction: pre-send deferred updates completed MediaWiki\MediaWikiEntryPoint::commitMainTransaction: session changes committed [BlockManager] Block cache hit with key BlockCacheKey{request=#413,user=#642,replica} [gitinfo] Candidate cacheFile=/var/www/html/w/gitinfo.json for /var/www/html/w [gitinfo] Cache incomplete for /var/www/html/w MediaWiki\Output\OutputPage::sendCacheControl: no caching ** Request ended normally Start request POST /wiki/Special:UserLogin [... various request headers...] ParserFactory: using default preprocessor Start request POST /w/api.php?format=json [... various request headers...] [cookie] already deleted setcookie: "my_wiki_session", "", "1682383309", "/", "", "", "1", "" [cookie] already deleted setcookie: "my_wikiUserID", "", "1682383309", "/", "", "", "1", "" [cookie] already deleted setcookie: "my_wikiToken", "", "1682383309", "/", "", "", "1", "" ParserFactory: using default preprocessor [CentralAuth] Loading state for global user 172.18.0.1 from DB [CentralAuth] Loading attached wiki list for global user 172.18.0.1 from DB [CentralAuth] Loading groups for global user 172.18.0.1 [CentralAuth] Loading CentralAuthUser for user 172.18.0.1 from cache object [cookie] setcookie: "my_wiki_session", "ebpiinuidmef085smeuopca1qvph9ver", "0", "/", "", "", "1", "" [cookie] already deleted setcookie: "my_wikiUserID", "", "1682383309", "/", "", "", "1", "" [cookie] already deleted setcookie: "my_wikiToken", "", "1682383309", "/", "", "", "1", "" [cookie] already set setcookie: "my_wiki_session", "ebpiinuidmef085smeuopca1qvph9ver", "0", "/", "", "", "1", "" [cookie] already deleted setcookie: "my_wikiUserID", "", "1682383309", "/", "", "", "1", "" [cookie] already deleted setcookie: "my_wikiToken", "", "1682383309", "/", "", "", "1", "" [BlockManager] Block cache miss with key BlockCacheKey{request=#413,user=#665,replica} [authevents] Login attempt [gitinfo] Candidate cacheFile=/var/www/html/w/gitinfo.json for /var/www/html/w [gitinfo] Cache incomplete for /var/www/html/w User: cache miss for user 1 [CentralAuth] Loading state for global user Admin from DB ApiMain::setCacheMode: setting cache mode private [cache-cookies] Cookies set on /w/api.php?format=json with Cache-Control "<not set>" [CentralAuth] Loading attached wiki list for global user Admin from DB [CentralAuth] Loading groups for global user Admin [CentralAuth] Loading CentralAuthUser for user Admin from cache object User: cache miss for user 1 [CentralAuth] authentication for 'Admin' succeeded [authentication] Primary login with MediaWiki\Extension\CentralAuth\CentralAuthPrimaryAuthenticationProvider succeeded [authentication] Login for Admin succeeded from 172.18.0.1 [cookie] setcookie: "my_wiki_session", "gjrn7ehspf2l9bu6cf89o0pad77605k1", "0", "/", "", "", "1", "" [cookie] setcookie: "my_wikiUserID", "1", "1716511369", "/", "", "", "1", "" [cookie] setcookie: "my_wikiUserName", "Admin", "1716511369", "/", "", "", "1", "" [cookie] already deleted setcookie: "my_wikiToken", "", "1682383369", "/", "", "", "1", "" [cookie] already set setcookie: "my_wiki_session", "gjrn7ehspf2l9bu6cf89o0pad77605k1", "0", "/", "", "", "1", "" [cookie] already set setcookie: "my_wikiUserID", "1", "1716511369", "/", "", "", "1", "" [cookie] already set setcookie: "my_wikiUserName", "Admin", "1716511369", "/", "", "", "1", "" [cookie] already deleted setcookie: "my_wikiToken", "", "1682383369", "/", "", "", "1", "" [cookie] already set setcookie: "my_wiki_session", "gjrn7ehspf2l9bu6cf89o0pad77605k1", "0", "/", "", "", "1", "" [cookie] already set setcookie: "my_wikiUserID", "1", "1716511369", "/", "", "", "1", "" [cookie] already set setcookie: "my_wikiUserName", "Admin", "1716511369", "/", "", "", "1", "" [cookie] already deleted setcookie: "my_wikiToken", "", "1682383369", "/", "", "", "1", "" [cookie] already set setcookie: "my_wiki_session", "gjrn7ehspf2l9bu6cf89o0pad77605k1", "0", "/", "", "", "1", "" [cookie] already set setcookie: "my_wikiUserID", "1", "1716511369", "/", "", "", "1", "" [cookie] already set setcookie: "my_wikiUserName", "Admin", "1716511369", "/", "", "", "1", "" [cookie] already deleted setcookie: "my_wikiToken", "", "1682383369", "/", "", "", "1", "" [authevents] Login attempt MediaWiki\MediaWikiEntryPoint::commitMainTransaction: primary transaction round committed MediaWiki\MediaWikiEntryPoint::commitMainTransaction: pre-send deferred updates completed MediaWiki\MediaWikiEntryPoint::commitMainTransaction: session changes committed [BlockManager] Block cache miss with key BlockCacheKey{request=none,user=#1168,replica} MediaWiki\Output\OutputPage::sendCacheControl: no caching ** Request ended normally Start request GET /wiki/Main_Page [... various request headers...] User: cache miss for user 1 [CentralAuth] Loading state for global user Admin from DB [CentralAuth] Loading attached wiki list for global user Admin from DB [CentralAuth] Loading groups for global user Admin [CentralAuth] Loading CentralAuthUser for user Admin from cache object ParserFactory: using default preprocessor Unstubbing $wgLang on call of $wgLang::getDatePreferenceMigrationMap from ParserOptions::initDateFormat Article::generateContentOutput: doing uncached parse MediaWiki\MediaWikiEntryPoint::commitMainTransaction: primary transaction round committed MediaWiki\MediaWikiEntryPoint::commitMainTransaction: pre-send deferred updates completed MediaWiki\MediaWikiEntryPoint::commitMainTransaction: session changes committed MediaWiki\Output\OutputPage::haveCacheVaryCookies: found my_wiki_session [BlockManager] Block cache miss with key BlockCacheKey{request=none,user=#678,replica} [gitinfo] Candidate cacheFile=/var/www/html/w/gitinfo.json for /var/www/html/w [gitinfo] Cache incomplete for /var/www/html/w MediaWiki\Output\OutputPage::sendCacheControl: private caching (config); ** Request ended normally ``` ```name=mw-error.log,lines=12 2024-04-24 00:42:49 4133e281f1a3 my_wiki: [5828c5ad7746ae71241d5012] /w/api.php?format=json Wikimedia\Rdbms\DBUnexpectedError: Cannot execute Wikimedia\Rdbms\Database::runOnTransactionIdleCallbacks critical section while session state is out of sync. A critical section from Wikimedia\Rdbms\Database::executeQuery has failed #0 /var/www/html/w/vendor/wikimedia/request-timeout/src/CriticalSectionScope.php(44): Wikimedia\Rdbms\Database->Wikimedia\Rdbms\{closure}() #1 /var/www/html/w/includes/libs/rdbms/database/Database.php(643): Wikimedia\RequestTimeout\CriticalSectionScope->__destruct() #2 /var/www/html/w/includes/libs/rdbms/database/Database.php(1350): Wikimedia\Rdbms\Database->query() #3 /var/www/html/w/includes/libs/rdbms/database/DBConnRef.php(126): Wikimedia\Rdbms\Database->select() #4 /var/www/html/w/includes/libs/rdbms/database/DBConnRef.php(358): Wikimedia\Rdbms\DBConnRef->__call() #5 /var/www/html/w/includes/libs/rdbms/querybuilder/SelectQueryBuilder.php(730): Wikimedia\Rdbms\DBConnRef->select() #6 /var/www/html/w/includes/language/MessageCache.php(622): Wikimedia\Rdbms\SelectQueryBuilder->fetchResultSet() #7 /var/www/html/w/includes/language/MessageCache.php(511): MessageCache->loadFromDB() #8 /var/www/html/w/includes/language/MessageCache.php(428): MessageCache->loadFromDBWithMainLock() #9 /var/www/html/w/includes/language/MessageCache.php(348): MessageCache->loadUnguarded() #10 /var/www/html/w/includes/language/MessageCache.php(1317): MessageCache->load() #11 /var/www/html/w/includes/language/MessageCache.php(1222): MessageCache->getMsgFromNamespace() #12 /var/www/html/w/includes/language/MessageCache.php(1193): MessageCache->getMessageForLang() #13 /var/www/html/w/includes/language/MessageCache.php(1091): MessageCache->getMessageFromFallbackChain() #14 /var/www/html/w/includes/Message/Message.php(1536): MessageCache->get() #15 /var/www/html/w/includes/Message/Message.php(1024): MediaWiki\Message\Message->fetchMessage() #16 /var/www/html/w/includes/Message/Message.php(1111): MediaWiki\Message\Message->format() #17 /var/www/html/w/includes/language/Language.php(715): MediaWiki\Message\Message->text() #18 /var/www/html/w/includes/language/Language.php(4293): Language->getMessageFromDB() #19 /var/www/html/w/includes/language/Language.php(4310): Language->formatComputingNumbers() #20 /var/www/html/w/includes/debug/MWDebug.php(771): Language->formatSize() #21 /var/www/html/w/includes/debug/MWDebug.php(724): MWDebug::getDebugInfo() #22 /var/www/html/w/includes/api/ApiMain.php(1954): MWDebug::appendDebugInfoToApiResult() #23 /var/www/html/w/includes/api/ApiMain.php(924): ApiMain->executeAction() #24 /var/www/html/w/includes/api/ApiMain.php(895): ApiMain->executeActionWithErrorHandling() #25 /var/www/html/w/includes/api/ApiEntryPoint.php(158): ApiMain->execute() #26 /var/www/html/w/includes/MediaWikiEntryPoint.php(199): MediaWiki\Api\ApiEntryPoint->execute() #27 /var/www/html/w/api.php(44): MediaWiki\MediaWikiEntryPoint->run() #28 {main} #0 /var/www/html/w/includes/libs/rdbms/database/Database.php(2035): Wikimedia\Rdbms\Database->commenceCriticalSection() #1 /var/www/html/w/includes/libs/rdbms/loadbalancer/LoadBalancer.php(1525): Wikimedia\Rdbms\Database->runOnTransactionIdleCallbacks() #2 /var/www/html/w/includes/libs/rdbms/lbfactory/LBFactory.php(377): Wikimedia\Rdbms\LoadBalancer->runPrimaryTransactionIdleCallbacks() #3 /var/www/html/w/includes/libs/rdbms/lbfactory/LBFactory.php(352): Wikimedia\Rdbms\LBFactory->executePostTransactionCallbacks() #4 /var/www/html/w/includes/exception/MWExceptionHandler.php(168): Wikimedia\Rdbms\LBFactory->rollbackPrimaryChanges() #5 /var/www/html/w/includes/exception/MWExceptionHandler.php(193): MWExceptionHandler::rollbackPrimaryChanges() #6 /var/www/html/w/includes/api/ApiMain.php(977): MWExceptionHandler::rollbackPrimaryChangesAndLog() #7 /var/www/html/w/includes/api/ApiMain.php(933): ApiMain->handleException() #8 /var/www/html/w/includes/api/ApiMain.php(895): ApiMain->executeActionWithErrorHandling() #9 /var/www/html/w/includes/api/ApiEntryPoint.php(158): ApiMain->execute() #10 /var/www/html/w/includes/MediaWikiEntryPoint.php(199): MediaWiki\Api\ApiEntryPoint->execute() #11 /var/www/html/w/api.php(44): MediaWiki\MediaWikiEntryPoint->run() #12 {main} 2024-04-24 00:42:49 4133e281f1a3 my_wiki: [5828c5ad7746ae71241d5012] /w/api.php?format=json Wikimedia\Rdbms\DBQueryError: Error 5: database is locked Function: Wikimedia\Rdbms\Database::beginIfImplied (MessageCache::loadFromDB(en)-big) Query: BEGIN IMMEDIATE #0 /var/www/html/w/includes/libs/rdbms/database/Database.php(1187): Wikimedia\Rdbms\Database->getQueryException() #1 /var/www/html/w/includes/libs/rdbms/database/Database.php(1161): Wikimedia\Rdbms\Database->getQueryExceptionAndLog() #2 /var/www/html/w/includes/libs/rdbms/database/Database.php(652): Wikimedia\Rdbms\Database->reportQueryError() #3 /var/www/html/w/includes/libs/rdbms/database/DatabaseSqlite.php(660): Wikimedia\Rdbms\Database->query() #4 /var/www/html/w/includes/libs/rdbms/database/Database.php(2334): Wikimedia\Rdbms\DatabaseSqlite->doBegin() #5 /var/www/html/w/includes/libs/rdbms/database/Database.php(942): Wikimedia\Rdbms\Database->begin() #6 /var/www/html/w/includes/libs/rdbms/database/Database.php(711): Wikimedia\Rdbms\Database->beginIfImplied() #7 /var/www/html/w/includes/libs/rdbms/database/Database.php(643): Wikimedia\Rdbms\Database->executeQuery() #8 /var/www/html/w/includes/libs/rdbms/database/Database.php(1350): Wikimedia\Rdbms\Database->query() #9 /var/www/html/w/includes/libs/rdbms/database/DBConnRef.php(126): Wikimedia\Rdbms\Database->select() #10 /var/www/html/w/includes/libs/rdbms/database/DBConnRef.php(358): Wikimedia\Rdbms\DBConnRef->__call() #11 /var/www/html/w/includes/libs/rdbms/querybuilder/SelectQueryBuilder.php(730): Wikimedia\Rdbms\DBConnRef->select() #12 /var/www/html/w/includes/language/MessageCache.php(622): Wikimedia\Rdbms\SelectQueryBuilder->fetchResultSet() #13 /var/www/html/w/includes/language/MessageCache.php(511): MessageCache->loadFromDB() #14 /var/www/html/w/includes/language/MessageCache.php(428): MessageCache->loadFromDBWithMainLock() #15 /var/www/html/w/includes/language/MessageCache.php(348): MessageCache->loadUnguarded() #16 /var/www/html/w/includes/language/MessageCache.php(1317): MessageCache->load() #17 /var/www/html/w/includes/language/MessageCache.php(1222): MessageCache->getMsgFromNamespace() #18 /var/www/html/w/includes/language/MessageCache.php(1193): MessageCache->getMessageForLang() #19 /var/www/html/w/includes/language/MessageCache.php(1091): MessageCache->getMessageFromFallbackChain() #20 /var/www/html/w/includes/Message/Message.php(1536): MessageCache->get() #21 /var/www/html/w/includes/Message/Message.php(1024): MediaWiki\Message\Message->fetchMessage() #22 /var/www/html/w/includes/Message/Message.php(1111): MediaWiki\Message\Message->format() #23 /var/www/html/w/includes/language/Language.php(715): MediaWiki\Message\Message->text() #24 /var/www/html/w/includes/language/Language.php(4293): Language->getMessageFromDB() #25 /var/www/html/w/includes/language/Language.php(4310): Language->formatComputingNumbers() #26 /var/www/html/w/includes/debug/MWDebug.php(771): Language->formatSize() #27 /var/www/html/w/includes/debug/MWDebug.php(724): MWDebug::getDebugInfo() #28 /var/www/html/w/includes/api/ApiMain.php(1954): MWDebug::appendDebugInfoToApiResult() #29 /var/www/html/w/includes/api/ApiMain.php(924): ApiMain->executeAction() #30 /var/www/html/w/includes/api/ApiMain.php(895): ApiMain->executeActionWithErrorHandling() #31 /var/www/html/w/includes/api/ApiEntryPoint.php(158): ApiMain->execute() #32 /var/www/html/w/includes/MediaWikiEntryPoint.php(199): MediaWiki\Api\ApiEntryPoint->execute() #33 /var/www/html/w/api.php(44): MediaWiki\MediaWikiEntryPoint->run() #34 {main} 2024-04-24 00:42:49 4133e281f1a3 my_wiki: [5828c5ad7746ae71241d5012] /w/api.php?format=json Wikimedia\Rdbms\DBUnexpectedError: Cannot execute Wikimedia\Rdbms\Database::runOnTransactionIdleCallbacks critical section while session state is out of sync. A critical section from Wikimedia\Rdbms\Database::executeQuery has failed #0 /var/www/html/w/vendor/wikimedia/request-timeout/src/CriticalSectionScope.php(44): Wikimedia\Rdbms\Database->Wikimedia\Rdbms\{closure}() #1 /var/www/html/w/includes/libs/rdbms/database/Database.php(643): Wikimedia\RequestTimeout\CriticalSectionScope->__destruct() #2 /var/www/html/w/includes/libs/rdbms/database/Database.php(1350): Wikimedia\Rdbms\Database->query() #3 /var/www/html/w/includes/libs/rdbms/database/DBConnRef.php(126): Wikimedia\Rdbms\Database->select() #4 /var/www/html/w/includes/libs/rdbms/database/DBConnRef.php(358): Wikimedia\Rdbms\DBConnRef->__call() #5 /var/www/html/w/includes/libs/rdbms/querybuilder/SelectQueryBuilder.php(730): Wikimedia\Rdbms\DBConnRef->select() #6 /var/www/html/w/includes/language/MessageCache.php(622): Wikimedia\Rdbms\SelectQueryBuilder->fetchResultSet() #7 /var/www/html/w/includes/language/MessageCache.php(511): MessageCache->loadFromDB() #8 /var/www/html/w/includes/language/MessageCache.php(428): MessageCache->loadFromDBWithMainLock() #9 /var/www/html/w/includes/language/MessageCache.php(348): MessageCache->loadUnguarded() #10 /var/www/html/w/includes/language/MessageCache.php(1317): MessageCache->load() #11 /var/www/html/w/includes/language/MessageCache.php(1222): MessageCache->getMsgFromNamespace() #12 /var/www/html/w/includes/language/MessageCache.php(1193): MessageCache->getMessageForLang() #13 /var/www/html/w/includes/language/MessageCache.php(1091): MessageCache->getMessageFromFallbackChain() #14 /var/www/html/w/includes/Message/Message.php(1536): MessageCache->get() #15 /var/www/html/w/includes/Message/Message.php(1024): MediaWiki\Message\Message->fetchMessage() #16 /var/www/html/w/includes/Message/Message.php(1111): MediaWiki\Message\Message->format() #17 /var/www/html/w/includes/language/Language.php(715): MediaWiki\Message\Message->text() #18 /var/www/html/w/includes/language/Language.php(4293): Language->getMessageFromDB() #19 /var/www/html/w/includes/language/Language.php(4310): Language->formatComputingNumbers() #20 /var/www/html/w/includes/debug/MWDebug.php(771): Language->formatSize() #21 /var/www/html/w/includes/debug/MWDebug.php(724): MWDebug::getDebugInfo() #22 /var/www/html/w/includes/api/ApiMain.php(1954): MWDebug::appendDebugInfoToApiResult() #23 /var/www/html/w/includes/api/ApiMain.php(924): ApiMain->executeAction() #24 /var/www/html/w/includes/api/ApiMain.php(895): ApiMain->executeActionWithErrorHandling() #25 /var/www/html/w/includes/api/ApiEntryPoint.php(158): ApiMain->execute() #26 /var/www/html/w/includes/MediaWikiEntryPoint.php(199): MediaWiki\Api\ApiEntryPoint->execute() #27 /var/www/html/w/api.php(44): MediaWiki\MediaWikiEntryPoint->run() #28 {main} #0 /var/www/html/w/includes/libs/rdbms/database/Database.php(2035): Wikimedia\Rdbms\Database->commenceCriticalSection() #1 /var/www/html/w/includes/libs/rdbms/loadbalancer/LoadBalancer.php(1525): Wikimedia\Rdbms\Database->runOnTransactionIdleCallbacks() #2 /var/www/html/w/includes/libs/rdbms/lbfactory/LBFactory.php(377): Wikimedia\Rdbms\LoadBalancer->runPrimaryTransactionIdleCallbacks() #3 /var/www/html/w/includes/libs/rdbms/lbfactory/LBFactory.php(352): Wikimedia\Rdbms\LBFactory->executePostTransactionCallbacks() #4 /var/www/html/w/includes/exception/MWExceptionHandler.php(168): Wikimedia\Rdbms\LBFactory->rollbackPrimaryChanges() #5 /var/www/html/w/includes/exception/MWExceptionHandler.php(193): MWExceptionHandler::rollbackPrimaryChanges() #6 /var/www/html/w/includes/exception/MWExceptionHandler.php(236): MWExceptionHandler::rollbackPrimaryChangesAndLog() #7 /var/www/html/w/includes/MediaWikiEntryPoint.php(221): MWExceptionHandler::handleException() #8 /var/www/html/w/includes/MediaWikiEntryPoint.php(205): MediaWiki\MediaWikiEntryPoint->handleTopLevelError() #9 /var/www/html/w/api.php(44): MediaWiki\MediaWikiEntryPoint->run() #10 {main} 2024-04-24 00:42:49 4133e281f1a3 my_wiki: [5828c5ad7746ae71241d5012] /w/api.php?format=json Wikimedia\Rdbms\DBTransactionError: Transaction round stage must be 'cursory' (not 'within-commit-callbacks') #0 /var/www/html/w/includes/libs/rdbms/lbfactory/LBFactory.php(304): Wikimedia\Rdbms\LBFactory->assertTransactionRoundStage() #1 /var/www/html/w/includes/MediaWikiEntryPoint.php(297): Wikimedia\Rdbms\LBFactory->commitPrimaryChanges() #2 /var/www/html/w/includes/MediaWikiEntryPoint.php(188): MediaWiki\MediaWikiEntryPoint->commitMainTransaction() #3 /var/www/html/w/includes/MediaWikiEntryPoint.php(171): MediaWiki\MediaWikiEntryPoint->doPrepareForOutput() #4 /var/www/html/w/includes/MediaWiki.php(90): MediaWiki\MediaWikiEntryPoint->prepareForOutput() #5 /var/www/html/w/includes/api/ApiMain.php(952): MediaWiki::preOutputCommit() #6 /var/www/html/w/includes/api/ApiMain.php(895): ApiMain->executeActionWithErrorHandling() #7 /var/www/html/w/includes/api/ApiEntryPoint.php(158): ApiMain->execute() #8 /var/www/html/w/includes/MediaWikiEntryPoint.php(199): MediaWiki\Api\ApiEntryPoint->execute() #9 /var/www/html/w/api.php(44): MediaWiki\MediaWikiEntryPoint->run() #10 {main} 2024-04-24 00:42:49 4133e281f1a3 my_wiki: [5828c5ad7746ae71241d5012] /w/api.php?format=json Wikimedia\Rdbms\DBUnexpectedError: Cannot execute Wikimedia\Rdbms\Database::runOnTransactionIdleCallbacks critical section while session state is out of sync. A critical section from Wikimedia\Rdbms\Database::executeQuery has failed #0 /var/www/html/w/vendor/wikimedia/request-timeout/src/CriticalSectionScope.php(44): Wikimedia\Rdbms\Database->Wikimedia\Rdbms\{closure}() #1 /var/www/html/w/includes/libs/rdbms/database/Database.php(643): Wikimedia\RequestTimeout\CriticalSectionScope->__destruct() #2 /var/www/html/w/includes/libs/rdbms/database/Database.php(1350): Wikimedia\Rdbms\Database->query() #3 /var/www/html/w/includes/libs/rdbms/database/DBConnRef.php(126): Wikimedia\Rdbms\Database->select() #4 /var/www/html/w/includes/libs/rdbms/database/DBConnRef.php(358): Wikimedia\Rdbms\DBConnRef->__call() #5 /var/www/html/w/includes/libs/rdbms/querybuilder/SelectQueryBuilder.php(730): Wikimedia\Rdbms\DBConnRef->select() #6 /var/www/html/w/includes/language/MessageCache.php(622): Wikimedia\Rdbms\SelectQueryBuilder->fetchResultSet() #7 /var/www/html/w/includes/language/MessageCache.php(511): MessageCache->loadFromDB() #8 /var/www/html/w/includes/language/MessageCache.php(428): MessageCache->loadFromDBWithMainLock() #9 /var/www/html/w/includes/language/MessageCache.php(348): MessageCache->loadUnguarded() #10 /var/www/html/w/includes/language/MessageCache.php(1317): MessageCache->load() #11 /var/www/html/w/includes/language/MessageCache.php(1222): MessageCache->getMsgFromNamespace() #12 /var/www/html/w/includes/language/MessageCache.php(1193): MessageCache->getMessageForLang() #13 /var/www/html/w/includes/language/MessageCache.php(1091): MessageCache->getMessageFromFallbackChain() #14 /var/www/html/w/includes/Message/Message.php(1536): MessageCache->get() #15 /var/www/html/w/includes/Message/Message.php(1024): MediaWiki\Message\Message->fetchMessage() #16 /var/www/html/w/includes/Message/Message.php(1111): MediaWiki\Message\Message->format() #17 /var/www/html/w/includes/language/Language.php(715): MediaWiki\Message\Message->text() #18 /var/www/html/w/includes/language/Language.php(4293): Language->getMessageFromDB() #19 /var/www/html/w/includes/language/Language.php(4310): Language->formatComputingNumbers() #20 /var/www/html/w/includes/debug/MWDebug.php(771): Language->formatSize() #21 /var/www/html/w/includes/debug/MWDebug.php(724): MWDebug::getDebugInfo() #22 /var/www/html/w/includes/api/ApiMain.php(1954): MWDebug::appendDebugInfoToApiResult() #23 /var/www/html/w/includes/api/ApiMain.php(924): ApiMain->executeAction() #24 /var/www/html/w/includes/api/ApiMain.php(895): ApiMain->executeActionWithErrorHandling() #25 /var/www/html/w/includes/api/ApiEntryPoint.php(158): ApiMain->execute() #26 /var/www/html/w/includes/MediaWikiEntryPoint.php(199): MediaWiki\Api\ApiEntryPoint->execute() #27 /var/www/html/w/api.php(44): MediaWiki\MediaWikiEntryPoint->run() #28 {main} #0 /var/www/html/w/includes/libs/rdbms/database/Database.php(2035): Wikimedia\Rdbms\Database->commenceCriticalSection() #1 /var/www/html/w/includes/libs/rdbms/loadbalancer/LoadBalancer.php(1525): Wikimedia\Rdbms\Database->runOnTransactionIdleCallbacks() #2 /var/www/html/w/includes/libs/rdbms/lbfactory/LBFactory.php(377): Wikimedia\Rdbms\LoadBalancer->runPrimaryTransactionIdleCallbacks() #3 /var/www/html/w/includes/libs/rdbms/lbfactory/LBFactory.php(352): Wikimedia\Rdbms\LBFactory->executePostTransactionCallbacks() #4 /var/www/html/w/includes/exception/MWExceptionHandler.php(168): Wikimedia\Rdbms\LBFactory->rollbackPrimaryChanges() #5 /var/www/html/w/includes/exception/MWExceptionHandler.php(193): MWExceptionHandler::rollbackPrimaryChanges() #6 /var/www/html/w/includes/MediaWikiEntryPoint.php(499): MWExceptionHandler::rollbackPrimaryChangesAndLog() #7 /var/www/html/w/includes/MediaWikiEntryPoint.php(453): MediaWiki\MediaWikiEntryPoint->doPostOutputShutdown() #8 /var/www/html/w/includes/MediaWikiEntryPoint.php(208): MediaWiki\MediaWikiEntryPoint->postOutputShutdown() #9 /var/www/html/w/api.php(44): MediaWiki\MediaWikiEntryPoint->run() #10 {main} 2024-04-24 00:42:49 4133e281f1a3 my_wiki: [5828c5ad7746ae71241d5012] /w/api.php?format=json Wikimedia\Rdbms\DBTransactionError: Transaction round stage must be 'cursory' (not 'within-commit-callbacks') #0 /var/www/html/w/includes/libs/rdbms/lbfactory/LBFactory.php(304): Wikimedia\Rdbms\LBFactory->assertTransactionRoundStage() #1 /var/www/html/w/includes/MediaWikiEntryPoint.php(659): Wikimedia\Rdbms\LBFactory->commitPrimaryChanges() #2 /var/www/html/w/includes/MediaWikiEntryPoint.php(495): MediaWiki\MediaWikiEntryPoint->restInPeace() #3 /var/www/html/w/includes/MediaWikiEntryPoint.php(453): MediaWiki\MediaWikiEntryPoint->doPostOutputShutdown() #4 /var/www/html/w/includes/MediaWikiEntryPoint.php(208): MediaWiki\MediaWikiEntryPoint->postOutputShutdown() #5 /var/www/html/w/api.php(44): MediaWiki\MediaWikiEntryPoint->run() #6 {main} ``` While I'm not familiar with the RDBMS internals, this seems to suggest a deadlock of some sort. The selenium spec does: ```name=editEventRegistration.js,lang=js event = Util.getTestString( 'Event:Test EditEventRegistration' ); await LoginPage.loginAdmin(); await EventRegistrationPage.createEventPage( event ); ``` where `createEventPage` calls `await Api.bot()`, which calls `MWBot.loginGetEditToken()`, which in turns makes an API login request. The `loginAdmin` call ends, successfully, but then it's the API request in `Api.bot()` which hangs. Putting all this together, what I think is happening is that `loginAdmin` does not wait for the form to be successfully submitted, and instead, it exists immediately after the submit button is clicked. The login API request is then also sent immediately, so there are two login requests being processed at the same time (one via the UI, one via the API). Something in there is likely causing a deadlock, but I haven't investigated what this would be exactly. Note that CentralAuth is required for this bug to appear; normal login works just fine. Also supporting the deadlock theory is the fact that the bug goes away if we sleep after the `loginAdmin` call (e.g., via `browser.pause()`). I think it would make sense for `loginAdmin` to wait until the form has been submitted successfully. This would resolve this bug, and also generally make our selenium tests more robust.
    • Task
    == Background Our night mode fixes do not apply to desktop for the many projects that do not use Main_page. This blocks a deploy to desktop beta feature. Certain styles should only apply to the main page. On the Minerva skin we do this via a class [[ https://gerrit.wikimedia.org/g/mediawiki/skins/MinervaNeue/+/4fe2a79c88b2408b937437b277ff4e0d8d1bb682/includes/Skins/SkinMinerva.php#639 | page-Main_Page ]]. On desktop this class doesn't exist as it is translated to the local language. == User story As a template editor on Vietnamese Wikipedia I want to add support for night theme at my own pace. == Acceptance criteria [] When visiting https://vi.wikipedia.org/wiki/Trang_Ch%C3%ADnh?vectornightmode=1 there should be no background on "Bài viết chọn lọc" - the styling should be consistent with mobile https://vi.m.wikipedia.org/wiki/Trang_Ch%C3%ADnh?minervanightmode=1 == Communication criteria - does this need an announcement or discussion? Not necessary.
    • Task
    The app is sending BCP47 language code to the endpoints: For Chinese (Taiwan) is: `zh-Hant-TW`, but the Wikidata only accepts the old one: `zh-tw`. **Steps to replicate the issue** (include links if applicable): - Add `Accept-Language: zh-tw` or `Accept-Language: zh-Hant-TW` when requesting https://zh.wikipedia.org/api/rest_v1/page/mobile-html/八尺門的辯護人 to see the difference. **What happens?**: The Wikidata endpoint does not support `BCP47` language code, and will fallback to `zh-hant`. Accept-Language: `zh-Hant`, `zh-Hant-TW` 臺灣電視劇 Accept-Language: `zh-TW` 台灣電視劇 **What should have happened instead?**: Should show the `zh-tw` one when requesting the endpoint with `Accept-Language: zh-Hant-TW`. Accept-Language: `zh-Hant` 臺灣電視劇 Accept-Language: `zh-TW`, `zh-Hant-TW` 台灣電視劇
    • Task
    **Steps to replicate the issue** (include links if applicable): * Navigate to explore settings either through settings or using the explore feed card menu * View the explore settings toggles **What happens?**: There is no settings toggle for Add an Image **What should have happened instead?**: There should be a settings toggle for Add an Image **Software version**: Tested on 7.4.11 (76) **Other information** (browser name/version, screenshots, etc.): {F48345010} {F48345025}
    • Task
    The new wiki's visibility will be: **Private**.
    • Task
    Please import `mywikisource` from incubator, once it is created. Thanks!
    • Task
    Per https://wikitech.wikimedia.org/wiki/Add_a_wiki once the wiki has been created
    • Task
    Per https://wikitech.wikimedia.org/wiki/Add_a_wiki once the wiki has been created
    • Task
    [] [[https://gerrit.wikimedia.org/g/mediawiki/services/restbase/deploy/+/master/scap/vars.yaml|RESTbase]] [x] [[https://gerrit.wikimedia.org/g/mediawiki/services/cxserver/+/master/config/languages.yaml|CX Config]] [x] [[https://gerrit.wikimedia.org/g/analytics/refinery/+/master/static_data/pageview/allowlist/allowlist.tsv|Analytics refinery]] [x] [[https://gerrit.wikimedia.org/g/pywikibot/core/+/master/pywikibot/families/wikisource_family.py|Pywikibot]] [] [[https://www.wikidata.org/w/api.php|Wikidata]] [] Namespaces [] Logos and wordmarks [] Import from Incubator [] Clean up old interwiki links [] Add the wiki to a CVNBot for SWMT monitoring
    • Task
    Please import `iglwiki` from incubator, once it is created. Thanks!
    • Task
    Per https://wikitech.wikimedia.org/wiki/Add_a_wiki once the wiki has been created
    • Task
    Per https://wikitech.wikimedia.org/wiki/Add_a_wiki once the wiki has been created
    • Task
    [] [[https://gerrit.wikimedia.org/g/mediawiki/services/restbase/deploy/+/master/scap/vars.yaml|RESTbase]] [x] [[https://gerrit.wikimedia.org/g/mediawiki/services/cxserver/+/master/config/languages.yaml|CX Config]] [x] [[https://gerrit.wikimedia.org/g/analytics/refinery/+/master/static_data/pageview/allowlist/allowlist.tsv|Analytics refinery]] [x] [[https://gerrit.wikimedia.org/g/pywikibot/core/+/master/pywikibot/families/wikipedia_family.py|Pywikibot]] [] [[https://www.wikidata.org/w/api.php|Wikidata]] [] Namespaces [] Logos and wordmarks [] Import from Incubator [] Clean up old interwiki links [] Add the wiki to a CVNBot for SWMT monitoring
    • Task
    Please import `kaawiktionary` from incubator, once it is created. Thanks!
    • Task
    Per https://wikitech.wikimedia.org/wiki/Add_a_wiki once the wiki has been created
    • Task
    Per https://wikitech.wikimedia.org/wiki/Add_a_wiki once the wiki has been created
    • Task
    [] [[https://gerrit.wikimedia.org/g/mediawiki/services/restbase/deploy/+/master/scap/vars.yaml|RESTbase]] [x] [[https://gerrit.wikimedia.org/g/mediawiki/services/cxserver/+/master/config/languages.yaml|CX Config]] [x] [[https://gerrit.wikimedia.org/g/analytics/refinery/+/master/static_data/pageview/allowlist/allowlist.tsv|Analytics refinery]] [x] [[https://gerrit.wikimedia.org/g/pywikibot/core/+/master/pywikibot/families/wiktionary_family.py|Pywikibot]] [] [[https://www.wikidata.org/w/api.php|Wikidata]] [] Namespaces [] Logos and wordmarks [] Import from Incubator [] Clean up old interwiki links [] Add the wiki to a CVNBot for SWMT monitoring
    • Task
    Please import `mswikisource` from incubator, once it is created. Thanks!
    • Task
    Per https://wikitech.wikimedia.org/wiki/Add_a_wiki once the wiki has been created
    • Task
    [x] [[https://gerrit.wikimedia.org/g/mediawiki/services/restbase/deploy/+/master/scap/vars.yaml|RESTbase]] [x] [[https://gerrit.wikimedia.org/g/mediawiki/services/cxserver/+/master/config/languages.yaml|CX Config]] [x] [[https://gerrit.wikimedia.org/g/analytics/refinery/+/master/static_data/pageview/allowlist/allowlist.tsv|Analytics refinery]] [x] [[https://gerrit.wikimedia.org/g/pywikibot/core/+/master/pywikibot/families/wikisource_family.py|Pywikibot]] [x] [[https://www.wikidata.org/w/api.php|Wikidata]] [] Namespaces [] Logos and wordmarks [] Import from Incubator [] Clean up old interwiki links [] Add the wiki to a CVNBot for SWMT monitoring
    • Task
    Please import `kawikisource` from incubator, once it is created. Thanks!
    • Task
    Per https://wikitech.wikimedia.org/wiki/Add_a_wiki once the wiki has been created
    • Task
    Per https://wikitech.wikimedia.org/wiki/Add_a_wiki once the wiki has been created
    • Task
    [] [[https://gerrit.wikimedia.org/g/mediawiki/services/restbase/deploy/+/master/scap/vars.yaml|RESTbase]] [x] [[https://gerrit.wikimedia.org/g/mediawiki/services/cxserver/+/master/config/languages.yaml|CX Config]] [x] [[https://gerrit.wikimedia.org/g/analytics/refinery/+/master/static_data/pageview/allowlist/allowlist.tsv|Analytics refinery]] [x] [[https://gerrit.wikimedia.org/g/pywikibot/core/+/master/pywikibot/families/wikisource_family.py|Pywikibot]] [] [[https://www.wikidata.org/w/api.php|Wikidata]] [] Namespaces [] Logos and wordmarks [] Import from Incubator [] Clean up old interwiki links [] Add the wiki to a CVNBot for SWMT monitoring
    • Task
    Given that I am interested in the Wishlist, I should see a homepage to help me understand where to go. From this page, I should be able to - View highlighted or recent focus areas - View recent or top-voted wishes (sortable table) - Link to all focus areas - Link to all wishes - Link to archived wishes - Submit a wish - View FAQ about wishes
    • Task
    Given that I'm interested in exploring and contributing to discussions about problem areas, when I go to the wishlist dashboard or focus area, I should see a link to view all focus areas And on this page, I should see a card for the focus area with a - Title - Truncated description - Status of focus area - Aggregate rating + participants
    • Task
    Citation Watchlist 1.0 https://en.wikipedia.org/w/index.php?title=Josiah_Leming&curid=20847171&diff=1220451442&oldid=1220451321 https://en.wikipedia.org/w/index.php?title=Josiah_Leming&curid=20847171&diff=1220451321&oldid=1220446960 https://en.wikipedia.org/wiki/Josiah_Leming{F48338787}
    • Task
    === Background In order to answer the research questions in T360195, we'd like to create a schema and server-side instrumentation to record IP reputation data (from stopforumspam.com), if it exists, for a given IP when edits and account creations occur. ==== Acceptance criteria Create a measurement plan for legal review Create an instrumentation spec for legal review
    • Task
    Given that I am viewing the community wishlist survey homepage and I want to view all old wishes, then I should see a link to "view all" **archived** wishes, and clicking that should open a table with all wishes that are archived - Wish name - # of votes - Assigned focus area ('uncategorized' if not in one) - Category/type - Project(s) - I should be able to sort by all columns - I should be able to paginate wishes (30) - I should be able to search by title
    • Task
    Given that I am viewing the community wishlist survey homepage and I want to view all wishes, then I should see a link to "view all" open wishes, and clicking that should open a table with all wishes that are not archived - Wish name - # of votes - Assigned focus area ('uncategorized' if not in one) - Category/type - Project(s) - wish status - I should be able to sort by all columns - I should be able to paginate wishes (30) - I should be able to search by title
    • Task
    As a viewer of a focus area, I should not be able to edit the page, with the exception of 1) discussions and 2) rating the focus area
    • Task
    Given that I am a native speaker of X language, and a Focus Area (and discussion) is written in Y language, when I open the Focus Are Page, then I should see a disclaimer that the wish is translatable to my native language - And first, I should see the ability to translate the page using extension:translate - And I should see a "quick translation" powered by - And if I click extension:translate, take me to the translate extension - And if I opt for MinT translation, the page should reload in my native language, and I should see a disclaimer that the content is machine translated and may not be accurate - And I should see a dissclaimer that the contents are machine translated - Given that a page has been translated via Extension:Translate, any user should still visit the single “Wish page” to view all content, discussion, and voting in their native language.
    • Task
    **Steps to replicate the issue** (include links if applicable): * Tap on Add an Image * Tap No * Choose a reason, and tap submit **What happens?**: The same image recommendation appears causing the user to have to tap 'Not sure' to load the next reccomendation. **What should have happened instead?**: The next image recommendation should appear **Software version**: Tested on 7.4.11 (75) **Other information** (browser name/version, screenshots, etc.): {F48335444}
    • Task
    Given that I’m logged in and have an opinion about a focus area, when I go to the focus area page, then I should see an average rating for the focus are and number of people who rated it, and be able to rate the focus area on impact. - And my ratings should be counted as one participant - And the score should be updated - And I should not be able to “re-rate” the area unless I "remove" my rating - And ratings by individuals should not be visible The rating system should increment by 1 star (1-5)
    • Task
    When nested, only the outermost curly braces should have meaning, the rest should be treated as part of the comment string. Instead the parser treats each of the braces as the start of a new comment.
    • Task
    Given that I am WMF staff and logged in, I should be able to create or update a focus area page The focus area page should include - Name (page slug + title) - Description - Project(s) impacted - Set s of the focus area (archived, open, in progress, complete) - Responsible team(s) I should assign wishes to the focus area, which should populate a table - Wish name (link) - Supports/votes - Category (bug, feature request, etc)
    • Task
    Given that I am staff or admin, I should be able to assign a “status” for any wish: - Archived - Submitted (default) - In progress - Completed And when the wish status is updated - People on the wish's watchlist should be notified - The “status” pill should be updated on all touchpoints (wish page, index pages) for any user to see Any when the wish is “archived,” then the wish should be removed from the wish index page, but kept on any focus area pages (if applicable)
    • Task
    Given that I am a native speaker of X language, and a Wish (and discussion) is written in Y language, when I open the Wish Page, then I should see a disclaimer that the wish is translatable to my native language - And first, I should see the ability to translate the page using extension:translate - And I should see a "quick translation" powered by - And if I click extension:translate, take me to the translate extension - And if I opt for MinT translation, the page should reload in my native language, and I should see a disclaimer that the content is machine translated and may not be accurate - And I should see a dissclaimer that the contents are machine translated Given that a page has been translated via Extension:Translate, any user should still visit the single “Wish page” to view all content, discussion, and voting in their native language.
    • Task
    # Αυτή είναι μια σελίδα βοήθειας για το [[ https://el.wikipedia.org/wiki/Βικιπαίδεια:Ionian_Wikithon_2024 | Ionian Wikithon 2024]]. ### What is Ionian Wikithon 2024 - in Corfu, Greece, 15th May, 2024 An annual event organised by the Networks, Multimedia and Systems Security Laboratory of the Department of Informatics of the Ionian University in collaboration with the Wikimedia Community User Group Greece. [[ https://meta.wikimedia.org/wiki/Event:Ionian_Wikithon_2024 | Program in English ]] & [[ https://w.wiki/9oFV | Program in Greek ]] ### Τι χρειάζεται να κάνω για να πάρω μέρος στο Ionian Wikithon 2024 α. Να έχεις δηλώσει συμμετοχή στη φόρμα: https://forms.gle/K8quCCV6g4KRKV1V7 β. //Πριν// έρθεις στο Γαληνό (να φορτίσεις το laptop σου, αν και οι υπολογιστές του εργαστηρίου θα είναι διαθέσιμοι) και να δημιουργήσεις ένα λογαριασμό στη Βικιπαίδεια (οδηγίες: https://ionian-wikithon-instructions.netlify.app/), στο τελευταίο βήμα των οδηγιών, καλύτερα προσθέστε στη σελίδα σας το `Συμμετέχω στο Ionian Wikithon 2024` (πέρασε ένας χρόνος από το πρώτο, αλλά οι οδηγίες δεν άλλαξαν). {F37081553} γ. Ό,τι απορία και αν έχεις μη διστάσεις να ρωτήσεις κάποιον από τους διοργανωτές και τους εθελοντές. ### preTask: Συμπλήρωση Ερωτηματολογίου Συμπληρώστε το εισαγωγικό [[ https://forms.gle/tSsSBpjTd8BpQZsEA | ερωτηματολόγιο ]] ## Task 1: Outreach Dashboard Επισκεφθείτε από μια συσκευή που έχετε κάνει login στη Βικιπαίδεια, τη σελίδα [[ https://outreachdashboard.wmflabs.org/courses/Ionian_University/Ionian_Wikithon_2023_(June_2023)?enroll=xiaogyth | Outreach Dashboard του Ionian Wikithon 2023 ]] και κάνετε εγγραφή με το username σας στη Βικιπαίδεια. Η εγγραφή σας στο Outreach Dashboard εξασφαλίζει ότι οι καταχωρήσεις σας στη Βικιπαίδεια, τα Commons, τα Wikidata καταμετρούνται και η συμβολή σας αναγνωρίζεται! ## Task 2: Αξιοποίηση wikidata Ας δούμε: https://riggas-ionio.github.io/heritage-promotion/pois/001.html Στατικό HTML + https://github.com/riggas-ionio/heritage-promotion/blob/master/assets/js/get_wiki_item.js Δλδ. `στατικό html` + `wikidata` Ή αν θέλουμε συμπληρώνουμε και με λίγο κείμενο από τη Βικιπαίδεια: https://el.wikipedia.org/w/api.php?format=json&action=query&prop=extracts&exintro&explaintext&redirects=1&titles=Κέρκυρα Πέρα από το query api (`api.php`) παρέχεται και REST API. Ας το γνωρίσουμε μέσα από το PAWS, μια υπηρεσία Jupyter Notebooks του wikimedia cloud: https://hub-paws.wmcloud.org/user/DimitriosRingas/lab/tree/Wikidata_REST_API.ipynb Public link: https://public-paws.wmcloud.org/User:DimitriosRingas/Wikidata_REST_API.ipynb Περισσότερα για το REST API των wikidata: https://www.wikidata.org/wiki/Wikidata:REST_API Περισσότερα για τα Jupyter Notebooks: https://www.datacamp.com/tutorial/tutorial-jupyter-notebook ## Task 3: Wikishootme Επισκεφθείτε από laptop ή κινητό το [[ https://wikishootme.toolforge.org/#lat=39.61911212892075&lng=19.91548961028457&zoom=15 | WikiShootMe ]]. Θα σας ζητήσει login με τα credentials που έχετε φτιάξει στη Βικιπαίδεια. Η βασική όψη του WikiShootMe είναι ένας χάρτης: {F37080297} Στο χάρτη εμφανίζονται διαφόρων τύπων δεδομένα, σε επίπεδα που μπορούν να ενεργοποιηθούν/απενεργοποιηθούν. {F37080300} Συγκεκριμένα: * οι μεγαλύτεροι, πράσινοι κύκλοι αντιπροσωπεύουν στοιχεία Wikidata με εικόνα * οι μεγαλύτεροι, κόκκινοι κύκλοι αντιπροσωπεύουν στοιχεία **Wikidata χωρίς εικόνα** * οι μικρότεροι, μπλε κύκλοι αντιπροσωπεύουν **εικόνες των Commons που δεν συνδέονται με ένα στοιχείο Wikidata** * οι μικρότεροι, κίτρινοι κύκλοι αντιπροσωπεύουν λήμματα της Βικιπαίδειας, στην τρέχουσα γλωσσική έκδοση (δείτε τον επιλογέα γλώσσας στην επάνω δεξιά γωνία) #### Τι μπορείτε να κάνετε με αυτά τα δεδομένα: α. Μπορείτε να **προσθέσετε μια εικόνα σε ένα στοιχείο του Wikidata που δεν έχει ήδη εικόνα** (θυμηθείτε, κάθε στοιχείο θα πρέπει ιδανικά να έχει μόνο μια εικόνα). {F37080335} Η εικόνα είτε ανεβαίνει από τη συσκευή σας ή εντοπίζετε μία στο https://commons.wikimedia.org και τη συνδέετε με το στοιχείο. β. Μπορείτε να **εντοπίσετε μία φωτογραφία στο commons και να δημιουργήσετε ένα στοιχείο Wikidata** που να αναφέρεται σε αυτό το στοιχείο. {F37080339} Προσέξτε ότι οι συντεταγμένες της φωτογραφίας μπορεί να είναι λάθος, διορθώστε τις. γ. Μπορείτε επιλέξετε ένα σημείο στο χάρτη (δεξί κλικ στο laptop, παρατεταμένο κλικ σε κινητό) για να δείτε τις συντεταγμένες στο σημείο αυτό ή για να δημιουργήσετε ένα νέο στοιχείο Wikidata. {F37080345} Περισσότερη βοήθεια: https://meta.wikimedia.org/wiki/WikiShootMe ## Task 4: Wikidata & Ιόνιο Πανεπιστήμιο Καλωσήλθατε στα [[ https://w.wiki/Jg | Wikidata ]]! Ας μάθουμε πρώτα [[ https://w.wiki/rV | Τι είναι τα Wikidata ]]. Μελετήστε ένα χαρακτηριστικό αντικείμενο (item) του [[ https://www.wikidata.org/wiki/Q42 | Douglas Adams ]] {F37081716}. Στη σελίδα των [[ https://w.wiki/Jg | Wikidata ]] κάνουμε μία αναζήτηση με θέμα Ionian University {F37081671} Στόχος μας είναι να επικαιροποιήσουμε/διορθώσουμε υπάρχοντα items και να δημιουργήσουμε wikidata items για όλα τα τμήματα του Ιονίου Πανεπιστημίου και για το Τμήμα Πληροφορικής προσθέτουμε όλα τα εργαστήρια και τους διδάσκοντες. - Πριν ξεκινήσετε να διορθώνετε και να προσθέτετε wikidata items σιγουρευτείτε ότι έχετε ολοκληρώσει τα βασικά [[ https://w.wiki/6mNA | Wikidata Tours ]] για το πώς δηλώνονται καινούρια αντικείμενα (items), οι δηλώσεις (statements) τους, σχετικές φωτογραφίες και οι συντεταγμένες. Μην ξεχνάτε πόσο σημαντικό είναι να προσθέτετε σωστές αναφορές (πηγές) που περιγράφουν την προέλευση των δηλώσεών σας και την επίσημη ιστοσελίδα του αντικειμένου που δημιουργείται (αν υπάρχει). - Στη συνέχεια μελετήστε δείτε μερικά χαρακτηριστικά παραδείγματα όπως το [[ https://www.wikidata.org/wiki/Q49108 | MIT ]], [[ https://www.wikidata.org/wiki/Q35794 | University of Cambridge ]], [[ https://www.wikidata.org/wiki/Q547867 | ΕΚΠΑ ]], [[ https://www.wikidata.org/wiki/Q111995001 | Τμήμα Πληροφορικής - ΙΠ ]], [[ https://www.wikidata.org/wiki/Q92621 | Andrew S. Tanenbaum ]]. - Αν και είστε έτοιμοι να δημιουργήσετε τα δικά σας αντικείμενα, καλύτερα να οργανωθείτε σε μικρές ομάδες και να μοιράσετε τι θα δημιουργήσετε. - Θα ήταν χρήσιμο να ανεβάσετε και σχετικές φωτογραφίες, αλλά με προσοχή ακολουθώντας τους [[ https://w.wiki/rV | κανόνες ]]! Εννοείται ότι μπορείτε να δημιουργήσετε και άλλα αντικείμενα με βάση τα ενδιαφέροντά σας. ## Task 5: Δημιουργία λημμάτων Δημιουργήστε εξαρχής ή μεταφράστε (π.χ. με το [[ https://el.wikipedia.org/w/index.php?title=Special:ContentTranslation&campaign=ionianwikithon&to=el | Special:ContentTranslation ]]) λήμματα που λείπουν από την Ελληνική Βικιπαίδεια. Η **λίστα [[ https://el.wikipedia.org/wiki/%CE%A7%CF%81%CE%AE%CF%83%CF%84%CE%B7%CF%82:Geraki/Ionian_Wikithon | Ζητούμενων λημμάτων Πληροφορικής κ.α. ]] **περιέχει τίτλους που είναι κόκκινοι σύνδεσμοι σε άλλα λήμματα της Βικιπαίδειας. ## finalTask: Πριν φύγετε... * Ανεβάστε φωτογραφίες σας στο Wikimedia Commons (προσοχή στους κανόνες) από τη δράση μας στην κατηγορία [[ https://commons.wikimedia.org/wiki/Category:Ionian_Wikithon_2024 | Ionian Wikithon 2024 ]] * Μη ξεχάσετε να μοιραστείτε την εμπειρία σας από το Ionian Wikithon 2024 και να γράψετε την εμπειρία σας [[ https://app.sli.do/event/mrTt1rY9k5pXfcivnuN9mN | Η εμπειρία μου στο Ionian Wikithon ]].
    • Task
    Given that I’m logged in and am viewing a wish page, when I see a discussion that I’d like to participate in, or start, I should be able to - Reply to a discussion - Start a discussion - Reply using wikitext or VE All language variants of a wish should direct to a single talk page. Discussions should be visible in any user’s native language.
    • Task
    Given that I am logged in and viewing a wish, when the wish is not archived or completed, then I should see a “voting” button on the wish page and be able to “support” a wish. And when I click support, I should be able to write context to support the wish and click "submit." And all supporters and their comments should be visible on the wish page. --- When I am not logged in, I should see a prompt to log in to submit my "support."
    • Task
    Given that my wish submission is valid, when I click submit, then a Wish Page should be created
and - I should be directed to the new wish page - The page should be hosted on meta - The page URL slug should be the title of the wish - The new wish should appear in the wish index page with a status of "submitted" The wish should clearly show - Title - “Problem” (description) - Affected users - Date + time submitted - Proposer - Category (ie, bug) - Project(s) - Wish status: “Submitted” - Focus area (if applicable) - Adopted team (if applicable) Given that I am viewing a wish, when I click “edit,” then I should see my editor of choice, and be able to edit the contents of a wish.
    • Task
    **Steps to replicate the issue** (include links if applicable): * Tap on Add an Image * Tap on Yes * Add a caption and tap on Next * View the preview **What happens?**: Often the added image is not visible to the user without scrolling the preview. This seems to happen in many cases, but I have noticed that images added to an infobox reproduce this consistently. **What should have happened instead?**: The image should be visible **Software version** : Tested on 7.4.11 (76) **Other information** (browser name/version, screenshots, etc.): {F48332557} {F48332588} Preview 1st view {F48332611} After scrolling {F48332637}
    • Task
    Given that I am on the wish submission form and am not logged in, I should see a notification to log in to submit a wish.
    • Task
    **Steps to replicate the issue** (include links if applicable): * With the device in any theme except 'light' * Tap on Add an Image * Tap Yes **What happens?**: The theme seems to switch to Light. This happens on Sepia, Dark, and Black modes **What should have happened instead?**: The theme should match the selected theme throughout the image recs flow **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): Tested on 7.4.11 (76) **Other information** (browser name/version, screenshots, etc.): {F48331738} {F48331737} {F48331736}
    • Task
    The wish intake form should be readable by a screenreader, and A11Y users should be able to submit a valid wish without using a mouse
    • Task
    Show an error state when a user hasn't selected a required field, or if a field doesn't hit a character limit - Category: Select 1 - Project(s): Select 1 or “other”. Other needs 3+ characters - Wish name: 10+ characters, max 70 - Description: 50+ characters - Impacted users: 10+ characters The submit button should be disabled until all required form fields are valid.
    • Task
    As reported in T363057, logo-test is down. ``` 2024-04-23T19:59:54+00:00 [logo-test-765c84f5c6-p2ksr] Error: Rocket failed to launch due to the following route collisions: 2024-04-23T19:59:54+00:00 [logo-test-765c84f5c6-p2ksr] >> (healthz) GET /healthz collides with (healthz) GET /healthz 2024-04-23T19:59:54+00:00 [logo-test-765c84f5c6-p2ksr] >> Note: Route collisions can usually be resolved by ranking routes. ``` This seems to be caused by https://gitlab.wikimedia.org/toolforge-repos/logo-test/-/commit/b0591f58b27e30ed1fc0c7e28c69862e8fa3eac8 which missed that there already is a `/healthz` route.
    • Task
    **Steps to replicate the issue** (include links if applicable): * Set Phabricator Locale to Finnish (presumably) * Try searching something from the top right corner * **What happens?**: ``` Unhandled Exception ("PhutilAggregateException") All of the configured Fulltext Search services failed. - Exception: DateTime::__construct(): Failed to parse time string (2024-04-23 12:00 epp.) at position 17 (e): The timezone could not be found in the database ``` **What should have happened instead?**: Search results. **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.): Started happening after the latest update. ``` [2024-04-23 19:28:57] EXCEPTION: (Exception) DateTime::__construct(): Failed to parse time string (2024-04-23 12:00 epp.) at position 17 (e): The timezone could not be found in the database at [<phorge>/src/view/form/control/AphrontFormDateControlValue.php:267] arcanist(), ava(), phorge(), translations(), wmf-ext-misc() #0 <#2> DateTime::__construct(string, DateTimeZone) called at [<phorge>/src/view/form/control/AphrontFormDateControlValue.php:267] #1 <#2> AphrontFormDateControlValue::getFormattedDateFromParts(string, string, string, string) called at [<phorge>/src/view/form/control/AphrontFormDateControlValue.php:110] #2 <#2> AphrontFormDateControlValue::newFromEpoch(PhabricatorUser, string) called at [<phorge>/src/applications/calendar/query/PhabricatorCalendarEventSearchEngine.php:590] #3 <#2> PhabricatorCalendarEventSearchEngine::getSafeDate(NULL) called at [<phorge>/src/applications/calendar/query/PhabricatorCalendarEventSearchEngine.php:155] #4 <#2> PhabricatorCalendarEventSearchEngine::getQueryDateRange(NULL, NULL, string, boolean) called at [<phorge>/src/applications/calendar/query/PhabricatorCalendarEventSearchEngine.php:120] #5 <#2> PhabricatorCalendarEventSearchEngine::buildQueryFromParameters(array) called at [<phorge>/src/applications/search/engine/PhabricatorApplicationSearchEngine.php:168] #6 <#2> PhabricatorApplicationSearchEngine::buildQueryFromSavedQuery(PhabricatorSavedQuery) called at [<phorge>/src/applications/calendar/query/PhabricatorCalendarEventSearchEngine.php:86] #7 <#2> PhabricatorCalendarEventSearchEngine::buildQueryFromSavedQuery(PhabricatorSavedQuery) called at [<phorge>/src/applications/search/fulltextstorage/PhabricatorFerretFulltextStorageEngine.php:77] #8 <#2> PhabricatorFerretFulltextStorageEngine::executeSearch(PhabricatorSavedQuery) called at [<phorge>/src/infrastructure/cluster/search/PhabricatorSearchService.php:266] #9 phlog(Exception) called at [<phorge>/src/infrastructure/cluster/search/PhabricatorSearchService.php:275] #10 PhabricatorSearchService::newResultSet(PhabricatorSavedQuery, PhabricatorSearchDocumentQuery) called at [<phorge>/src/applications/search/query/PhabricatorSearchDocumentQuery.php:52] #11 PhabricatorSearchDocumentQuery::loadPage() called at [<phorge>/src/infrastructure/query/policy/PhabricatorPolicyAwareQuery.php:251] #12 PhabricatorPolicyAwareQuery::execute() called at [<phorge>/src/infrastructure/query/PhabricatorOffsetPagedQuery.php:46] #13 PhabricatorOffsetPagedQuery::executeWithOffsetPager(PHUIPagerView) called at [<phorge>/src/applications/search/engine/PhabricatorApplicationSearchEngine.php:1036] #14 PhabricatorApplicationSearchEngine::executeQuery(PhabricatorSearchDocumentQuery, PHUIPagerView) called at [<phorge>/src/applications/search/controller/PhabricatorApplicationSearchController.php:280] #15 PhabricatorApplicationSearchController::processSearchRequest() called at [<phorge>/src/applications/search/controller/PhabricatorApplicationSearchController.php:91] #16 PhabricatorApplicationSearchController::processRequest() called at [<phorge>/src/aphront/AphrontController.php:29] #17 AphrontController::handleRequest(AphrontRequest) called at [<phorge>/src/aphront/AphrontController.php:71] #18 AphrontController::delegateToController(PhabricatorApplicationSearchController) called at [<phorge>/src/applications/search/controller/PhabricatorSearchController.php:96] #19 PhabricatorSearchController::handleRequest(AphrontRequest) called at [<phorge>/src/aphront/configuration/AphrontApplicationConfiguration.php:284] #20 AphrontApplicationConfiguration::processRequest(AphrontRequest, PhutilDeferredLog, AphrontPHPHTTPSink, MultimeterControl) called at [<phorge>/src/aphront/configuration/AphrontApplicationConfiguration.php:204] #21 AphrontApplicationConfiguration::runHTTPRequest(AphrontPHPHTTPSink) called at [<phorge>/webroot/index.php:35] [2024-04-23 19:28:57] EXCEPTION: (PhutilAggregateException) All of the configured Fulltext Search services failed.\n - Exception: DateTime::__construct(): Failed to parse time string (2024-04-23 12:00 epp.) at position 17 (e): The timezone could not be found in the database at [<phorge>/src/infrastructure/cluster/search/PhabricatorSearchService.php:279] arcanist(), ava(), phorge(), translations(), wmf-ext-misc() #0 <#3> DateTime::__construct(string, DateTimeZone) called at [<phorge>/src/view/form/control/AphrontFormDateControlValue.php:267] #1 <#3> AphrontFormDateControlValue::getFormattedDateFromParts(string, string, string, string) called at [<phorge>/src/view/form/control/AphrontFormDateControlValue.php:110] #2 <#3> AphrontFormDateControlValue::newFromEpoch(PhabricatorUser, string) called at [<phorge>/src/applications/calendar/query/PhabricatorCalendarEventSearchEngine.php:590] #3 <#3> PhabricatorCalendarEventSearchEngine::getSafeDate(NULL) called at [<phorge>/src/applications/calendar/query/PhabricatorCalendarEventSearchEngine.php:155] #4 <#3> PhabricatorCalendarEventSearchEngine::getQueryDateRange(NULL, NULL, string, boolean) called at [<phorge>/src/applications/calendar/query/PhabricatorCalendarEventSearchEngine.php:120] #5 <#3> PhabricatorCalendarEventSearchEngine::buildQueryFromParameters(array) called at [<phorge>/src/applications/search/engine/PhabricatorApplicationSearchEngine.php:168] #6 <#3> PhabricatorApplicationSearchEngine::buildQueryFromSavedQuery(PhabricatorSavedQuery) called at [<phorge>/src/applications/calendar/query/PhabricatorCalendarEventSearchEngine.php:86] #7 <#3> PhabricatorCalendarEventSearchEngine::buildQueryFromSavedQuery(PhabricatorSavedQuery) called at [<phorge>/src/applications/search/fulltextstorage/PhabricatorFerretFulltextStorageEngine.php:77] #8 <#3> PhabricatorFerretFulltextStorageEngine::executeSearch(PhabricatorSavedQuery) called at [<phorge>/src/infrastructure/cluster/search/PhabricatorSearchService.php:266] #9 <#2> PhabricatorSearchService::newResultSet(PhabricatorSavedQuery, PhabricatorSearchDocumentQuery) called at [<phorge>/src/applications/search/query/PhabricatorSearchDocumentQuery.php:52] #10 <#2> PhabricatorSearchDocumentQuery::loadPage() called at [<phorge>/src/infrastructure/query/policy/PhabricatorPolicyAwareQuery.php:251] #11 <#2> PhabricatorPolicyAwareQuery::execute() called at [<phorge>/src/infrastructure/query/PhabricatorOffsetPagedQuery.php:46] #12 <#2> PhabricatorOffsetPagedQuery::executeWithOffsetPager(PHUIPagerView) called at [<phorge>/src/applications/search/engine/PhabricatorApplicationSearchEngine.php:1036] #13 <#2> PhabricatorApplicationSearchEngine::executeQuery(PhabricatorSearchDocumentQuery, PHUIPagerView) called at [<phorge>/src/applications/search/controller/PhabricatorApplicationSearchController.php:280] #14 <#2> PhabricatorApplicationSearchController::processSearchRequest() called at [<phorge>/src/applications/search/controller/PhabricatorApplicationSearchController.php:91] #15 <#2> PhabricatorApplicationSearchController::processRequest() called at [<phorge>/src/aphront/AphrontController.php:29] #16 <#2> AphrontController::handleRequest(AphrontRequest) called at [<phorge>/src/aphront/AphrontController.php:71] #17 <#2> AphrontController::delegateToController(PhabricatorApplicationSearchController) called at [<phorge>/src/applications/search/controller/PhabricatorSearchController.php:96] ... ```
    • Task
    This task has been created by DC Ops for #serviceops implementation tracking (per serviceops request when filing racking tasks.) Once racking task T363212 has been completed, this task can be taken by service ops for implementation. Please note this task is not monitored by DC ops and any questions should be directed to the racking task.
    • Task
    This task will track the racking, setup, and OS installation of kafka-main100[6789] and kafka-main1010 == Hostname / Racking / Installation Details == **Hostnames:** kafka-main100[6789] and kafka-main1010 **Racking Proposal:** Row A (`A7` if possible), Row B (`B7` if possible), Row C (`C7` if possible), Row D (`D7` if possible), Row E (`E7` if possible) **Networking Setup:** 1 connection, 1G **VLAN:** Private, AAAA records: Y Additional IP records (Cassandra)? No **Partitioning/Raid:** HW Raid: N, Partman recipe and/or desired Raid Level: raid10-4dev.cfg **OS Distro:** Bullseye **Sub-team Technical Contact:** Alex == Per host setup checklist == ==== kafka-main1006: [] Receive in system on #procurement task T361360 & in Coupa [] Rack system with proposed racking plan (see above) & update Netbox (include all system info plus location, state of planned) [] Run the [[ https://netbox.wikimedia.org/extras/scripts/provision_server.ProvisionServerNetwork/ | Provision a server's network attributes ]] Netbox script - Note that you must run the DNS and Provision cookbook after completing this step [] **Immediately** run the `sre.dns.netbox` cookbook [] **Immediately** run the `sre.hosts.provision` cookbook [] Run the `sre.hardware.upgrade-firmware` cookbook [] Update the `operations/puppet` repo - this should include updates to preseed.yaml, and site.pp with roles defined by service group: https://wikitech.wikimedia.org/wiki/SRE/Dc-operations [] Run the `sre.hosts.reimage` cookbook ==== kafka-main1007: [] Receive in system on #procurement task T361360 & in Coupa [] Rack system with proposed racking plan (see above) & update Netbox (include all system info plus location, state of planned) [] Run the [[ https://netbox.wikimedia.org/extras/scripts/provision_server.ProvisionServerNetwork/ | Provision a server's network attributes ]] Netbox script - Note that you must run the DNS and Provision cookbook after completing this step [] **Immediately** run the `sre.dns.netbox` cookbook [] **Immediately** run the `sre.hosts.provision` cookbook [] Run the `sre.hardware.upgrade-firmware` cookbook [] Update the `operations/puppet` repo - this should include updates to preseed.yaml, and site.pp with roles defined by service group: https://wikitech.wikimedia.org/wiki/SRE/Dc-operations [] Run the `sre.hosts.reimage` cookbook ==== kafka-main1008: [] Receive in system on #procurement task T361360 & in Coupa [] Rack system with proposed racking plan (see above) & update Netbox (include all system info plus location, state of planned) [] Run the [[ https://netbox.wikimedia.org/extras/scripts/provision_server.ProvisionServerNetwork/ | Provision a server's network attributes ]] Netbox script - Note that you must run the DNS and Provision cookbook after completing this step [] **Immediately** run the `sre.dns.netbox` cookbook [] **Immediately** run the `sre.hosts.provision` cookbook [] Run the `sre.hardware.upgrade-firmware` cookbook [] Update the `operations/puppet` repo - this should include updates to preseed.yaml, and site.pp with roles defined by service group: https://wikitech.wikimedia.org/wiki/SRE/Dc-operations [] Run the `sre.hosts.reimage` cookbook ==== kafka-main1009: [] Receive in system on #procurement task T361360 & in Coupa [] Rack system with proposed racking plan (see above) & update Netbox (include all system info plus location, state of planned) [] Run the [[ https://netbox.wikimedia.org/extras/scripts/provision_server.ProvisionServerNetwork/ | Provision a server's network attributes ]] Netbox script - Note that you must run the DNS and Provision cookbook after completing this step [] **Immediately** run the `sre.dns.netbox` cookbook [] **Immediately** run the `sre.hosts.provision` cookbook [] Run the `sre.hardware.upgrade-firmware` cookbook [] Update the `operations/puppet` repo - this should include updates to preseed.yaml, and site.pp with roles defined by service group: https://wikitech.wikimedia.org/wiki/SRE/Dc-operations [] Run the `sre.hosts.reimage` cookbook ==== kafka-main1010: [] Receive in system on #procurement task T361360 & in Coupa [] Rack system with proposed racking plan (see above) & update Netbox (include all system info plus location, state of planned) [] Run the [[ https://netbox.wikimedia.org/extras/scripts/provision_server.ProvisionServerNetwork/ | Provision a server's network attributes ]] Netbox script - Note that you must run the DNS and Provision cookbook after completing this step [] **Immediately** run the `sre.dns.netbox` cookbook [] **Immediately** run the `sre.hosts.provision` cookbook [] Run the `sre.hardware.upgrade-firmware` cookbook [] Update the `operations/puppet` repo - this should include updates to preseed.yaml, and site.pp with roles defined by service group: https://wikitech.wikimedia.org/wiki/SRE/Dc-operations [] Run the `sre.hosts.reimage` cookbook
    • Task
    Given an event has ended and the final aggregation has occurred When there are no participant questions enabled, and there have never been participant questions enabled, the response statistics tab displays the following message: `This event has no responses from participants to aggregate.` Which is accurate, but also more correct is that there were never any participant questions enabled. {F48323907} --- Given an event has ended and the final aggregation has occurred When there are no participant questions enabled, but the participant questions were previously enabled and participants recorded their answers but then the organizer removed all participant questions, the Response statistics tab displays the following message: `Participants' responses will be aggregated and made available here shortly.` This is inaccurate as the final aggregation has already occurred and there are no questions enabled to be aggregated ( though there were questions enabled and answered previously ). {F48324638} **Acceptance criteria** Do not show Response statistics tab in the case of no participant questions being enabled. This should occur in all scenarios where there are no participant questions enabled, including 1. if there were never any questions enabled, 2. if questions were created and then answered and then removed, and 3. if questions were enabled answered aggregated after 90 days and then removed.
    • Task
    Per #serviceops request, all hosts being imaged and setup by DC Ops will have a sub-task for tracking service ops implementation. This task is for #service-ops and all questions regarding status of the hosts should direct to parent task T363209. Once that task is resolved this can take place.
    • Task
    This task will track the racking, setup, and OS installation of X == Hostname / Racking / Installation Details == **Hostnames:** kafka-main200[6789] and kafka-main2010 **Racking Proposal:** Row A (`A4` if possible), Row B (`B4` if possible), Row C (`C7` if possible), Row D (`D7` if possible), Row D (`D4` if possible) **Networking Setup:** 1 connection, 1G **VLAN:** Private, AAAA records: Y, Additional IP records (Cassandra)? No **Partitioning/Raid:** HW Raid: N, Partman recipe and/or desired Raid Level: raid10-4dev.cfg **OS Distro:** Bullseye **Sub-team Technical Contact:** @akosiaris == Per host setup checklist == ==== kafka-main2006: [] Receive in system on #procurement task T361362 & in Coupa [] Rack system with proposed racking plan (see above) & update Netbox (include all system info plus location, state of planned) [] Run the [[ https://netbox.wikimedia.org/extras/scripts/provision_server.ProvisionServerNetwork/ | Provision a server's network attributes ]] Netbox script - Note that you must run the DNS and Provision cookbook after completing this step [] **Immediately** run the `sre.dns.netbox` cookbook [] **Immediately** run the `sre.hosts.provision` cookbook [] Run the `sre.hardware.upgrade-firmware` cookbook [] Update the `operations/puppet` repo - this should include updates to preseed.yaml, and site.pp with roles defined by service group: https://wikitech.wikimedia.org/wiki/SRE/Dc-operations [] Run the `sre.hosts.reimage` cookbook ==== kafka-main2007: [] Receive in system on #procurement task T361362 & in Coupa [] Rack system with proposed racking plan (see above) & update Netbox (include all system info plus location, state of planned) [] Run the [[ https://netbox.wikimedia.org/extras/scripts/provision_server.ProvisionServerNetwork/ | Provision a server's network attributes ]] Netbox script - Note that you must run the DNS and Provision cookbook after completing this step [] **Immediately** run the `sre.dns.netbox` cookbook [] **Immediately** run the `sre.hosts.provision` cookbook [] Run the `sre.hardware.upgrade-firmware` cookbook [] Update the `operations/puppet` repo - this should include updates to preseed.yaml, and site.pp with roles defined by service group: https://wikitech.wikimedia.org/wiki/SRE/Dc-operations [] Run the `sre.hosts.reimage` cookbook ==== kafka-main2008: [] Receive in system on #procurement task T361362 & in Coupa [] Rack system with proposed racking plan (see above) & update Netbox (include all system info plus location, state of planned) [] Run the [[ https://netbox.wikimedia.org/extras/scripts/provision_server.ProvisionServerNetwork/ | Provision a server's network attributes ]] Netbox script - Note that you must run the DNS and Provision cookbook after completing this step [] **Immediately** run the `sre.dns.netbox` cookbook [] **Immediately** run the `sre.hosts.provision` cookbook [] Run the `sre.hardware.upgrade-firmware` cookbook [] Update the `operations/puppet` repo - this should include updates to preseed.yaml, and site.pp with roles defined by service group: https://wikitech.wikimedia.org/wiki/SRE/Dc-operations [] Run the `sre.hosts.reimage` cookbook ==== kafka-main2009: [] Receive in system on #procurement task T361362 & in Coupa [] Rack system with proposed racking plan (see above) & update Netbox (include all system info plus location, state of planned) [] Run the [[ https://netbox.wikimedia.org/extras/scripts/provision_server.ProvisionServerNetwork/ | Provision a server's network attributes ]] Netbox script - Note that you must run the DNS and Provision cookbook after completing this step [] **Immediately** run the `sre.dns.netbox` cookbook [] **Immediately** run the `sre.hosts.provision` cookbook [] Run the `sre.hardware.upgrade-firmware` cookbook [] Update the `operations/puppet` repo - this should include updates to preseed.yaml, and site.pp with roles defined by service group: https://wikitech.wikimedia.org/wiki/SRE/Dc-operations [] Run the `sre.hosts.reimage` cookbook ==== kafka-main2010: [] Receive in system on #procurement task T361362 & in Coupa [] Rack system with proposed racking plan (see above) & update Netbox (include all system info plus location, state of planned) [] Run the [[ https://netbox.wikimedia.org/extras/scripts/provision_server.ProvisionServerNetwork/ | Provision a server's network attributes ]] Netbox script - Note that you must run the DNS and Provision cookbook after completing this step [] **Immediately** run the `sre.dns.netbox` cookbook [] **Immediately** run the `sre.hosts.provision` cookbook [] Run the `sre.hardware.upgrade-firmware` cookbook [] Update the `operations/puppet` repo - this should include updates to preseed.yaml, and site.pp with roles defined by service group: https://wikitech.wikimedia.org/wiki/SRE/Dc-operations [] Run the `sre.hosts.reimage` cookbook
    • Task
    This is happening both in LibUp and locally in `fresh`. Running the `npm run check-build-assets` script after bumping `eslint-config-wikimedia` to 0.27.0 and running the eslint autofixer fails with no output: ``` I have no name!@4d3a8163e8d6:/CodeMirror$ npm run check-built-assets > check-built-assets > { git status src/ | grep "nothing to commit, working tree clean"; } && { echo 'CHECKING BUILD SOURCES ARE COMMITTED' && npm run build && git status resources/dist/ | grep "nothing to commit, working tree clean" || { npm run node-debug; false; }; } I have no name!@4d3a8163e8d6:/CodeMirror$ echo $? 1 ``` The script should ideally pass, or at least produce a clear error on why it does not pass. LibUp runs with Node 18.19 and Npm 9.2, Fresh is on Node 18.17 and Npm 9.6.7.
    • Task
    All OWID graphs and datasets were exported in 2019 to commons in Category:Our World in Data. Searching for relevant images is easier if the OWID categorization is saved, as the amount of images and data is too large. [X] Create all subcategories. [X] Move datasets to category [ ] Categorize all images
    • Task
    **Steps to replicate the issue** (include links if applicable): * Create the following module: skin.json ``` "testit": { "class": "MediaWiki\\ResourceLoader\\CodexModule", "styles": [ "resources/test.less" ] }, ``` test.less ``` div { position: absolute; left: 0; } ``` * Visit http://localhost:8888/w/load.php?lang=fa&modules=testit&only=styles **What happens?**: CSS is not flipped. ``` div{position:absolute;left:0}. ``` **What should have happened instead?**: CSS should be flipped. ``` div{position:absolute;right: 0}. ``` **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.):
    • Task
    OWID is currently only available in English. Making it multilingual would be a great addition to the project.
    • Task
    **Steps to replicate the issue** (include links if applicable): * Call `https://api.wikimedia.org/service/lw/inference/v1/models/revertrisk-language-agnostic:predict` for the following languages - dga - zgh - fon - bbc **What happens?**: 400 Bad Request ``` { "error": "Unsupported lang: dga." } ``` **What should have happened instead?**: Score returned **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): **Other information** (browser name/version, screenshots, etc.):
    • Task
    This task merges some of the tasks needed to improve the OurWorldInData popup gadget https://www.mediawiki.org/wiki/Template:OWID
    • Task
    In this session we will learn how to deploy the Our World in Data pop-up template in your wiki, so data from OWID can be shown after user gives permission. More information: https://www.mediawiki.org/wiki/Template:OWID
    • Task
    **Acceptance Criteria:** * Set the need configs values needed on mediawiki-config to enable CampaignEvents Extension on Igbo Wikipedia Steps: [] Add `igwiki` to `wmgUseCampaignEvents` in wmf-config/InitialiseSettings.php [] Create the `event-organizer` user group, assign the standard user rights, make it editable by sysops. Remove those rights from the `user` group. (wmf-config/core-Permissions.php) ``` // groupOverrides '+igwiki' => [ 'event-organizer' => [ // T362675 'campaignevents-enable-registration' => true, 'campaignevents-organize-events' => true, 'campaignevents-email-participants' => true, ], 'user' => [ 'campaignevents-enable-registration' => false, // T362675 'campaignevents-organize-events' => false, // T362675 'campaignevents-email-participants' => false, // T362675 ], ], // wgAddGroups '+igwiki' => [ 'sysop' => [ 'event-organizer', // T362675 ], ], // wgRemoveGroups '+igwiki' => [ 'sysop' => [ 'event-organizer', // T362675 ], ], ``` [] Schedule a deployment window
    • Task
    In order to use the catalyst api from patchdemo, there needs to be a PHP api usable by patchdemo to launch MediaWiki instances. The scope of this task is to develop a set of PHP classes that could be called via a script like: ``` include catalyst.php; $c = new Catalyst($host='blah'); $c->launch($commits=['abc123']); ``` This can act as the starting point for integration with patchdemo. The code for this will live in our fork of patchdemo.
    • Task
    NOTE: Blocks deployment to French and Italian Wikipedia and other projects (see impacted projects) == Background Various projects use different markup for their infoboxes that does not follow the guidance in [[ https://www.mediawiki.org/wiki/Recommendations_for_mobile_friendly_articles_on_Wikimedia_wikis#Use_standardized_class_names_in_HTML_markup_for_components_in_templates_across_projects | Recommendation: Use standardized class names in HTML markup for components in templates across projects ]] As a result: * Infoboxes do not adapt to mobile screens consistently with other projects * Infoboxes are not relocated after the first paragraph * These projects have infoboxes that are not optimized for mobile or night mode. These projects are likely to see a delay in getting night mode. == Impacted projects * https://it.m.wikipedia.org/wiki/Acacia_dealbata?minervanightmode=1 * https://qu.m.wikipedia.org/wiki/Paris?minervanightmode=1 * https://pnb.m.wikipedia.org/wiki/%D9%BE%DB%8C%D8%B1%D8%B3?minervanightmode=1 * https://fr.m.wikipedia.org/wiki/Paris?minervanightmode=1 * https://oc.m.wikipedia.org/wiki/Par%C3%ADs_(Fran%C3%A7a)?minervanightmode=1 * https://pcd.m.wikipedia.org/wiki/Paris?minervanightmode=1 * https://nds.m.wikipedia.org/wiki/Paris?minervanightmode=1 * https://krc.m.wikipedia.org/wiki/%D0%9F%D0%B0%D1%80%D0%B8%D0%B6?minervanightmode=1 == User story As a user of a project using a non-standard infobox I would like to have an infobox friendly in night mode. == Requirements [] All projects must mark their infoboxes with the infobox class to get optimizations. == Acceptance criteria On a mobile device: [] The infobox should appear below the lead paragraph [] The infobox should be friendly in night mode.
    • Task
    I'm in need for testing against the Gerrit API and have noticed that the test instances have been deleted. I'd like for a new instance so I can test, pretty please!
    • Task
    **Steps to replicate the issue**: * Login to the app * Change the primary language to `Test` wiki * Go to "Edits" and see the edit count * Click on "More" menu on the bottom and go to "Settings" to change the primary language to `English` wiki * Go back to "Edits" and wait for the screen to be refreshed automatically. * Pull down to refresh the screen to fetch the updated edit count **What happens?**: The edit count from manual refresh is different from the initial refresh. **What should have happened instead?**: The edit counts should be the same. **Software version**: 2.7.50482-r-2024-04-16
    • Task
    Add new GPay donor flow events to instrumentation docs and create instrumentation ticket for engineers. Use screens from [[ https://www.figma.com/file/yapMJ6cGCJpYWhLmnSYScJ/iOS-and-Android-%E2%86%92-Donor-experiences?type=design&node-id=1248-532&mode=design&t=eIonvbheJPX6dSye-0 | Figma ]] / Design ticket: T362698 Plan to report on specific metrics for GPay: - **Indicator**: 30% higher donation completion rate (Success rate %) for donations made through Native GPay as compared to web view donations (all payment types) - **Indicator**: 25% increase in donation completion rate (Success rate %) for all donations made in Android app during campaign as compared to previous year - **Curiosity**: What percentage of users use native payment GPay compared to GPay on Web? - **Curiosity**: How did the share of GPay donations as a portion of all Android donations change? (compare to last campaign, or before and after of Apps Menu donations) - **Guardrail**: Abandonment rate is no more than 25%
    • Task
    When a participant registers and answers PII questions, and their answers are aggregated after 90 days (or shorter if changing ttl to test on local), and the participant cancels their registration, and the participant re-registers, then the participant should not be shown the participant questions or be able to re answer them. This can be reproduced on the event page, in Special:RegisterForEvent, and via the API (answers are not rejected). --- Chrome, local env, mediawiki 1.43.0-alpha This is what is happening currently: {F48307443}
    • Task
    AS a patchdemo user GIVEN I have signed into patch demo AND configured a mediawiki with the Thanks extension WHEN I hit Start AND wait five minute THEN I see my mediawiki running at the displayed URL GIVEN I have deployed a mediawiki with patchdemo WHEN I curl the catalyst API: GET /api/environments THEN I see a JSON response with my environment
    • Task
    In T362984 we are investigating some issues with GPUs on k8s, and something came up: do we need to deploy ROCm Debian packages on k8s nodes, or can we just rely on the libs shipped by Pytorch and similar? Rationale: On k8s nodes we deploy ROCm libs (~10G+) that shouldn't be used, since the only GPU workloads are through containers. We have a device plugin that exposes the GPU device to the kubelet (that in turn exposes it to the containers that request it), but it relies only on the Linux Kernel driver to recognize the device and nothing more. We should try to remove ROCm libs from ml-staging2001, and test if we can just use ROCm libs with Pytorch. This would simplify a lot our life, since we'll need to update our internal APT repos only for the training infra (stat nodes etc.., so bare metal nodes that actually use those packages).
    • Task
    ``` elif right is not None and not self.has_right(right): raise UserRightsError('User "{}" does not have required ' 'user right "{}"' > .format(self.user(), right)) E pywikibot.exceptions.UserRightsError: User "Pywikibot-oauth" does not have required user right "upload" ``` Possibly the user is not logged in.
    • Task
    T358610 made various changes that need to be reflected in phabricator-translations (i.e it deleted chatlog (although I started on that in T318763), changed some source strings which need to be pushed to translatewiki, and then will need to be retranslated and deployed)
    • Task
    **Steps to replicate the issue** (include links if applicable): * Create an event linked to a course on the P&E Dashboard * Delete the event (which will also unsync the course) * Restore the event manually with the following DB query: `UPDATE campaign_events SET event_deleted_at=NULL where event_id=YOUR_EVENT_ID` * Go to Special:EditEventRegistration for this event * Remove the tracking tool * Submit **What happens?**: Form submission fails with the following error: > The course $COURSE_NAME is not connected to this event, and therefore it cannot be synchronized. **What should have happened instead?**: The "unsynced course" error should be ignored, and the course should be removed. **Software version** (on `Special:Version` page; skip for WMF-hosted wikis like Wikipedia): master **Other information** (browser name/version, screenshots, etc.): While the reproduction steps described here require you to alter the data at the DB level, the course might potentially get unsynced through other means (e.g., some kind of failure on the dashboard side). In general, when a course is removed, it makes sense to ignore this error if the course is not currently linked with the event. See T362365 for an instance of this problem in production (following manual DB intervention). Similar to T358732, but for a slightly different scenario (course unsynced instead of deleted).
    • Task
    Following up the [[ https://wikitech.wikimedia.org/wiki/Incidents/2024-04-17_mw-on-k8s_eqiad_outage | 2024-04-17 mw-on-k8s eqiad outage ]], where the root cause was the number of DNS resolution requests from MediaWiki pods towards CoreDNS so to resolve mw-mcrouter's location `mcrouter-main.mw-mcrouter.svc.cluster.local` To mitigate the issue, we have added a [[ https://gerrit.wikimedia.org/r/c/operations/deployment-charts/+/1020768/2/helmfile.d/services/_mediawiki-common_/global.yaml | trailing dot ]] to speed up the FQDN resolution, with good results on codfw CoreDNS rps: * Green: rps with mw-on-k8s using `mcrouter-main.mw-mcrouter.svc.cluster.local` * At ~10:50 UTC we switched to `mcrouter-main.mw-mcrouter.svc.cluster.local.` * Yellow: baseline rps for CoreDNS {F48302605} While this looks alright for now, we are unsure how things may go down in times of high traffic, or during deployments. For that reason, we would like to cache the IP to which `mcrouter-main.mw-mcrouter.svc.cluster.local` resolves to, to APCu, with a TTL of 1s.
    • Task
    In WikimediaEvents, `includes/BlockUtils.php` and `includes/BlockMetrics/BlockMetricsHooks.php` use the deprecated `User::getGlobalBlock()` method.
    • Task
    **Feature summary**: I would like to request for the [[ https://dumps.wikimedia.org/backup-index.html | Wikimedia Dumps ]] infrastructure to compute and publish a SHA-256 digest for each dumped data file. **Use case(s)**: I'm working on a [[ https://toolsadmin.wikimedia.org/tools/id/timestamper | Toolforge service ]] to perform automated, cryptographically-sound timestamping of Wikipedia snapshots. I'm using the opentimestamps.org service, which by default uses the SHA-256 hash. In order to get the SHA-256 digest of database files for the timestamp, my project currently needs to apply sha256sum to compute the digest of each file (example [[ https://timestamper.toolforge.org/data/wikimedia_digests_20240420.json | here ]]). This computation is somewhat expensive for larger dump files, though not prohibitive. Currently the xml data dumps provide only the MD5 and SHA-1 digests ([[ https://dumps.wikimedia.org/fawiki/20240420/dumpstatus.json | here ]] is an example). Both of these hash functions are obsolete because they are cryptographically broken. **Benefits**: It would be helpful if the Dumps service provided SHA-256 digests in addition to MD5 and SHA-1, because SHA-256 is cryptographically sound. Not only would it reduce the computing resources needed by my timestamping project, but it would also be useful to anyone else looking to verify the integrity of the dump files.
    • Task
    As part of MinT for Wikipedia Readers MVP (T359072), this ticket proposes to provide an entry point from the mobile language selector on Wikipedia. In this way, users interested in reading an article in another language will have a way to access a machine translated version in case the content is not available in the language they are searching for. Currently, the language selector already provides access to content translation from the language selector for users to create a new translation. [Based on data](https://htmlpreview.github.io/?https://github.com/wikimedia-research/content-translation-funnel-analysis/blob/main/T328913_sx_mobile_entry_points.html), the language selector represents the most common way for users to access Content Translation on mobile. This ticket proposes to make adjustments on the mobile language selector, in order to present a new option to read a machine translation next to the option to create a translation. Recent research suggests that this contrast may help understanding the different possibilities. You can [check the prototype](https://www.figma.com/proto/RTBmXJ4ZGmhPoNP1nlnm5C/MinT-for-Readers-MVP?page-id=562%3A8207&type=design&node-id=562-13405&viewport=467%2C-1681%2C0.72&t=Lc3sTXvQsbSkMiIr-8&scaling=scale-down&starting-point-node-id=562%3A13405&hotspot-hints=0&hide-ui=1) and the video below: {F48594653} Below there is more detail on the current situation and the aspects we need to change to support the proposed solution. # Current {F48298930, size=full} Currently on mobile, users have access to the language selector on Wikipedia articles. The language selector provides different mechanisms to find a language: - **Search** allows users to type the name of a language. When the search fails the user is shown a "Language not available" view which can happen in two cases: - The language is identified but content is not available. In this case, the user is invited to make a translation and a set of actions are shown for users to start the translation of the current article into the languages matching the search query or any other language. - The search query is not identified as a language (e.g., random text), a generic error message is shown. - **Favorite languages** is a section where frequently selected languages by the user are surfaced for easy access. As part of Section Translation entry points, an element shows also when the content is missing in some of the user frequently used languages. Taping on this links users directly to Section Translation. # Proposed {F48594377, size=full} This ticket proposes to make the following changes: - **Update the "Language not available" view** to include two different options: "Write a new translation" and "Read an automatic translation". This will only show when the search query is identified as a language. Otherwise the generic messages will be shown as it is now. - **Adjust the navigation from the missing favorite languages.** When missing languages are surfaced, tapping on them will have the same effect as searching for the first of them. That is, taping on the "Missing in Igbo and Hausa" message will result on "Igbo" being typed in the search bar and, as a result, the "Language not available" view with the options to "Write a new translation" and "Read an automatic translation" will be shown. ##Design details A [Card Codex Component](https://doc.wikimedia.org/codex/latest/components/demos/card.html) is used to present the options: - **Write a new translation** leads to Content Translation with the information pre-filled for the article to translate (current article from which the user accessed the language selector), source language (the content language of the article to translate), and target language (first match for the user search term). - Title and visual element: "Write a new translation" with "Add" icon. - Description: "Step-by-step translation process to create a new page in your language." - **Read an automatic translation** leads to MinT for Wikipedia Readers Confirm step (T359512) with the information pre-filled in the same way as in the previous case. - Title and visual element: "Read an automatic translation" with "Robot" icon. - Description: "Automatic preview of contents in other languages for you to review and fix." More details [in Figma](https://www.figma.com/file/RTBmXJ4ZGmhPoNP1nlnm5C/MinT-for-Readers-MVP?type=design&node-id=581%3A6000&mode=design&t=9Xmv0vDaKqntxEiG-1)
    • Task
    **Kickoff meeting minutes** * https://docs.google.com/document/d/1N9Dkz9yjYZGqtur51FEMPedZqmYQP_s9ZnoaoRDgbRo/edit **Current draft questionnaire** * https://docs.google.com/document/d/1e1y2cBCHEwtYZI_fCg9wY6Z-Xywex7tvP-YgQ88pDWM/edit **General deliverables** * A basic, high-level form or questionnaire and associated risk calculations * A set of guidelines/best practices that define that #security-team's rapid risk assessment process **Resources** * [[ https://www.rra.rocks/ | rra.rocks ]] (based upon Mozilla's RRA program) * [[ https://owasp.org/www-community/OWASP_Risk_Rating_Methodology | OWASP risk-rating methodology ]] * [[ https://owasp.org/www-project-web-security-testing-guide/assets/archive/OWASP_Web_Application_Penetration_Checklist_v1_1.pdf | OWASP pen-testing checklist ]]