Add something to the ElasticSearch indexes, presumably?
|Open||None||T44725 Multimedia file format support (tracking)|
|Declined||None||T107410 Wiki 3d warehouse|
|Open||None||T133526 Epic saga: immersive hypermedia (Myst for Wikipedia)|
|Resolved||TheDJ||T3790 Allow uploading of 3D files to Wikimedia Commons|
|Resolved||dr0ptp4kt||T157348 Have search recognise STL files as a new kind of media file ('type:3d' or whatever)|
Dear wonderful DBA s, could one of you briefly look at https://gerrit.wikimedia.org/r/#/c/336454/ (especially https://gerrit.wikimedia.org/r/#/c/336454/6/maintenance/archives/patch-add-3d.sql ) to see if it's OK from your POV? If so, we'll merge and then create a task for a production DB adjustment for this. Thanks!
I would have to test that I am right at the CR to provide a more accurate ETA. It will not be 100% trivial in any case because commons' heavy usage of the image table, but we will see how we can do it best.
Merge- if it is slow or not, it has to be applied, the only difference is if it will take hours or weeks to apply- as long as no code depends on it existing, there is no problem, just follow https://www.mediawiki.org/wiki/Development_policy#Database_patches and https://wikitech.wikimedia.org/wiki/Schema_changes#Workflow_of_a_schema_change
@jcrespo since this change is only really necessary on the beta cluster right now, do we need to file a ticket for that specifically, or is it a separate process that we should do ourselves?
Also, it looks like the wikis we need to apply this to (i.e. 'wikis where uploads are enabled') may be difficult to list explicitly based on the current WMF config - is there a better way to specify that?
I have no power over there... on beta. That is supposedly to be used for people to test their changes before anything reaches production. In the places were beta testing is not adequate, testwiki and sister projects (on production) can be used- which are all normally very easy to apply a schema change to.
The workflow I mentioned is for wmf production deployment (that means enwiki and the other 900 wikis with well-known domains.
"may be difficult to list explicitly based on the current WMF config" If you do not know the list, I will not be able to apply it :-) Looking at your change, given that it alters tables.sql on core, normally that would mean you have to reference all.dblist, unless something very very specific is needed (all -test because test has been done; only enwiki becase there was a bug that only applies to wikis older than 17 years, etc.).
On the above guide:
Note that this means your schema change should be optional in code - for wikimedia deployments, it is expected that every wiki with the relevant database table(s) will have the schema change applied to them. If you need different schema for different wikis, then apply the change using an extension and creating new tables dependent on that extension.
That means that if the code is deployed, it should be requested to be applied to all wikis if it is a core change, but that the code should not exect the change to be available from the begining, a schema change takes, in the optimal case, 1 week to propagate; in the worse case, several years.
This needs to update other db backends as well, not just MySQL.
This patch caused the Travis CI build for wikimedia/mediawiki to fail on Postgres, due to the ENUM being incomplete and failing Upload and Search unit tests.
Re-opening until "the build" is no longer broken. The patch for this task added a MySQL schema change to MediaWiki without updating the other schemas, which creates a fatal error for all non-MySQL installations. (Including Sqlite and Postgres.)