#Research would like to load recommendation into a production MySQL database. @bmansurov would like to pair with someone from #operations to work on this task:
- Data to be imported is at stat1007:/home/bmansurov/tp9/article-recommender-deploy/predictions-06032018-11302018. This data is generated using Wikidata dumps of 10/01/2018 and page views of until 11/30/2018. Language pairs included are: en-es, en-fa, and ru-uz.
- The import script is [[ https://gerrit.wikimedia.org/r/#/admin/projects/research/article-recommender/deploy | here ]].
- [[ https://gerrit.wikimedia.org/r/#/c/mediawiki/services/recommendation-api/+/450601/4/scripts/article-recommendation.sql | Database schema ]]
- [[ https://gerrit.wikimedia.org/r/#/c/mediawiki/services/recommendation-api/+/450601/4/scripts/article-recommendation-data-importer.py | Script that inserts data into the database ]]
host to connect: m2-master.eqiad.wmnet
user: recommendationapi and recommendationapiservice (for read only)
credentials: on private puppet (class passwords::recommendationapi::mysql, $recommendationapi_pass & $recommendationapiservice_pass)
- [ ] Create a Puppet script that loads data from a git repository.
- [ ] Set up the script so that a configuration change will load the data automatically.
- [ ] Make sure the data is versioned. We'll periodically remove old versions of the data once we make sure the new version is better than the old ones.
- [ ] Actually load 'en-es', 'ru-uz', and 'en-fa' into the database.