**Participants**, please read/think about/research these, ahead of time:
Session description:
* Next Steps for Languages and Cross Project Collaboration
Goals:
* Find out the Language infrastructure capabilities and opportunities for Wikimedia movement.
* Identify projects and tasks to meet the strategic goals of Wikimedia in this area.
* See also [[https://www.mediawiki.org/wiki/Wikimedia_Developer_Summit/2018/Purpose_and_Results|DevSummit purpose and results]] for guidance
Related position statements:
* https://wikifarm.wmflabs.org/devsummit/index.php/Session:5
Structure (rough draft):
* The session will be divided to two parts - first half for Languages infrastructure, and second half for Cross Project Collaboration. It is possible that some discussion having overlap between these two topics
* Session will have an introduction to topic, how this topic is relate to WMF Vision and what problems we would like discuss.
* Identify the key challenges we have for meeting the goals set by the WMF vision about Languages.
* Propose ways to support future needs; concrete solution, technologies to explore
* The session will NOT be used to discuss any of the topics in-depth. Concrete engineering questions are not in scope.
Related background reading:
* Languages, Machine Translation:
** The [[ https://www.mediawiki.org/wiki/Wikimedia_Language_engineering | Language Team ]] who was responsible for Language related projects , now does not exist. It is merged to Collaboration team and part of [[ https://www.mediawiki.org/wiki/Global_Collaboration | Global Collaboration team ]] now.
** The Language infrastructure that the ex-Language team was responsible for, are [[ https://www.mediawiki.org/wiki/Wikimedia_Language_engineering#Projects | listed here ]]
** The Content translation(CX) system to translate articles between languages - https://mediawiki.org/wiki/Content_translation
** The machine translation service used by the system is a public API provided by cxserveer: https://www.mediawiki.org/wiki/Content_translation/cxserver But used only by Content translation now.
** Currently we have [[ https://www.mediawiki.org/wiki/Content_translation/Machine_Translation/Apertium | Apertium ]], [[ https://www.mediawiki.org/wiki/Content_translation/Machine_Translation/Yandex | Yandex ]], [[ https://www.mediawiki.org/wiki/Content_translation/Machine_Translation/Youdao | Youdao ]] machine translation backends. Apertium is hosted by Wikimedia. Rest of them accessed using their APIs using special API keys. A legal contract exist between WMF and these providers. Discussions to use Google MT is in its final stages.
* Cross project collaboration
** [[ https://en.wikipedia.org/wiki/Wikipedia_talk:Manual_of_Style#RfC:_Linking_to_wikidata | Recent discussion about Wikidata usage on English Wikipedia ]]
** [[ https://grafana.wikimedia.org/dashboard/db/wikidata-entity-usage?refresh=5m&orgId=1 | Usage numbers of Wikidata's data across our projects ]]
** [[ http://wdcm.wmflabs.org | Wikidata Concept Monitor showing how data from Wikidata is used across our projects ]]
**Session notes**:
* https://etherpad.wikimedia.org/p/devsummit18-languagesandcollaboration
----
**Topic Leaders** (@Lydia_Pintscher @santhosh)
This is one of the 8 [[https://www.mediawiki.org/wiki/Wikimedia_Developer_Summit/2018 |Wikimedia Developer Summit 2018]] topics.
----
Post-event Summary:
* ...
Action items:
* ...