We want to see if we can support a fastText model on Lift Wing. Let's try loading the outlinks topic model as a custom KFServing inference service.
Serving code for Cloud VPS API (Python Flask) that scores an article on demand via the Mediawiki APIs + loaded model binary: https://github.com/wikimedia/research-api-endpoint-template/blob/master/model/wsgi.py
Current model binary: https://analytics.wikimedia.org/published/datasets/one-off/isaacj/articletopic/model_alloutlinks_202012.bin