Description
This is a tracking bug for shared efforts to investigate a surprising difference in latency between production and local deployments.
I've uploaded here the aligned logs for a single Wikifunctions request:
This request corresponds to the "eine schöne Katze" tester for Z20612.
The logs with messages like calling Evaluator in orchestrator and finished calling Evaluator in orchestrator correspond to REST requests made from the orchestrator to the evaluator. For this request, three such evaluator calls were made. The logs show that the first two of these requests took ~2.8 and ~3.03 seconds. Notably, these two requests send large objects (>100 Kbs) objects (in this case, Wikidata Lexemes) as part of the REST request.
When I run this same function in a local docker-compose deployment, these requests take ~800 ms each--still not great, but better.
Desired behavior/Acceptance criteria (returned value, expected error, performance expectations, etc.)
- we should understand the causes of this latency discrepancy
Reproduction Instructions
- follow Docker installation instructions for MW core (DEVELOPERS.md) and WikiLambda (README.md)
- in core, create docker-compose.override.yaml as follows:
version: '3.7'
services:
mediawiki:
extra_hosts:
- "host.docker.internal:host-gateway"
function-orchestrator:
image: docker-registry.wikimedia.org/repos/abstract-wiki/wikifunctions/function-orchestrator:latest
environment:
WIKI_API_URL: http://wikifunctions.org/w/api.php
WIKIDATA_API_URL: https://wikidata.org
ORCHESTRATOR_CONFIG: |
{
"doValidate": true,
"addNestedMetadata": true,
"generateFunctionsMetrics": true,
"useWikidata": true,
"evaluatorConfigs": [
{
"programmingLanguages": ["python-3-9", "python-3-8", "python-3-7", "python-3", "python"],
"evaluatorUri": "http://core-python3-evaluator-1:6927/1/v1/evaluate/",
"evaluatorWs": "",
"useReentrance": false,
"allowCustomDerializers": true
},
{
"programmingLanguages": ["javascript-es2020", "javascript-es2019", "javascript-es2018", "javascript-es2017", "javascript-es2016", "javascript-es2015", "javascript"],
"evaluatorUri": "http://core-javascript-evaluator-1:6927/1/v1/evaluate/",
"evaluatorWs": "",
"useReentrance": false,
"allowCustomDerializers": true
}
]
}
ports:
- 6254:6254
- 9100:9100
python3-evaluator:
image: docker-registry.wikimedia.org/repos/abstract-wiki/wikifunctions/function-evaluator/wasm-python3-all:latest
ports:
- 6928:6927
javascript-evaluator:
image: docker-registry.wikimedia.org/repos/abstract-wiki/wikifunctions/function-evaluator/wasm-javascript-all:latest
ports:
- 6926:6926- run curl -X POST http://0.0.0.0:6254/1/v1/evaluate -H "Content-type: application/json" -d '{"zobject":{"Z1K1": "Z7", "Z7K1": "Z20612", "Z20612K1": {"Z1K1": "Z7", "Z7K1": "Z6825", "Z6825K1": {"Z1K1": "Z6095", "Z6095K1": "L816418"}}, "Z20612K2": {"Z1K1": "Z7", "Z7K1": "Z6825", "Z6825K1": {"Z1K1": "Z6095", "Z6095K1": "L495180"}}}}'
- latency numbers can be recovered with docker logs or by proxying requests to the evaluator service (but I'd be interested to know if you have other tools to inspect this)
Completion checklist
- Before closing this task, review one by one the checklist available here: https://www.mediawiki.org/wiki/Abstract_Wikipedia_team/Definition_of_Done#Back-end_Task/Bug_completion_checklist
