I've translated to French yesterday the https://en.wikipedia.org/wiki/Calcium_lactate article, which made me realize that the estimated translation time is probably way too low. The interface was estimating a bit less than an half an hour (22 minutes if I remember correctly), and it took me several (> 5) hours. Granted, I'm an inexperienced editor, but it made me curious about the estimate computation.
My understanding of the code is that the translation time is estimated in https://gerrit.wikimedia.org/g/mediawiki/extensions/ContentTranslation/+/master/app/src/composables/useTranslationSize.js with the assumptions of 5 bytes in a word and 200 words per minute.
This would yield 12000 words per hour, which is very unrealistic. For reference, https://en.wikipedia.org/wiki/Postediting (which feels like a reasonable approximate of ML-assisted content translation in the extension) indicates estimates of 1000 word per hour in light post-editing, which is a/ an order of magnitude lower b/ probably still an underestimation since ContentTranslation usage is meant to be beyond "light" post-editing (and probably used by non-professional translators).
Such a discrepancy feels like it would be discouraging to new editors/translators and reduce the number of returning users.
Update from team discussion on 2025-10-29: We want to revise the estimation algorithm based historical data of time spent translating. We would like a data analyst to take this on, or at least their advice on how to do it ourselves.