Page MenuHomePhabricator

Lua error: Not enough memory due to several templates in pages
Closed, InvalidPublicBUG REPORT


Few weeks ago, some users reported an issue at Among Us at the Spanish Wikipedia, to be specific, a memory exhaustation when using the template Ficha de videojuego, that takes data from Wikidata using the function Argumentos.obtenerTablaDeArgumentos(), that returns the Wikidata data into a Lua table.

I believed the issue with large memory usages are caused by the large amount of data at Wikidata. However, we discovered the issue is caused actually by the repeated usage of some templates (see andemia de COVID-19 en Costa Rica, some if them we tried to optimiser. We're still researching the causes of the memory exhaustion while try to optimise templates.

So, in order to avoid further issues in pages, is possible to increment the memory used by Lua, from 50 MB to, for example, 100 MB?

Event Timeline

Amitie_10g renamed this task from Lua error: Not enough memory due to large ammount of data from Wikidata to Lua error: Not enough memory due to several templates in pages .May 31 2022, 11:58 AM
Amitie_10g reopened this task as Open.
Amitie_10g updated the task description. (Show Details)

@Amitie_10g: Thanks for reporting this. Where exactly (full text) to see the issue? What are clear steps to see that this is due "repeated usage of some templates"? For future reference, please use the bug report form (linked from the top of the task creation page) to create a bug report, and fill in the sections in the template. Thanks!

From reference 34 and on you can see that Lua complains about not having enough memory. We're wondering if it's us who run into a limitation of the language, or if an update broke something that we had been using. There are many more articles having this issue. Thanks for looking into it.

Looking at the HTML source code, it says Lua memory usage: 52428762/52428800 bytes

That one only has Lua memory usage: 6030563/52428800 bytes

T308893: Increase $wgMaxArticleSize to 4MB for ruwikisource, T275319: Raise limit of $wgMaxArticleSize for Hebrew Wikisource come to my mind regarding recent "raise some limits" discussions (though those are about the max article size instead).
See also T123844#6595413 or T165935#3281460 though - there's always going to be some limit.

Looking further, in Among Us, memory exhaustion happens when invoking Módulo:Lanzamientovj 🠒 Módulo:Wikidata/Fecha 🠒 Módulo:Fechas, specifically, at the lines 76 - 93, the repetitive invocation of Módulo:Wikidata/Fecha.

for k, v in pairs(Obj) do
    Valor       = require('Módulo:Wikidata/Fecha').FormateaFechaHora(elementoTabla(v, 'mainsnak', 'datavalue', 'value'), {['enlace']='no'})
    Regiones    = elementoTabla(v, 'qualifiers', 'P291')
    if Regiones then
        for kk, vv in pairs(Regiones) do
            Region = elementoTabla(vv, 'datavalue', 'value', 'id')
            Valores[Region] = Valor
        Valores['DESC'] = Valor

Some users complained about the "uneccesary" nested loops, and I thinks so. However, attemps for optimising then memory is not enough, and I still see serious unoptimisations. The problem is this happens only in few pages. I believe there are enough resources for increasing the limit then.

Some updates:

I've edited the module Lanzamientovj for optimisations and proper import from Wikidata.

Working in Among Us, I restored the previous wikitext-based {{Lanzamientovj}} and put it in sandbox (the main templates still uses Lua, and that wikitext-based template is transcluded in Among Us. By comparing the Lua-based and the wikitext-based, the differences in memory usage (see Parser profiling data at the bottom) is minimal, one uses near 48 MB of memory, while the other uses almost 50 MB of memory. Considering the pages uses a large ammount of references, a memory increase is, IMHO, justified. Attemps for improving performance on other templates and modules were already done, however, the problem still persists in large articles like Among Us.

I recommend asking around in the community someone more familiar with Wikidata usage from Lua to review these templates and look for common mistakes or inefficiencies. It seems very unlikely to me that so much memory would be used at a single point to render even a large article like that. I suspect that it it may be querying too wide a dataset instead of something more specific and thus pulling in more data than it actually needs.

Whether something is "justified" is more a matter of server capacity and acceptable wait times for rendering/editing pages, and less about whether we think something is worthwhile or important in the context of a specific article. Server capacity - because we generally handle a few thousand requests at the same time, and allowing each one to consume more memory very quickly adds up. Wait times - because more likely than not, the memory increase is not the result of a cheap operation but the result of an expensive operation that in turn has other infrastructure costs and side-effects behind the scenes to seek and aggregate the asked-for-data.

If someone more familiar with Wikidata and with Lua's Wikibase integration can confirm that the queries in these templates are narrow and efficient as possible, and can indicate what data is being fetched here exactly that is so large, then we can look further at how to address this.

One possible solution might be for developers to improve the underlying data format that you are interacting with, to be represented in a more efficient way, and thus you would be able to do do exactly what you want to do, and have it count as less memory. 50 million bytes is quite a lot of bytes, and smells to me like the system is working as intended by highlighting this as a problem. The needed action here is first to understand that problem.

As far as Lua Wikidata calls go, you want to avoid mw.wikibase.getEntity, or at least suffix it, so it only fetches badges or whatever you are after in each case (f.x. not local entity = mw.wikibase.getEntity()). That is because it, unsuffixed, fetches the whole item. For statements you can use either mw.wikibase.getBestStatements(id, property)[1] - which skips statements that are ranked as being outdated/incorrect or mw.wikibase.getAllStatements(id, property)[1] - which gives you all statements. Looking at Module:Wikidata on es.wikipedia, that does mean changing the calls to the module aswell.

Legoktm added a subscriber: Legoktm.

@Amitie_10g: please see the comment above by @Snaevar on how to optimize the module. If you're still having issues, probably asking for help on m:Tech or on Wikidata itself would be better.