Page MenuHomePhabricator

Reduce memory overhead
Closed, ResolvedPublic

Description

Author: timwi

Description:
BUG MIGRATED FROM SOURCEFORGE
http://sourceforge.net/tracker/index.php?func=detail&aid=928878&group_id=34373&atid=411192
Originally submitted by Raphael Jolly (rjolly) 2004-04-03 17:58

I'm trying to have mediawiki running on a free.fr account.
When I post an article of 50-80KB, I get the following error:

Fatal error: Allowed memory size of 5242880 bytes exhausted (tried to allocate 186793 bytes) in /var/www/free.fr/e/9/raphael.jolly/include/MagicWord.php on line 194

Indeed phpinfo() tells me that memory_limit=5M on my provider's server.
At first I thought that it was a DB issue, but the problem occurs also when one chooses "show preview".
My website is here:
http://raphael.jolly.free.fr/wiki

I can preview much bigger articles on fr.wikipedia.org (say 120KB+). Could someone tell me what the memory_limit is on that server ?

Of course I understand that the parser could really need 5M for a 50KB article. However I would like to rule out the possibility that the problem is due to a mistake in my install/config, or tied to my provider's settings (disregarding the memory_limit issue).
(Note that I had to hack the in-place install that doesn't perform perfectly well on this platform.)

Additional info:

  • Now the site is not localized ( = utf8 ). When I set the fr localisation, I don't even get the php error above. I have either error 500 internal server error, or no response at all.
  • My sighted application will be dealing with docs of precisely 50-80KB (laws), so this issue has me completely stuck.
  • I've openned a case at my provider's asking for the apache error messages, but they haven't responded yet.
  • Mediawiki is 1.2.3

Any clue to what's going wrong is much welcome.

Raphael

  • Additional comments ------------------------

Date: 2004-04-05 17:33
Sender: SF user rjolly

Update:

A further inquiry convinced me that this isn't an isolated bug,
but rather an overall design issue. Indeed I've tried to
comment out the lines where the memory size exhausted error
occured, but it just moves to other places. Specifically:

From MagicWord.php:194
to MagicWord.php:146
to Parser.php:443
to Parser.php:501

I reported to my ISP that this was a memory_limit issue. His
response was that it couldn't by augmented.

So my only hope it that some optimization is possible. It is
supported by the fact that phpBB installed on the same
server can stand bigger messages without any problem:

http://raphael.jolly.free.fr/phpBB2/

Regarding the compared behavior of fr.wikipedia.org, I can
preview articles as big as 3.2MB, so I guess the server is
compiled without --enable-memory-limit. If the memory-limit
is enabled, the default is 8M.


Date: 2004-04-06 10:29
Sender: SF user rjolly

Some more accurate data:

Preview article on my website:

works  fails

phpBB (*) 360KB 392KB (Allowed memory size exh.)
mediawiki-utf8 76KB 84KB (Allowed memory size exh.)
mediawiki-fr 48KB 64KB (internal server error)

Preview article on fr.wikipedia.org:

works  fails

mediawiki-fr 3.2MB 6.4MB (my proxy hits a timeout)

(*) with everything on : smileys, bbcode, html ; localized in
fr.


Date: 2004-04-06 17:27
Sender: SF user hashar

For the localized issue, please open a new bug ;0)


I have been able to preview a 63kb page on your wiki, but
bigger things fail :/ Seems like we will have to clear

unused variables in the code.

Date: 2004-06-14 11:39
Sender: SF user hashar

We are currently doing profiling of the mediawiki software.
It actually (cvs version) uses about 7 580 KB of memory with
fr language & phptal monobook skin.

Some memory eaters have been spotted and might be fixed in
the next version (1.4).


Version: unspecified
Severity: normal

Details

Reference
bz318

Event Timeline

bzimport raised the priority of this task from to Medium.Nov 21 2014, 6:47 PM
bzimport set Reference to bz318.
bzimport added a subscriber: Unknown Object (MLST).

My mediawiki installation (fr, utf-8) run with a memory limit of 5MB.
Looks like we hunted enough memory eaters.