While investigating large NFS users Toolforge on 2020-05-21 @bd808 noticed that the files in $HOME/public_html/cache for the templatetiger tool were very old. He poked around a bit and found that the cron job to cleanup the cache files for this tool introduced in response to T208456: templatetiger is using 1.2TB out of 8T available Tools had been disabled at some point. When the script ran, it deleted all of the cache files as they were all more than 90 days old.
Further poking around led to https://de.wikipedia.org/wiki/Wikipedia:WikiProjekt_Vorlagenauswertung where it appears that the project has been marked as inactive.
The referenced T184126: Templatetiger-Updating: Lost connection to MySQL server during query ticket was closed for inactivity by @bd808. The code being used to load the multiple gigabyte data files needs to be written in a different way. There is no reasonable way that the ToolsDB can be tuned to allow the multiple hours of uninterrupted runtime that this style of data loading would require:
#ukwiki-2019-02-20.txt #Date: 20190220# DB: 20170501 sk: 20190220 # DB: 20170501 sk: 20190220 DROP TABLE IF EXISTS `ukwiki_old`; DROP TABLE IF EXISTS `ukwiki_sum_old`; DROP TABLE IF EXISTS `ukwiki_para_old`; DROP TABLE IF EXISTS `ukwiki_neu`; DROP TABLE IF EXISTS `ukwiki_sum_neu`; DROP TABLE IF EXISTS `ukwiki_para_neu`; CREATE TABLE `ukwiki_neu` ( `name_id` bigint( 20 ) NOT NULL , `name` varchar( 180 ) NOT NULL , `tp_nr` bigint( 20 ) NOT NULL , `tp_name` varchar( 100 ) NOT NULL , `entry_nr` bigint( 20 ) NOT NULL , `entry_name` varchar( 200 ) NOT NULL , `Value` varchar( 900 ) NOT NULL , KEY `tp_name` (`tp_name`(30)), KEY `name` ( `name` (15)) , KEY `name_id` ( `name_id`) , KEY `entry_name` ( `entry_name`(15) ) , KEY `Value` ( `Value` (15) ) ) ENGINE = ARIA DEFAULT CHARSET = utf8; LOAD DATA LOCAL INFILE '/data/project/templatetiger/public_html/dumps/ukwiki-2019-02-20.txt' IGNORE INTO TABLE `ukwiki_neu` FIELDS TERMINATED BY '\t' ESCAPED BY '\\' LINES TERMINATED BY '\n' ; CREATE TABLE `ukwiki_sum_neu` ( `tp_name` varchar( 100 ) COLLATE utf8_bin NOT NULL , `sum` bigint( 20 ) NOT NULL , PRIMARY KEY ( `tp_name` ) , KEY `sum` ( `sum` ) ) ENGINE = ARIA DEFAULT CHARSET = utf8 COLLATE = utf8_bin; INSERT INTO `ukwiki_sum_neu` SELECT `tp_name` , COUNT( * ) FROM `ukwiki_neu` WHERE `entry_nr` = '1' GROUP BY `tp_name` ORDER BY COUNT( * ) DESC /* SLOW_OK */; CREATE TABLE `ukwiki_para_neu` (`tp_name` varchar( 100 ) COLLATE utf8_bin NOT NULL ,`entry_name` varchar( 200 ) COLLATE utf8_bin NOT NULL , `sum` BIGINT( 20 ) NOT NULL , KEY `tp_name` ( `tp_name` ) ,KEY `entry_name` ( `entry_name` )) ENGINE = ARIA DEFAULT CHARSET = utf8 COLLATE = utf8_bin; INSERT INTO `ukwiki_para_neu` SELECT `tp_name` , `entry_name`, COUNT( * ) FROM `ukwiki_neu` GROUP BY `tp_name`,`entry_name`; RENAME TABLE `ukwiki` to `ukwiki_old`, `ukwiki_sum` to `ukwiki_sum_old`,`ukwiki_para` to `ukwiki_para_old`;RENAME TABLE `ukwiki_neu` to `ukwiki`, `ukwiki_sum_neu` to `ukwiki_sum`,`ukwiki_para_neu` to `ukwiki_para`; DROP TABLE IF EXISTS `ukwiki_old`; DROP TABLE IF EXISTS `ukwiki_sum_old`; DROP TABLE IF EXISTS `ukwiki_para_old`; DELETE FROM `info` WHERE `lang` = 'ukwiki' ;INSERT INTO `info` ( `lang` , `date`, `sum`, `date_old` ) VALUES ('ukwiki', '20190220',(SELECT sum(`sum`) FROM `ukwiki_sum` ),'20170501');
The open question now is if @Kolossos is interested in reworking the data loading process or if they have given up on the project.