Page MenuHomePhabricator

[jobs-api] Investigate if we can reuse the 'web' flavour pre-built images as regular images
Closed, ResolvedPublic

Description

That would come with some simplification of the amount of images we generate and maintain.

The main difference seems to be that they have the webservice-runner and some (like python) install some packages.

This task is to double check that those images are usable, and then generate only one image of each type instead, so they can be reused.

This will simplify a bit also the addition of webservice to the jobs-api.

Notes:

See:

Related Objects

StatusSubtypeAssignedTask
ResolvedLucasWerkmeister
Resolvedmatmarex
ResolvedLegoktm
ResolvedLegoktm
In Progressdcaro
Resolveddcaro
In Progresskomla
Resolveddcaro
Resolveddcaro
Opendcaro
OpenNone
In ProgressFeatureRaymond_Ndibe
In ProgressRaymond_Ndibe
ResolvedRaymond_Ndibe

Event Timeline

fnegri triaged this task as Medium priority.Nov 4 2025, 3:26 PM
fnegri subscribed.

In all cases where both variants exist, the webservice image is functionally a superset of the jobs-framework image, therefore, the webservice image can most likely serve both purposes.

The table below is is a detailed analysis of every image in the image-config configmap (on the main branch)

Table

Image TypeJobs-Framework Image Name (size)Webservice Image Name (size)New Things Added To Webservice ImageRecommendation
bookwormtoolforge-bookworm-sssd (246.76 MB)toolforge-bookworm-web-sssd (246.76 MB)/usr/bin/webservice-runnerUse toolforge-bookworm-web-sssd
bullseyetoolforge-bullseye-sssd (179.99 MB)N/AN/AN/A
bullseye-standalonetoolforge-bullseye-standalone (98.11 MB)N/AN/AN/A
buster-standalonetoolforge-buster-standalone (97.1 MB)N/AN/AN/A
golang1.11toolforge-golang111-sssd-base (267.1 MB)toolforge-golang111-sssd-web (275.69 MB)N/AUse toolforge-golang111-sssd-web
jdk8toolforge-jdk8-sssd-base (261.0 MB)toolforge-jdk8-sssd-web (271.49 MB)N/A (doesn't exist in the toollabs-images repo, but setup is likely like the other jdk images)Use toolforge-jdk8-sssd-web
jdk11toolforge-jdk11-sssd-base (352.94 MB)toolforge-jdk11-sssd-web (363.26 MB)N/A (doesn't exist in the toollabs-images repo, but setup is likely like the other jdk images)Use toolforge-jdk11-sssd-web
jdk17toolforge-jdk17-sssd-base (460.56 MB)toolforge-jdk17-sssd-web (469.31 MB)python3, /usr/bin/webservice-runnerUse toolforge-jdk17-sssd-web
jdk21toolforge-jdk21-sssd-base (503.75 MB)toolforge-jdk21-sssd-web (503.75 MB)/usr/bin/webservice-runnerUse toolforge-jdk21-sssd-web
mariadbtoolforge-mariadb-sssd-base (286.5 MB)N/AN/AN/A
mono6.8toolforge-mono68-sssd-base (292.33 MB)N/AN/AN/A
mono6.12toolforge-mono612-sssd-base (367.19 MB)N/AN/AN/A
node6toolforge-node6-sssd-base (229.1 MB)toolforge-node6-sssd-web (230.7 MB)N/A (doesn't exist in the toollabs-images repo, but setup is likely like the other node images)Use toolforge-node6-sssd-web
node10toolforge-node10-sssd-base (222.89 MB)toolforge-node10-sssd-web (233.81 MB)N/A (doesn't exist in the toollabs-images repo, but setup is likely like the other node images)Use toolforge-node10-sssd-web
node12toolforge-node12-sssd-base (296.53 MB)toolforge-node12-sssd-web (296.54 MB)python3, /usr/bin/webservice-runnerUse toolforge-node12-sssd-web
node16toolforge-node16-sssd-base (304.62 MB)toolforge-node16-sssd-web (309.48 MB)python3, /usr/bin/webservice-runnerUse toolforge-node16-sssd-web
node18toolforge-node18-sssd-base (371.72 MB)toolforge-node18-sssd-web (371.72 MB)/usr/bin/webservice-runnerUse toolforge-node18-sssd-web
node20toolforge-node20-sssd-base (424.49 MB)toolforge-node20-sssd-web (424.49 MB)/usr/bin/webservice-runnerUse toolforge-node20-sssd-web
perl5.32toolforge-perl532-sssd-base (283.53 MB)toolforge-perl532-sssd-web (292.75 MB)lighttpd, lighttpd-mod-openssl, python3, /usr/bin/webservice-runner, /etc/toolforge-enable-perlUse toolforge-perl532-sssd-web
perl5.36toolforge-perl536-sssd-base (350.31 MB)toolforge-perl536-sssd-web (351.15 MB)lighttpd, /usr/bin/webservice-runner, /etc/toolforge-enable-perlUse toolforge-perl536-sssd-web
perl5.40toolforge-perl540-sssd-base (380.03 MB)toolforge-perl540-sssd-web (380.87 MB)lighttpd, /usr/bin/webservice-runner, /etc/toolforge-enable-perlUse toolforge-perl540-sssd-web
php5.6toolforge-php5-sssd-base (171.88 MB)toolforge-php5-sssd-web (186.5 MB)N/A (doesn't exist in the toollabs-images repo, but setup is likely like the other php images)Use toolforge-php5-sssd-web
php7.2toolforge-php72-sssd-base (173.69 MB)toolforge-php72-sssd-web (194.92 MB)N/A (doesn't exist in the toollabs-images repo, but setup is likely like the other php images)Use toolforge-php72-sssd-web
php7.3toolforge-php73-sssd-base (221.3 MB)toolforge-php73-sssd-web (242.54 MB)N/A (doesn't exist in the toollabs-images repo, but setup is likely like the other php images)Use toolforge-php73-sssd-web
php7.4toolforge-php74-sssd-base (433.65 MB)toolforge-php74-sssd-web (446.53 MB)lighttpd, lighttpd-mod-openssl, php7.4-cgi, python3, /usr/bin/webservice-runner, /etc/toolforge-enable-phpUse toolforge-php74-sssd-web
php8.2toolforge-php82-sssd-base (514.2 MB)toolforge-php82-sssd-web (519.49 MB)lighttpd, php8.2-cgi, /usr/bin/webservice-runner, /etc/toolforge-enable-phpUse toolforge-php82-sssd-web
php8.4toolforge-php84-sssd-base (297.84 MB)toolforge-php84-sssd-web (303.36 MB)lighttpd, php8.4-cgi, /usr/bin/webservice-runner, /etc/toolforge-enable-phpUse toolforge-php84-sssd-web
python2toolforge-python2-sssd-base (259.96 MB)toolforge-python2-sssd-web (262.74 MB)N/A (doesn't exist in the toollabs-images repo, but setup is likely like the other python images)Use toolforge-python2-sssd-web
python3.4toolforge-python34-sssd-base (282.5 MB)toolforge-python34-sssd-web (285.34 MB)N/A (doesn't exist in the toollabs-images repo, but setup is likely like the other python images)Use toolforge-python34-sssd-web
python3.5toolforge-python35-sssd-base (273.14 MB)toolforge-python35-sssd-web (274.95 MB)N/A (doesn't exist in the toollabs-images repo, but setup is likely like the other python images)Use toolforge-python35-sssd-web
python3.7toolforge-python37-sssd-base (324.96 MB)toolforge-python37-sssd-web (327.13 MB)N/A (doesn't exist in the toollabs-images repo, but setup is likely like the other python images)Use toolforge-python37-sssd-web
python3.9toolforge-python39-sssd-base (290.81 MB)toolforge-python39-sssd-web (292.85 MB)uwsgi, uwsgi-plugin-python3, /usr/local/bin/webservice-python-bootstrap, /usr/bin/webservice-runnerUse toolforge-python39-sssd-web
python3.11toolforge-python311-sssd-base (349.29 MB)toolforge-python311-sssd-web (351.35 MB)uwsgi, uwsgi-plugin-python3, /usr/local/bin/webservice-python-bootstrap, /usr/bin/webservice-runnerUse toolforge-python311-sssd-web
python3.13toolforge-python313-sssd-base (376.5 MB)toolforge-python313-sssd-web (378.4 MB)uwsgi, uwsgi-plugin-python3, /usr/local/bin/webservice-python-bootstrap, /usr/bin/webservice-runnerUse toolforge-python313-sssd-web
ruby2.1toolforge-ruby21-sssd-base (198.54 MB)toolforge-ruby21-sssd-web (207.13 MB)N/A (doesn't exist in the toollabs-images repo, but setup is likely like the other ruby images)Use toolforge-ruby21-sssd-web
ruby2.5toolforge-ruby25-sssd-base (270.65 MB)toolforge-ruby25-sssd-web (280.95 MB)N/A (doesn't exist in the toollabs-images repo, but setup is likely like the other ruby images)Use toolforge-ruby25-sssd-web
ruby2.7toolforge-ruby27-sssd-base (294.13 MB)toolforge-ruby27-sssd-web (302.89 MB)python3, /usr/bin/webservice-runnerUse toolforge-ruby27-sssd-web
ruby3.1toolforge-ruby31-sssd-base (364.55 MB)toolforge-ruby31-sssd-web (364.55 MB)/usr/bin/webservice-runnerUse toolforge-ruby31-sssd-web
ruby3.3toolforge-ruby33-sssd-base (417.92 MB)toolforge-ruby33-sssd-web (417.92 MB)/usr/bin/webservice-runnerUse toolforge-ruby33-sssd-web
tcl8.6toolforge-tcl86-sssd-base (259.19 MB)toolforge-tcl86-sssd-web (270.79 MB)libfcgi-dev, lighttpd, python3, /usr/bin/webservice-runnerUse toolforge-tcl86-sssd-web
trixietoolforge-trixie-sssd (252.24 MB)toolforge-trixie-web-sssd (252.24 MB)/usr/bin/webservice-runnerUse toolforge-trixie-web-sssd
dcaro updated the task description. (Show Details)
dcaro added a subscriber: Raymond_Ndibe.

I did not mean to unassign sorry, I think we both edited at the same time.

Can you manually test that is the case? For example running some code on each of them, even if it's a shellscript of sorts.

Also there's some setup that is not there in some other images, if you check some of the Dockerfiles for webservice there's also envvars set in some.

And, can you share the code you use to generate that table? Could be useful.

Also, what does you mean with doesn't exist in the toollabs-images repo, but setup is likely like the other node image? those do exist there, just a different revision, for example for ruby 2.5: https://gerrit.wikimedia.org/r/plugins/gitiles/operations/docker-images/toollabs-images/+/9aaeb88e4af82a42f50146ef4ba97f6932d1e1b6/ruby25-sssd/

Also, what does you mean with doesn't exist in the toollabs-images repo, but setup is likely like the other node image? those do exist there, just a different revision, for example for ruby 2.5: https://gerrit.wikimedia.org/r/plugins/gitiles/operations/docker-images/toollabs-images/+/9aaeb88e4af82a42f50146ef4ba97f6932d1e1b6/ruby25-sssd/

yeaa by this I meant they don't exist on the master branch

I did not mean to unassign sorry, I think we both edited at the same time.

Can you manually test that is the case? For example running some code on each of them, even if it's a shellscript of sorts.

Also there's some setup that is not there in some other images, if you check some of the Dockerfiles for webservice there's also envvars set in some.

And, can you share the code you use to generate that table? Could be useful.

The image sizes can be calculated with:

#!/usr/bin/env python3
import yaml
import json
import subprocess
import sys
import os
from typing import Union

CONFIG_FILE = 'config.yml'
OUTPUT_FILE = 'image_sizes.json'
IMAGE_TAG = 'latest'


def get_image_size_mb(image_name: str, tag: str) -> Union[float, str]:
    full_image_name = f"{image_name}:{tag}"
    cmd = ["docker", "manifest", "inspect", full_image_name]
    
    print(f"  inspecting: {full_image_name}...")

    try:
        result = subprocess.run(
            cmd,
            capture_output=True,
            text=True,
            check=True,
            encoding='utf-8'
        )
        manifest = json.loads(result.stdout)

        if manifest.get("mediaType") == "application/vnd.docker.distribution.manifest.list.v2+json":
            digest = None
            for m in manifest.get("manifests", []):
                if m.get("platform", {}).get("architecture") == "amd64":
                    digest = m["digest"]
                    break
            
            if not digest:
                return "Error: No amd64 manifest"

            full_image_name_digest = f"{image_name}@{digest}"
            result = subprocess.run(
                ["docker", "manifest", "inspect", full_image_name_digest],
                capture_output=True,
                text=True,
                check=True,
                encoding='utf-8'
            )
            manifest = json.loads(result.stdout)

        total_size_bytes = 0
        for layer in manifest.get("layers", []):
            total_size_bytes += layer.get("size", 0)

        if total_size_bytes == 0:
            return "error: No layers found"

        size_mb = total_size_bytes / (1024 * 1024)
        return round(size_mb, 2)

    except (subprocess.CalledProcessError, json.JSONDecodeError):
        return "error"


def main():
    if not os.path.exists(CONFIG_FILE):
        print(f"error: {CONFIG_FILE} not found.", file=sys.stderr)
        sys.exit(1)

    print(f"loading image config from {CONFIG_FILE}...")
    with open(CONFIG_FILE, 'r', encoding='utf-8') as f:
        full_config = yaml.safe_load(f)
        image_data = yaml.safe_load(full_config['data']['images-v1.yaml'])

    image_sizes = {}
    print("processing images...\n")

    for image_type, details in image_data.items():
        print(f"processing type: {image_type}")
        image_sizes[image_type] = {}
        variants = details.get('variants', {})

        for variant_name, variant_details in variants.items():
            image_to_check = variant_details.get('image')
            
            if image_to_check:
                size = get_image_size_mb(image_to_check, IMAGE_TAG)
                image_sizes[image_type][variant_name] = {
                    "image_name": image_to_check,
                    "tag": IMAGE_TAG,
                    "size_mb": size
                }
                print(f"  -> {variant_name}: {size} MB")
            else:
                image_sizes[image_type][variant_name] = {
                    "image_name": None,
                    "tag": IMAGE_TAG,
                    "size_mb": "skipped (no image)"
                }
        print("") 

    with open(OUTPUT_FILE, 'w', encoding='utf-8') as f:
        json.dump(image_sizes, f, indent=2)
    print(f"results saved to {OUTPUT_FILE}")

    print(json.dumps(image_sizes, indent=2))


if __name__ == "__main__":
    main()

The table will have to be put together by hand and find+replace

quick throw-away script for simple deployments in lima-kilo using the web images:

#!/usr/bin/env python3

import sys
import subprocess

# web images
IMAGES = [
    "docker-registry.tools.wmflabs.org/toolforge-bookworm-web-sssd",
    "docker-registry.tools.wmflabs.org/toolforge-golang111-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-jdk8-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-jdk11-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-jdk17-sssd-web",
    "docker-registry.svc.toolforge.org/toolforge-jdk21-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-node6-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-node10-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-node12-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-node16-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-node18-sssd-web",
    "docker-registry.svc.toolforge.org/toolforge-node20-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-perl532-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-perl536-sssd-web",
    "docker-registry.svc.toolforge.org/toolforge-perl540-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-php5-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-php72-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-php73-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-php74-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-php82-sssd-web",
    "docker-registry.svc.toolforge.org/toolforge-php84-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-python2-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-python34-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-python35-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-python37-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-python39-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-python311-sssd-web",
    "docker-registry.svc.toolforge.org/toolforge-python313-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-ruby21-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-ruby25-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-ruby27-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-ruby31-sssd-web",
    "docker-registry.svc.toolforge.org/toolforge-ruby33-sssd-web",
    "docker-registry.tools.wmflabs.org/toolforge-tcl86-sssd-web",
    "docker-registry.svc.toolforge.org/toolforge-trixie-web-sssd",
]


DEFAULT_RESOURCES_BLOCK = """
          requests:
            cpu: "10m"
            memory: "16Mi"
          limits:
            cpu: "20m"
            memory: "32Mi"
"""

JAVA_RESOURCES_BLOCK = """
          requests:
            cpu: "50m"
            memory: "128Mi"
          limits:
            cpu: "100m"
            memory: "256Mi"
"""


DEPLOYMENT_TEMPLATE = """---
apiVersion: apps/v1
kind: Deployment
metadata:
  name: {deploy_name}
  namespace: default
  labels:
    app: {app_name}
spec:
  replicas: 1
  selector:
    matchLabels:
      app: {app_name}
  template:
    metadata:
      labels:
        app: {app_name}
    spec:
      containers:
      - name: {app_name}
        image: "{image_url}"
        command: ["/bin/sh", "-c"]
        args: ["while true; do date; sleep 1; done"]
        
        # This block will be filled dynamically
        resources:{resources_block}
"""

def generate_all_manifests() -> str:
    all_manifests = []
    for image_url in IMAGES:
        app_name = image_url.split('/')[-1]
        deploy_name = f"{app_name}-deploy"

        if "jdk" in app_name:
            resources_to_apply = JAVA_RESOURCES_BLOCK
        else:
            resources_to_apply = DEFAULT_RESOURCES_BLOCK

        yaml_manifest = DEPLOYMENT_TEMPLATE.format(
            deploy_name=deploy_name,
            app_name=app_name,
            image_url=image_url,
            resources_block=resources_to_apply
        )
        all_manifests.append(yaml_manifest)

    # Join all manifests into one big string
    return "".join(all_manifests)

def create_deployments(yaml_data: str):
    command = ['kubectl', 'create', '-f', '-']

    print(f"attempting to create {len(IMAGES)} deployments via 'kubectl create -f -'...")
    process = subprocess.run(
        command,
        input=yaml_data,
        text=True,
        capture_output=True
    )
    print(process.stdout)
    
    if process.stderr:
        print(process.stderr)
        
    print(f"\nsuccessfully created {len(IMAGES)} deployments.")

if __name__ == "__main__":
    all_yaml = generate_all_manifests()

    create_deployments(all_yaml)

Results:

NAME                                                   READY   STATUS    RESTARTS   AGE
toolforge-bookworm-web-sssd-deploy-756f66f54-9n85s     1/1     Running   0          5h49m
toolforge-golang111-sssd-web-deploy-d4575c67c-d2v69    1/1     Running   0          5h49m
toolforge-jdk11-sssd-web-deploy-754bc6fd99-29fh7       1/1     Running   0          5h49m
toolforge-jdk17-sssd-web-deploy-856b4d7577-qxr5n       1/1     Running   0          5h49m
toolforge-jdk21-sssd-web-deploy-f57c4bd6b-gf5r9        1/1     Running   0          5h49m
toolforge-jdk8-sssd-web-deploy-6845f8dc65-vhl7h        1/1     Running   0          5h49m
toolforge-node10-sssd-web-deploy-fcbddd9bf-vj45n       1/1     Running   0          5h49m
toolforge-node12-sssd-web-deploy-bfb8db47-sz8nc        1/1     Running   0          5h49m
toolforge-node16-sssd-web-deploy-56f97c4b4c-qz6nf      1/1     Running   0          5h49m
toolforge-node18-sssd-web-deploy-c9465c68f-bsmwx       1/1     Running   0          5h49m
toolforge-node20-sssd-web-deploy-695c574d86-4cs4d      1/1     Running   0          5h49m
toolforge-node6-sssd-web-deploy-7dff956b44-wjrm6       1/1     Running   0          5h49m
toolforge-perl532-sssd-web-deploy-79d4f77c59-gmbxb     1/1     Running   0          5h49m
toolforge-perl536-sssd-web-deploy-bf8d945ff-jbbjl      1/1     Running   0          5h49m
toolforge-perl540-sssd-web-deploy-787797d448-cv9vr     1/1     Running   0          5h49m
toolforge-php5-sssd-web-deploy-9cdcdbbcf-dwm2b         1/1     Running   0          5h49m
toolforge-php72-sssd-web-deploy-c5fb895f-626ll         1/1     Running   0          5h49m
toolforge-php73-sssd-web-deploy-76ff85c7dd-phgr5       1/1     Running   0          5h49m
toolforge-php74-sssd-web-deploy-5654686f97-2shnd       1/1     Running   0          5h49m
toolforge-php82-sssd-web-deploy-6bcfb9d5b4-pj456       1/1     Running   0          5h49m
toolforge-php84-sssd-web-deploy-774c4c5f7b-zrz6h       1/1     Running   0          5h49m
toolforge-python2-sssd-web-deploy-855778896b-knmch     1/1     Running   0          5h49m
toolforge-python311-sssd-web-deploy-856c9954c9-22q9z   1/1     Running   0          5h49m
toolforge-python313-sssd-web-deploy-674cd84979-65j7q   1/1     Running   0          5h49m
toolforge-python34-sssd-web-deploy-8488566d8f-4mjtc    1/1     Running   0          5h49m
toolforge-python35-sssd-web-deploy-85b87c8d8-22thf     1/1     Running   0          5h49m
toolforge-python37-sssd-web-deploy-765ffc9566-qxbpn    1/1     Running   0          5h49m
toolforge-python39-sssd-web-deploy-5d98f8f764-pdskl    1/1     Running   0          5h49m
toolforge-ruby21-sssd-web-deploy-77f554fb97-8d8jz      1/1     Running   0          5h49m
toolforge-ruby25-sssd-web-deploy-756c9ff65-zjjd2       1/1     Running   0          5h49m
toolforge-ruby27-sssd-web-deploy-66c88649c9-vxdx7      1/1     Running   0          5h49m
toolforge-ruby31-sssd-web-deploy-78896594bb-rv6p5      1/1     Running   0          5h49m
toolforge-ruby33-sssd-web-deploy-59846b9466-5g4j7      1/1     Running   0          5h49m
toolforge-tcl86-sssd-web-deploy-85f76f6d99-zxgfg       1/1     Running   0          5h49m
toolforge-trixie-web-sssd-deploy-5cf5db9989-s8gtc      1/1     Running   0          5h49m

quick throw-away script for simple deployments in lima-kilo using the web images:

That's ok, but can you test if we can use them as jobs from jobs-api?
I'm sure that they will be able to be pulled and run as just images, the key point is running as jobs (envvars, entrypoints, resources, security policies, ...).

For that you can try using the image-config patch you created in lima-kilo, and start one job for each image type (might be easier using jobs.yaml), and making sure it runs ok (ex. logging some string, and checking that the logs are sent ok).

quick throw-away script for simple deployments in lima-kilo using the web images:

That's ok, but can you test if we can use them as jobs from jobs-api?
I'm sure that they will be able to be pulled and run as just images, the key point is running as jobs (envvars, entrypoints, resources, security policies, ...).

For that you can try using the image-config patch you created in lima-kilo, and start one job for each image type (might be easier using jobs.yaml), and making sure it runs ok (ex. logging some string, and checking that the logs are sent ok).

this might likely require creating a test PR for jobs-api since we can't just deploy webservice images in jobs-api rn. I'll do that and respond here

quick throw-away script for simple deployments in lima-kilo using the web images:

That's ok, but can you test if we can use them as jobs from jobs-api?
I'm sure that they will be able to be pulled and run as just images, the key point is running as jobs (envvars, entrypoints, resources, security policies, ...).

For that you can try using the image-config patch you created in lima-kilo, and start one job for each image type (might be easier using jobs.yaml), and making sure it runs ok (ex. logging some string, and checking that the logs are sent ok).

this might likely require creating a test PR for jobs-api since we can't just deploy webservice images in jobs-api rn. I'll do that and respond here

jobs-api will use whatever is in image-config configmap, you can change that locally in lima-kilo (just did to test harbor images), but sure, whatever you need 👍

Lima kilo env configurations for anyone who wants to recreate (configmaps, limitranges, resourcequotas, etc. I basically maxed everything out to ensure those never become an issue while running these tests. keeping everything in doc so things don't clutter the task):
https://docs.google.com/document/d/1LfXdcVB-Vh0I0IuoniCN325Tofzu7MK9bBnA8G9aLM0/edit?tab=t.0

Images:
local.tf-test@lima-kilo:~$ toolforge jobs images

Short nameimage
bookwormdocker-registry.tools.wmflabs.org/toolforge-bookworm-web-sssd:latest
bullseyedocker-registry.tools.wmflabs.org/toolforge-bullseye-sssd:latest
jdk17docker-registry.tools.wmflabs.org/toolforge-jdk17-sssd-web:latest
jdk21docker-registry.svc.toolforge.org/toolforge-jdk21-sssd-web:latest
mariadbdocker-registry.tools.wmflabs.org/toolforge-mariadb-sssd-base:latest
mono6.12docker-registry.tools.wmflabs.org/toolforge-mono612-sssd-base:latest
mono6.8docker-registry.tools.wmflabs.org/toolforge-mono68-sssd-base:latest
node16docker-registry.tools.wmflabs.org/toolforge-node16-sssd-web:latest
node18docker-registry.tools.wmflabs.org/toolforge-node18-sssd-web:latest
node20docker-registry.svc.toolforge.org/toolforge-node20-sssd-web:latest
perl5.32docker-registry.tools.wmflabs.org/toolforge-perl532-sssd-web:latest
perl5.36docker-registry.tools.wmflabs.org/toolforge-perl536-sssd-web:latest
perl5.40docker-registry.svc.toolforge.org/toolforge-perl540-sssd-web:latest
php7.4docker-registry.tools.wmflabs.org/toolforge-php74-sssd-web:latest
php8.2docker-registry.tools.wmflabs.org/toolforge-php82-sssd-web:latest
php8.4docker-registry.svc.toolforge.org/toolforge-php84-sssd-web:latest
python3.11docker-registry.tools.wmflabs.org/toolforge-python311-sssd-web:latest
python3.13docker-registry.svc.toolforge.org/toolforge-python313-sssd-web:latest
python3.9docker-registry.tools.wmflabs.org/toolforge-python39-sssd-web:latest
ruby2.1docker-registry.tools.wmflabs.org/toolforge-ruby21-sssd-web:latest
ruby2.7docker-registry.tools.wmflabs.org/toolforge-ruby27-sssd-web:latest
ruby3.1docker-registry.tools.wmflabs.org/toolforge-ruby31-sssd-web:latest
ruby3.3docker-registry.svc.toolforge.org/toolforge-ruby33-sssd-web:latest
tcl8.6docker-registry.tools.wmflabs.org/toolforge-tcl86-sssd-web:latest
tool-tf-test/component1:latest192.168.5.15/tool-tf-test/component1:latest
trixiedocker-registry.svc.toolforge.org/toolforge-trixie-web-sssd:latest

Script:

#!/usr/bin/env python3

import subprocess

# web images
IMAGES = [
  "bookworm", "bullseye", "jdk17", "jdk21", "mariadb", "mono6.12", "mono6.8",
  "node16", "node18", "node20", "perl5.32", "perl5.36", "perl5.40", "php7.4",
  "php8.2", "php8.4", "python3.11", "python3.13", "python3.9", "ruby2.1",
  "ruby2.7", "ruby3.1", "ruby3.3", "tcl8.6",
]

def create_jobs():
    for image in IMAGES:
      command = [
          'toolforge', 'jobs', 'run', image, "--command", "while true; do date; sleep 1; done", "--image", image, "--continuous", "--no-filelog"]
      if image.startswith("jdk"):
          command += ["--cpu", "100m", "--mem", "256Mi"]

      print(f"attempting to create {len(IMAGES)} jobs via 'toolforge jobs run'...")
      process = subprocess.run(
          command,
          text=True,
          capture_output=True
      )
      print(process.stdout)
      
      if process.stderr:
          print(process.stderr)
          
      print(f"\nsuccessfully created {len(IMAGES)} jobs.")

if __name__ == "__main__":
    create_jobs()

Logs:

bookworm-5c5b978674-w7fsm job Thu Nov 13 10:26:09 PM UTC 2025
bookworm-5c5b978674-w7fsm job Thu Nov 13 10:26:10 PM UTC 2025
bookworm-5c5b978674-w7fsm job Thu Nov 13 10:26:11 PM UTC 2025

bullseye-54c667c89b-mfhd5 job Thu 13 Nov 2025 10:26:08 PM UTC
bullseye-54c667c89b-mfhd5 job Thu 13 Nov 2025 10:26:10 PM UTC
bullseye-54c667c89b-mfhd5 job Thu 13 Nov 2025 10:26:11 PM UTC

jdk17-b7b77845c-xm8ws job Thu 13 Nov 2025 10:26:08 PM UTC
jdk17-b7b77845c-xm8ws job Thu 13 Nov 2025 10:26:09 PM UTC
jdk17-b7b77845c-xm8ws job Thu 13 Nov 2025 10:26:10 PM UTC

jdk21-bb997974d-ms5ld job Thu Nov 13 10:26:09 PM UTC 2025
jdk21-bb997974d-ms5ld job Thu Nov 13 10:26:10 PM UTC 2025
jdk21-bb997974d-ms5ld job Thu Nov 13 10:26:11 PM UTC 2025

mariadb-5f5c8f8c5f-j8vrw job Thu Nov 13 10:26:09 PM UTC 2025
mariadb-5f5c8f8c5f-j8vrw job Thu Nov 13 10:26:10 PM UTC 2025
mariadb-5f5c8f8c5f-j8vrw job Thu Nov 13 10:26:11 PM UTC 2025

mono6.12-54cdf5dd85-xb4pk job Thu Nov 13 10:26:09 PM UTC 2025
mono6.12-54cdf5dd85-xb4pk job Thu Nov 13 10:26:10 PM UTC 2025
mono6.12-54cdf5dd85-xb4pk job Thu Nov 13 10:26:11 PM UTC 2025

mono6.8-5847fdf976-ns7h7 job Thu 13 Nov 2025 10:26:08 PM UTC
mono6.8-5847fdf976-ns7h7 job Thu 13 Nov 2025 10:26:09 PM UTC
mono6.8-5847fdf976-ns7h7 job Thu 13 Nov 2025 10:26:10 PM UTC

node16-5dbb9485cc-ndfcw job Thu 13 Nov 2025 10:26:09 PM UTC
node16-5dbb9485cc-ndfcw job Thu 13 Nov 2025 10:26:10 PM UTC
node16-5dbb9485cc-ndfcw job Thu 13 Nov 2025 10:26:11 PM UTC

node18-7bd968d6c5-zsvpb job Thu Nov 13 10:26:09 PM UTC 2025
node18-7bd968d6c5-zsvpb job Thu Nov 13 10:26:10 PM UTC 2025
node18-7bd968d6c5-zsvpb job Thu Nov 13 10:26:11 PM UTC 2025

node20-69947c5fc4-x8jrn job Thu Nov 13 10:26:08 PM UTC 2025
node20-69947c5fc4-x8jrn job Thu Nov 13 10:26:10 PM UTC 2025
node20-69947c5fc4-x8jrn job Thu Nov 13 10:26:11 PM UTC 2025

perl5.32-79d654f587-z9kp8 job Thu 13 Nov 2025 10:26:09 PM UTC
perl5.32-79d654f587-z9kp8 job Thu 13 Nov 2025 10:26:10 PM UTC
perl5.32-79d654f587-z9kp8 job Thu 13 Nov 2025 10:26:11 PM UTC

perl5.36-fd779457f-cqp6b job Thu Nov 13 10:26:08 PM UTC 2025
perl5.36-fd779457f-cqp6b job Thu Nov 13 10:26:10 PM UTC 2025
perl5.36-fd779457f-cqp6b job Thu Nov 13 10:26:11 PM UTC 2025

perl5.40-f74b7bb46-fvr8b job Thu Nov 13 10:26:08 PM UTC 2025
perl5.40-f74b7bb46-fvr8b job Thu Nov 13 10:26:10 PM UTC 2025
perl5.40-f74b7bb46-fvr8b job Thu Nov 13 10:26:11 PM UTC 2025

php7.4-6d87cfcfc7-vm945 job Thu 13 Nov 2025 10:26:08 PM UTC
php7.4-6d87cfcfc7-vm945 job Thu 13 Nov 2025 10:26:09 PM UTC
php7.4-6d87cfcfc7-vm945 job Thu 13 Nov 2025 10:26:10 PM UTC

php8.2-f465b4bf4-p4zgz job Thu Nov 13 10:26:08 PM UTC 2025
php8.2-f465b4bf4-p4zgz job Thu Nov 13 10:26:09 PM UTC 2025
php8.2-f465b4bf4-p4zgz job Thu Nov 13 10:26:10 PM UTC 2025

php8.4-59b56b7474-fxvmn job Thu Nov 13 10:26:08 PM UTC 2025
php8.4-59b56b7474-fxvmn job Thu Nov 13 10:26:10 PM UTC 2025
php8.4-59b56b7474-fxvmn job Thu Nov 13 10:26:11 PM UTC 2025

python3.11-7fb9d66f95-ftzc8 job Thu Nov 13 10:26:08 PM UTC 2025
python3.11-7fb9d66f95-ftzc8 job Thu Nov 13 10:26:10 PM UTC 2025
python3.11-7fb9d66f95-ftzc8 job Thu Nov 13 10:26:11 PM UTC 2025

python3.13-6bd6bb8c99-cl6nl job Thu Nov 13 10:26:08 PM UTC 2025
python3.13-6bd6bb8c99-cl6nl job Thu Nov 13 10:26:10 PM UTC 2025
python3.13-6bd6bb8c99-cl6nl job Thu Nov 13 10:26:11 PM UTC 2025

python3.9-64dbc457b-8wfvv job Thu 13 Nov 2025 10:26:09 PM UTC
python3.9-64dbc457b-8wfvv job Thu 13 Nov 2025 10:26:10 PM UTC
python3.9-64dbc457b-8wfvv job Thu 13 Nov 2025 10:26:11 PM UTC

ruby2.1-6b487dbf5f-t58ls job Thu Nov 13 22:26:08 UTC 2025
ruby2.1-6b487dbf5f-t58ls job Thu Nov 13 22:26:09 UTC 2025
ruby2.1-6b487dbf5f-t58ls job Thu Nov 13 22:26:11 UTC 2025

ruby2.7-56b6549c45-pplwp job Thu 13 Nov 2025 10:26:08 PM UTC
ruby2.7-56b6549c45-pplwp job Thu 13 Nov 2025 10:26:10 PM UTC
ruby2.7-56b6549c45-pplwp job Thu 13 Nov 2025 10:26:11 PM UTC

ruby3.1-5cd5d8db6-zbm9g job Thu Nov 13 10:26:08 PM UTC 2025
ruby3.1-5cd5d8db6-zbm9g job Thu Nov 13 10:26:09 PM UTC 2025
ruby3.1-5cd5d8db6-zbm9g job Thu Nov 13 10:26:11 PM UTC 2025

ruby3.3-788646bd9d-9q8kd job Thu Nov 13 10:26:08 PM UTC 2025
ruby3.3-788646bd9d-9q8kd job Thu Nov 13 10:26:10 PM UTC 2025
ruby3.3-788646bd9d-9q8kd job Thu Nov 13 10:26:11 PM UTC 2025

tcl8.6-6868745bcf-jbdc2 job Thu Nov 13 10:26:08 PM UTC 2025
tcl8.6-6868745bcf-jbdc2 job Thu Nov 13 10:26:10 PM UTC 2025
tcl8.6-6868745bcf-jbdc2 job Thu Nov 13 10:26:11 PM UTC 2025

It seems like we don't need to do any special thing to get the images to run @dcaro @fnegri

Raymond_Ndibe changed the task status from Open to In Progress.Nov 25 2025, 1:46 AM
fnegri moved this task from In Review to Done on the Toolforge (Toolforge iteration 25) board.

Marking as Resolved: the investigation was successful and we can proceed with T415322: Replace job image variants with webservice image variants.