Page MenuHomePhabricator

[dse-k8s] Provide common hadooop config for spark jobs
Open, In Progress, LowPublic

Description

Spark jobs required to have hadoop config (core/hdfs-site.xml files) available to be able to access hdfs

It is possible to push a configmap with this configuration for each job but it will leads to pushing same config multiple times

So we should add common configmap for it (one per hadoop cluster)

Event Timeline

nfraison changed the task status from Open to In Progress.Mar 23 2023, 3:52 PM
nfraison claimed this task.

Change 902402 had a related patch set uploaded (by Nicolas Fraison; author: Nicolas Fraison):

[operations/deployment-charts@master] spark: add hadoop conf configmap

https://gerrit.wikimedia.org/r/902402

BTullis subscribed.

Removing inactive assignee.

BTullis renamed this task from Provide common hadooop config for spark jobs to [dse-k8s] Provide common hadooop config for spark jobs.Jul 3 2023, 12:02 PM
BTullis triaged this task as Low priority.