Page MenuHomePhabricator

[dse-k8s] Provide common hadooop config for spark jobs
Closed, ResolvedPublic

Description

Spark jobs required to have hadoop config (core/hdfs-site.xml files) available to be able to access hdfs

It is possible to push a configmap with this configuration for each job but it will leads to pushing same config multiple times

So we should add common configmap for it (one per hadoop cluster)

Details

Event Timeline

nfraison changed the task status from Open to In Progress.Mar 23 2023, 3:52 PM
nfraison claimed this task.

Change 902402 had a related patch set uploaded (by Nicolas Fraison; author: Nicolas Fraison):

[operations/deployment-charts@master] spark: add hadoop conf configmap

https://gerrit.wikimedia.org/r/902402

BTullis subscribed.

Removing inactive assignee.

BTullis renamed this task from Provide common hadooop config for spark jobs to [dse-k8s] Provide common hadooop config for spark jobs.Jul 3 2023, 12:02 PM
BTullis triaged this task as Low priority.
Aklapper changed the task status from In Progress to Open.Apr 11 2025, 10:06 PM

Resetting task status from "In Progress" to "Open" as this task has been "in progress" for more than two years.

Change #902402 abandoned by Btullis:

[operations/deployment-charts@master] spark: add hadoop conf configmap

Reason:

Now taking a different approach

https://gerrit.wikimedia.org/r/902402

BTullis claimed this task.

Resolving this, as it is now achieved through the spark-support chart: T406833