Failure example:
ssh deployment.eqiad.wmnet kube_env airflow-platform-eng-deploy dse-k8s-eqiad kubectl exec -it $(kubectl get pod -l app=airflow,component=hadoop-shell --no-headers -o custom-columns=":metadata.name") -- bash yarn logs -appOwner analytics-platform-eng -applicationId application_1764064841637_1305761 ... 26/01/22 11:38:00 ERROR WMFSparkSQLCLIDriver: Could not open input file for reading. (File does not exist: /wmf/refinery/current/hql/aggregate_pageview_to_fundraising_projectview.hql at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:72) at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:62) ...
Checked manually, file indeed doesn't exist.
Adn the default from DagProperties:
agg_for_fundraising_hourly_hql=f"{hql_directory}/fundraising/aggregate_pageview_to_fundraising_projectview.hql",doesn't exist either?
Job: https://airflow-platform-eng.wikimedia.org/dags/aggregate_for_fundraising_hourly/grid