You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
NAME READY STATUS RESTARTS AGE
pod/spark-operator-5d7df588f-vctkn 1/1 Running 0 17h
pod/spark-operator-webhook-init--1-46xjf 0/1 Completed 0 17h
NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE
service/spark-operator-webhook ClusterIP * <none> 443/TCP 17h
service/spark-pi-1-ui-svc ClusterIP * <none> 4040/TCP 15h
service/spark-pi-2-ui-svc ClusterIP * <none> 4040/TCP 23m
service/spark-pi-ui-svc ClusterIP * <none> 4040/TCP 55m
NAME READY UP-TO-DATE AVAILABLE AGE
deployment.apps/spark-operator 1/1 1 1 17h
NAME DESIRED CURRENT READY AGE
replicaset.apps/spark-operator-5d7df588f 1 1 1 17h
NAME COMPLETIONS DURATION AGE
job.batch/spark-operator-webhook-init 1/1 2m18s 17h
kubectl describe driver-pod
Events:
Type Reason Age From Message
---- ------ ---- ---- -------
Normal Scheduled 28s default-scheduler Successfully assigned spark-latest/spark-pi-2-driver to *******, elapsedTime: 29.683152ms
Warning FailedMount 12s (x6 over 28s) kubelet MountVolume.SetUp failed for volume "spark-conf-volume-driver" : configmap "spark-drv-76a7dd8f561347f4-conf-map" not found
Warning FailedMount 12s (x6 over 28s) kubelet MountVolume.SetUp failed for volume "config-vol" : configmap "dummy-cm" not found
I initiated a test task, but the driver pod showed that configmap was not found, and the entire test task failed
Appversion:v1beta2-1.4.5-3.5.0
kubectl get all -n spark-latest
kubectl describe driver-pod
Provide a link to the example/module related to the question
Additional context
The text was updated successfully, but these errors were encountered: