Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Unable to assign environment variables #2031

Closed
RyanZotti opened this issue May 17, 2024 · 4 comments
Closed

[BUG] Unable to assign environment variables #2031

RyanZotti opened this issue May 17, 2024 · 4 comments

Comments

@RyanZotti
Copy link

Description

For the past couple weeks I've had intermittent issues getting environment variables added to my pods. The variables weren't getting added about 20-30% of the time. Now they're not getting added 100% of the time.

Reproduction Code

Create a values.yaml file.

serviceAccounts:
  spark:
    create: true
    name: spark
rbac:
  createClusterRole: true
webhook:
  enable: true
  port: 443
  namespaceSelector: "spark-webhook-enabled=true"
image:
  repository: docker.io/kubeflow/spark-operator
  tag: v1beta2-1.4.6-3.5.0

Install the Helm chart.

helm install my-release spark-operator/spark-operator  \
    -f values.yaml \
    --namespace spark-operator-env \
    --create-namespace \
    --version 1.2.15

Create a ScheduledSparkApplication in spark-py-pi.yaml:

apiVersion: "sparkoperator.k8s.io/v1beta2"
kind: ScheduledSparkApplication
metadata:
  name: pi-scheduled-spark-application
  namespace: "spark-operator-env"
spec:
  schedule: "* * * * *"
  template:
    type: Python
    pythonVersion: "3"
    mode: cluster
    image: "spark:3.5.1-python3"
    mainApplicationFile: local:///opt/spark/examples/src/main/python/pi.py
    sparkVersion: "3.5.0"
    driver:
      env:
        - name: ABC
          value: "123"
      cores: 1
      coreLimit: "1200m"
      memory: "512m"
      labels:
        version: 3.5.0
      serviceAccount: spark
    executor:
      cores: 1
      instances: 1
      memory: "512m"
      labels:
        version: 3.5.0

Apply the scheduled app:

kubectl apply -f spark-py-pi.yaml

Expected behavior

I expect ABC to show up as an environment variable when I run:

kubectl describe pod -n spark-operator-env pi-scheduled-spark-application-1715975950367111977-driver 

Actual behavior

Instead, the only environment variables I see when I run kubectl describe are these:

Environment:
      SPARK_USER:                 root
      SPARK_APPLICATION_ID:       spark-fa6befd0696a4d679eb250971b8ff27d
      SPARK_DRIVER_BIND_ADDRESS:   (v1:status.podIP)
      SPARK_LOCAL_DIRS:           /var/data/spark-487682f1-a657-461b-a282-2e724fa3a92b
      SPARK_CONF_DIR:             /opt/spark/conf

Environment & Versions

  • Spark Operator App version:
  • Helm Chart Version: 1.2.14, and 1.2.15
  • Kubernetes Version: 1.28.3
  • Apache Spark version: 3.5.0
@imtzer
Copy link

imtzer commented May 20, 2024

spec. template. driver.env is set by webhook which listen port 8080, one way to fix this is use default port setting in values.yaml like:

webhook:
  enable: true 
  port: 8080

the other way is to set spark operator param '--webhook-port' to your port 443

@RyanZotti
Copy link
Author

@imtzer I updated the port to point to 8080 like you described, but I'm still not seeing the environment variables added. What's the reason for updating the port?

@imtzer
Copy link

imtzer commented May 21, 2024

@imtzer I updated the port to point to 8080 like you described, but I'm still not seeing the environment variables added. What's the reason for updating the port?

spec.template.driver.env is patched in webhook,webhook default port is 8080,or you can remove webhook.port to use default value
Also, you have set webhook.namespaceSelector, this will lead to webhook only run on object having the same key=value label, if you do not need the feature, you can remove it and webhook will run on everything

@RyanZotti
Copy link
Author

@imtzer That was it! The problem was the namespaceSelector. When I took that out the environment variables showed up. Thank you so much!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants