Skip to content

Release tag hard coded #155

@gustavosr98

Description

@gustavosr98

On 3.5 it will fail since the tag does not exists

  Warning  Failed     7s (x2 over 25s)   kubelet            Failed to pull image "ghcr.io/canonical/charmed-spark:3.5-22.04": rpc error: code = NotFound desc = failed to pull and unpack image "ghcr.io/canonical/charmed-spark:3.5-22.04": failed to resolve reference "ghcr.io/canonical/charmed-spark:3.5-22.04": ghcr.io/canonical/charmed-spark:3.5-22.04: not found

spark.kubernetes.container.image=ghcr.io/canonical/charmed-spark:3.5-22.04


On 3.4/edge when using a pyspark shell it will use an old build (Published almost 2 years ago) with tag 3.4
https://github.com/canonical/charmed-spark-rock/blob/3.4-22.04/edge/images/charmed-spark/conf/spark-defaults.conf#L1


Trying to explicitly set the image

# ---- Spark versions
# They should all match the version
# SPARK_NOTEBOOK_IMG="ghcr.io/canonical/charmed-spark-jupyterlab:3.4-22.04_edge"
SPARK_IMG="ghcr.io/canonical/charmed-spark:3.4-22.04_edge" # ghcr.io/canonical/charmed-spark-gpu
# SPARK_IMG_GPU="ghcr.io/canonical/charmed-spark-gpu:3.4-22.04_edge"

# Create a Spark session 
spark = ( SparkSession
        .builder
        .appName("S3example")
        .config("spark.kubernetes.container.image", SPARK_IMG)
        .config("spark.kubernetes.driver.container.image", SPARK_IMG)
        .config("spark.kubernetes.executor.container.image", SPARK_IMG)
        .config("spark.executor.instances", "2")
        .config("spark.executor.cores", "2")
        .config("spark.executor.memory", "8G") 
        .config("spark.hadoop.fs.s3a.path.style.access", "true")
        .config("spark.hadoop.fs.s3a.endpoint", S3_ENDPOINT)
        .config("spark.hadoop.fs.s3a.access.key", S3_ACCESS_KEY)
        .config("spark.hadoop.fs.s3a.secret.key", S3_SECRET_KEY)
        .config("spark.hadoop.fs.s3a.aws.credentials.provider", "org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider")
        .getOrCreate()
    )

But I get

2025-08-14T22:07:40.035Z [entrypoint] java.io.InvalidClassException: org.apache.spark.util.SerializableBuffer; local class incompatible: stream classdesc serialVersionUID = -5096107160662392145, local class serialVersionUID = -6275049695822025938

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions