Skip to content

Commit ef78b49

Browse files
committed
Bumped version of Apache Spark to 3.5.5 to prevent 404 error
+ Fixed version of repo in dist files
1 parent eaa2651 commit ef78b49

File tree

6 files changed

+8
-8
lines changed

6 files changed

+8
-8
lines changed

Dockerfile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ ENV PATH=$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$PATH
4444
RUN mkdir -p ${HADOOP_VERSION}/logs
4545

4646
# Install Spark
47-
ENV SPARK_VERSION=3.5.4
47+
ENV SPARK_VERSION=3.5.5
4848
RUN wget https://dlcdn.apache.org/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop3.tgz \
4949
&& tar -xzf spark-${SPARK_VERSION}-bin-hadoop3.tgz -C /opt/ \
5050
&& rm spark-${SPARK_VERSION}-bin-hadoop3.tgz

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ This repository is inspired by and uses several scripts taken from [Rubenafo's r
3131

3232
- ✅ Ready to deploy in a Docker Swarm cluster: all the networking and port configuration issues have been fixed so you can scale your cluster to as many worker nodes as you need.
3333
- ⚡️ Hadoop, HDFS, Spark, Scala and PySpark ready to use: all the tools are available inside the container globally so you don't have to fight with environment variables and executable paths.
34-
- 🌟 New technology: our image offers Hadoop 3.4.0, Spark 3.5.4 and Python 3.12.6
34+
- 🌟 New technology: our image offers Hadoop 3.4.0, Spark 3.5.5 and Python 3.12.6
3535
- ⚙️ Less configuration: we have removed some settings to keep the minimum possible configuration, this way you prevent errors, unexpected behaviors and get the freedom to set parameters via environment variables and have an agile development that does not require rebuilding the Docker image.
3636
- 🐍 Python dependencies: we include the most used Python dependencies like Pandas, Numpy and Scipy to be able to work on datasets and perform mathematical operations (you can remove them if you don't need them!)
3737

docker-compose.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
services:
22
# Master
33
master-node:
4-
image: "jwaresolutions/big-data-cluster:0.5.0"
4+
image: "jwaresolutions/big-data-cluster:1.0.1"
55
container_name: "master-node"
66
restart: "always"
77
command: bash -c "/home/big_data/spark-cmd.sh start master-node"
@@ -18,7 +18,7 @@ services:
1818

1919
# Workers
2020
worker:
21-
image: "jwaresolutions/big-data-cluster:0.5.0"
21+
image: "jwaresolutions/big-data-cluster:1.0.1"
2222
restart: "always"
2323
command: bash -c "/home/big_data/spark-cmd.sh start"
2424
deploy:

docker-compose_cluster.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
services:
22
# Master
33
master-node:
4-
image: "jwaresolutions/big-data-cluster:0.5.0"
4+
image: "jwaresolutions/big-data-cluster:1.0.1"
55
command: bash -c "/home/big_data/spark-cmd.sh start master-node"
66
ports:
77
- target: 8080
@@ -32,7 +32,7 @@ services:
3232

3333
# Workers
3434
worker:
35-
image: "jwaresolutions/big-data-cluster:0.5.0"
35+
image: "jwaresolutions/big-data-cluster:1.0.1"
3636
command: bash -c "/home/big_data/spark-cmd.sh start"
3737
depends_on:
3838
- "master-node"

toy-cluster.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
#!/bin/bash
22

3-
imageName="jwaresolutions/big-data-cluster:0.5.0"
3+
imageName="jwaresolutions/big-data-cluster:1.0.1"
44

55
# Bring the services up
66
function startServices {

version.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
1.0.0
1+
1.0.1

0 commit comments

Comments
 (0)