You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Spark Operator uses [dep](https://golang.github.io/dep/) for dependency management. Please install `dep` following
24
-
the instruction on the website if you don't have it available locally. To install the dependencies, run the following
25
-
command:
26
-
27
-
```bash
28
-
$ dep ensure
29
-
```
30
-
31
-
To update the dependencies, run the following command:
32
-
33
-
```bash
34
-
$ dep ensure -update
35
-
```
36
-
37
-
Before building the Spark Operator the first time, run the following commands to get the required Kubernetes code
38
-
generators:
39
-
40
-
```bash
41
-
$ go get -u k8s.io/code-generator/cmd/deepcopy-gen
42
-
$ go get -u k8s.io/code-generator/cmd/defaulter-gen
43
-
```
44
-
45
-
To build the Spark Operator, run the following command:
46
-
47
-
```bash
48
-
$ make build
49
-
```
50
-
51
-
To build a Docker image of the Spark Operator, run the following command:
52
-
53
-
```bash
54
-
$ make image-tag=<image tag> image
55
-
```
56
-
57
-
To push the Docker image to Docker Hub, run the following command:
58
-
console
59
-
60
-
```bash
61
-
$ make image-tag=<image tag> push
62
-
```
14
+
## Installation
63
15
64
16
To install the Spark Operator on a Kubernetes cluster, run the following command:
65
17
@@ -212,3 +164,69 @@ Currently the following annotations are supported:
212
164
|`sparkoperator.k8s.io/hadoopConfigMap`|Name of the Kubernetes ConfigMap storing Hadoop configuration files (to which `HADOOP_CONF_DIR` applies)|
213
165
|`sparkoperator.k8s.io/configMap.[ConfigMapName]`|Mount path of the ConfigMap named `ConfigMapName`|
214
166
|`sparkoperator.k8s.io/GCPServiceAccount.[SeviceAccountSecretName]`|Mount path of the secret storing GCP service account credentials (typically a JSON key file) named `SeviceAccountSecretName`|
167
+
168
+
## Build
169
+
170
+
In case you want to build the Spark Operator from the source code, e.g., to test a fix or a feature you write, you can do so following the instructions below.
171
+
172
+
To get the Spark Operator, run the following commands:
0 commit comments