Skip to content

Commit 8d97f65

Browse files
authored
Merge pull request #199 from marklogic/feature/docs-update
Bumping version in docs
2 parents a15dc23 + 84c461f commit 8d97f65

File tree

9 files changed

+23
-174
lines changed

9 files changed

+23
-174
lines changed

docs/getting-started/jupyter.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,15 +32,15 @@ connector and also to initialize Spark:
3232

3333
```
3434
import os
35-
os.environ['PYSPARK_SUBMIT_ARGS'] = '--jars "/path/to/marklogic-spark-connector-2.1.0.jar" pyspark-shell'
35+
os.environ['PYSPARK_SUBMIT_ARGS'] = '--jars "/path/to/marklogic-spark-connector-2.2.0.jar" pyspark-shell'
3636
3737
from pyspark.sql import SparkSession
3838
spark = SparkSession.builder.master("local[*]").appName('My Notebook').getOrCreate()
3939
spark.sparkContext.setLogLevel("WARN")
4040
spark
4141
```
4242

43-
The path of `/path/to/marklogic-spark-connector-2.1.0.jar` should be changed to match the location of the connector
43+
The path of `/path/to/marklogic-spark-connector-2.2.0.jar` should be changed to match the location of the connector
4444
jar on your filesystem. You are free to customize the `spark` variable in any manner you see fit as well.
4545

4646
Now that you have an initialized Spark session, you can run any of the examples found in the

docs/getting-started/pyspark.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ shell by pressing `ctrl-D`.
2929

3030
Run PySpark from the directory that you downloaded the connector to per the [setup instructions](setup.md):
3131

32-
pyspark --jars marklogic-spark-connector-2.1.0.jar
32+
pyspark --jars marklogic-spark-connector-2.2.0.jar
3333

3434
The `--jars` command line option is PySpark's method for utilizing Spark connectors. Each Spark environment should have
3535
a similar mechanism for including third party connectors; please see the documentation for your particular Spark

docs/getting-started/setup.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -31,10 +31,10 @@ have an instance of MarkLogic running, you can skip step 4 below, but ensure tha
3131
extracted directory contains valid connection properties for your instance of MarkLogic.
3232

3333
1. From [this repository's Releases page](https://github.com/marklogic/marklogic-spark-connector/releases), select
34-
the latest release and download the `marklogic-spark-getting-started-2.1.0.zip` file.
34+
the latest release and download the `marklogic-spark-getting-started-2.2.0.zip` file.
3535
2. Extract the contents of the downloaded zip file.
3636
3. Open a terminal window and go to the directory created by extracting the zip file; the directory should have a
37-
name of "marklogic-spark-getting-started-2.1.0".
37+
name of "marklogic-spark-getting-started-2.2.0".
3838
4. Run `docker-compose up -d` to start an instance of MarkLogic
3939
5. Ensure that the `./gradlew` file is executable; depending on your operating system, you may need to run
4040
`chmod 755 gradlew` to make the file executable.

examples/entity-aggregation/build.gradle

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ repositories {
88

99
dependencies {
1010
implementation 'org.apache.spark:spark-sql_2.12:3.4.1'
11-
implementation "com.marklogic:marklogic-spark-connector:2.1.0"
11+
implementation "com.marklogic:marklogic-spark-connector:2.2.0"
1212
implementation "org.postgresql:postgresql:42.6.0"
1313
}
1414

examples/entity-aggregation/src/main/java/org/example/ImportCustomers.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,7 @@ public static void main(String[] args) {
6464
// The remainder calls use the MarkLogic Spark connector to write customer rows, with nested rentals, to
6565
// the Documents database in MarkLogic.
6666
.write()
67-
.format("com.marklogic.spark")
67+
.format("marklogic")
6868
.option("spark.marklogic.client.host", "localhost")
6969
.option("spark.marklogic.client.port", "8000")
7070
.option("spark.marklogic.client.username", "admin")

examples/entity-aggregation/src/main/java/org/example/ImportCustomersWithRentalsAndPayments.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@ public static void main(String[] args) {
5858
// The remaining calls use the MarkLogic Spark connector to write customer rows, with nested rentals and
5959
// sub-nested payments, to the Documents database in MarkLogic.
6060
.write()
61-
.format("com.marklogic.spark")
61+
.format("marklogic")
6262
.option("spark.marklogic.client.host", "localhost")
6363
.option("spark.marklogic.client.port", "8000")
6464
.option("spark.marklogic.client.username", "admin")

examples/getting-started/marklogic-spark-getting-started.ipynb

Lines changed: 13 additions & 164 deletions
Large diffs are not rendered by default.

examples/java-dependency/build.gradle

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ repositories {
88

99
dependencies {
1010
implementation 'org.apache.spark:spark-sql_2.12:3.4.1'
11-
implementation 'com.marklogic:marklogic-spark-connector:2.1.0'
11+
implementation 'com.marklogic:marklogic-spark-connector:2.2.0'
1212
}
1313

1414
task runApp(type: JavaExec) {

examples/java-dependency/src/main/java/org/example/App.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ public static void main(String[] args) {
1515
try {
1616
List<Row> rows = session
1717
.read()
18-
.format("com.marklogic.spark")
18+
.format("marklogic")
1919
.option("spark.marklogic.client.uri", "spark-example-user:password@localhost:8003")
2020
.option("spark.marklogic.read.opticQuery", "op.fromView('example', 'employee', '')")
2121
.load()

0 commit comments

Comments
 (0)