Skip to content

Commit 7d4c044

Browse files
committed
Fixed issue with Jackson dependencies
Ran into an issue when testing the java-dependency project, where the Java Client's preferred Jackson version of 2.15.2 was being used, which Spark 3.4 does not like. Made me realize that we don't want our connector to bring any Jackson dependencies with it - it needs to use whatever the user's Spark distribution is using. So updated our build.gradle file to exclude the Jackson dependencies coming from java-client-api, ml-app-deployer, and marklogic-unit-test (the latter two ensure that we don't get any Jackson 2.15.2 dependencies when running the tests, but get Spark 3.4.1's preferred version instead, which is 2.14.2). After fixing this, I realized it's perfectly safe to use Java Client 6.4.0, as that was using Jackson 2.15.2 just like Java Client 6.3.0. So made that upgrade. All tests are passing, and verified that all examples in the docs work as well.
1 parent 79d3c5d commit 7d4c044

File tree

4 files changed

+25
-7
lines changed

4 files changed

+25
-7
lines changed

build.gradle

Lines changed: 21 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,15 @@ repositories {
2121

2222
dependencies {
2323
compileOnly 'org.apache.spark:spark-sql_2.12:' + sparkVersion
24-
implementation "com.marklogic:marklogic-client-api:6.3.0"
24+
implementation ("com.marklogic:marklogic-client-api:6.4.0") {
25+
// The Java Client uses Jackson 2.15.2; Scala 3.4.x does not yet support that and will throw the following error:
26+
// Scala module 2.14.2 requires Jackson Databind version >= 2.14.0 and < 2.15.0 - Found jackson-databind version 2.15.2
27+
// So the 4 Jackson modules are excluded to allow for Spark's to be used.
28+
exclude module: 'jackson-core'
29+
exclude module: 'jackson-databind'
30+
exclude module: 'jackson-annotations'
31+
exclude module: 'jackson-dataformat-csv'
32+
}
2533

2634
// Makes it possible to use lambdas in Java 8 to implement Spark's Function1 and Function2 interfaces
2735
// See https://github.com/scala/scala-java8-compat for more information
@@ -31,8 +39,18 @@ dependencies {
3139
}
3240

3341
testImplementation 'org.apache.spark:spark-sql_2.12:' + sparkVersion
34-
testImplementation 'com.marklogic:ml-app-deployer:4.6.0'
35-
testImplementation 'com.marklogic:marklogic-junit5:1.4.0'
42+
testImplementation ('com.marklogic:ml-app-deployer:4.6.0') {
43+
exclude module: 'jackson-core'
44+
exclude module: 'jackson-databind'
45+
exclude module: 'jackson-annotations'
46+
exclude module: 'jackson-dataformat-csv'
47+
}
48+
testImplementation ('com.marklogic:marklogic-junit5:1.4.0') {
49+
exclude module: 'jackson-core'
50+
exclude module: 'jackson-databind'
51+
exclude module: 'jackson-annotations'
52+
exclude module: 'jackson-dataformat-csv'
53+
}
3654
testImplementation "ch.qos.logback:logback-classic:1.3.5"
3755
testImplementation "org.slf4j:jcl-over-slf4j:1.7.36"
3856
testImplementation "org.skyscreamer:jsonassert:1.5.1"

examples/java-dependency/build.gradle

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,8 +8,8 @@ repositories {
88
}
99

1010
dependencies {
11-
implementation 'org.apache.spark:spark-sql_2.12:3.3.2'
12-
implementation 'com.marklogic:marklogic-spark-connector:2.0-SNAPSHOT'
11+
implementation 'org.apache.spark:spark-sql_2.12:3.4.1'
12+
implementation 'com.marklogic:marklogic-spark-connector:2.1-SNAPSHOT'
1313
}
1414

1515
task runApp(type: JavaExec) {
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
distributionBase=GRADLE_USER_HOME
22
distributionPath=wrapper/dists
3-
distributionUrl=https\://services.gradle.org/distributions/gradle-7.5.1-bin.zip
3+
distributionUrl=https\://services.gradle.org/distributions/gradle-8.4-bin.zip
44
zipStoreBase=GRADLE_USER_HOME
55
zipStorePath=wrapper/dists

examples/java-dependency/src/main/java/org/example/App.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ public static void main(String[] args) {
1616
List<Row> rows = session
1717
.read()
1818
.format("com.marklogic.spark")
19-
.option("spark.marklogic.client.uri", "spark-example-user:password@localhost:8020")
19+
.option("spark.marklogic.client.uri", "spark-example-user:password@localhost:8003")
2020
.option("spark.marklogic.read.opticQuery", "op.fromView('example', 'employee', '')")
2121
.load()
2222
.filter("City == 'San Diego'")

0 commit comments

Comments
 (0)