"TypeError: 'JavaPackage' object is not callable" when using PySpark 3.3.0 and sparknlp 4.0.2 #12576
Unanswered
Colinnnnnm
asked this question in
Q&A
Replies: 1 comment
-
You forgot to include Spark NLP Maven package in your SparkSession:
And we have parameters to be set if needed: https://github.com/JohnSnowLabs/spark-nlp#packages-cheatsheet
spark = SparkSession.builder \
.appName("Spark NLP")\
.master("local[*]")\
.config("spark.driver.memory","16G")\
.config("spark.driver.maxResultSize", "0") \
.config("spark.kryoserializer.buffer.max", "2000M")\
.config("spark.jars.packages", "com.johnsnowlabs.nlp:spark-nlp_2.12:4.0.2")\
.getOrCreate() https://github.com/JohnSnowLabs/spark-nlp#packages-cheatsheet Extra examples: https://github.com/JohnSnowLabs/spark-nlp-workshop |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am working with PySpark 3.3.0 and using latest sparknlp 4.0.2
When calling document_assembler or any other module under sparknlp, I get the following error:
Here is my OS setting:
Java version:
Is there anyone face similar situation before and have an idea to solve it? Thanks in advanced.
Beta Was this translation helpful? Give feedback.
All reactions