Skip to content

Connecting and reading via spark support #11

@mkegelCognism

Description

@mkegelCognism

When trying to connect via spark, I get this exception:

Exception in thread "main" java.sql.SQLFeatureNotSupportedException
	at jdbc.RedisStatement.setQueryTimeout(RedisStatement.java:91)
	at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.getQueryOutputSchema(JDBCRDD.scala:67)
	at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:58)
	at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:242)
	at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:37)
	at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:350)
	at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:228)
	at org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:210)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:210)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:171)
	at local.package.generate.Classname$.main(Sample.scala:57)

This is the code I am using, I assume that query param is not supported...
When I remove the line with query parameter, I get another error saying that "dbtable" or "query" params are mandatory.
When I try connecting without spark, everything works as expected and I am able to retrieve values.

sparkSession
      .read
      .format("jdbc")
      .option("url", "jdbc:redis://<password>@<host>:<port>?ssl=true")
      .option("driver", "jdbc.RedisDriver")
      .option("query", "keys startsWith*")
      .load()
      .printSchema()

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions