You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@ignite.apache.org by eugene miretsky <eu...@gmail.com> on 2018/09/27 02:25:45 UTC

Ignite + Spark: json4s versions are incompatible

Hello,

Spark provides json4s 3.2.X, while Ignite uses the newest version. This
seems to cause an error when using some spark SQL commands that use a
json4s methods that no longer exist.

Adding Ignite to our existing Spark code bases seems to break things.

How do people work around this issue?

Stack trace:

[info] Caused by: java.lang.NoSuchMethodError:
org.json4s.jackson.JsonMethods$.parse(Lorg/json4s/JsonInput;Z)Lorg/json4s/JsonAST$JValue;
[info]     at
org.apache.spark.sql.types.DataType$.fromJson(DataType.scala:108)
[info]     at
org.apache.spark.sql.types.StructType$$anonfun$6.apply(StructType.scala:414)
[info]     at
org.apache.spark.sql.types.StructType$$anonfun$6.apply(StructType.scala:414)
[info]     at scala.util.Try$.apply(Try.scala:192)
[info]     at
org.apache.spark.sql.types.StructType$.fromString(StructType.scala:414)
[info]     at
org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.init(ParquetWriteSupport.scala:80)
[info]     at
org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:341)
[info]     at
org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:302)
[info]     at
org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.<init>(ParquetOutputWriter.scala:37)
[info]     at
org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$$anon$1.newInstance(ParquetFileFormat.scala:159)
[info]     at
org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.newOutputWriter(FileFormatWriter.scala:303)
[info]     at
org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.execute(FileFormatWriter.scala:312)
[info]     at
org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:256)
[info]     at
org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:254)
[info]     at
org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1371)
[info]     at
org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:259)
[info]     ... 8 more

RE: Ignite + Spark: json4s versions are incompatible

Posted by Stanislav Lukyanov <st...@gmail.com>.
Hi,

AFAICS Ignite doesn’t even use json4s itself. I assume it’s only in the dependencies and binary distribution for Spark to work.
So, if Spark actually needs 3.2.X you can try using that.
You can remove/replace the Ignite’s json4s jar with the required one, or use your-favorite-build-system’s magic
to override what’s being picked during the build.

Please share with the community if it works or not. Perhaps we need to adjust the Ignite’s build to use a different json4s version.

Thanks,
Stan

From: eugene miretsky
Sent: 27 сентября 2018 г. 5:26
To: user@ignite.apache.org
Subject: Ignite + Spark: json4s versions are incompatible

Hello, 

Spark provides json4s 3.2.X, while Ignite uses the newest version. This seems to cause an error when using some spark SQL commands that use a json4s methods that no longer exist. 

Adding Ignite to our existing Spark code bases seems to break things. 

How do people work around this issue? 

Stack trace: 

[info] Caused by: java.lang.NoSuchMethodError: org.json4s.jackson.JsonMethods$.parse(Lorg/json4s/JsonInput;Z)Lorg/json4s/JsonAST$JValue;
[info]     at org.apache.spark.sql.types.DataType$.fromJson(DataType.scala:108)
[info]     at org.apache.spark.sql.types.StructType$$anonfun$6.apply(StructType.scala:414)
[info]     at org.apache.spark.sql.types.StructType$$anonfun$6.apply(StructType.scala:414)
[info]     at scala.util.Try$.apply(Try.scala:192)
[info]     at org.apache.spark.sql.types.StructType$.fromString(StructType.scala:414)
[info]     at org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport.init(ParquetWriteSupport.scala:80)
[info]     at org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:341)
[info]     at org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:302)
[info]     at org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.<init>(ParquetOutputWriter.scala:37)
[info]     at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$$anon$1.newInstance(ParquetFileFormat.scala:159)
[info]     at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.newOutputWriter(FileFormatWriter.scala:303)
[info]     at org.apache.spark.sql.execution.datasources.FileFormatWriter$SingleDirectoryWriteTask.execute(FileFormatWriter.scala:312)
[info]     at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:256)
[info]     at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:254)
[info]     at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1371)
[info]     at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:259)
[info]     ... 8 more