You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Manku Timma (JIRA)" <ji...@apache.org> on 2015/05/18 10:30:59 UTC

[jira] [Commented] (SPARK-4852) Hive query plan deserialization failure caused by shaded hive-exec jar file when generating golden answers

    [ https://issues.apache.org/jira/browse/SPARK-4852?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14547708#comment-14547708 ] 

Manku Timma commented on SPARK-4852:
------------------------------------

The following diff could solve the problem. File is sql/hive/v0.13.1/src/main/scala/org/apache/spark/sql/hive/Shim13.scala.
{code}
   import java.io.{OutputStream, InputStream}
-  import com.esotericsoftware.kryo.Kryo
+  import org.apache.hive.com.esotericsoftware.kryo.Kryo
   import org.apache.spark.util.Utils._
   import org.apache.hadoop.hive.ql.exec.Utilities
   import org.apache.hadoop.hive.ql.exec.UDF
{code}

> Hive query plan deserialization failure caused by shaded hive-exec jar file when generating golden answers
> ----------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-4852
>                 URL: https://issues.apache.org/jira/browse/SPARK-4852
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.2.0
>            Reporter: Cheng Lian
>            Priority: Minor
>
> When adding Hive 0.13.1 support for Spark SQL Thrift server in PR [2685|https://github.com/apache/spark/pull/2685], Kryo 2.22 used by original hive-exec-0.13.1.jar was shaded by Kryo 2.21 used by Spark SQL because of dependency hell. Unfortunately, Kryo 2.21 has a known bug that may cause Hive query plan deserialization failure. This bug was fixed in Kryo 2.22.
> Normally, this issue doesn't affect Spark SQL because we don't even generate Hive query plan. But when running Hive test suites like {{HiveCompatibilitySuite}}, golden answer files must be generated by Hive, and thus triggers this issue. A workaround is to replace {{hive-exec-0.13.1.jar}} under {{$HIVE_HOME/lib}} with Spark's {{hive-exec-0.13.1a.jar}} and {{kryo-2.21.jar}} under {{$SPARK_DEV_HOME/lib_managed/jars}}. Then add {{$HIVE_HOME/lib}} to {{$HADOOP_CLASSPATH}}.
> Upgrading to some newer version of Kryo which is binary compatible with Kryo 2.22 (if there is one) may fix this issue.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org