You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "liyunzhang_intel (JIRA)" <ji...@apache.org> on 2016/12/15 08:21:58 UTC

[jira] [Created] (HIVE-15432) java.lang.ClassCastException is thrown when setting "hive.input.format" as "org.apache.hadoop.hive.ql.io.CombineHiveInputFormat" in hive on spark

liyunzhang_intel created HIVE-15432:
---------------------------------------

             Summary: java.lang.ClassCastException is thrown when setting "hive.input.format" as "org.apache.hadoop.hive.ql.io.CombineHiveInputFormat" in hive on spark
                 Key: HIVE-15432
                 URL: https://issues.apache.org/jira/browse/HIVE-15432
             Project: Hive
          Issue Type: Bug
            Reporter: liyunzhang_intel


set "hive.input.format" as "org.apache.hadoop.hive.ql.io.CombineHiveInputFormat" in itests/qtest/target/testconf/spark/standalone/hive-site.xml and run qtest like following cmd:
{code}
mvn test -Dtest=TestSparkCliDriver -Dtest.output.overwrite=true -Dqfile=union.q  >log.TestSparkCliDriver 2>&1
{code}

found following exception in itests/qtest-spark/target/tmp/log/hive.log
{code}
2016-12-14T23:43:17,819  INFO [stderr-redir-1] client.SparkClientImpl: java.lang.ClassCastException: Cannot cast org.apache.hadoop.hive.ql.io.CombineHiveInputFormat$CombineHiveInputSplit to org.apache.hadoop.mapred.InputSplitWithLocationInfo
2016-12-14T23:43:17,819  INFO [stderr-redir-1] client.SparkClientImpl:  at java.lang.Class.cast(Class.java:3094)
2016-12-14T23:43:17,819  INFO [stderr-redir-1] client.SparkClientImpl:  at org.apache.spark.rdd.HadoopRDD.getPreferredLocations(HadoopRDD.scala:318)
2016-12-14T23:43:17,819  INFO [stderr-redir-1] client.SparkClientImpl:  at org.apache.spark.rdd.RDD$$anonfun$preferredLocations$2.apply(RDD.scala:270)
2016-12-14T23:43:17,819  INFO [stderr-redir-1] client.SparkClientImpl:  at org.apache.spark.rdd.RDD$$anonfun$preferredLocations$2.apply(RDD.scala:270)
2016-12-14T23:43:17,819  INFO [stderr-redir-1] client.SparkClientImpl:  at scala.Option.getOrElse(Option.scala:121)
2016-12-14T23:43:17,819  INFO [stderr-redir-1] client.SparkClientImpl:  at org.apache.spark.rdd.RDD.preferredLocations(RDD.scala:269)
2016-12-14T23:43:17,819  INFO [stderr-redir-1] client.SparkClientImpl:  at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$getPreferredLocsInternal(DAGScheduler.scala:1564)
2016-12-14T23:43:17,819  INFO [stderr-redir-1] client.SparkClientImpl:  at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$getPreferredLocsInternal$2$$anonfun$apply$1.apply$mcVI$sp(DAGScheduler.scala:1575)
2016-12-14T23:43:17,819  INFO [stderr-redir-1] client.SparkClientImpl:  at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$getPreferredLocsInternal$2$$anonfun$apply$1.apply(DAGScheduler.scala:1574)
2016-12-14T23:43:17,819  INFO [stderr-redir-1] client.SparkClientImpl:  at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$getPreferredLocsInternal$2$$anonfun$apply$1.apply(DAGScheduler.scala:1574)
2016-12-14T23:43:17,819  INFO [stderr-redir-1] client.SparkClientImpl:  at scala.collection.immutable.List.foreach(List.scala:381)
2016-12-14T23:43:17,819  INFO [stderr-redir-1] client.SparkClientImpl:  at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$getPreferredLocsInternal$2.apply(DAGScheduler.scala:1574)
2016-12-14T23:43:17,820  INFO [stderr-redir-1] client.SparkClientImpl:  at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$getPreferredLocsInternal$2.apply(DAGScheduler.scala:1572)
2016-12-14T23:43:17,820  INFO [stderr-redir-1] client.SparkClientImpl:  at scala.collection.immutable.List.foreach(List.scala:381)

{code}








--
This message was sent by Atlassian JIRA
(v6.3.4#6332)