You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Paride Casulli (JIRA)" <ji...@apache.org> on 2017/03/30 14:22:41 UTC

[jira] [Created] (SPARK-20158) crash in Spark sql insert in partitioned hive tables

Paride Casulli created SPARK-20158:
--------------------------------------

             Summary: crash in Spark sql insert in partitioned hive tables
                 Key: SPARK-20158
                 URL: https://issues.apache.org/jira/browse/SPARK-20158
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.1.0, 2.0.0
         Environment: hive 1.2.1 on parquet file table
            Reporter: Paride Casulli


Hi, I have this exception while inserting data on a parquet partitioned table in hive from a temp view

Job aborted due to stage failure: Task 3 in stage 177.0 failed 4 times, most recent failure: Lost task 3.3 in stage 177.0 (TID 3833, XXX.XXX.XXX.XXX, executor 1008): org.apache.spark.SparkException: Task failed while writing rows.
	at org.apache.spark.sql.hive.SparkHiveDynamicPartitionWriterContainer.writeToFile(hiveWriterContainers.scala:328)
	at org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:159)
	at org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:159)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:99)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ArrayIndexOutOfBoundsException



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org