You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2018/02/07 08:56:00 UTC

[jira] [Commented] (SPARK-23348) append data using saveAsTable should adjust the data types

    [ https://issues.apache.org/jira/browse/SPARK-23348?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16355160#comment-16355160 ] 

Apache Spark commented on SPARK-23348:
--------------------------------------

User 'cloud-fan' has created a pull request for this issue:
https://github.com/apache/spark/pull/20527

> append data using saveAsTable should adjust the data types
> ----------------------------------------------------------
>
>                 Key: SPARK-23348
>                 URL: https://issues.apache.org/jira/browse/SPARK-23348
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0
>            Reporter: Wenchen Fan
>            Priority: Major
>
>  
> {code:java}
> Seq(1 -> "a").toDF("i", "j").write.saveAsTable("t")
> Seq("c" -> 3).toDF("i", "j").write.mode("append").saveAsTable("t")
> {code}
>  
> This query will fail with a strange error:
>  
> {code:java}
> org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 10.0 failed 1 times, most recent failure: Lost task 1.0 in stage 10.0 (TID 15, localhost, executor driver): java.lang.UnsupportedOperationException: Unimplemented type: IntegerType
>  at org.apache.spark.sql.execution.datasources.parquet.VectorizedColumnReader.readBinaryBatch(VectorizedColumnReader.java:473)
>  at org.apache.spark.sql.execution.datasources.parquet.VectorizedColumnReader.readBatch(VectorizedColumnReader.java:214)
>  at org.apache.spark.sql.execution.datasources.parquet.VectorizedParquetRecordReader.nextBatch(VectorizedParquetRecordReader.java:261)
> ...
> {code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org