You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/06/14 10:05:00 UTC

[jira] [Commented] (SPARK-28032) DataFrame.saveAsTable( in AVRO format with Timestamps create bad Hive tables

    [ https://issues.apache.org/jira/browse/SPARK-28032?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16863909#comment-16863909 ] 

Hyukjin Kwon commented on SPARK-28032:
--------------------------------------

Looks like the error message just described its limitation clearly. What's an issue? You can upgrade your Hive version to read.

> DataFrame.saveAsTable( in AVRO format with Timestamps create bad Hive tables
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-28032
>                 URL: https://issues.apache.org/jira/browse/SPARK-28032
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.3
>         Environment: Spark 2.4.3
> Hive 1.1.0
>            Reporter: Mathew Wicks
>            Priority: Major
>
> I am not sure if it's my very old version of Hive (1.1.0), but when I use the following code, I end up with a table which Spark can read, but Hive cannot.
> That is to say, when writing AVRO format tables, they cannot be read in Hive if they contain timestamp types.
> *Hive error:*
> {code:java}
> Error while compiling statement: FAILED: UnsupportedOperationException timestamp is not supported.
> {code}
> *Spark Code:*
> {code:java}
> import java.sql.Timestamp
> import spark.implicits._
> val currentTime = new Timestamp(System.currentTimeMillis())
>  
> val df = Seq(
>  (currentTime)
> ).toDF()
> df.write.mode("overwrite").format("avro").saveAsTable("database.table_name")
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org