You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2021/08/11 04:32:00 UTC
[jira] [Resolved] (SPARK-36459) Date Value '0001-01-01' changes to
'0001-12-30' when inserted into a parquet hive table
[ https://issues.apache.org/jira/browse/SPARK-36459?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-36459.
----------------------------------
Resolution: Incomplete
> Date Value '0001-01-01' changes to '0001-12-30' when inserted into a parquet hive table
> ---------------------------------------------------------------------------------------
>
> Key: SPARK-36459
> URL: https://issues.apache.org/jira/browse/SPARK-36459
> Project: Spark
> Issue Type: Bug
> Components: Spark Core, Spark Shell
> Affects Versions: 2.4.4
> Reporter: sindhura alluri
> Priority: Major
>
> Hi All,
> we are seeing this issue on spark 2.4.4. Below are the steps to reproduce it.
> *Login in to hive terminal on cluster and create below tables.*
> create table t_src(dob timestamp);
> insert into t_src values('0001-01-01 00:00:00.0');
> create table t_tgt(dob timestamp) stored as parquet;
>
> *Spark-shell steps :*
>
> import org.apache.spark.sql.hive.HiveContext
> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
> val q0 = "TRUNCATE table t_tgt"
> val q1 = "SELECT alias.dob as a0 FROM t_src alias"
> val q2 = "INSERT INTO TABLE t_tgt SELECT tbl0.a0 as c0 FROM tbl0"
> sqlContext.sql(q0)
> sqlContext.sql(q1).select("a0").createOrReplaceTempView("tbl0")
> sqlContext.sql(q2)
>
> After this check the contents of target table t_tgt. You will see the date "0001-01-01 00:00:00" changed to "0001-12-30 00:00:00".
> select * from t_tgt;
> Is this a known issue? Is it fixed in any subsequent releases?
> Thanks & regards,
> Sindhura Alluri
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org