You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "huangtianhua (Jira)" <ji...@apache.org> on 2020/06/24 09:39:00 UTC

[jira] [Created] (SPARK-32088) test of pyspark.sql.functions.timestamp_seconds failed if non-american timezone setting

huangtianhua created SPARK-32088:
------------------------------------

             Summary: test of pyspark.sql.functions.timestamp_seconds failed if non-american timezone setting
                 Key: SPARK-32088
                 URL: https://issues.apache.org/jira/browse/SPARK-32088
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 3.1.0
            Reporter: huangtianhua


The python test failed for aarch64 job, see https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-python-arm/405/console since the commit https://github.com/apache/spark/commit/f0e6d0ec13d9cdadf341d1b976623345bcdb1028#diff-c8de34467c555857b92875bf78bf9d49 merged:
**********************************************************************
File "/home/jenkins/workspace/spark-master-test-python-arm/python/pyspark/sql/functions.py", line 1435, in pyspark.sql.functions.timestamp_seconds
Failed example:
    time_df.select(timestamp_seconds(time_df.unix_time).alias('ts')).collect()
Expected:
    [Row(ts=datetime.datetime(2008, 12, 25, 7, 30))]
Got:
    [Row(ts=datetime.datetime(2008, 12, 25, 23, 30))]
**********************************************************************
   1 of   3 in pyspark.sql.functions.timestamp_seconds
***Test Failed*** 1 failures.

But this is not arm64-related issue, I took test on x86 instance with timezone setting of UTC, then the test failed too, so I think the expected datetime is timezone American/**, but seems we have not set the timezone when doing these timezone sensitive python tests.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org