You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Brett Randall (JIRA)" <ji...@apache.org> on 2016/06/02 00:39:59 UTC

[jira] [Created] (SPARK-15723) SimpleDateParamSuite test is locale-fragile and relies on deprecated short TZ name

Brett Randall created SPARK-15723:
-------------------------------------

             Summary: SimpleDateParamSuite test is locale-fragile and relies on deprecated short TZ name
                 Key: SPARK-15723
                 URL: https://issues.apache.org/jira/browse/SPARK-15723
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 1.6.1
            Reporter: Brett Randall
            Priority: Minor


{{org.apache.spark.status.api.v1.SimpleDateParamSuite}} has this assertion:

{code}
    new SimpleDateParam("2015-02-20T17:21:17.190EST").timestamp should be (1424470877190L)
{code}

This test is fragile and fails when executing in an environment where the local default timezone causes {{EST}} to be interpreted as something other than US Eastern Standard Time.  If your local timezone is {{Australia/Sydney}}, then {{EST}} equates to {{GMT+10}} and you will get:

{noformat}
date parsing *** FAILED ***
1424413277190 was not equal to 1424470877190 (SimpleDateParamSuite.scala:29)
{noformat}

In short, {{SimpleDateFormat}} is sensitive to the local default {{TimeZone}} when interpreting short zone names.  According to the {{TimeZone}} javadoc, they ought not be used:

{quote}
Three-letter time zone IDs

For compatibility with JDK 1.1.x, some other three-letter time zone IDs (such as "PST", "CTT", "AST") are also supported. However, their use is deprecated because the same abbreviation is often used for multiple time zones (for example, "CST" could be U.S. "Central Standard Time" and "China Standard Time"), and the Java platform can then only recognize one of them.
{quote}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org