You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Stavros Kontopoulos (JIRA)" <ji...@apache.org> on 2015/10/09 12:05:26 UTC

[jira] [Comment Edited] (SPARK-11025) Exception key can't be empty at getSystemProperties function in utils

    [ https://issues.apache.org/jira/browse/SPARK-11025?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14950157#comment-14950157 ] 

Stavros Kontopoulos edited comment on SPARK-11025 at 10/9/15 10:04 AM:
-----------------------------------------------------------------------

falling back to previous impl: 
 System.getProperties.clone().asInstanceOf[java.util.Properties].toMap[String, String] which was ignoring it, i guess at language level java does not complain so i think it is ok to ignore it...unless the general strategy is to catch everything that is wrong... but i think we should only validate what we use... i know -D only may come up as a mistake... just wanted to bring to the table what is the strategy and if for such minor mistakes should we fail the execution when spark config is created etc...


was (Author: skonto):
falling back to previous impl: 
 System.getProperties.clone().asInstanceOf[java.util.Properties].toMap[String, String] which was ignoring it, i guess at language level java does not complain so i think it is ok to ignore it...unless the general strategy is to catch everything that is wrong... but i think we should only validate what we use... i know -D only may come up as a mistake... just wanted to bring to the table what is the strategy and if for such minor mistakes should we fail the execution when spark context is created etc...

> Exception key can't be empty at getSystemProperties function in utils 
> ----------------------------------------------------------------------
>
>                 Key: SPARK-11025
>                 URL: https://issues.apache.org/jira/browse/SPARK-11025
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.5.1
>            Reporter: Stavros Kontopoulos
>            Priority: Trivial
>              Labels: easyfix, easytest
>
> At file core/src/main/scala/org/apache/spark/util/Utils.scala
> getSystemProperties function fails when someone passes -D to the jvm and as a result the key passed is "" (empty).
> Exception thrown: java.lang.IllegalArgumentException: key can't be empty
> Empty keys should be ignored or just passed them without filtering at that level as in previous versions.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org