You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2014/10/13 20:40:33 UTC

[jira] [Commented] (SPARK-1010) Update all unit tests to use SparkConf instead of system properties

    [ https://issues.apache.org/jira/browse/SPARK-1010?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14169724#comment-14169724 ] 

Sean Owen commented on SPARK-1010:
----------------------------------

Yes, lots of usage in tests still. A lot looks intentional.

{code}
find . -name "*Suite.scala" -type f -exec grep -E "System\.[gs]etProperty" {} \;
    ...
    """.format(System.getProperty("user.name", "<unknown>"),
    """.format(System.getProperty("user.name", "<unknown>")).stripMargin
    System.setProperty("spark.testing", "true")
    System.setProperty("spark.reducer.maxMbInFlight", "1")
    System.setProperty("spark.storage.memoryFraction", "0.0001")
    System.setProperty("spark.storage.memoryFraction", "0.01")
    System.setProperty("spark.authenticate", "false")
    System.setProperty("spark.authenticate", "false")
    System.setProperty("spark.shuffle.manager", "hash")
    System.setProperty("spark.scheduler.mode", "FIFO")
    System.setProperty("spark.scheduler.mode", "FAIR")
    ...
{code}


> Update all unit tests to use SparkConf instead of system properties
> -------------------------------------------------------------------
>
>                 Key: SPARK-1010
>                 URL: https://issues.apache.org/jira/browse/SPARK-1010
>             Project: Spark
>          Issue Type: New Feature
>    Affects Versions: 0.9.0
>            Reporter: Patrick Cogan
>            Assignee: Nirmal
>            Priority: Minor
>              Labels: starter
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org