You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by sarutak <gi...@git.apache.org> on 2014/08/07 07:58:39 UTC

[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

GitHub user sarutak opened a pull request:

    https://github.com/apache/spark/pull/1825

    [SPARK-2894] spark-shell doesn't accept flags

    As @sryza reported, spark-shell doesn't accept any flags.
    The root cause is wrong usage of spark-submit in spark-shell and it come to the surface by #1801

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/sarutak/spark SPARK-2894

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/1825.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #1825
    
----
commit 56df7aed29b7ebe36b7f95ea966633ac5bdcc9f1
Author: Kousuke Saruta <sa...@oss.nttdata.co.jp>
Date:   2014-08-07T05:52:41Z

    Modified spark-shell to execute spark-submit correctly

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r16027188
  
    --- Diff: bin/utils.sh ---
    @@ -0,0 +1,58 @@
    +#!/usr/bin/env bash
    +
    +#
    +# Licensed to the Apache Software Foundation (ASF) under one or more
    +# contributor license agreements.  See the NOTICE file distributed with
    +# this work for additional information regarding copyright ownership.
    +# The ASF licenses this file to You under the Apache License, Version 2.0
    +# (the "License"); you may not use this file except in compliance with
    +# the License.  You may obtain a copy of the License at
    +#
    +#    http://www.apache.org/licenses/LICENSE-2.0
    +#
    +# Unless required by applicable law or agreed to in writing, software
    +# distributed under the License is distributed on an "AS IS" BASIS,
    +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +# See the License for the specific language governing permissions and
    +# limitations under the License.
    +#
    +
    +# Gather all all spark-submit options into SUBMISSION_OPTS
    +function gatherSparkSubmitOpts() {
    +
    +  if [ -z "$SUBMIT_USAGE_FUNCTION" ]; then
    +
    +    echo "Function for printing usage of $0 is not set." 1>&2
    +    echo "Please set usage function to shell variable 'SUBMIT_USAGE_FUNCTION' in $0" 1>&2
    +    exit 1
    +  fi
    +
    +  SUBMISSION_OPTS=()
    --- End diff --
    
    I still think it's inconsistent that we use `SUBMIT_*` somewhere, `SPARK_SUBMIT_*` elsewhere, and then `SUBMISSION_*` here. Note a huge deal, but the word "submission" actually has a different connotation that I don't think is intended.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by liancheng <gi...@git.apache.org>.
Github user liancheng commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r15975896
  
    --- Diff: bin/spark-shell ---
    @@ -46,14 +48,38 @@ function main(){
             # (see https://github.com/sbt/sbt/issues/562).
             stty -icanon min 1 -echo > /dev/null 2>&1
             export SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Djline.terminal=unix"
    -        $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main spark-shell "$@"
    +        $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main ${SPARK_SUBMIT_ARGS[@]} spark-shell ${SPARK_SHELL_ARGS[@]}
             stty icanon echo > /dev/null 2>&1
         else
    +	extractLoadFileOpt "$@"
    --- End diff --
    
    We should move this call before the `if` statement to cover the Cygwin branch. And please use 4-space indentation here :)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by sarutak <gi...@git.apache.org>.
Github user sarutak commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51685117
  
    @andrewor14 , @liancheng , the new commit includes improvement based on @liancheng 's great work. The main changes are as follows.
    
    1) Improved utils.sh to force scripts which use utils.sh to set usage function
    2) Modified spark-shell, pyspark to enable them to use arguments which include white spaces
    3) Modified java_gateway.py to enable pyspark to use application opts. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r16027150
  
    --- Diff: python/pyspark/java_gateway.py ---
    @@ -39,7 +39,11 @@ def launch_gateway():
             submit_args = os.environ.get("PYSPARK_SUBMIT_ARGS")
             submit_args = submit_args if submit_args is not None else ""
             submit_args = shlex.split(submit_args)
    -        command = [os.path.join(SPARK_HOME, script), "pyspark-shell"] + submit_args
    +        application_opts = os.environ.get("APPLICATION_OPTS")
    +        application_opts = application_opts if application_opts is not None else ""
    +        application_opts = shlex.split(application_opts)
    +        command = [os.path.join(SPARK_HOME, script)] + submit_args + \
    +            ["pyspark-shell"] + application_opts
    --- End diff --
    
    We actually do need the application arguments here. Note that this code path is only for the pyspark shell, and in `bin/pyspark` we only handle the application arguments for the python file case.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by rxin <gi...@git.apache.org>.
Github user rxin commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51434809
  
    Jenkins, add to whitelist.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51458395
  
    QA results for PR 1825:<br>- This patch PASSES unit tests.<br>- This patch merges cleanly<br>- This patch adds the following public classes (experimental):<br>$FWDIR/bin/spark-submit --class org.apache.spark.repl.Main ${SPARK_SUBMIT_ARGS[@]} spark-shell ${SPARK_SHELL_ARGS[@]}<br>$FWDIR/bin/spark-submit --class org.apache.spark.repl.Main ${SPARK_SUBMIT_ARGS[@]} spark-shell ${SPARK_SHELL_ARGS[@]}<br><br>For more information see test ouptut:<br>https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18118/consoleFull


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51435773
  
    does the shell actually take flags? I didn't realize this when I OK'd #1801. If there are specific flags, we should trap them and pass them after `spark-shell`. For an example see the code in the SQL shell:
    https://github.com/apache/spark/blob/master/bin/spark-sql#L66


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51699589
  
    Okay cool - we can merge this once the tests pass.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51685115
  
    QA tests have started for PR 1825. This patch merges cleanly. <br>View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18252/consoleFull


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r16027161
  
    --- Diff: bin/utils.sh ---
    @@ -0,0 +1,58 @@
    +#!/usr/bin/env bash
    +
    +#
    +# Licensed to the Apache Software Foundation (ASF) under one or more
    +# contributor license agreements.  See the NOTICE file distributed with
    +# this work for additional information regarding copyright ownership.
    +# The ASF licenses this file to You under the Apache License, Version 2.0
    +# (the "License"); you may not use this file except in compliance with
    +# the License.  You may obtain a copy of the License at
    +#
    +#    http://www.apache.org/licenses/LICENSE-2.0
    +#
    +# Unless required by applicable law or agreed to in writing, software
    +# distributed under the License is distributed on an "AS IS" BASIS,
    +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +# See the License for the specific language governing permissions and
    +# limitations under the License.
    +#
    +
    +# Gather all all spark-submit options into SUBMISSION_OPTS
    +function gatherSparkSubmitOpts() {
    +
    +  if [ -z "$SUBMIT_USAGE_FUNCTION" ]; then
    +
    --- End diff --
    
    nit: remove random new lines here and other places?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51636321
  
    QA tests have started for PR 1825. This patch merges cleanly. <br>View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18211/consoleFull


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51699576
  
    LGTM, thanks @sarutak and @liancheng.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51550144
  
    @liancheng However that means every time we want to change a spark-submit config, we need to change it in multiple places, which might make things harder to maintain. For spark-shell, aren't we only expecting one type of argument? I'm not super familiar with this functionality of passing a settings file to spark-shell, but I don't think this file path matches `--*`, so maybe we can filter this argument out this way. This might be simpler than enumerating all of spark-submit's flags.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by sarutak <gi...@git.apache.org>.
Github user sarutak commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r15980352
  
    --- Diff: bin/spark-shell ---
    @@ -32,12 +32,28 @@ set -o posix
     FWDIR="$(cd `dirname $0`/..; pwd)"
     
     if [[ "$@" = *--help ]] || [[ "$@" = *-h ]]; then
    -  echo "Usage: ./bin/spark-shell [options]"
    -  $FWDIR/bin/spark-submit --help 2>&1 | grep -v Usage 1>&2
    -  exit 0
    +    echo "Usage: ./bin/spark-shell [options]"
    +    $FWDIR/bin/spark-submit --help 2>&1 | grep -v Usage 1>&2
    +    exit 0
     fi
     
    +CLI_ARGS=()
    +SUBMISSION_ARGS=()
    +
     function main(){
    +
    +    while (($#)); do
    +        case $1 in
    +	    -i)
    +	        CLI_ARGS+=($1); shift
    +                CLI_ARGS+=($1); shift
    --- End diff --
    
    Sorry, included tabs.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r16027257
  
    --- Diff: python/pyspark/java_gateway.py ---
    @@ -39,7 +39,11 @@ def launch_gateway():
             submit_args = os.environ.get("PYSPARK_SUBMIT_ARGS")
             submit_args = submit_args if submit_args is not None else ""
             submit_args = shlex.split(submit_args)
    -        command = [os.path.join(SPARK_HOME, script), "pyspark-shell"] + submit_args
    +        application_opts = os.environ.get("APPLICATION_OPTS")
    +        application_opts = application_opts if application_opts is not None else ""
    +        application_opts = shlex.split(application_opts)
    +        command = [os.path.join(SPARK_HOME, script)] + submit_args + \
    +            ["pyspark-shell"] + application_opts
    --- End diff --
    
    Yes, @sarutak could you remove this? I didn't realize we already had an IPYTHON_OPTS for this purpose, and any application argument provided here will only go to `py4j.JavaGateway`, which is clearly not interested in our ipython arguments.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r16027306
  
    --- Diff: bin/utils.sh ---
    @@ -0,0 +1,58 @@
    +#!/usr/bin/env bash
    +
    +#
    +# Licensed to the Apache Software Foundation (ASF) under one or more
    +# contributor license agreements.  See the NOTICE file distributed with
    +# this work for additional information regarding copyright ownership.
    +# The ASF licenses this file to You under the Apache License, Version 2.0
    +# (the "License"); you may not use this file except in compliance with
    +# the License.  You may obtain a copy of the License at
    +#
    +#    http://www.apache.org/licenses/LICENSE-2.0
    +#
    +# Unless required by applicable law or agreed to in writing, software
    +# distributed under the License is distributed on an "AS IS" BASIS,
    +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +# See the License for the specific language governing permissions and
    +# limitations under the License.
    +#
    +
    +# Gather all all spark-submit options into SUBMISSION_OPTS
    +function gatherSparkSubmitOpts() {
    +
    +  if [ -z "$SUBMIT_USAGE_FUNCTION" ]; then
    +
    +    echo "Function for printing usage of $0 is not set." 1>&2
    +    echo "Please set usage function to shell variable 'SUBMIT_USAGE_FUNCTION' in $0" 1>&2
    +    exit 1
    +  fi
    +
    +  SUBMISSION_OPTS=()
    +  APPLICATION_OPTS=()
    +  while (($#)); do
    +    case $1 in
    +      --master | --deploy-mode | --class | --name | --jars | --py-files | --files | \
    +      --conf | --properties-file | --driver-memory | --driver-java-options | \
    +      --driver-library-path | --driver-class-path | --executor-memory | --driver-cores | \
    +      --total-executor-cores | --executor-cores | --queue | --num-executors | --archives)
    +        if [[ $# -lt 2 ]]; then
    +          "$SUBMIT_USAGE_FUNCTION"
    +          exit 1;
    +        fi
    +        SUBMISSION_OPTS+=($1); shift
    +        SUBMISSION_OPTS+=($1); shift
    --- End diff --
    
    Actually double quotes should be sufficient here, since it preserves whitespaces by default. The only reason why we needed the special handling for pyspark is that we need to pass the string literal in as an environment variable, otherwise python won't know how to split our arguments. We don't need to do the same here because this is already split by bash.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51699408
  
    QA tests have started for PR 1825. This patch merges cleanly. <br>View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18261/consoleFull


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51686152
  
    QA tests have started for PR 1825. This patch merges cleanly. <br>View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18253/consoleFull


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51434427
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51435031
  
    QA tests have started for PR 1825. This patch merges cleanly. <br>View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18100/consoleFull


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51436043
  
    AFAIK I don't believe spark-shell takes in any application-specific arguments, at least not documented ones. @sryza What file are you referring to?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r15975980
  
    --- Diff: bin/spark-shell ---
    @@ -46,14 +48,38 @@ function main(){
             # (see https://github.com/sbt/sbt/issues/562).
             stty -icanon min 1 -echo > /dev/null 2>&1
             export SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Djline.terminal=unix"
    -        $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main spark-shell "$@"
    +        $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main ${SPARK_SUBMIT_ARGS[@]} spark-shell ${SPARK_SHELL_ARGS[@]}
             stty icanon echo > /dev/null 2>&1
         else
    +	extractLoadFileOpt "$@"
             export SPARK_SUBMIT_OPTS
    -        $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main spark-shell "$@"
    +        $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main ${SPARK_SUBMIT_ARGS[@]} spark-shell ${SPARK_SHELL_ARGS[@]}
         fi
     }
     
    +function extractLoadFileOpt() {
    +  args="$@"
    +  inLoadFileFlag=0
    +
    +  for arg in $args;do
    +    if [[ "$inLoadFileFlag" -eq 0 ]];then
    +      if [ "$arg" != "-i" ]; then
    +        SPARK_SUBMIT_ARGS=(${SPARK_SUBMIT_ARGS[@]} "$arg")
    +      else
    +        SPARK_SHELL_ARGS=(${SPARK_SHELL_ARGS[@]} "$arg")
    +	inLoadFileFlag=1
    --- End diff --
    
    indentation is weird


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-52346903
  
    Doh, I forgot about that.  I tried a bunch of other Python configurations and in every case the behavior seems to match 1.0.2, which is great.  Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by liancheng <gi...@git.apache.org>.
Github user liancheng commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r16026159
  
    --- Diff: python/pyspark/java_gateway.py ---
    @@ -39,7 +39,11 @@ def launch_gateway():
             submit_args = os.environ.get("PYSPARK_SUBMIT_ARGS")
             submit_args = submit_args if submit_args is not None else ""
             submit_args = shlex.split(submit_args)
    -        command = [os.path.join(SPARK_HOME, script), "pyspark-shell"] + submit_args
    +        application_opts = os.environ.get("APPLICATION_OPTS")
    +        application_opts = application_opts if application_opts is not None else ""
    +        application_opts = shlex.split(application_opts)
    +        command = [os.path.join(SPARK_HOME, script)] + submit_args + \
    +            ["pyspark-shell"] + application_opts
    --- End diff --
    
    I don't think `application_opts` is needed as explained [here](https://github.com/apache/spark/pull/1864#discussion_r16026126). But anyway we can leave it as is.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51641967
  
    QA results for PR 1825:<br>- This patch FAILED unit tests.<br>- This patch merges cleanly<br>- This patch adds the following public classes (experimental):<br>$FWDIR/bin/spark-submit --class org.apache.spark.repl.Main ${SUBMISSION_ARGS[*]} spark-shell ${CLIARGS[*]}<br>$FWDIR/bin/spark-submit --class org.apache.spark.repl.Main ${SUBMISSION_ARGS[*]} spark-shell ${CLI_ARGS[*]}<br><br>For more information see test ouptut:<br>https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18211/consoleFull


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51537623
  
    It would be good to verify whether pyspark is affected too (see java_gateway.py)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by sryza <gi...@git.apache.org>.
Github user sryza commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51435337
  
    This will allow spark-shell to take spark-submit options, but will remove its ability to take spark-shell-specific options (currently there's only one, "file").  I'm unclear on the best way to support both of these.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by sarutak <gi...@git.apache.org>.
Github user sarutak commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r16027352
  
    --- Diff: bin/utils.sh ---
    @@ -0,0 +1,58 @@
    +#!/usr/bin/env bash
    +
    +#
    +# Licensed to the Apache Software Foundation (ASF) under one or more
    +# contributor license agreements.  See the NOTICE file distributed with
    +# this work for additional information regarding copyright ownership.
    +# The ASF licenses this file to You under the Apache License, Version 2.0
    +# (the "License"); you may not use this file except in compliance with
    +# the License.  You may obtain a copy of the License at
    +#
    +#    http://www.apache.org/licenses/LICENSE-2.0
    +#
    +# Unless required by applicable law or agreed to in writing, software
    +# distributed under the License is distributed on an "AS IS" BASIS,
    +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +# See the License for the specific language governing permissions and
    +# limitations under the License.
    +#
    +
    +# Gather all all spark-submit options into SUBMISSION_OPTS
    +function gatherSparkSubmitOpts() {
    +
    +  if [ -z "$SUBMIT_USAGE_FUNCTION" ]; then
    +
    +    echo "Function for printing usage of $0 is not set." 1>&2
    +    echo "Please set usage function to shell variable 'SUBMIT_USAGE_FUNCTION' in $0" 1>&2
    +    exit 1
    +  fi
    +
    +  SUBMISSION_OPTS=()
    --- End diff --
    
    Yes, I also noticed inconsistent naming but the matter of the highest priority is to fix broken spark-shell so I think inconsistency and some ugly things should be fixed another ticket.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-52346371
  
    You need to set them through `IPYTHON_OPTS`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51705721
  
    Tests passed so I'm going to merge it


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by liancheng <gi...@git.apache.org>.
Github user liancheng commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51691295
  
    Hey @sarutak, thanks for fixing this :) Did some tests locally and the only issue I found is the quoted string case I just commented. Otherwise LGTM.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by sarutak <gi...@git.apache.org>.
Github user sarutak commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r15976793
  
    --- Diff: bin/spark-shell ---
    @@ -46,14 +48,38 @@ function main(){
             # (see https://github.com/sbt/sbt/issues/562).
             stty -icanon min 1 -echo > /dev/null 2>&1
             export SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Djline.terminal=unix"
    -        $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main spark-shell "$@"
    +        $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main ${SPARK_SUBMIT_ARGS[@]} spark-shell ${SPARK_SHELL_ARGS[@]}
             stty icanon echo > /dev/null 2>&1
         else
    +	extractLoadFileOpt "$@"
             export SPARK_SUBMIT_OPTS
    -        $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main spark-shell "$@"
    +        $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main ${SPARK_SUBMIT_ARGS[@]} spark-shell ${SPARK_SHELL_ARGS[@]}
         fi
     }
     
    +function extractLoadFileOpt() {
    +  args="$@"
    +  inLoadFileFlag=0
    +
    +  for arg in $args;do
    --- End diff --
    
    O.K. I'll check it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by freeman-lab <gi...@git.apache.org>.
Github user freeman-lab commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51667756
  
    @JoshRosen @liancheng regarding the iPython options, these are three setting combos we use frequently (and likely anyone else using notebooks): ``notebook``, ``notebook --pylab inline``, and ``notebook --pylab inline --profile=nbserver``


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51691001
  
    QA tests have started for PR 1825. This patch merges cleanly. <br>View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18254/consoleFull


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by sryza <gi...@git.apache.org>.
Github user sryza commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51436115
  
    org.apache.spark.repl.SparkRunnerSettings


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51635793
  
    This failed a flaky test. Retest this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by liancheng <gi...@git.apache.org>.
Github user liancheng commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51673034
  
    Thanks for pointing this out @andrewor14, opened #1864 to fix both `pyspark` and `spark-shell` and added related code to handle the gateway.
    
    @JoshRosen @freeman-lab, I tested options like `notebook --master <url>` locally against #1864 and it seems working fine :)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51568034
  
    QA results for PR 1825:<br>- This patch PASSES unit tests.<br>- This patch merges cleanly<br>- This patch adds the following public classes (experimental):<br>$FWDIR/bin/spark-submit --class org.apache.spark.repl.Main ${SUBMISSION_ARGS[*]} spark-shell ${CLIARGS[*]}<br>$FWDIR/bin/spark-submit --class org.apache.spark.repl.Main ${SUBMISSION_ARGS[*]} spark-shell ${CLI_ARGS[*]}<br><br>For more information see test ouptut:<br>https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18190/consoleFull


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51569996
  
    QA tests have started for PR 1825. This patch merges cleanly. <br>View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18197/consoleFull


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r15980176
  
    --- Diff: bin/spark-shell ---
    @@ -32,12 +32,28 @@ set -o posix
     FWDIR="$(cd `dirname $0`/..; pwd)"
     
     if [[ "$@" = *--help ]] || [[ "$@" = *-h ]]; then
    -  echo "Usage: ./bin/spark-shell [options]"
    -  $FWDIR/bin/spark-submit --help 2>&1 | grep -v Usage 1>&2
    -  exit 0
    +    echo "Usage: ./bin/spark-shell [options]"
    +    $FWDIR/bin/spark-submit --help 2>&1 | grep -v Usage 1>&2
    +    exit 0
     fi
     
    +CLI_ARGS=()
    +SUBMISSION_ARGS=()
    +
     function main(){
    +
    +    while (($#)); do
    +        case $1 in
    +	    -i)
    +	        CLI_ARGS+=($1); shift
    +                CLI_ARGS+=($1); shift
    --- End diff --
    
    indentation is off


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by liancheng <gi...@git.apache.org>.
Github user liancheng commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51556563
  
    @andrewor14 Ah OK, I only had a glance at `repl.Main` and `SparkILoop` before writing my last comment and thought `spark-shell` might accept all standard Scala REPL options by mistake.
    
    @sarutak Thanks for help fixing this. I'd also suggest what @pwendell suggested, i.e. using a similar while/case structure in [`bin/spark-sql`](https://github.com/apache/spark/blob/master/bin/spark-sql#L66) and [`sbin/start-thriftserver.sh`](https://github.com/apache/spark/blob/9de6a42bb34ea8963225ce90f1a45adcfee38b58/sbin/start-thriftserver.sh#L61-L76) to filter out the `-i <init-file>` option, can be easier to read. And we need to do the same thing for `pyspark`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51454271
  
    QA tests have started for PR 1825. This patch merges cleanly. <br>View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18118/consoleFull


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51538397
  
    There seems to be a similar PySpark issue, too (I just ran into it during testing):
    
    https://issues.apache.org/jira/browse/SPARK-2110


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51573180
  
    QA results for PR 1825:<br>- This patch FAILED unit tests.<br>- This patch merges cleanly<br>- This patch adds the following public classes (experimental):<br>$FWDIR/bin/spark-submit --class org.apache.spark.repl.Main ${SUBMISSION_ARGS[*]} spark-shell ${CLIARGS[*]}<br>$FWDIR/bin/spark-submit --class org.apache.spark.repl.Main ${SUBMISSION_ARGS[*]} spark-shell ${CLI_ARGS[*]}<br><br>For more information see test ouptut:<br>https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18197/consoleFull


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51436429
  
    ah... looks like we need some special logic to filter that one out here


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by liancheng <gi...@git.apache.org>.
Github user liancheng commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51557678
  
    @JoshRosen I'm checking `pyspark`, but I'm not very familiar with this. Would you mind to help confirming are there any other Python/IPython/PySpark specific command line options that need to be handled?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by liancheng <gi...@git.apache.org>.
Github user liancheng commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r16026483
  
    --- Diff: bin/utils.sh ---
    @@ -0,0 +1,58 @@
    +#!/usr/bin/env bash
    +
    +#
    +# Licensed to the Apache Software Foundation (ASF) under one or more
    +# contributor license agreements.  See the NOTICE file distributed with
    +# this work for additional information regarding copyright ownership.
    +# The ASF licenses this file to You under the Apache License, Version 2.0
    +# (the "License"); you may not use this file except in compliance with
    +# the License.  You may obtain a copy of the License at
    +#
    +#    http://www.apache.org/licenses/LICENSE-2.0
    +#
    +# Unless required by applicable law or agreed to in writing, software
    +# distributed under the License is distributed on an "AS IS" BASIS,
    +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +# See the License for the specific language governing permissions and
    +# limitations under the License.
    +#
    +
    +# Gather all all spark-submit options into SUBMISSION_OPTS
    +function gatherSparkSubmitOpts() {
    +
    +  if [ -z "$SUBMIT_USAGE_FUNCTION" ]; then
    +
    +    echo "Function for printing usage of $0 is not set." 1>&2
    +    echo "Please set usage function to shell variable 'SUBMIT_USAGE_FUNCTION' in $0" 1>&2
    +    exit 1
    +  fi
    +
    +  SUBMISSION_OPTS=()
    +  APPLICATION_OPTS=()
    +  while (($#)); do
    +    case $1 in
    +      --master | --deploy-mode | --class | --name | --jars | --py-files | --files | \
    +      --conf | --properties-file | --driver-memory | --driver-java-options | \
    +      --driver-library-path | --driver-class-path | --executor-memory | --driver-cores | \
    +      --total-executor-cores | --executor-cores | --queue | --num-executors | --archives)
    +        if [[ $# -lt 2 ]]; then
    +          "$SUBMIT_USAGE_FUNCTION"
    +          exit 1;
    +        fi
    +        SUBMISSION_OPTS+=($1); shift
    +        SUBMISSION_OPTS+=($1); shift
    --- End diff --
    
    The following case failed because this line doesn't handle quoted string with spaces properly:
    
    ```
    ./bin/pyspark app.py --master spark://lian-laptop.local:7077 --name "awesome name"
    ```
    
    A possible fix is to replace this line with the trick `bin/pyspark` uses:
    
    ```bash
    whitespace="[[:space:]]"
    i=$1
    if [[ $1 =~ \" ]]; then i=$(echo $1 | sed 's/\"/\\\"/g'); fi
    if [[ $1 =~ $whitespace ]]; then i=\"$1\"; fi
    SUBMISSION_OPTS+=($i); shift
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by sarutak <gi...@git.apache.org>.
Github user sarutak commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r16027266
  
    --- Diff: python/pyspark/java_gateway.py ---
    @@ -39,7 +39,11 @@ def launch_gateway():
             submit_args = os.environ.get("PYSPARK_SUBMIT_ARGS")
             submit_args = submit_args if submit_args is not None else ""
             submit_args = shlex.split(submit_args)
    -        command = [os.path.join(SPARK_HOME, script), "pyspark-shell"] + submit_args
    +        application_opts = os.environ.get("APPLICATION_OPTS")
    +        application_opts = application_opts if application_opts is not None else ""
    +        application_opts = shlex.split(application_opts)
    +        command = [os.path.join(SPARK_HOME, script)] + submit_args + \
    +            ["pyspark-shell"] + application_opts
    --- End diff --
    
    Now I'm modifying including this.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by sarutak <gi...@git.apache.org>.
Github user sarutak commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r16026493
  
    --- Diff: bin/utils.sh ---
    @@ -0,0 +1,58 @@
    +#!/usr/bin/env bash
    +
    +#
    +# Licensed to the Apache Software Foundation (ASF) under one or more
    +# contributor license agreements.  See the NOTICE file distributed with
    +# this work for additional information regarding copyright ownership.
    +# The ASF licenses this file to You under the Apache License, Version 2.0
    +# (the "License"); you may not use this file except in compliance with
    +# the License.  You may obtain a copy of the License at
    +#
    +#    http://www.apache.org/licenses/LICENSE-2.0
    +#
    +# Unless required by applicable law or agreed to in writing, software
    +# distributed under the License is distributed on an "AS IS" BASIS,
    +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +# See the License for the specific language governing permissions and
    +# limitations under the License.
    +#
    +
    +# Gather all all spark-submit options into SUBMISSION_OPTS
    +function gatherSparkSubmitOpts() {
    +
    +  if [ -z "$SUBMIT_USAGE_FUNCTION" ]; then
    +
    +    echo "Function for printing usage of $0 is not set." 1>&2
    +    echo "Please set usage function to shell variable 'SUBMIT_USAGE_FUNCTION' in $0" 1>&2
    +    exit 1
    +  fi
    +
    +  SUBMISSION_OPTS=()
    +  APPLICATION_OPTS=()
    +  while (($#)); do
    +    case $1 in
    +      --master | --deploy-mode | --class | --name | --jars | --py-files | --files | \
    +      --conf | --properties-file | --driver-memory | --driver-java-options | \
    +      --driver-library-path | --driver-class-path | --executor-memory | --driver-cores | \
    +      --total-executor-cores | --executor-cores | --queue | --num-executors | --archives)
    +        if [[ $# -lt 2 ]]; then
    +          "$SUBMIT_USAGE_FUNCTION"
    +          exit 1;
    +        fi
    +        SUBMISSION_OPTS+=($1); shift
    +        SUBMISSION_OPTS+=($1); shift
    --- End diff --
    
    Thanks for pointing that. Newer PR is modified that.
    I think the main reason is $1 is not quoted in util.sh.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-52345676
  
    Does this work with Python?  I ask because
    
    ```
    IPYTHON=1 ./bin/pyspark --no-banner --no-confirm-exit
    ```
    doesn't seem to respect my ipython flags.  Am I using this right?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51638397
  
    @liancheng The only slightly complicated thing is the pyspark shell is launched in python, not bash, so we need to separate out the configs there as well, which adds a third source of all the spark-submit options. This may be fine as a temporary solution, seeing that the alternative of enumerating all of ipython's options is less manageable.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51558310
  
    @liancheng `ipython` has a ton of command line options that someone might hypothetically want to use (just take a look at `ipython --help`).  Maybe try things out with a few of these options?  I'm not sure which pyspark options users rely on.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51565466
  
    QA tests have started for PR 1825. This patch merges cleanly. <br>View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18190/consoleFull


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by liancheng <gi...@git.apache.org>.
Github user liancheng commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51636522
  
    @JoshRosen Thanks, then I think gathering all `spark-submit` options rather than options of `spark-shell`/`pyspark` or any other potential applications can be more scalable, since we have control over these options. I opened [a WIP branch](https://github.com/liancheng/spark/compare/apache:master...liancheng:spark-2894) for this, and it seems that we can fix both `pyspark` (Python, IPython and the deprecated `pyspark app.py` style) and `spark-shell` easily.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r15970742
  
    --- Diff: bin/spark-shell ---
    @@ -46,14 +48,38 @@ function main(){
             # (see https://github.com/sbt/sbt/issues/562).
             stty -icanon min 1 -echo > /dev/null 2>&1
             export SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Djline.terminal=unix"
    -        $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main spark-shell "$@"
    +        $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main ${SPARK_SUBMIT_ARGS[@]} spark-shell ${SPARK_SHELL_ARGS[@]}
             stty icanon echo > /dev/null 2>&1
         else
    +	extractLoadFileOpt "$@"
             export SPARK_SUBMIT_OPTS
    -        $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main spark-shell "$@"
    +        $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main ${SPARK_SUBMIT_ARGS[@]} spark-shell ${SPARK_SHELL_ARGS[@]}
         fi
     }
     
    +function extractLoadFileOpt() {
    +  args="$@"
    +  inLoadFileFlag=0
    +
    +  for arg in $args;do
    --- End diff --
    
    For consistency, could you use the same approach that we use in `spark-sql` script?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r16027155
  
    --- Diff: bin/utils.sh ---
    @@ -0,0 +1,58 @@
    +#!/usr/bin/env bash
    +
    +#
    +# Licensed to the Apache Software Foundation (ASF) under one or more
    +# contributor license agreements.  See the NOTICE file distributed with
    +# this work for additional information regarding copyright ownership.
    +# The ASF licenses this file to You under the Apache License, Version 2.0
    +# (the "License"); you may not use this file except in compliance with
    +# the License.  You may obtain a copy of the License at
    +#
    +#    http://www.apache.org/licenses/LICENSE-2.0
    +#
    +# Unless required by applicable law or agreed to in writing, software
    +# distributed under the License is distributed on an "AS IS" BASIS,
    +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +# See the License for the specific language governing permissions and
    +# limitations under the License.
    +#
    +
    +# Gather all all spark-submit options into SUBMISSION_OPTS
    +function gatherSparkSubmitOpts() {
    +
    +  if [ -z "$SUBMIT_USAGE_FUNCTION" ]; then
    +
    +    echo "Function for printing usage of $0 is not set." 1>&2
    +    echo "Please set usage function to shell variable 'SUBMIT_USAGE_FUNCTION' in $0" 1>&2
    +    exit 1
    +  fi
    +
    +  SUBMISSION_OPTS=()
    +  APPLICATION_OPTS=()
    +  while (($#)); do
    +    case "$1" in
    +      --master | --deploy-mode | --class | --name | --jars | --py-files | --files | \
    --- End diff --
    
    We should add a note here that says something like `All changes here must be reflected in SparkSubmitArguments.scala`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r16027217
  
    --- Diff: python/pyspark/java_gateway.py ---
    @@ -39,7 +39,11 @@ def launch_gateway():
             submit_args = os.environ.get("PYSPARK_SUBMIT_ARGS")
             submit_args = submit_args if submit_args is not None else ""
             submit_args = shlex.split(submit_args)
    -        command = [os.path.join(SPARK_HOME, script), "pyspark-shell"] + submit_args
    +        application_opts = os.environ.get("APPLICATION_OPTS")
    --- End diff --
    
    Also arguments passed to applications aren't restricted to `--*` options, so I think it makes sense to rename this to something ARGS. That would also make the variable naming here more consistent (right now it's not clear that `PYSPARK_SUBMIT_ARGS` and `APPLICATION_OPTS` are actually analogous)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by liancheng <gi...@git.apache.org>.
Github user liancheng commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r16026515
  
    --- Diff: bin/utils.sh ---
    @@ -0,0 +1,58 @@
    +#!/usr/bin/env bash
    +
    +#
    +# Licensed to the Apache Software Foundation (ASF) under one or more
    +# contributor license agreements.  See the NOTICE file distributed with
    +# this work for additional information regarding copyright ownership.
    +# The ASF licenses this file to You under the Apache License, Version 2.0
    +# (the "License"); you may not use this file except in compliance with
    +# the License.  You may obtain a copy of the License at
    +#
    +#    http://www.apache.org/licenses/LICENSE-2.0
    +#
    +# Unless required by applicable law or agreed to in writing, software
    +# distributed under the License is distributed on an "AS IS" BASIS,
    +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +# See the License for the specific language governing permissions and
    +# limitations under the License.
    +#
    +
    +# Gather all all spark-submit options into SUBMISSION_OPTS
    +function gatherSparkSubmitOpts() {
    +
    +  if [ -z "$SUBMIT_USAGE_FUNCTION" ]; then
    +
    +    echo "Function for printing usage of $0 is not set." 1>&2
    +    echo "Please set usage function to shell variable 'SUBMIT_USAGE_FUNCTION' in $0" 1>&2
    +    exit 1
    +  fi
    +
    +  SUBMISSION_OPTS=()
    +  APPLICATION_OPTS=()
    +  while (($#)); do
    +    case $1 in
    +      --master | --deploy-mode | --class | --name | --jars | --py-files | --files | \
    +      --conf | --properties-file | --driver-memory | --driver-java-options | \
    +      --driver-library-path | --driver-class-path | --executor-memory | --driver-cores | \
    +      --total-executor-cores | --executor-cores | --queue | --num-executors | --archives)
    +        if [[ $# -lt 2 ]]; then
    +          "$SUBMIT_USAGE_FUNCTION"
    +          exit 1;
    +        fi
    +        SUBMISSION_OPTS+=($1); shift
    +        SUBMISSION_OPTS+=($1); shift
    --- End diff --
    
    Hmm, I'm afraid only adding double quotes doesn't help, the case I mentioned above still fails, and the `pyspark` trick can be helpful. And we only need to deal with the argument part of those options that need an argument.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/1825


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51695237
  
    I have tested this locally and I was able to get both spark shell and pyspark working as before. I haven't tried the settings file for spark-shell, however. Pending a few comments this LGTM, and we should get this in quickly


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1825#discussion_r16027159
  
    --- Diff: bin/utils.sh ---
    @@ -0,0 +1,58 @@
    +#!/usr/bin/env bash
    +
    +#
    +# Licensed to the Apache Software Foundation (ASF) under one or more
    +# contributor license agreements.  See the NOTICE file distributed with
    +# this work for additional information regarding copyright ownership.
    +# The ASF licenses this file to You under the Apache License, Version 2.0
    +# (the "License"); you may not use this file except in compliance with
    +# the License.  You may obtain a copy of the License at
    +#
    +#    http://www.apache.org/licenses/LICENSE-2.0
    +#
    +# Unless required by applicable law or agreed to in writing, software
    +# distributed under the License is distributed on an "AS IS" BASIS,
    +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +# See the License for the specific language governing permissions and
    +# limitations under the License.
    +#
    +
    +# Gather all all spark-submit options into SUBMISSION_OPTS
    +function gatherSparkSubmitOpts() {
    +
    +  if [ -z "$SUBMIT_USAGE_FUNCTION" ]; then
    +
    +    echo "Function for printing usage of $0 is not set." 1>&2
    +    echo "Please set usage function to shell variable 'SUBMIT_USAGE_FUNCTION' in $0" 1>&2
    +    exit 1
    +  fi
    +
    +  SUBMISSION_OPTS=()
    +  APPLICATION_OPTS=()
    +  while (($#)); do
    +    case "$1" in
    +      --master | --deploy-mode | --class | --name | --jars | --py-files | --files | \
    --- End diff --
    
    and add a similar comment in SparkSubmitArguments.scala too. It would be bad if these two go out of sync


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51695588
  
    QA tests have started for PR 1825. This patch merges cleanly. <br>View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18257/consoleFull


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51437506
  
    QA results for PR 1825:<br>- This patch FAILED unit tests.<br>- This patch merges cleanly<br>- This patch adds no public classes<br><br>For more information see test ouptut:<br>https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18100/consoleFull


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-2894] spark-shell doesn't accept flags

Posted by liancheng <gi...@git.apache.org>.
Github user liancheng commented on the pull request:

    https://github.com/apache/spark/pull/1825#issuecomment-51547892
  
    Maybe instead of filtering out all `spark-shell` (and `pyspark`) options, we can do the opposite as what I did in `bin/spark-sql`: filter out all `spark-submit` options, and pass all others to the main class. At least we have control on the `spark-submit` option list.
    
    I'll try to write a utility script to filter out `spark-submit` options today.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org