You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2017/05/01 22:25:04 UTC

[jira] [Commented] (SPARK-20546) spark-class gets syntax error in posix mode

    [ https://issues.apache.org/jira/browse/SPARK-20546?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15991678#comment-15991678 ] 

Sean Owen commented on SPARK-20546:
-----------------------------------

Are there downsides to turning off posix mode?
It seems reasonable if it makes this case work and doesn't affect other behavior.

> spark-class gets syntax error in posix mode
> -------------------------------------------
>
>                 Key: SPARK-20546
>                 URL: https://issues.apache.org/jira/browse/SPARK-20546
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 2.0.2
>            Reporter: Jessie Yu
>            Priority: Minor
>
> spark-class gets the following error when running in posix mode:
> {code}
> spark-class: line 78: syntax error near unexpected token `<'
> spark-class: line 78: `done < <(build_command "$@")'
> {code}
> \\
> It appears to be complaining about the process substitution: 
> {code}
> CMD=()
> while IFS= read -d '' -r ARG; do
>   CMD+=("$ARG")
> done < <(build_command "$@")
> {code}
> \\
> This can be reproduced by first turning on allexport then posix mode:
> {code}set -a -o posix {code}
> then run something like spark-shell which calls spark-class.
> \\
> The simplest fix is probably to always turn off posix mode in spark-class before the while loop.
> \\
> This was previously reported in [SPARK-8417|https://issues.apache.org/jira/browse/SPARK-8417] which closed with cannot reproduce. 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org