You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mahout.apache.org by "Andrew Palumbo (JIRA)" <ji...@apache.org> on 2015/03/06 01:50:38 UTC
[jira] [Comment Edited] (MAHOUT-1643) CLI arguments are not being
processed in spark-shell
[ https://issues.apache.org/jira/browse/MAHOUT-1643?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14349655#comment-14349655 ]
Andrew Palumbo edited comment on MAHOUT-1643 at 3/6/15 12:50 AM:
-----------------------------------------------------------------
Yeah- talking about the shell. Do we want to process CLI args for the spark configuration here ie:
{code}
$bin/mahout spark-shell -D:k=n
{code}
or should i just close this and we'll just go off of MAHOUT_OPTS?
was (Author: andrew_palumbo):
Yeah- talking about the shell. Do we want to process CLI args here ie:
{code}
$bin/mahout spark-shell -D:k=n
{code}
or should i just close this and we'll just go off of MAHOUT_OPTS?
> CLI arguments are not being processed in spark-shell
> ----------------------------------------------------
>
> Key: MAHOUT-1643
> URL: https://issues.apache.org/jira/browse/MAHOUT-1643
> Project: Mahout
> Issue Type: Bug
> Components: CLI, spark
> Affects Versions: 1.0
> Environment: spark spark-shell
> Reporter: Andrew Palumbo
> Labels: DSL, scala, spark, spark-shell
> Fix For: 1.0
>
>
> The CLI arguments are not being processed in spark-shell. Most importantly the spark options are not being passed to the spark configuration via:
> {code}
> $ mahout spark-shell -D:k=n
> {code}
> The arguments are preserved it through {code}$ bin/mahout{code}There should be a relatively easy fix either by using the MahoutOptionParser, Scopt or by simply parsing the args array.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
Re: [jira] [Comment Edited] (MAHOUT-1643) CLI arguments are not being
processed in spark-shell
Posted by Dmitriy Lyubimov <dl...@gmail.com>.
note that with MAHOUT_OPTS you have a choice. You can either set up
env or you can use inline syntax like
MAHOUT_OPTS='-Dk=n' bin/mahout spark-shell
On Thu, Mar 5, 2015 at 4:50 PM, Andrew Palumbo (JIRA) <ji...@apache.org> wrote:
>
> [ https://issues.apache.org/jira/browse/MAHOUT-1643?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14349655#comment-14349655 ]
>
> Andrew Palumbo edited comment on MAHOUT-1643 at 3/6/15 12:50 AM:
> -----------------------------------------------------------------
>
> Yeah- talking about the shell. Do we want to process CLI args for the spark configuration here ie:
>
> {code}
> $bin/mahout spark-shell -D:k=n
> {code}
>
> or should i just close this and we'll just go off of MAHOUT_OPTS?
>
>
>
> was (Author: andrew_palumbo):
> Yeah- talking about the shell. Do we want to process CLI args here ie:
>
> {code}
> $bin/mahout spark-shell -D:k=n
> {code}
>
> or should i just close this and we'll just go off of MAHOUT_OPTS?
>
>
>> CLI arguments are not being processed in spark-shell
>> ----------------------------------------------------
>>
>> Key: MAHOUT-1643
>> URL: https://issues.apache.org/jira/browse/MAHOUT-1643
>> Project: Mahout
>> Issue Type: Bug
>> Components: CLI, spark
>> Affects Versions: 1.0
>> Environment: spark spark-shell
>> Reporter: Andrew Palumbo
>> Labels: DSL, scala, spark, spark-shell
>> Fix For: 1.0
>>
>>
>> The CLI arguments are not being processed in spark-shell. Most importantly the spark options are not being passed to the spark configuration via:
>> {code}
>> $ mahout spark-shell -D:k=n
>> {code}
>> The arguments are preserved it through {code}$ bin/mahout{code}There should be a relatively easy fix either by using the MahoutOptionParser, Scopt or by simply parsing the args array.
>
>
>
> --
> This message was sent by Atlassian JIRA
> (v6.3.4#6332)