You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Ron Gonzalez <zl...@yahoo.com.INVALID> on 2014/08/25 01:12:40 UTC

Problems running examples in IDEA

Hi,
   After getting the code base to compile, I tried running some of the 
scala examples.
   They all fail since it can't find classes like SparkConf.
   If I change the iml file to convert provided scope from PROVIDED to 
COMPILE, I am able to run them. It's simple by doing the following in 
the root directory of the spark code base: find . -name "*.iml" | xargs 
sed -i.bak 's/PROVIDED/COMPILE/g'.
   Is this expected? I'd really rather not modify the iml files since 
they were sourced from the pom xml files, so if you guys have some tips 
on doing this better, that would be great...

Thanks,
Ron

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Problems running examples in IDEA

Posted by Ron Gonzalez <zl...@yahoo.com.INVALID>.
Oh ok. So from the code base, local execution is dependent on everyone's 
way then, right? I am indeed changing the code to add the master to 
local[*], but still getting the no classdef found errors.

If that's the case, then I think I'm ok then...

Thanks,
Ron


On 08/24/2014 04:21 PM, Sean Owen wrote:
> The examples aren't runnable quite like this. It's intended that they
> are submitted to a cluster with spark-submit, which would among other
> things provide Spark at runtime.
>
> I think you might get them to run this way if you set master to
> "local[*]" and indeed made a run profile that also included Spark on
> the classpath.
>
> You would never modify the .iml files anyway. You can change the Maven
> pom.xml files if you were to need to modify a dependency scope.
>
> On Mon, Aug 25, 2014 at 12:12 AM, Ron Gonzalez
> <zl...@yahoo.com.invalid> wrote:
>> Hi,
>>    After getting the code base to compile, I tried running some of the scala
>> examples.
>>    They all fail since it can't find classes like SparkConf.
>>    If I change the iml file to convert provided scope from PROVIDED to
>> COMPILE, I am able to run them. It's simple by doing the following in the
>> root directory of the spark code base: find . -name "*.iml" | xargs sed
>> -i.bak 's/PROVIDED/COMPILE/g'.
>>    Is this expected? I'd really rather not modify the iml files since they
>> were sourced from the pom xml files, so if you guys have some tips on doing
>> this better, that would be great...
>>
>> Thanks,
>> Ron
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Problems running examples in IDEA

Posted by Sean Owen <so...@cloudera.com>.
The examples aren't runnable quite like this. It's intended that they
are submitted to a cluster with spark-submit, which would among other
things provide Spark at runtime.

I think you might get them to run this way if you set master to
"local[*]" and indeed made a run profile that also included Spark on
the classpath.

You would never modify the .iml files anyway. You can change the Maven
pom.xml files if you were to need to modify a dependency scope.

On Mon, Aug 25, 2014 at 12:12 AM, Ron Gonzalez
<zl...@yahoo.com.invalid> wrote:
> Hi,
>   After getting the code base to compile, I tried running some of the scala
> examples.
>   They all fail since it can't find classes like SparkConf.
>   If I change the iml file to convert provided scope from PROVIDED to
> COMPILE, I am able to run them. It's simple by doing the following in the
> root directory of the spark code base: find . -name "*.iml" | xargs sed
> -i.bak 's/PROVIDED/COMPILE/g'.
>   Is this expected? I'd really rather not modify the iml files since they
> were sourced from the pom xml files, so if you guys have some tips on doing
> this better, that would be great...
>
> Thanks,
> Ron
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org