You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@mahout.apache.org by hlqv <hl...@gmail.com> on 2014/12/21 18:28:18 UTC

No configuration setting found for key 'akka.event-handlers' when running spark-itemsimilarity

Hi everyone!

In Mahout 1.0 Snapshot, I follow to this introduction
https://mahout.apache.org/users/recommender/intro-cooccurrence-spark.html

and running the command below:

mahout spark-itemsimilarity --input hdfs://localhost:9000/test/logs.csv
--output hdfs://localhost:9000/test/output/ --master spark://localhost:7077
--filter1 purchase --filter2 view --itemIDColumn 2 --rowIDColumn 0
--filterColumn 1

then I got the error:

Exception in thread "main" com.typesafe.config.ConfigException$Missing: No
configuration setting found for key 'akka.event-handlers'
at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:142)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
at com.typesafe.config.impl.SimpleConfig.getList(SimpleConfig.java:203)
at
com.typesafe.config.impl.SimpleConfig.getHomogeneousUnwrappedList(SimpleConfig.java:260)
at
com.typesafe.config.impl.SimpleConfig.getStringList(SimpleConfig.java:318)
at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:150)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:153)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
at
org.apache.mahout.sparkbindings.package$.mahoutSparkContext(package.scala:95)
at
org.apache.mahout.drivers.MahoutSparkDriver.start(MahoutSparkDriver.scala:81)
at
org.apache.mahout.drivers.ItemSimilarityDriver$.start(ItemSimilarityDriver.scala:128)
at
org.apache.mahout.drivers.ItemSimilarityDriver$.process(ItemSimilarityDriver.scala:211)
at
org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:116)
at
org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:114)
at scala.Option.map(Option.scala:145)
at
org.apache.mahout.drivers.ItemSimilarityDriver$.main(ItemSimilarityDriver.scala:114)
at
org.apache.mahout.drivers.ItemSimilarityDriver.main(ItemSimilarityDriver.scala)

I really don't know there is any mistake in my configuration, I installed
Hadoop with CDH4 and Spark with the version for CDH4 and also put a sample
file to HDFS.

Thanks for your help

Re: No configuration setting found for key 'akka.event-handlers' when running spark-itemsimilarity

Posted by hlqv <hl...@gmail.com>.
@Pat Ferrel: thanks for your helping

My environment:
 Spark 1.1.0-chd4 (built for Hadoop 2.0.0-mr1-cdh4.2.0)
 Hadoop 2.0.0-mr1-cdh4.1.2
 Mahout 1.0 Snapshot build with default (mvn -DskipTests clean install)

But I still got the error

Initial job has not accepted any resources; check your cluster UI to ensure
that workers are registered and have sufficient memory
Job aborted due to stage failure: All masters are unresponsive! Giving up

I'm trying to check everything carefully

On 23 December 2014 at 00:32, Pat Ferrel <pa...@occamsmachete.com> wrote:

> For local mode the job is running with the bits it was linked to. Make
> sure these are the same bits running on your cluster. Mahout links to Spark
> 1.1.0, is that what you are running on the cluster?
>
> On Dec 21, 2014, at 9:06 PM, hlqv <hl...@gmail.com> wrote:
>
> @Pat Ferrel: Thank you so much. I restarted Hadoop and now it works fine
> for local mode (--master local[n]).
> I ran an example JavaWordCount with --master spark://localhost:7077 and got
> the correct result. But when I ran spark-itemsimilarity again with --master
> spark://localhost:7077 then I got the error in the log:
> ERROR SparkDeploySchedulerBackend: Application has been killed. Reason: All
> masters are unresponsive! Giving up
> It seems relating to the WARN log:
> Initial job has not accepted any resources; check your cluster UI to ensure
> that workers are registered and have sufficient memory
>
> I have looked the Spark UI, every thing look correct with two workers are
> available and 4GB RAM is usable.
>
> On 22 December 2014 at 06:11, Pat Ferrel <pa...@occamsmachete.com> wrote:
>
> > Looks like it's refusing to initialize your spark context. Did you start
> > the master spark://localhost:7077 before running mahout? Can you see a UI
> > at localhost:8080?
> >
> > If no to any of those try  --master local[n], which will run the job
> > locally not in a single machine cluster and will run much faster.
> Replace n
> > with the number of cores you want to allocate, like local[7].
> >
> > You should use the Spark start scripts when using a single machine
> cluster.
> >
> > If none of this helps, are you able to run any of the Spark examples on
> > spark://localhost:7077?
> >
> >
> > On Dec 21, 2014, at 9:28 AM, hlqv <hl...@gmail.com> wrote:
> >
> > Hi everyone!
> >
> > In Mahout 1.0 Snapshot, I follow to this introduction
> >
> https://mahout.apache.org/users/recommender/intro-cooccurrence-spark.html
> >
> > and running the command below:
> >
> > mahout spark-itemsimilarity --input hdfs://localhost:9000/test/logs.csv
> > --output hdfs://localhost:9000/test/output/ --master
> spark://localhost:7077
> > --filter1 purchase --filter2 view --itemIDColumn 2 --rowIDColumn 0
> > --filterColumn 1
> >
> > then I got the error:
> >
> > Exception in thread "main" com.typesafe.config.ConfigException$Missing:
> No
> > configuration setting found for key 'akka.event-handlers'
> > at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115)
> > at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136)
> > at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:142)
> > at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
> > at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
> > at com.typesafe.config.impl.SimpleConfig.getList(SimpleConfig.java:203)
> > at
> >
> >
> com.typesafe.config.impl.SimpleConfig.getHomogeneousUnwrappedList(SimpleConfig.java:260)
> > at
> >
> com.typesafe.config.impl.SimpleConfig.getStringList(SimpleConfig.java:318)
> > at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:150)
> > at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
> > at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
> > at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
> > at
> >
> >
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
> > at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
> > at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
> > at
> >
> >
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
> > at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
> > at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
> > at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
> > at org.apache.spark.SparkEnv$.create(SparkEnv.scala:153)
> > at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
> > at
> >
> >
> org.apache.mahout.sparkbindings.package$.mahoutSparkContext(package.scala:95)
> > at
> >
> >
> org.apache.mahout.drivers.MahoutSparkDriver.start(MahoutSparkDriver.scala:81)
> > at
> >
> >
> org.apache.mahout.drivers.ItemSimilarityDriver$.start(ItemSimilarityDriver.scala:128)
> > at
> >
> >
> org.apache.mahout.drivers.ItemSimilarityDriver$.process(ItemSimilarityDriver.scala:211)
> > at
> >
> >
> org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:116)
> > at
> >
> >
> org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:114)
> > at scala.Option.map(Option.scala:145)
> > at
> >
> >
> org.apache.mahout.drivers.ItemSimilarityDriver$.main(ItemSimilarityDriver.scala:114)
> > at
> >
> >
> org.apache.mahout.drivers.ItemSimilarityDriver.main(ItemSimilarityDriver.scala)
> >
> > I really don't know there is any mistake in my configuration, I installed
> > Hadoop with CDH4 and Spark with the version for CDH4 and also put a
> sample
> > file to HDFS.
> >
> > Thanks for your help
> >
> >
>
>

Re: No configuration setting found for key 'akka.event-handlers' when running spark-itemsimilarity

Posted by Pat Ferrel <pa...@occamsmachete.com>.
For local mode the job is running with the bits it was linked to. Make sure these are the same bits running on your cluster. Mahout links to Spark 1.1.0, is that what you are running on the cluster?

On Dec 21, 2014, at 9:06 PM, hlqv <hl...@gmail.com> wrote:

@Pat Ferrel: Thank you so much. I restarted Hadoop and now it works fine
for local mode (--master local[n]).
I ran an example JavaWordCount with --master spark://localhost:7077 and got
the correct result. But when I ran spark-itemsimilarity again with --master
spark://localhost:7077 then I got the error in the log:
ERROR SparkDeploySchedulerBackend: Application has been killed. Reason: All
masters are unresponsive! Giving up
It seems relating to the WARN log:
Initial job has not accepted any resources; check your cluster UI to ensure
that workers are registered and have sufficient memory

I have looked the Spark UI, every thing look correct with two workers are
available and 4GB RAM is usable.

On 22 December 2014 at 06:11, Pat Ferrel <pa...@occamsmachete.com> wrote:

> Looks like it's refusing to initialize your spark context. Did you start
> the master spark://localhost:7077 before running mahout? Can you see a UI
> at localhost:8080?
> 
> If no to any of those try  --master local[n], which will run the job
> locally not in a single machine cluster and will run much faster. Replace n
> with the number of cores you want to allocate, like local[7].
> 
> You should use the Spark start scripts when using a single machine cluster.
> 
> If none of this helps, are you able to run any of the Spark examples on
> spark://localhost:7077?
> 
> 
> On Dec 21, 2014, at 9:28 AM, hlqv <hl...@gmail.com> wrote:
> 
> Hi everyone!
> 
> In Mahout 1.0 Snapshot, I follow to this introduction
> https://mahout.apache.org/users/recommender/intro-cooccurrence-spark.html
> 
> and running the command below:
> 
> mahout spark-itemsimilarity --input hdfs://localhost:9000/test/logs.csv
> --output hdfs://localhost:9000/test/output/ --master spark://localhost:7077
> --filter1 purchase --filter2 view --itemIDColumn 2 --rowIDColumn 0
> --filterColumn 1
> 
> then I got the error:
> 
> Exception in thread "main" com.typesafe.config.ConfigException$Missing: No
> configuration setting found for key 'akka.event-handlers'
> at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115)
> at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136)
> at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:142)
> at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
> at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
> at com.typesafe.config.impl.SimpleConfig.getList(SimpleConfig.java:203)
> at
> 
> com.typesafe.config.impl.SimpleConfig.getHomogeneousUnwrappedList(SimpleConfig.java:260)
> at
> com.typesafe.config.impl.SimpleConfig.getStringList(SimpleConfig.java:318)
> at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:150)
> at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
> at
> 
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
> at
> 
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
> at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
> at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:153)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
> at
> 
> org.apache.mahout.sparkbindings.package$.mahoutSparkContext(package.scala:95)
> at
> 
> org.apache.mahout.drivers.MahoutSparkDriver.start(MahoutSparkDriver.scala:81)
> at
> 
> org.apache.mahout.drivers.ItemSimilarityDriver$.start(ItemSimilarityDriver.scala:128)
> at
> 
> org.apache.mahout.drivers.ItemSimilarityDriver$.process(ItemSimilarityDriver.scala:211)
> at
> 
> org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:116)
> at
> 
> org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:114)
> at scala.Option.map(Option.scala:145)
> at
> 
> org.apache.mahout.drivers.ItemSimilarityDriver$.main(ItemSimilarityDriver.scala:114)
> at
> 
> org.apache.mahout.drivers.ItemSimilarityDriver.main(ItemSimilarityDriver.scala)
> 
> I really don't know there is any mistake in my configuration, I installed
> Hadoop with CDH4 and Spark with the version for CDH4 and also put a sample
> file to HDFS.
> 
> Thanks for your help
> 
> 


Re: No configuration setting found for key 'akka.event-handlers' when running spark-itemsimilarity

Posted by hlqv <hl...@gmail.com>.
@Pat Ferrel: Thank you so much. I restarted Hadoop and now it works fine
for local mode (--master local[n]).
I ran an example JavaWordCount with --master spark://localhost:7077 and got
the correct result. But when I ran spark-itemsimilarity again with --master
spark://localhost:7077 then I got the error in the log:
ERROR SparkDeploySchedulerBackend: Application has been killed. Reason: All
masters are unresponsive! Giving up
It seems relating to the WARN log:
Initial job has not accepted any resources; check your cluster UI to ensure
that workers are registered and have sufficient memory

I have looked the Spark UI, every thing look correct with two workers are
available and 4GB RAM is usable.

On 22 December 2014 at 06:11, Pat Ferrel <pa...@occamsmachete.com> wrote:

> Looks like it's refusing to initialize your spark context. Did you start
> the master spark://localhost:7077 before running mahout? Can you see a UI
> at localhost:8080?
>
> If no to any of those try  --master local[n], which will run the job
> locally not in a single machine cluster and will run much faster. Replace n
> with the number of cores you want to allocate, like local[7].
>
> You should use the Spark start scripts when using a single machine cluster.
>
> If none of this helps, are you able to run any of the Spark examples on
> spark://localhost:7077?
>
>
> On Dec 21, 2014, at 9:28 AM, hlqv <hl...@gmail.com> wrote:
>
> Hi everyone!
>
> In Mahout 1.0 Snapshot, I follow to this introduction
> https://mahout.apache.org/users/recommender/intro-cooccurrence-spark.html
>
> and running the command below:
>
> mahout spark-itemsimilarity --input hdfs://localhost:9000/test/logs.csv
> --output hdfs://localhost:9000/test/output/ --master spark://localhost:7077
> --filter1 purchase --filter2 view --itemIDColumn 2 --rowIDColumn 0
> --filterColumn 1
>
> then I got the error:
>
> Exception in thread "main" com.typesafe.config.ConfigException$Missing: No
> configuration setting found for key 'akka.event-handlers'
> at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115)
> at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136)
> at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:142)
> at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
> at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
> at com.typesafe.config.impl.SimpleConfig.getList(SimpleConfig.java:203)
> at
>
> com.typesafe.config.impl.SimpleConfig.getHomogeneousUnwrappedList(SimpleConfig.java:260)
> at
> com.typesafe.config.impl.SimpleConfig.getStringList(SimpleConfig.java:318)
> at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:150)
> at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
> at
>
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
> at
>
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
> at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
> at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:153)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
> at
>
> org.apache.mahout.sparkbindings.package$.mahoutSparkContext(package.scala:95)
> at
>
> org.apache.mahout.drivers.MahoutSparkDriver.start(MahoutSparkDriver.scala:81)
> at
>
> org.apache.mahout.drivers.ItemSimilarityDriver$.start(ItemSimilarityDriver.scala:128)
> at
>
> org.apache.mahout.drivers.ItemSimilarityDriver$.process(ItemSimilarityDriver.scala:211)
> at
>
> org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:116)
> at
>
> org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:114)
> at scala.Option.map(Option.scala:145)
> at
>
> org.apache.mahout.drivers.ItemSimilarityDriver$.main(ItemSimilarityDriver.scala:114)
> at
>
> org.apache.mahout.drivers.ItemSimilarityDriver.main(ItemSimilarityDriver.scala)
>
> I really don't know there is any mistake in my configuration, I installed
> Hadoop with CDH4 and Spark with the version for CDH4 and also put a sample
> file to HDFS.
>
> Thanks for your help
>
>

Re: No configuration setting found for key 'akka.event-handlers' when running spark-itemsimilarity

Posted by Pat Ferrel <pa...@occamsmachete.com>.
Looks like it's refusing to initialize your spark context. Did you start the master spark://localhost:7077 before running mahout? Can you see a UI at localhost:8080?

If no to any of those try  --master local[n], which will run the job locally not in a single machine cluster and will run much faster. Replace n with the number of cores you want to allocate, like local[7].

You should use the Spark start scripts when using a single machine cluster.

If none of this helps, are you able to run any of the Spark examples on spark://localhost:7077?


On Dec 21, 2014, at 9:28 AM, hlqv <hl...@gmail.com> wrote:

Hi everyone!

In Mahout 1.0 Snapshot, I follow to this introduction
https://mahout.apache.org/users/recommender/intro-cooccurrence-spark.html

and running the command below:

mahout spark-itemsimilarity --input hdfs://localhost:9000/test/logs.csv
--output hdfs://localhost:9000/test/output/ --master spark://localhost:7077
--filter1 purchase --filter2 view --itemIDColumn 2 --rowIDColumn 0
--filterColumn 1

then I got the error:

Exception in thread "main" com.typesafe.config.ConfigException$Missing: No
configuration setting found for key 'akka.event-handlers'
at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:142)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
at com.typesafe.config.impl.SimpleConfig.getList(SimpleConfig.java:203)
at
com.typesafe.config.impl.SimpleConfig.getHomogeneousUnwrappedList(SimpleConfig.java:260)
at
com.typesafe.config.impl.SimpleConfig.getStringList(SimpleConfig.java:318)
at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:150)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:153)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
at
org.apache.mahout.sparkbindings.package$.mahoutSparkContext(package.scala:95)
at
org.apache.mahout.drivers.MahoutSparkDriver.start(MahoutSparkDriver.scala:81)
at
org.apache.mahout.drivers.ItemSimilarityDriver$.start(ItemSimilarityDriver.scala:128)
at
org.apache.mahout.drivers.ItemSimilarityDriver$.process(ItemSimilarityDriver.scala:211)
at
org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:116)
at
org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:114)
at scala.Option.map(Option.scala:145)
at
org.apache.mahout.drivers.ItemSimilarityDriver$.main(ItemSimilarityDriver.scala:114)
at
org.apache.mahout.drivers.ItemSimilarityDriver.main(ItemSimilarityDriver.scala)

I really don't know there is any mistake in my configuration, I installed
Hadoop with CDH4 and Spark with the version for CDH4 and also put a sample
file to HDFS.

Thanks for your help