You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Srikrishna S <sr...@gmail.com> on 2014/07/11 21:01:54 UTC
Job getting killed
I am trying to run Logistic Regression on the url dataset (from
libsvm) using the exact same code
as the example on a 5 node Yarn-Cluster.
I get a pretty cryptic error that says
"Killed"
Nothing more
Settings:
--master yarn-client"
--verbose"
--driver-memory 24G
--executor-memory 24G
--executor-cores 8
--num-executors 5
I set the akka.frame_size to 200MB.
Script:
ef main(args: Array[String]) {
val conf = new SparkConf()
.setMaster("yarn-client")
.setAppName("Logistic regression SGD fixed")
.set("spark.akka.frameSize", "200")
var sc = new SparkContext(conf)
// Load and parse the data
val dataset = args(0)
val maxIterations = 100
val start_time = System.nanoTime()
val data = MLUtils.loadLibSVMFile(sc, dataset)
// Building the model
var solver = new LogisticRegressionWithSGD()
solver.optimizer.setNumIterations(maxIterations)
solver.optimizer.setRegParam(0.01)
val model = solver.run(data)
// Measure the accuracy. Don't measure the time taken to do this.
val preditionsAndLabels = data.map { point =>
val prediction = model.predict(point.features)
(prediction, point.label)
}
val accuracy = (preditionsAndLabels.filter(r => r._1 ==
r._2).count.toDouble) / data.count
val elapsed_time = (System.nanoTime() - start_time) / 1e9
// User the last known accuracy
println(dataset + ",spark-sgd," + maxIterations + "," +
elapsed_time + "," + accuracy)
System.exit(0)
}
Re: Job getting killed
Posted by akhandeshi <am...@gmail.com>.
Where you able to resolve this issue. I am seeing similar problem! It seems
to be connected to using OFF_PEAK persist.
Thanks,
Ami
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Job-getting-killed-tp9437p21123.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org