You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Sarath Chandra <sa...@algofusiontech.com> on 2014/09/22 08:28:01 UTC

Worker state is 'killed'

Hi All,

I'm executing a simple job in spark which reads a file on HDFS, processes
the lines and saves the processed lines back to HDFS. All the 3 stages are
happening correctly and I'm able to see the processed file on the HDFS.

But on the spark UI, the worker state is shown as "killed". And I'm not
finding any exceptions being thrown in the logs.

What could be going wrong?

...
var newLines = lines.flatMap(line => process(line));
newLines.saveAsTextFile(hdfsPath);
...
def process(line: String): Array[String] = {
...
Array(str1, str2);
}
...

~Sarath.