You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Umesh Kacha (JIRA)" <ji...@apache.org> on 2015/04/18 23:40:58 UTC
[jira] [Updated] (SPARK-6995) Spark driver program throws exception
cant get Master Kerberose principal for use as renewer
[ https://issues.apache.org/jira/browse/SPARK-6995?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Umesh Kacha updated SPARK-6995:
-------------------------------
Description:
am having thousands of files residing in HDFS which I would like to read/process/store back to HDFS using Spark. I have the following Java Spark driver program:
SparkConf sparkConf = new SparkConf().setAppName("MyApp").setMaster("spark://myhost1.com");
JavaSparkContext sc = new JavaSparkContext(sparkConf);
JavaRDD<MySchemaPOJO> schemaRDD = sc.textFile("hdfs://path/to/file").map(
new Function<String,MySchemaPOJO>() {
public MySchemaPOJO call(String line) throws Exception {
String[] fields = line.split("\\t");
MySchemaPOJO schemaPOJO = new MySchemaPOJO();
//set required fields on POJO
return schemaPOJO;
}
});
System.out.println("POJO list are " + schemaRDD.collect());
schemaRDD.saveAsTextFile("hdfs://path/to/save/file");
sc.stop();
I have started Master/Worker on myhost1.com and I am running this driver program on IntelliJ on another host myhost2.com. When I start driver program everything works fine until schemaRDD.collect() and it throws exception as
Exception in thread 'main' java.io.IOException : Cant get Master
Kerberose principal to use as renewer.
I did kinit on both host where I am running driver and master/worker. My driver program can connect to Master also only when actual actions gets called on RDD it throws above exception. Please guide I am new to Spark.
was:
am having thousands of files residing in HDFS which I would like to read/process/store back to HDFS using Spark. I have the following Java Spark driver program:
SparkConf sparkConf = new SparkConf().setAppName("MyApp").setMaster("spark://myhost1.com");
JavaSparkContext sc = new JavaSparkContext(sparkConf);
JavaRDD<MySchemaPOJO> schemaRDD = sc.textFile("hdfs://path/to/file").map(
new Function<String,MySchemaPOJO>() {
public MySchemaPOJO call(String line) throws Exception {
String[] fields = line.split("\\t");
MySchemaPOJO schemaPOJO = new MySchemaPOJO();
//set required fields on POJO
return schemaPOJO;
}
});
System.out.println("POJO list are " + schemaRDD.collect());
schemaRDD.saveAsTextFile("hdfs://path/to/save/file");
sc.stop();
I have started Master/Worker on myhost1.com and I am running this driver program on IntelliJ on another host myhost2.com. When I start driver program everything works fine until schemaRDD.collect() and it throws exception as
Exception in thread 'main' java.io.IOException : Cant get Master
Kerberose principal to use as renewer.
I did kinit on both host where I am running driver and master/worker. My driver program can connect to Master also only when actual actions gets called on RDD it throws above exception. Please guide I am new to Spark.
> Spark driver program throws exception cant get Master Kerberose principal for use as renewer
> --------------------------------------------------------------------------------------------
>
> Key: SPARK-6995
> URL: https://issues.apache.org/jira/browse/SPARK-6995
> Project: Spark
> Issue Type: Question
> Components: Java API
> Affects Versions: 1.1.0
> Environment: Linux,IntelliJ
> Reporter: Umesh Kacha
>
> am having thousands of files residing in HDFS which I would like to read/process/store back to HDFS using Spark. I have the following Java Spark driver program:
> SparkConf sparkConf = new SparkConf().setAppName("MyApp").setMaster("spark://myhost1.com");
> JavaSparkContext sc = new JavaSparkContext(sparkConf);
> JavaRDD<MySchemaPOJO> schemaRDD = sc.textFile("hdfs://path/to/file").map(
> new Function<String,MySchemaPOJO>() {
> public MySchemaPOJO call(String line) throws Exception {
> String[] fields = line.split("\\t");
> MySchemaPOJO schemaPOJO = new MySchemaPOJO();
> //set required fields on POJO
> return schemaPOJO;
> }
> });
> System.out.println("POJO list are " + schemaRDD.collect());
> schemaRDD.saveAsTextFile("hdfs://path/to/save/file");
> sc.stop();
> I have started Master/Worker on myhost1.com and I am running this driver program on IntelliJ on another host myhost2.com. When I start driver program everything works fine until schemaRDD.collect() and it throws exception as
> Exception in thread 'main' java.io.IOException : Cant get Master
> Kerberose principal to use as renewer.
> I did kinit on both host where I am running driver and master/worker. My driver program can connect to Master also only when actual actions gets called on RDD it throws above exception. Please guide I am new to Spark.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org