You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Bolke de Bruin (JIRA)" <ji...@apache.org> on 2015/06/25 09:50:04 UTC

[jira] [Created] (SPARK-8623) Some queries in spark-sql lead to NullPointerException when using Yarn

Bolke de Bruin created SPARK-8623:
-------------------------------------

             Summary: Some queries in spark-sql lead to NullPointerException when using Yarn
                 Key: SPARK-8623
                 URL: https://issues.apache.org/jira/browse/SPARK-8623
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.5.0
         Environment: Hadoop 2.6, Kerberos
            Reporter: Bolke de Bruin


The following query was executed using "spark-sql --master yarn-client" on 1.5.0-SNAPSHOT:

select * from wcs.geolite_city limit 10;

This lead to the following error:

15/06/25 09:38:37 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, lxhnl008.ad.ing.net): java.lang.NullPointerException
	at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:693)
	at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:442)
	at org.apache.hadoop.mapreduce.Job.<init>(Job.java:131)
	at org.apache.spark.sql.sources.SqlNewHadoopRDD.getJob(SqlNewHadoopRDD.scala:83)
	at org.apache.spark.sql.sources.SqlNewHadoopRDD.getConf(SqlNewHadoopRDD.scala:89)
	at org.apache.spark.sql.sources.SqlNewHadoopRDD$$anon$1.<init>(SqlNewHadoopRDD.scala:127)
	at org.apache.spark.sql.sources.SqlNewHadoopRDD.compute(SqlNewHadoopRDD.scala:124)
	at org.apache.spark.sql.sources.SqlNewHadoopRDD.compute(SqlNewHadoopRDD.scala:66)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)

This does not happen in every case, ie. some queries execute fine, and it is unclear why.

Using just "spark-sql" the query executes fine as well and thus the issue seems to rely in the communication with Yarn.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org