You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Bijuna <bi...@gmail.com> on 2016/02/16 21:03:18 UTC

Spark null pointer exception and task failure

Spark Experts,

We are using Spark 1.6 Java version in standalone mode in Windows . Our program reads data from text files using javasparkcontext and SQLServer using SQLContext. 

We use DataFrame API to filter, join and aggregate data read from above sources. The collectAsList operation on the DataFrame fails intermittently with Nullpointer and Task failure exception. I have attached two screenshots of the errors we see.

Any help in debugging and fixing this error will be much appreciated.

Thank you,
Bijuna


Sent from my iPad

> 
> 
> 

Re: Spark null pointer exception and task failure

Posted by bi...@gmail.com.
The program does not fail consistently. We are executing the spark driver as a standalone Java program. If the cause was a null value in a field, I would assume the program will fail every time we execute it . But, it fails with Nullpointer exception and Task failure only on certain executions even if the input remains the same and we restart the program.

Thanks
Bijuna


Sent from my iPhone

> On Feb 17, 2016, at 3:18 AM, Sudhanshu Janghel <su...@cloudwick.com> wrote:
> 
> I think that the value in the data frame is null for some field. Why not insert a check for it . For e.g. Say field is not null then convert to a df.
> 
> Kind Regards,
> Sudhanshu
> 
>> On 16 Feb 2016, at 8:03 pm, Bijuna <bi...@gmail.com> wrote:
>> 
>> Spark Experts,
>> 
>> We are using Spark 1.6 Java version in standalone mode in Windows . Our program reads data from text files using javasparkcontext and SQLServer using SQLContext. 
>> 
>> We use DataFrame API to filter, join and aggregate data read from above sources. The collectAsList operation on the DataFrame fails intermittently with Nullpointer and Task failure exception. I have attached two screenshots of the errors we see.
>> 
>> Any help in debugging and fixing this error will be much appreciated.
>> 
>> Thank you,
>> Bijuna
>> 
>> 
>> Sent from my iPad
>> 
>> <IMG_0337.JPG>
>> <IMG_0338.JPG>
>>> 
>>> 
>>> 
>>> Sent from my iPad
>> 
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org