You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sujith (JIRA)" <ji...@apache.org> on 2018/09/25 06:36:00 UTC
[jira] [Commented] (SPARK-25521) Job id showing null when Job is
finished.
[ https://issues.apache.org/jira/browse/SPARK-25521?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16626849#comment-16626849 ]
Sujith commented on SPARK-25521:
--------------------------------
[~Bjangir] I could see the jobcontext doesn't have jobID when the flow hits FileFormatWriter.scala in the insert flow. I will check into this issue more and raise a PR for handling the same. Thanks for reporting.
> Job id showing null when Job is finished.
> -----------------------------------------
>
> Key: SPARK-25521
> URL: https://issues.apache.org/jira/browse/SPARK-25521
> Project: Spark
> Issue Type: Bug
> Components: Spark Core, SQL
> Affects Versions: 2.3.1
> Reporter: Babulal
> Priority: Minor
> Attachments: image-2018-09-25-12-01-31-871.png
>
>
> scala> spark.sql("create table x1(name string,age int) stored as parquet")
> scala> spark.sql("insert into x1 select 'a',29")
> check logs
> 2018-08-19 12:45:36 INFO TaskSetManager:54 - Finished task 0.0 in stage 0.0 (TID 0) in 874 ms on localhost (executor
> driver) (1/1)
> 2018-08-19 12:45:36 INFO TaskSchedulerImpl:54 - Removed TaskSet 0.0, whose tasks have all completed, from pool
> 2018-08-19 12:45:36 INFO DAGScheduler:54 - ResultStage 0 (sql at <console>:24) finished in 1.131 s
> 2018-08-19 12:45:36 INFO DAGScheduler:54 - Job 0 finished: sql at <console>:24, took 1.233329 s
> 2018-08-19 12:45:36 INFO FileFormatWriter:54 - Job {color:#d04437}null{color} committed.
> 2018-08-19 12:45:36 INFO FileFormatWriter:54 - Finished processing stats for job null.
> res4: org.apache.spark.sql.DataFrame = []
>
> !image-2018-09-25-12-01-31-871.png!
>
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org