You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Babulal (JIRA)" <ji...@apache.org> on 2018/09/25 06:28:00 UTC

[jira] [Created] (SPARK-25521) Job id showing null when Job is finished.

Babulal created SPARK-25521:
-------------------------------

             Summary: Job id showing null when Job is finished.
                 Key: SPARK-25521
                 URL: https://issues.apache.org/jira/browse/SPARK-25521
             Project: Spark
          Issue Type: Bug
          Components: Spark Core, SQL
    Affects Versions: 2.3.1
            Reporter: Babulal


scala> spark.sql("create table x1(name string,age int) stored as parquet")
scala> spark.sql("insert into x1 select 'a',29")
check logs
2018-08-19 12:45:36 INFO TaskSetManager:54 - Finished task 0.0 in stage 0.0 (TID 0) in 874 ms on localhost (executor
driver) (1/1)
2018-08-19 12:45:36 INFO TaskSchedulerImpl:54 - Removed TaskSet 0.0, whose tasks have all completed, from pool
2018-08-19 12:45:36 INFO DAGScheduler:54 - ResultStage 0 (sql at <console>:24) finished in 1.131 s
2018-08-19 12:45:36 INFO DAGScheduler:54 - Job 0 finished: sql at <console>:24, took 1.233329 s
2018-08-19 12:45:36 INFO FileFormatWriter:54 - Job {color:#d04437}null{color} committed.
2018-08-19 12:45:36 INFO FileFormatWriter:54 - Finished processing stats for job null.
res4: org.apache.spark.sql.DataFrame = []

 

 

 

!image-2018-09-25-11-56-55-916.png!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org