You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Mars Xu <xu...@gmail.com> on 2017/02/07 03:17:08 UTC
How to get a spark sql statement implement duration ?
Hello All,
Some spark sqls will produce one or more jobs, I have 2 questions,
1, How the cc.sql(“sql statement”) divided into one or more jobs ?
2, When I execute spark sql query in spark - shell client, how to get the execution time (Spark 2.1.0) ? if a sql query produced 3 jobs, In my opinion, the execution time is to sum up the 3 jobs’ duration time.
Thanks .
Mars
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org
Re: How to get a spark sql statement implement duration ?
Posted by 萝卜丝炒饭 <14...@qq.com>.
you can find the duration time by wen ui,such as http://xxx:8080 .It depends on your setting.
anout the shell, i do not know how to check the time
---Original---
From: "Jacek Laskowski"<ja...@japila.pl>
Date: 2017/2/8 04:14:58
To: "Mars Xu"<xu...@gmail.com>;
Cc: "user"<us...@spark.apache.org>;
Subject: Re: How to get a spark sql statement implement duration ?
On 7 Feb 2017 4:17 a.m., "Mars Xu" <xu...@gmail.com> wrote:
Hello All,
        Some spark sqls will produce one or more jobs, I have 2 questions,
        1, How the cc.sql(“sql statement”) divided into one or more jobs ?
It's an implementation detail. You can have zero or more jobs for a single structured query (query DSL or SQL). 
        2, When I execute spark sql query in spark - shell client, how to get the execution time (Spark  2.1.0) ?  if a sql query produced 3 jobs, In my opinion, the execution time is to sum up the 3 jobs’ duration time.
Yes. What's the question then?
Jacek
Re: How to get a spark sql statement implement duration ?
Posted by Jacek Laskowski <ja...@japila.pl>.
On 7 Feb 2017 4:17 a.m., "Mars Xu" <xu...@gmail.com> wrote:
Hello All,
Some spark sqls will produce one or more jobs, I have 2 questions,
1, How the cc.sql(“sql statement”) divided into one or more jobs ?
It's an implementation detail. You can have zero or more jobs for a single
structured query (query DSL or SQL).
2, When I execute spark sql query in spark - shell client, how to
get the execution time (Spark 2.1.0) ? if a sql query produced 3 jobs, In
my opinion, the execution time is to sum up the 3 jobs’ duration time.
Yes. What's the question then?
Jacek