You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Michael Shtelma <ms...@gmail.com> on 2018/01/24 12:16:36 UTC

spark.sql call takes far too long

Hi all,

I have a problem with the performance of the sparkSession.sql call. It
takes up to a couple of seconds for me right now. I have a lot of
generated temporary tables, which are registered within the session
and also a lot of temporary data frames. Is it possible, that the
analysis/resolve/analysis phases take far too long? Is there a way to
figure out, what exactly takes too long?

Does anybody have any ideas on this?
Any assistance would be greatly appreciated!

Thanks,
Michael

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: spark.sql call takes far too long

Posted by "lucas.gary@gmail.com" <lu...@gmail.com>.
Hi Michael.

I haven't had this particular issue previously, but I have had other
performance issues.

Some questions which may help:

1. Have you checked the Spark Console?
2. Have you isolated the query in question, are you sure it's actually
where the slowdown occurs?
3. How much data are you talking about and how complex is the query?

Usually when debugging spark slowness issues it comes down to ineffective
data ingestion and / or a partition shuffle (and in some cases both).

That can all be seen from the console.

Good luck!

Gary Lucas

On 24 January 2018 at 04:16, Michael Shtelma <ms...@gmail.com> wrote:

> Hi all,
>
> I have a problem with the performance of the sparkSession.sql call. It
> takes up to a couple of seconds for me right now. I have a lot of
> generated temporary tables, which are registered within the session
> and also a lot of temporary data frames. Is it possible, that the
> analysis/resolve/analysis phases take far too long? Is there a way to
> figure out, what exactly takes too long?
>
> Does anybody have any ideas on this?
> Any assistance would be greatly appreciated!
>
> Thanks,
> Michael
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>