You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/06/23 10:47:16 UTC

[jira] [Commented] (SPARK-16169) Saving Intermediate dataframe increasing processing time upto 5 times.

    [ https://issues.apache.org/jira/browse/SPARK-16169?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15346250#comment-15346250 ] 

Sean Owen commented on SPARK-16169:
-----------------------------------

It's hard to help without knowing what you're saving, how, to where, and why you think 5 minutes is too long. 

> Saving Intermediate dataframe increasing processing time upto 5 times.
> ----------------------------------------------------------------------
>
>                 Key: SPARK-16169
>                 URL: https://issues.apache.org/jira/browse/SPARK-16169
>             Project: Spark
>          Issue Type: Question
>          Components: Spark Submit, Web UI
>    Affects Versions: 1.6.1
>         Environment: Amazon EMR
>            Reporter: Manish Kumar
>              Labels: performance
>         Attachments: Spark-UI.png
>
>
> When a spark application is (written in scala) trying to save intermediate dataframe, the application is taking processing almost 5 times. 
> Although the spark-UI clearly shows that all stages are completed but the spark application remains in running status.
> Below is the command for saving the intermediate output and then using the dataframe.
> {noformat}
> saveDataFrame(flushPath, flushFormat, isCoalesce, flushMode, previousDataFrame, sqlContext)
> previousDataFrame
> {noformat}
> here, previousDataFrame is the result of the last step and saveDataFrame is just saving the DataFrame as given location, then the previousDataFrame will be used by next steps/transformation. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org