You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Brad Willard (JIRA)" <ji...@apache.org> on 2015/05/07 21:38:59 UTC
[jira] [Created] (SPARK-7447) Large Job submission lag when using
Parquet w/ Schema Merging
Brad Willard created SPARK-7447:
-----------------------------------
Summary: Large Job submission lag when using Parquet w/ Schema Merging
Key: SPARK-7447
URL: https://issues.apache.org/jira/browse/SPARK-7447
Project: Spark
Issue Type: Bug
Components: PySpark, Spark Core, Spark Submit
Affects Versions: 1.3.1, 1.3.0
Environment: Spark 1.3.1, aws, persistent hdfs version 2 with ebs storage, pyspark, 8 x c3.8xlarge nodes.
Reporter: Brad Willard
I have 2.6 billion rows in parquet format and I'm trying to use the new schema merging feature (I was enforcing a consistent schema manually before in 0.8-1.2 which was annoying).
I have approximate 200 parquet files with key=<date>. When I load the dataframe with the sqlcontext that process is understandably slow because I assume it's reading all the meta data from the parquet files and doing the initial schema merging. So that's ok.
However the problem I have is that once I have the dataframe. Doing any operation on the dataframe seems to have a 10-30 second lag before it actually starts processing the Job and shows up as an Active Job in the Spark Manager. This was an instant operation in all previous versions of Spark. Once the job actually is running the performance is fantastic, however this job submission lag is horrible.
I'm wondering if there is a bug with recomputing the schema merging. Running top on the master node shows some thread maxed out on 1 cpu during the lagging time which makes me think it's not net i/o but something pre-processing before job submission.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org