You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Samphel Norden (JIRA)" <ji...@apache.org> on 2015/07/25 21:29:04 UTC

[jira] [Commented] (SPARK-9347) spark load of existing parquet files extremely slow if large number of files

    [ https://issues.apache.org/jira/browse/SPARK-9347?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14641762#comment-14641762 ] 

Samphel Norden commented on SPARK-9347:
---------------------------------------

System info
System properties:
spark.logConf -> true
spark.yarn.jar -> <>/spark-assembly-1.3.1-hadoop2.3.jar
spark.executor.memory -> 4G
spark.executor.instances -> 20
spark.driver.memory -> 2G
spark.yarn.historyServer.address -> hbase-ctl03.303net.pvt:19888
SPARK_SUBMIT -> true
spark.app.name -> org.apache.spark.repl.Main
spark.driver.extraJavaOptions -> -XX:MaxPermSize=512m
spark.jars -> 
spark.master -> yarn-client
Classpath elements:



Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.3.1
      /_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.6.0_31)
Type in expressions to have them evaluated.


> spark load of existing parquet files extremely slow if large number of files
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-9347
>                 URL: https://issues.apache.org/jira/browse/SPARK-9347
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 1.3.1
>            Reporter: Samphel Norden
>
> When spark sql shell is launched and we point it to a folder containing a large number of parquet files, the sqlContext.parquetFile() command takes a very long time to load the tables. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org